Recherche avancée

Médias (91)

Autres articles (77)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

  • Automated installation script of MediaSPIP

    25 avril 2011, par

    To overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
    You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
    The documentation of the use of this installation script is available here.
    The code of this (...)

Sur d’autres sites (12105)

  • How do I add text and watermark to each image using FFmpeg ?

    14 novembre 2023, par Oleh

    I wrote a small code that converts several images to video using com.arthenica:ffmpeg-kit-full:6.0-2 and everything works, but I need to add text to each image and place a watermarked photo on it and I can't implement it. Could you please help me with this ?

    


    Here is the code that converts an array of images into a video :

    


    public void ImagesToVideo(ArrayList<string> pathList) {&#xA;        String outputVideoPath = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DOWNLOADS) &#x2B; "/" &#x2B; Calendar.getInstance().getTimeInMillis() &#x2B; ".mp4";&#xA;        try {&#xA;            StringBuilder strCommand = new StringBuilder();&#xA;&#xA;            int size = pathList.size();&#xA;&#xA;            for (int i = 0; i &lt; size; i&#x2B;&#x2B;) {&#xA;                strCommand.append("-framerate 8 -i &#x27;")&#xA;                        .append(pathList.get(i))&#xA;                        .append("&#x27; ");&#xA;            }&#xA;&#xA;            strCommand.append("-filter_complex \"");&#xA;&#xA;            for (int i = 0; i &lt; size; i&#x2B;&#x2B;) {&#xA;                strCommand.append("[")&#xA;                        .append(i)&#xA;                        .append(":v]setpts=PTS-STARTPTS&#x2B;1[v")&#xA;                        .append(i)&#xA;                        .append("];");&#xA;            }&#xA;&#xA;&#xA;            for (int i = 0; i &lt; size; i&#x2B;&#x2B;) {&#xA;                strCommand.append("[v")&#xA;                        .append(i)&#xA;                        .append("]");&#xA;            }&#xA;            strCommand.append("concat=n=")&#xA;                    .append(size)&#xA;                    .append(":v=1:a=0,format=yuv420p\" -r 30 -b:v 4M -preset veryfast &#x27;")&#xA;                    .append(outputVideoPath)&#xA;                    .append("&#x27;");&#xA;            &#xA;            FFmpegKit.executeAsync(strCommand.toString(), new FFmpegSessionCompleteCallback() {&#xA;                @Override&#xA;                public void apply(FFmpegSession session) {&#xA;                    SessionState state = session.getState();&#xA;                    ReturnCode returnCode = session.getReturnCode();&#xA;&#xA;                    // CALLED WHEN SESSION IS EXECUTED&#xA;                    Log.d("IPRIPR", String.format("FFmpeg process exited with state %s and rc %s.%s", state, returnCode, session.getFailStackTrace()));&#xA;                }&#xA;            }, new LogCallback() {&#xA;                @Override&#xA;                public void apply(com.arthenica.ffmpegkit.Log log) {&#xA;&#xA;                    // CALLED WHEN SESSION PRINTS LOGS&#xA;&#xA;                }&#xA;            }, new StatisticsCallback() {&#xA;                @Override&#xA;                public void apply(Statistics statistics) {&#xA;&#xA;                }&#xA;            });&#xA;&#xA;        } catch (Exception e) {&#xA;        }&#xA;    }&#xA;</string>

    &#xA;

  • ffplay : how does it calculate the fps for playback ?

    21 octobre 2020, par Daniel

    I'm trying to playback a live media (h264) which is produced by a hardware encoder.

    &#xA;

    The actual desired FPS on the encoder is set to 20, and when checking the logs of the encoder it prints "FPS statistics" every minute :

    &#xA;

    2020-10-21 17:26:54.787 [  info] video_stream_thread(),video chn 0, fps: 19.989270&#xA;2020-10-21 17:27:54.836 [  info] video_stream_thread(),video chn 0, fps: 19.989270&#xA;2020-10-21 17:28:54.837 [  info] video_stream_thread(),video chn 0, fps: 20.005924&#xA;2020-10-21 17:29:54.837 [  info] video_stream_thread(),video chn 0, fps: 19.989270&#xA;2020-10-21 17:30:54.888 [  info] video_stream_thread(),video chn 0, fps: 19.989274&#xA;2020-10-21 17:31:54.918 [  info] video_stream_thread(),video chn 0, fps: 19.989264&#xA;

    &#xA;

    You can see it's varying, but not too much around 20.

    &#xA;

    Question1 : Is this normal ? Or it should be exactly 20 every time ? To avoid confusion : I'd like to know if by the standard of H264, can this be accepted as a valid stream or this violates some&#xA;rule ?

    &#xA;

    I'm trying to playback this stream with ffplay :

    &#xA;

    $ ffplay rtsp://this_stream&#xA;Input #0, rtsp, from &#x27;xyz&#x27;&#xA;  Metadata:&#xA;    title           : &#xA;    comment         : substream&#xA;  Duration: N/A, start: 0.040000, bitrate: N/A&#xA;    Stream #0:0: Video: h264 (Constrained Baseline), yuv420p(progressive), 640x360, 25 fps, 25 tbr, 90k tbn, 180k tbc&#xA;

    &#xA;

    The thing is that ffplay thinks this is a stream with 25fps. And it also plays 25 frames each sec, causing the stream to stall and buffer in every few seconds.

    &#xA;

    I believe the fps is calculated by some pts/dts values in the stream itself, and it's not hardcoded. Am I wrong here ?

    &#xA;

    If I'm not wrong, why does ffplay thinks this stream runs at 25fps, whereas it only runs at (around) 20 ?

    &#xA;

  • FFmpeg 5 C api codec end of stream situation

    11 mars 2023, par Guanyuming He

    I'm new to FFmpeg api programming (I'm using version 5.1) and am learning from the documentation and official examples.

    &#xA;

    In the documentation page about send/receive encoding and decoding API overview, end of stream situation is discussed briefly :

    &#xA;

    &#xA;

    End of stream situations. These require "flushing" (aka draining) the codec, as the codec might buffer multiple frames or packets internally for performance or out of necessity (consider B-frames). This is handled as follows :

    &#xA;

    &#xA;

    &#xA;

    Instead of valid input, send NULL to the avcodec_send_packet() (decoding) or avcodec_send_frame() (encoding) functions. This will enter draining mode.&#xA;Call avcodec_receive_frame() (decoding)&#xA;or avcodec_receive_packet() (encoding) in a loop until AVERROR_EOF is returned. The functions will not return AVERROR(EAGAIN), unless you forgot to enter draining mode.&#xA;Before decoding can be resumed again, the codec has to be reset with avcodec_flush_buffers().

    &#xA;

    &#xA;

    As I understand it, when I get AVERROR_EOF, I have reached a special point where I need to drain buffered data from the codec and finally reset the codec with avcodec_flush_buffers(). Without doing it, I cannot continue decoding/encoding.

    &#xA;

    Then I have some questions :

    &#xA;

      &#xA;
    1. If I received EOF when I already finished sending data (e.g. when after EOF is returned by av_read_frame()), how should I tell if it's really finished ?
    2. &#xA;

    3. The data returned from the receive_... functions during draining, should I take them as valid ?
    4. &#xA;

    &#xA;

    I might have found answers to those in the official examples, but I'm not sure if the answer is universally true. I noticed that in some official examples, like in transcode_aac.c, draining is only done for the first EOF reached, and then after the second one is received, it is regarded that there are really nothing left. Any data received during draining is also written to the final output.

    &#xA;

    I just wonder, Is it true for all multimedia files in ffmpeg ?

    &#xA;

    I appreciate your response and time in advance. :)

    &#xA;