Recherche avancée

Médias (91)

Autres articles (51)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

Sur d’autres sites (9569)

  • Can't play rtp stream from ffmpeg/avconv, no data received

    6 mars 2014, par Foo Barazz

    I started avserver on my Raspberry Pi, webcam attached I read from /dev/video0 with

    pi@raspberrypi $ avconv -f video4linux2 -i /dev/video0 -vcodec mpeg2video -r 25 - pix_fmt yuv420p -me_method epzs -b 2600k -bt 256k -f rtp rtp://192.168.0.124:8090

    avconv version 0.8.6-6:0.8.6-1+rpi1, Copyright (c) 2000-2013 the Libav developers
     built on Mar 31 2013 13:58:10 with gcc 4.6.3
    [video4linux2 @ 0x17c1720] Estimating duration from bitrate, this may be inaccurate
    Input #0, video4linux2, from '/dev/video0':
     Duration: N/A, start: 615.594215, bitrate: 36864 kb/s
       Stream #0.0: Video: rawvideo, yuyv422, 320x240, 36864 kb/s, 30 tbr, 1000k tbn, 30 tbc
    [buffer @ 0x17c16e0] w:320 h:240 pixfmt:yuyv422
    [avsink @ 0x17c2f00] auto-inserting filter 'auto-inserted scaler 0' between the filter 'src' and the filter 'out'
    [scale @ 0x17c34c0] w:320 h:240 fmt:yuyv422 -> w:320 h:240 fmt:yuv420p flags:0x4
    Output #0, rtp, to 'rtp://192.168.0.124:8090':
     Metadata:
       encoder         : Lavf53.21.1
       Stream #0.0: Video: mpeg2video, yuv420p, 320x240, q=2-31, 2600 kb/s, 90k tbn, 25 tbc
    Stream mapping:
     Stream #0:0 -> #0:0 (rawvideo -> mpeg2video)
    SDP:
    v=0
    o=- 0 0 IN IP4 127.0.0.1
    s=No Name
    c=IN IP4 192.168.0.124
    t=0 0
    a=tool:libavformat 53.21.1
    m=video 8090 RTP/AVP 32
    b=AS:2600

    Press ctrl-c to stop encoding
    frame=  576 fps= 25 q=2.0 size=    2133kB time=23.00 bitrate= 759.8kbits/s dup=390 drop=0    
    frame=  590 fps= 25 q=2.0 size=    2191kB time=23.56 bitrate= 762.0kbits/s dup=400 drop=0    
    frame= 1320 fps= 25 q=2.0 size=    4932kB time=52.76 bitrate= 765.8kbits/s dup=908 drop=0
    ...

    Seems to work fine, it reads data from the webcam.

    Now I'm trying to simply play with ffplay from my Mac with

    $ ffplay rtp://192.168.0.124:8090
    ffplay version 1.2.4 Copyright (c) 2003-2013 the FFmpeg developers
     built on Mar  1 2014 15:18:21 with Apple LLVM version 5.0 (clang-500.2.79) (based on LLVM 3.3svn)
     configuration: --prefix=/usr/local/Cellar/ffmpeg/1.2.4 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-nonfree --enable-hardcoded-tables --enable-avresample --enable-vda --cc=clang --host-cflags= --host-ldflags= --enable-libx264 --enable-libfaac --enable-libmp3lame --enable-libxvid --enable-libfreetype --enable-libtheora --enable-libvorbis --enable-libvpx --enable-librtmp --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libvo-aacenc --enable-libass --enable-ffplay --enable-libspeex --enable-libschroedinger --enable-libfdk-aac --enable-libopus --enable-frei0r --enable-libopenjpeg --extra-cflags='-I/usr/local/Cellar/openjpeg/1.5.1/include/openjpeg-1.5 '
     libavutil      52. 18.100 / 52. 18.100
     libavcodec     54. 92.100 / 54. 92.100
     libavformat    54. 63.104 / 54. 63.104
     libavdevice    54.  3.103 / 54.  3.103
     libavfilter     3. 42.103 /  3. 42.103
     libswscale      2.  2.100 /  2.  2.100
     libswresample   0. 17.102 /  0. 17.102
     libpostproc    52.  2.100 / 52.  2.100
       nan A-V:  0.000 fd=   0 aq=    0KB vq=    0KB sq=    0B f=0/0  
       nan A-V:  0.000 fd=   0 aq=    0KB vq=    0KB sq=    0B f=0/0  
       nan A-V:  0.000 fd=   0 aq=    0KB vq=    0KB sq=    0B f=0/0
       ...

    The video doesn't open and it seems to not reading any data from the Raspberry Pi.

    I use the default configuration for avserver.
    The webcam is definitely working as I managed to just write out images with avconv from it.

    What did I miss ?

  • FFmpeg player backporting to Android 2.1 - one more problem

    22 avril 2024, par tretdm

    I looked for a lot of information about how to build and use FFmpeg in early versions of Android, looked at the source codes of players from 2011-2014 and was able to easily build FFmpeg 4.0.4 and 3.1.4 on the NDKv5 platform. I have highlighted the main things for this purpose :

    


      

    • <android></android>bitmap.h> and <android></android>native_window.h> before Android 2.2 (API Level 8) such a thing did not exist
    • &#xA;

    • this requires some effort to implement buffer management for A/V streams, since in practice, when playing video, the application silently crashed after a few seconds due to overflow (below code example in C++ and Java)
    • &#xA;

    • FFmpeg - imho, the only way to support a sufficient number of codecs that are not officially included in Android 2.1 and above
    • &#xA;

    &#xA;

    void decodeVideoFromPacket(JNIEnv *env, jobject instance,&#xA;                           jclass mplayer_class, AVPacket avpkt, &#xA;                           int total_frames, int length) {&#xA;    AVFrame     *pFrame = NULL&#xA;    AVFrame     *pFrameRGB = NULL;&#xA;    pFrame = avcodec_alloc_frame();&#xA;    pFrameRGB = avcodec_alloc_frame();&#xA;    int frame_size = avpicture_get_size(PIX_FMT_RGB32, gVideoCodecCtx->width, gVideoCodecCtx->height);&#xA;    unsigned char* buffer = (unsigned char*)av_malloc((size_t)frame_size * 3);&#xA;    if (!buffer) {&#xA;        av_free(pFrame);&#xA;        av_free(pFrameRGB);&#xA;        return;&#xA;    }&#xA;    jbyteArray buffer2;&#xA;    jmethodID renderVideoFrames = env->GetMethodID(mplayer_class, "renderVideoFrames", "([BI)V");&#xA;    int frameDecoded;&#xA;    avpicture_fill((AVPicture*) pFrame,&#xA;                   buffer,&#xA;                   gVideoCodecCtx->pix_fmt,&#xA;                   gVideoCodecCtx->width,&#xA;                   gVideoCodecCtx->height&#xA;                  );&#xA;&#xA;    if (avpkt.stream_index == gVideoStreamIndex) { // If video stream found&#xA;        int size = avpkt.size;&#xA;        total_frames&#x2B;&#x2B;;&#xA;        struct SwsContext *img_convert_ctx = NULL;&#xA;        avcodec_decode_video2(gVideoCodecCtx, pFrame, &amp;frameDecoded, &amp;avpkt);&#xA;        if (!frameDecoded || pFrame == NULL) {&#xA;            return;&#xA;        }&#xA;&#xA;        try {&#xA;            PixelFormat pxf;&#xA;            // RGB565 by default for Android Canvas in pre-Gingerbread devices.&#xA;            if(android::get_android_api_version(env) >= ANDROID_API_CODENAME_GINGERBREAD) {&#xA;                pxf = PIX_FMT_BGR32;&#xA;            } else {&#xA;                pxf = PIX_FMT_RGB565;&#xA;            }&#xA;&#xA;            int rgbBytes = avpicture_get_size(pxf, gVideoCodecCtx->width,&#xA;                                            gVideoCodecCtx->height);&#xA;&#xA;            // Converting YUV to RGB frame &amp; RGB frame to char* buffer &#xA;            &#xA;            buffer = convertYuv2Rgb(pxf, pFrame, rgbBytes); // result of av_image_copy_to_buffer()&#xA;&#xA;            if(buffer == NULL) {&#xA;                return;&#xA;            }&#xA;&#xA;            buffer2 = env->NewByteArray((jsize) rgbBytes);&#xA;            env->SetByteArrayRegion(buffer2, 0, (jsize) rgbBytes,&#xA;                                    (jbyte *) buffer);&#xA;            env->CallVoidMethod(instance, renderVideoFrames, buffer2, rgbBytes);&#xA;            env->DeleteLocalRef(buffer2);&#xA;            free(buffer);&#xA;        } catch (...) {&#xA;            if (debug_mode) {&#xA;                LOGE(10, "[ERROR] Render video frames failed");&#xA;                return;&#xA;            }&#xA;        }&#xA;    }&#xA;}&#xA;

    &#xA;

    private void renderVideoFrames(final byte[] buffer, final int length) {&#xA;        new Thread(new Runnable() {&#xA;            @Override&#xA;            public void run() {&#xA;                Canvas c;&#xA;                VideoTrack track = null;&#xA;                for (int tracks_index = 0; tracks_index &lt; tracks.size(); tracks_index&#x2B;&#x2B;) {&#xA;                    if (tracks.get(tracks_index) instanceof VideoTrack) {&#xA;                        track = (VideoTrack) tracks.get(tracks_index);&#xA;                    }&#xA;                }&#xA;                if (track != null) {&#xA;                    int frame_width = track.frame_size[0];&#xA;                    int frame_height = track.frame_size[1];&#xA;                    if (frame_width > 0 &amp;&amp; frame_height > 0) {&#xA;                        try {&#xA;                            // RGB_565  == 65K colours (16 bit)&#xA;                            // RGB_8888 == 16.7M colours (24 bit w/ alpha ch.)&#xA;                            int bpp = Build.VERSION.SDK_INT > 9 ? 16 : 24;&#xA;                            Bitmap.Config bmp_config =&#xA;                                    bpp == 24 ? Bitmap.Config.RGB_565 : Bitmap.Config.ARGB_8888;&#xA;                            Paint paint = new Paint();&#xA;                            if(buffer != null &amp;&amp; holder != null) {&#xA;                                holder.setType(SurfaceHolder.SURFACE_TYPE_NORMAL);&#xA;                                if((c = holder.lockCanvas()) == null) {&#xA;                                    Log.d(MPLAY_TAG, "Lock canvas failed");&#xA;                                    return;&#xA;                                }&#xA;                                ByteBuffer bbuf =&#xA;                                        ByteBuffer.allocateDirect(minVideoBufferSize);&#xA;                                bbuf.rewind();&#xA;                                for(int i = 0; i &lt; buffer.length; i&#x2B;&#x2B;) {&#xA;                                    bbuf.put(i, buffer[i]);&#xA;                                }&#xA;                                bbuf.rewind();&#xA;&#xA;                                // The approximate location where the application crashed.&#xA;                                Bitmap bmp = Bitmap.createBitmap(frame_width, frame_height, bmp_config);&#xA;                                bmp.copyPixelsFromBuffer(bbuf);&#xA;                                &#xA;                                float aspect_ratio = (float) frame_width / (float) frame_height;&#xA;                                int scaled_width = (int)(aspect_ratio * (c.getHeight()));&#xA;                                c.drawBitmap(bmp,&#xA;                                        null,&#xA;                                        new RectF(&#xA;                                                ((c.getWidth() - scaled_width) / 2), 0,&#xA;                                                ((c.getWidth() - scaled_width) / 2) &#x2B; scaled_width,&#xA;                                                c.getHeight()),&#xA;                                        null);&#xA;                                holder.unlockCanvasAndPost(c);&#xA;                                bmp.recycle();&#xA;                                bbuf.clear();&#xA;                            } else {&#xA;                                Log.d(MPLAY_TAG, "Video frame buffer is null");&#xA;                            }&#xA;                        } catch (Exception ex) {&#xA;                            ex.printStackTrace();&#xA;                        } catch (OutOfMemoryError oom) {&#xA;                            oom.printStackTrace();&#xA;                            stop();&#xA;                        }&#xA;                    }&#xA;                }&#xA;            }&#xA;        }).start();&#xA;    }&#xA;

    &#xA;

    Exception (tested in Android 4.1.2 emulator) :

    &#xA;

    E/dalvikvm-heap: Out of memory on a 1228812-byte allocation&#xA;I/dalvikvm: "Thread-495" prio=5 tid=21 RUNNABLE&#xA;   ................................................&#xA;     at android.graphics.Bitmap.nativeCreate(Native Method)&#xA;     at android.graphics.Bitmap.createBitmap(Bitmap.java:640)&#xA;     at android.graphics.Bitmap.createBitmap(Bitmap.java:620)&#xA;     at [app_package_name].MediaPlayer$5.run(MediaPlayer.java:406)&#xA;     at java.lang.Thread.run(Thread.java:856)&#xA;

    &#xA;

    For clarification : I first compiled FFmpeg 0.11.x on a virtual machine with Ubuntu 12.04 LTS from my written build script, looked for player examples suitable for Android below 2.2 (there is little information about them, unfortunately) and opened the file on the player and after showing the first frames it crashed into a stack or buffer overflow, on I put off developing the player for some time.

    &#xA;

    Is there anything ready-made that, as a rule, fits into one C++ file and takes into account all the nuances of backporting ? Thanks in advance.

    &#xA;

  • Split video with ffmpeg segment option is missing frame

    9 février 2024, par Dan

    I’m trying to get the ffmpeg “segment” option to split my video into segments at the Iframes. I'm using ffmpeg V6.1.1.

    &#xA;

    First I added time stamps to each frame of my video so that when it plays, I can see exactly which frame is being displayed. I used this command :

    &#xA;

    ffmpeg -i In.mp4 -vf "drawtext=fontfile='C :\Windows\Fonts\Arial.ttf' : text='%frame_num :~ %pts':fontsize=200 : r=25 : x=(w-tw)/2 : y=h-(2*lh) : fontcolor=white : box=1 : boxcolor=0x00000099" -y Out.mp4

    &#xA;

    Then I used ffprobe to confirm that the video is 30 FPS and the Iframes are as follows :

    &#xA;

    0.000000&#xA;4.933333&#xA;10.000000&#xA;11.533333&#xA;18.866667&#xA;24.966667

    &#xA;

    Based on these Iframe times, I’d expect the following segments :

    &#xA;

    &#xA;&#xA;&#xA;&#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;&#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    &#xA;

    Start Frame Start Time End Frame End Time
    0 0 147 4.900000
    148 4.933333 299 9.966667
    300 10.000000 345 11.500000
    346 11.533333 565 18.833334
    566 18.866667 748 24.933334
    749 24.966667 867 28.906667

    &#xA;

    &#xA;

    When I use ffmpeg to split the video into segments with the following command, I get six files as expected :

    &#xA;

    ffmpeg -i Out.mp4 -f segment -c copy -reset_timestamps 1 -map 0 "Out %d.mp4"

    &#xA;

    When I play the segments, they are all correct except the first segment file (Out 0.mp4). It seems to be missing the last frame. It contains frames 0 to 146 (4.866667 sec) but should also include frame 147 (4.9 sec). All the other segment files are as expected.

    &#xA;

    I’ve tried this on several different mp4 videos and they all are missing the last frame on the first segments.

    &#xA;

    Any idea why my first segment files is missing the last frame of the segment ?

    &#xA;

    Could this be an ffmpeg bug ?

    &#xA;

    Thanks for the help !&#xA;Dan

    &#xA;

    Here is my console session with all output :

    &#xA;

    C:\> ffprobe Out.mp4&#xA;ffprobe version 2023-12-21-git-1e42a48e37-full_build-www.gyan.dev Copyright (c) 2007-2023 the FFmpeg developers&#xA;  built with gcc 12.2.0 (Rev10, Built by MSYS2 project)&#xA;  configuration: --enable-gpl --enable-version3 --enable-static --pkg-config=pkgconf --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-dxva2 --enable-d3d11va --enable-libvpl --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libcodec2 --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint&#xA;  libavutil      58. 36.100 / 58. 36.100&#xA;  libavcodec     60. 36.100 / 60. 36.100&#xA;  libavformat    60. 20.100 / 60. 20.100&#xA;  libavdevice    60.  4.100 / 60.  4.100&#xA;  libavfilter     9. 14.100 /  9. 14.100&#xA;  libswscale      7.  6.100 /  7.  6.100&#xA;  libswresample   4. 13.100 /  4. 13.100&#xA;  libpostproc    57.  4.100 / 57.  4.100&#xA;Input #0, mov,mp4,m4a,3gp,3g2,mj2, from &#x27;Out.mp4&#x27;:&#xA;  Metadata:&#xA;    major_brand     : isom&#xA;    minor_version   : 512&#xA;    compatible_brands: isomiso2avc1mp41&#xA;    title           : Short 4k video sample - 4K Ultra HD (3840x2160)&#xA;    date            : 2014:05:24 19:00:00&#xA;    encoder         : Lavf60.20.100&#xA;  Duration: 00:00:28.96, start: 0.000000, bitrate: 3181 kb/s&#xA;  Stream #0:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], 3045 kb/s, 30 fps, 30 tbr, 15360 tbn (default)&#xA;      Metadata:&#xA;        handler_name    : VideoHandler&#xA;        vendor_id       : [0][0][0][0]&#xA;        encoder         : Lavc60.36.100 libx264&#xA;  Stream #0:1[0x2](und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)&#xA;      Metadata:&#xA;        handler_name    : SoundHandler&#xA;        vendor_id       : [0][0][0][0]&#xA;&#xA;C:\ ffprobe -loglevel error -skip_frame nokey -select_streams v:0 -show_entries frame=pts_time -of csv=print_section=0 Out.mp4&#xA;0.000000,&#xA;4.933333&#xA;10.000000&#xA;11.533333&#xA;18.866667&#xA;24.966667&#xA;&#xA;C:\ ffmpeg -i Out.mp4 -f segment -c copy -reset_timestamps 1 -map 0 "Out %1d.mp4"&#xA;ffmpeg version 2023-12-21-git-1e42a48e37-full_build-www.gyan.dev Copyright (c) 2000-2023 the FFmpeg developers&#xA;  built with gcc 12.2.0 (Rev10, Built by MSYS2 project)&#xA;  configuration: --enable-gpl --enable-version3 --enable-static --pkg-config=pkgconf --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-ffnvcodec --enable-nvdec --enable-nvenc --enable-dxva2 --enable-d3d11va --enable-libvpl --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libcodec2 --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint&#xA;  libavutil      58. 36.100 / 58. 36.100&#xA;  libavcodec     60. 36.100 / 60. 36.100&#xA;  libavformat    60. 20.100 / 60. 20.100&#xA;  libavdevice    60.  4.100 / 60.  4.100&#xA;  libavfilter     9. 14.100 /  9. 14.100&#xA;  libswscale      7.  6.100 /  7.  6.100&#xA;  libswresample   4. 13.100 /  4. 13.100&#xA;  libpostproc    57.  4.100 / 57.  4.100&#xA;Input #0, mov,mp4,m4a,3gp,3g2,mj2, from &#x27;Out.mp4&#x27;:&#xA;  Metadata:&#xA;    major_brand     : isom&#xA;    minor_version   : 512&#xA;    compatible_brands: isomiso2avc1mp41&#xA;    title           : Short 4k video sample - 4K Ultra HD (3840x2160)&#xA;    date            : 2014:05:24 19:00:00&#xA;    encoder         : Lavf60.20.100&#xA;  Duration: 00:00:28.96, start: 0.000000, bitrate: 3181 kb/s&#xA;  Stream #0:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], 3045 kb/s, 30 fps, 30 tbr, 15360 tbn (default)&#xA;      Metadata:&#xA;        handler_name    : VideoHandler&#xA;        vendor_id       : [0][0][0][0]&#xA;        encoder         : Lavc60.36.100 libx264&#xA;  Stream #0:1[0x2](und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)&#xA;      Metadata:&#xA;        handler_name    : SoundHandler&#xA;        vendor_id       : [0][0][0][0]&#xA;Stream mapping:&#xA;  Stream #0:0 -> #0:0 (copy)&#xA;  Stream #0:1 -> #0:1 (copy)&#xA;[segment @ 00000195bbc52940] Opening &#x27;Out 0.mp4&#x27; for writing&#xA;Output #0, segment, to &#x27;Out %1d.mp4&#x27;:&#xA;  Metadata:&#xA;    major_brand     : isom&#xA;    minor_version   : 512&#xA;    compatible_brands: isomiso2avc1mp41&#xA;    title           : Short 4k video sample - 4K Ultra HD (3840x2160)&#xA;    date            : 2014:05:24 19:00:00&#xA;    encoder         : Lavf60.20.100&#xA;  Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 3045 kb/s, 30 fps, 30 tbr, 15360 tbn (default)&#xA;      Metadata:&#xA;        handler_name    : VideoHandler&#xA;        vendor_id       : [0][0][0][0]&#xA;        encoder         : Lavc60.36.100 libx264&#xA;  Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)&#xA;      Metadata:&#xA;        handler_name    : SoundHandler&#xA;        vendor_id       : [0][0][0][0]&#xA;Press [q] to stop, [?] for help&#xA;[segment @ 00000195bbc52940] Opening &#x27;Out 1.mp4&#x27; for writing&#xA;[segment @ 00000195bbc52940] Opening &#x27;Out 2.mp4&#x27; for writing&#xA;[segment @ 00000195bbc52940] Opening &#x27;Out 3.mp4&#x27; for writing&#xA;[segment @ 00000195bbc52940] Opening &#x27;Out 4.mp4&#x27; for writing&#xA;[segment @ 00000195bbc52940] Opening &#x27;Out 5.mp4&#x27; for writing&#xA;[out#0/segment @ 00000195bc3e8cc0] video:10757kB audio:456kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown&#xA;size=N/A time=00:00:28.86 bitrate=N/A speed= 322x&#xA;

    &#xA;