Recherche avancée

Médias (91)

Autres articles (74)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (5883)

  • Changing portrait videos to landscape aspect with latest ffmpeg release is breaking

    19 décembre 2016, par zeal

    I am having an issue scaling videos recorded in portrait with the latest version of ffmpeg (2.1). This worked fine in a previous version, but I need the lastest version to fix a different issue.

    I am trying to take any video passed and make it 852 wide by 480 high. It works fine when converting videos that are wider then tall, but when the video is taller then wide it corrupts the video. It actually adds a letter box to the top and bottom, rather then left and right. Also, the meta data shows it as the correct height & width, but its wrong when played in windows.

    Here are the parameters I am using.

    ffmpeg -i INPUT -s 852x480 -r 30 -aspect 1.775 -b:v 2000000 -vcodec mpeg4 -vf "scale=iw*min(852/iw\,480 /ih):ih*min(852/iw\,480 /ih),pad=852:480 :(852-iw)/2:(480 -ih)/2" -ac 2 -b:a 128k -ar 44100 -y OUTPUT

    Console output from ffmpeg -i INPUT -r 30 -b:v 2000000 -vcodec mpeg4 -vf "scale=852:480" -ac 2 -b:a 128k -ar 44100 -y OUTPUT :

    C:\Lib>ffmpeg -i "d\ca96cd13-2995-4794-b753-22be3b918659.mov" -r 30 -b:v 2000000 -vcodec mpeg4 -vf "
    scale=852:480" -ac 2 -b:a 128k -ar 44100 -y "d\portx.mp4"
    ffmpeg version N-58015-g8cdf4e0 Copyright (c) 2000-2013 the FFmpeg developers
     built on Nov 10 2013 18:04:45 with gcc 4.8.2 (GCC)
     configuration: --disable-static --enable-shared --enable-gpl --enable-version3 --disable-w32thread
    s --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-icon
    v --enable-libass --enable-libbluray --enable-libcaca --enable-libfreetype --enable-libgsm --enable-
    libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrw
    b --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr -
    -enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-aacenc --
    enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libx264 --enab
    le-libxavs --enable-libxvid --enable-zlib
     libavutil      52. 52.100 / 52. 52.100
     libavcodec     55. 41.100 / 55. 41.100
     libavformat    55. 21.100 / 55. 21.100
     libavdevice    55.  5.100 / 55.  5.100
     libavfilter     3. 90.102 /  3. 90.102
     libswscale      2.  5.101 /  2.  5.101
     libswresample   0. 17.104 /  0. 17.104
     libpostproc    52.  3.100 / 52.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'd\ca96cd13-2995-4794-b753-22be3b918659.mov':
     Metadata:
       major_brand     : qt
       minor_version   : 0
       compatible_brands: qt
       creation_time   : 2013-11-12 15:02:21
       model           : iPhone 5
       model-eng       : iPhone 5
       encoder         : 7.0.2
       encoder-eng     : 7.0.2
       date            : 2013-11-12T10:02:21-0500
       date-eng        : 2013-11-12T10:02:21-0500
       make            : Apple
       make-eng        : Apple
     Duration: 00:00:15.48, start: 0.000000, bitrate: 780 kb/s
       Stream #0:0(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p(tv, smpte170m), 480x360, 7
    05 kb/s, 29.98 fps, 30 tbr, 600 tbn, 1200 tbc (default)
       Metadata:
         rotate          : 90
         creation_time   : 2013-11-12 15:02:21
         handler_name    : Core Media Data Handler
       Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 62 kb/s (default)
       Metadata:
         creation_time   : 2013-11-12 15:02:21
         handler_name    : Core Media Data Handler
    Output #0, mp4, to 'd\portx.mp4':
     Metadata:
       major_brand     : qt
       minor_version   : 0
       compatible_brands: qt
       make-eng        : Apple
       model           : iPhone 5
       model-eng       : iPhone 5
       make            : Apple
       encoder-eng     : 7.0.2
       date            : 2013-11-12T10:02:21-0500
       date-eng        : 2013-11-12T10:02:21-0500
       encoder         : Lavf55.21.100
       Stream #0:0(und): Video: mpeg4 ( [0][0][0] / 0x0020), yuv420p, 852x480, q=2-31, 2000 kb/s, 15360
    tbn, 30 tbc (default)
       Metadata:
         rotate          : 90
         creation_time   : 2013-11-12 15:02:21
         handler_name    : Core Media Data Handler
       Stream #0:1(und): Audio: aac (libvo_aacenc) ([64][0][0][0] / 0x0040), 44100 Hz, stereo, s16, 128
    kb/s (default)
       Metadata:
         creation_time   : 2013-11-12 15:02:21
         handler_name    : Core Media Data Handler
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 -> mpeg4)
     Stream #0:1 -> #0:1 (aac -> libvo_aacenc)
    Press [q] to stop, [?] for help
    frame=  465 fps=417 q=3.8 Lsize=    4118kB time=00:00:15.52 bitrate=2173.3kbits/s
    video:3860kB audio:243kB subtitle:0 global headers:0kB muxing overhead 0.348231%
  • FFmpeg avcodec_decode_video2 decode RTSP H264 HD-video packet to video picture with error

    29 mai 2018, par Nguyen Ba Thi

    I used FFmpeg library version 4.0 to have simple C++ program, in witch is a thread to receive RTSP H264 video data from IP-camera and display it in program window.

    Code of this thread is follow :

    DWORD WINAPI GrabbProcess(LPVOID lpParam)
    // Grabbing thread
    {
     DWORD i;
     int ret = 0, nPacket=0;
     FILE *pktFile;
     // Open video file
     pFormatCtx = avformat_alloc_context();
     if(avformat_open_input(&pFormatCtx, nameVideoStream, NULL, NULL)!=0)
         fGrabb=-1; // Couldn't open file
     else
     // Retrieve stream information
     if(avformat_find_stream_info(pFormatCtx, NULL)<0)
         fGrabb=-2; // Couldn't find stream information
     else
     {
         // Find the first video stream
         videoStream=-1;
         for(i=0; inb_streams; i++)
           if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO)
           {
             videoStream=i;
             break;
           }
         if(videoStream==-1)
             fGrabb=-3; // Didn't find a video stream
         else
         {
             // Get a pointer to the codec context for the video stream
             pCodecCtxOrig=pFormatCtx->streams[videoStream]->codec;
             // Find the decoder for the video stream
             pCodec=avcodec_find_decoder(pCodecCtxOrig->codec_id);
             if(pCodec==NULL)
                 fGrabb=-4; // Codec not found
             else
             {
                 // Copy context
                 pCodecCtx = avcodec_alloc_context3(pCodec);
                 if(avcodec_copy_context(pCodecCtx, pCodecCtxOrig) != 0)
                     fGrabb=-5; // Error copying codec context
                 else
                 {
                     // Open codec
                     if(avcodec_open2(pCodecCtx, pCodec, NULL)<0)
                         fGrabb=-6; // Could not open codec
                     else
                     // Allocate video frame for input
                     pFrame=av_frame_alloc();
                     // Determine required buffer size and allocate buffer
                     numBytes=avpicture_get_size(pCodecCtx->pix_fmt, pCodecCtx->width,
                         pCodecCtx->height);
                     buffer=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
                     // Assign appropriate parts of buffer to image planes in pFrame
                     // Note that pFrame is an AVFrame, but AVFrame is a superset
                     // of AVPicture
                     avpicture_fill((AVPicture *)pFrame, buffer, pCodecCtx->pix_fmt,
                         pCodecCtx->width, pCodecCtx->height);

                     // Allocate video frame for display
                     pFrameRGB=av_frame_alloc();
                     // Determine required buffer size and allocate buffer
                     numBytes=avpicture_get_size(AV_PIX_FMT_RGB24, pCodecCtx->width,
                         pCodecCtx->height);
                     bufferRGB=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
                     // Assign appropriate parts of buffer to image planes in pFrameRGB
                     // Note that pFrameRGB is an AVFrame, but AVFrame is a superset
                     // of AVPicture
                     avpicture_fill((AVPicture *)pFrameRGB, bufferRGB, AV_PIX_FMT_RGB24,
                         pCodecCtx->width, pCodecCtx->height);
                     // initialize SWS context for software scaling to FMT_RGB24
                     sws_ctx_to_RGB = sws_getContext(pCodecCtx->width,
                         pCodecCtx->height,
                         pCodecCtx->pix_fmt,
                         pCodecCtx->width,
                         pCodecCtx->height,
                         AV_PIX_FMT_RGB24,
                         SWS_BILINEAR,
                         NULL,
                         NULL,
                         NULL);

                     // Allocate video frame (grayscale YUV420P) for processing
                     pFrameYUV=av_frame_alloc();
                     // Determine required buffer size and allocate buffer
                     numBytes=avpicture_get_size(AV_PIX_FMT_YUV420P, pCodecCtx->width,
                         pCodecCtx->height);
                     bufferYUV=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
                     // Assign appropriate parts of buffer to image planes in pFrameYUV
                     // Note that pFrameYUV is an AVFrame, but AVFrame is a superset
                     // of AVPicture
                     avpicture_fill((AVPicture *)pFrameYUV, bufferYUV, AV_PIX_FMT_YUV420P,
                         pCodecCtx->width, pCodecCtx->height);
                     // initialize SWS context for software scaling to FMT_YUV420P
                     sws_ctx_to_YUV = sws_getContext(pCodecCtx->width,
                         pCodecCtx->height,
                         pCodecCtx->pix_fmt,
                         pCodecCtx->width,
                         pCodecCtx->height,
                         AV_PIX_FMT_YUV420P,
                         SWS_BILINEAR,
                         NULL,
                         NULL,
                         NULL);
                   RealBsqHdr.biWidth = pCodecCtx->width;
                   RealBsqHdr.biHeight = -pCodecCtx->height;
                 }
             }
         }
     }
     while ((fGrabb==1)||(fGrabb==100))
     {
         // Grabb a frame
         if (av_read_frame(pFormatCtx, &packet) >= 0)
         {
           // Is this a packet from the video stream?
           if(packet.stream_index==videoStream)
           {
               // Decode video frame
               int len = avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
               nPacket++;
               // Did we get a video frame?
               if(frameFinished)
               {
                   // Convert the image from its native format to YUV
                   sws_scale(sws_ctx_to_YUV, (uint8_t const * const *)pFrame->data,
                       pFrame->linesize, 0, pCodecCtx->height,
                       pFrameYUV->data, pFrameYUV->linesize);
                   // Convert the image from its native format to RGB
                   sws_scale(sws_ctx_to_RGB, (uint8_t const * const *)pFrame->data,
                       pFrame->linesize, 0, pCodecCtx->height,
                       pFrameRGB->data, pFrameRGB->linesize);
                   HDC hdc=GetDC(hWndM);
                   SetDIBitsToDevice(hdc, 0, 0, pCodecCtx->width, pCodecCtx->height,
                       0, 0, 0, pCodecCtx->height,pFrameRGB->data[0], (LPBITMAPINFO)&RealBsqHdr, DIB_RGB_COLORS);
                   ReleaseDC(hWndM,hdc);
                   av_frame_unref(pFrame);
               }
           }
           // Free the packet that was allocated by av_read_frame
           av_free_packet(&packet);
         }
      }
      // Free the org frame
     av_frame_free(&pFrame);
     // Free the RGB frame
     av_frame_free(&pFrameRGB);
     // Free the YUV frame
     av_frame_free(&pFrameYUV);

     // Close the codec
     avcodec_close(pCodecCtx);
     avcodec_close(pCodecCtxOrig);

     // Close the video file
     avformat_close_input(&pFormatCtx);
     avformat_free_context(pFormatCtx);

     if (fGrabb==1)
         sprintf(tmpstr,"Grabbing Completed %d frames", nCntTotal);
     else if (fGrabb==2)
         sprintf(tmpstr,"User break on %d frames", nCntTotal);
     else if (fGrabb==3)
         sprintf(tmpstr,"Can't Grabb at frame %d", nCntTotal);
     else if (fGrabb==-1)
         sprintf(tmpstr,"Couldn't open file");
     else if (fGrabb==-2)
         sprintf(tmpstr,"Couldn't find stream information");
     else if (fGrabb==-3)
         sprintf(tmpstr,"Didn't find a video stream");
     else if (fGrabb==-4)
         sprintf(tmpstr,"Codec not found");
     else if (fGrabb==-5)
         sprintf(tmpstr,"Error copying codec context");
     else if (fGrabb==-6)
         sprintf(tmpstr,"Could not open codec");
     i=(UINT) fGrabb;
     fGrabb=0;
     SetWindowText(hWndM,tmpstr);
     ExitThread(i);
     return 0;
    }
    // End Grabbing thread  

    When program receive RTSP H264 video data with resolution 704x576 then decoded video pictures are OK. When receive RTSP H264 HD-video data with resolution 1280x720 it look like that first video picture is decoded OK and then video pictures are decoded but always with some error.

    Please help me to fix this problem !

    Here is problems brief :
    I have an IP camera model HI3518E_50H10L_S39 (product of China).
    Camera can provide H264 video stream both at resolution 704x576 (with RTSP URI "rtsp ://192.168.1.18:554/user=admin_password=tlJwpbo6_channel=1_stream=1.sdp ?real_stream") or 1280x720 (with RTSP URI "rtsp ://192.168.1.18:554/user=admin_password=tlJwpbo6_channel=1_stream=0.sdp ?real_stream").
    Using FFplay utility I can access and display them with good picture quality.
    For testing of grabbing from this camera, I have a simple (above mentioned) program in VC-2005. In "Grabbing thread" program use FFmpeg library version 4.0 for opening camera RTSP stream, retrieve stream information, find the first video stream... and prepare some variables.
    Center of this thread is loop : Grab a frame (function av_read_frame) - Decode it if it’s video (function avcodec_decode_video2) - Convert to RGB format (function sws_scale) - Display to program window (GDI function SetDIBitsToDevice).
    When proram run with camera RTSP stream at resolution 704x576, I have good video picture. Here is a sample :
    704x576 sample
    When program run with camera RTSP stream at resolution 1280x720, first video picture is good :
    First good at res.1280x720
    but then not good :
    not good at res.1280x720
    Its seem to be my FFmpeg function call to avcodec_decode_video2 can’t fully decode certain packet for some reasons.

  • How to keep transparency when scale webm file with ffmpeg

    5 octobre 2022, par Sonia Kidman

    I'm using ffmpeg to scale my WEBM file, by using below command : 
ffmpeg -i in.webm -c:v libvpx -vf scale=100:100 out.webm
The output has correct resolution as I expected but the problem is transparency become black background.

    



    Could someone give me a solution for this.

    



    Thank you so much.

    



    Below is the log of the operation :

    



    ffmpeg version 3.4 Copyright (c) 2000-2017 the FFmpeg developers
  built with gcc 7.2.0 (GCC)
  configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-bzlib --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-cuda --enable-cuvid --enable-d3d11va --enable-nvenc --enable-dxva2 --enable-avisynth --enable-libmfx
  libavutil      55. 78.100 / 55. 78.100
  libavcodec     57.107.100 / 57.107.100
  libavformat    57. 83.100 / 57. 83.100
  libavdevice    57. 10.100 / 57. 10.100
  libavfilter     6.107.100 /  6.107.100
  libswscale      4.  8.100 /  4.  8.100
  libswresample   2.  9.100 /  2.  9.100
  libpostproc    54.  7.100 / 54.  7.100
Splitting the commandline.
Reading option '-v' ... matched as option 'v' (set logging level) with argument '56'.
Reading option '-i' ... matched as input url with argument 'in.webm'.
Reading option '-c:v' ... matched as option 'c' (codec name) with argument 'libvpx'.
Reading option '-vf' ... matched as option 'vf' (set video filters) with argument 'scale=320:240'.
Reading option 'out.webm' ... matched as output url.
Finished splitting the commandline.
Parsing a group of options: global .
Applying option v (set logging level) with argument 56.
Successfully parsed a group of options.
Parsing a group of options: input url in.webm.
Successfully parsed a group of options.
Opening an input file: in.webm.
[NULL @ 000002387e6322a0] Opening 'in.webm' for reading
[file @ 000002387e632ea0] Setting default whitelist 'file,crypto'
Probing matroska,webm score:100 size:2048
Probing mp3 score:1 size:2048
[matroska,webm @ 000002387e6322a0] Format matroska,webm probed with size=2048 and score=100
st:0 removing common factor 1000000 from timebase
[matroska,webm @ 000002387e6322a0] Before avformat_find_stream_info() pos: 634 bytes read:32768 seeks:0 nb_streams:1
[matroska,webm @ 000002387e6322a0] All info found
[matroska,webm @ 000002387e6322a0] stream 0: start_time: 0.000 duration: -9223372036854776.000
[matroska,webm @ 000002387e6322a0] format: start_time: 0.000 duration: 0.400 bitrate=1432 kb/s
[matroska,webm @ 000002387e6322a0] After avformat_find_stream_info() pos: 34843 bytes read:65536 seeks:0 frames:1
Input #0, matroska,webm, from 'in.webm':
  Metadata:
    ENCODER         : Lavf57.83.100
  Duration: 00:00:00.40, start: 0.000000, bitrate: 1432 kb/s
    Stream #0:0, 1, 1/1000: Video: vp8, 1 reference frame, yuv420p(progressive), 640x480, 0/1, SAR 1:1 DAR 4:3, 10 fps, 10 tbr, 1k tbn, 1k tbc (default)
    Metadata:
      alpha_mode      : 1
      ENCODER         : Lavc57.107.100 libvpx
      DURATION        : 00:00:00.400000000
Successfully opened the file.
Parsing a group of options: output url out.webm.
Applying option c:v (codec name) with argument libvpx.
Applying option vf (set video filters) with argument scale=320:240.
Successfully parsed a group of options.
Opening an output file: out.webm.
[file @ 000002387e658b40] Setting default whitelist 'file,crypto'
Successfully opened the file.
detected 4 logical cores
Stream mapping:
  Stream #0:0 -> #0:0 (vp8 (native) -> vp8 (libvpx))
Press [q] to stop, [?] for help
cur_dts is invalid (this is harmless if it occurs once at the start per stream)
    Last message repeated 4 times
[Parsed_scale_0 @ 000002387e718a60] Setting 'w' to value '320'
[Parsed_scale_0 @ 000002387e718a60] Setting 'h' to value '240'
[Parsed_scale_0 @ 000002387e718a60] Setting 'flags' to value 'bicubic'
[Parsed_scale_0 @ 000002387e718a60] w:320 h:240 flags:'bicubic' interl:0
[graph 0 input from stream 0:0 @ 000002387e743b00] Setting 'video_size' to value '640x480'
[graph 0 input from stream 0:0 @ 000002387e743b00] Setting 'pix_fmt' to value '0'
[graph 0 input from stream 0:0 @ 000002387e743b00] Setting 'time_base' to value '1/1000'
[graph 0 input from stream 0:0 @ 000002387e743b00] Setting 'pixel_aspect' to value '1/1'
[graph 0 input from stream 0:0 @ 000002387e743b00] Setting 'sws_param' to value 'flags=2'
[graph 0 input from stream 0:0 @ 000002387e743b00] Setting 'frame_rate' to value '10/1'
[graph 0 input from stream 0:0 @ 000002387e743b00] w:640 h:480 pixfmt:yuv420p tb:1/1000 fr:10/1 sar:1/1 sws_param:flags=2
[format @ 000002387e7fe1e0] compat: called with args=[yuv420p|yuva420p]
[format @ 000002387e7fe1e0] Setting 'pix_fmts' to value 'yuv420p|yuva420p'
[AVFilterGraph @ 000002387e634e60] query_formats: 4 queried, 3 merged, 0 already done, 0 delayed
[Parsed_scale_0 @ 000002387e718a60] w:640 h:480 fmt:yuv420p sar:1/1 -> w:320 h:240 fmt:yuv420p sar:1/1 flags:0x4
[libvpx @ 000002387e657fe0] v1.6.1
[libvpx @ 000002387e657fe0] --prefix=/Users/kyle/software/libvpx/win64/libvpx-1.6.1-win64 --target=x86_64-win64-gcc
[libvpx @ 000002387e657fe0] vpx_codec_enc_cfg
[libvpx @ 000002387e657fe0] generic settings
  g_usage:                      0
  g_threads:                    0
  g_profile:                    0
  g_w:                          320
  g_h:                          240
  g_bit_depth:                  8
  g_input_bit_depth:            8
  g_timebase:                   {1/30}
  g_error_resilient:            0
  g_pass:                       0
  g_lag_in_frames:              0
[libvpx @ 000002387e657fe0] rate control settings
  rc_dropframe_thresh:          0
  rc_resize_allowed:            0
  rc_resize_up_thresh:          60
  rc_resize_down_thresh:        30
  rc_end_usage:                 0
  rc_twopass_stats_in:          0000000000000000(0)
  rc_target_bitrate:            256
[libvpx @ 000002387e657fe0] quantizer settings
  rc_min_quantizer:             4
  rc_max_quantizer:             63
[libvpx @ 000002387e657fe0] bitrate tolerance
  rc_undershoot_pct:            100
  rc_overshoot_pct:             100
[libvpx @ 000002387e657fe0] decoder buffer model
  rc_buf_sz:                    6000
  rc_buf_initial_sz:            4000
  rc_buf_optimal_sz:            5000
[libvpx @ 000002387e657fe0] 2 pass rate control settings
  rc_2pass_vbr_bias_pct:        50
  rc_2pass_vbr_minsection_pct:  0
  rc_2pass_vbr_maxsection_pct:  400
[libvpx @ 000002387e657fe0] keyframing settings
  kf_mode:                      1
  kf_min_dist:                  0
  kf_max_dist:                  128
[libvpx @ 000002387e657fe0] 
[libvpx @ 000002387e657fe0] vpx_codec_enc_cfg
[libvpx @ 000002387e657fe0] generic settings
  g_usage:                      0
  g_threads:                    0
  g_profile:                    0
  g_w:                          320
  g_h:                          240
  g_bit_depth:                  8
  g_input_bit_depth:            8
  g_timebase:                   {1/10}
  g_error_resilient:            0
  g_pass:                       0
  g_lag_in_frames:              25
[libvpx @ 000002387e657fe0] rate control settings
  rc_dropframe_thresh:          0
  rc_resize_allowed:            0
  rc_resize_up_thresh:          60
  rc_resize_down_thresh:        30
  rc_end_usage:                 0
  rc_twopass_stats_in:          0000000000000000(0)
  rc_target_bitrate:            200
[libvpx @ 000002387e657fe0] quantizer settings
  rc_min_quantizer:             4
  rc_max_quantizer:             63
[libvpx @ 000002387e657fe0] bitrate tolerance
  rc_undershoot_pct:            100
  rc_overshoot_pct:             100
[libvpx @ 000002387e657fe0] decoder buffer model
  rc_buf_sz:                    6000
  rc_buf_initial_sz:            4000
  rc_buf_optimal_sz:            5000
[libvpx @ 000002387e657fe0] 2 pass rate control settings
  rc_2pass_vbr_bias_pct:        50
  rc_2pass_vbr_minsection_pct:  0
  rc_2pass_vbr_maxsection_pct:  400
[libvpx @ 000002387e657fe0] keyframing settings
  kf_mode:                      1
  kf_min_dist:                  0
  kf_max_dist:                  128
[libvpx @ 000002387e657fe0] 
[libvpx @ 000002387e657fe0] vpx_codec_control
[libvpx @ 000002387e657fe0]   VP8E_SET_CPUUSED:             1
[libvpx @ 000002387e657fe0]   VP8E_SET_ARNR_MAXFRAMES:      0
[libvpx @ 000002387e657fe0]   VP8E_SET_ARNR_STRENGTH:       3
[libvpx @ 000002387e657fe0]   VP8E_SET_ARNR_TYPE:           3
[libvpx @ 000002387e657fe0]   VP8E_SET_NOISE_SENSITIVITY:   0
[libvpx @ 000002387e657fe0]   VP8E_SET_TOKEN_PARTITIONS:    0
[libvpx @ 000002387e657fe0]   VP8E_SET_STATIC_THRESHOLD:    0
[libvpx @ 000002387e657fe0] Using deadline: 1000000
Output #0, webm, to 'out.webm':
  Metadata:
    encoder         : Lavf57.83.100
    Stream #0:0, 0, 1/1000: Video: vp8 (libvpx), 1 reference frame, yuv420p, 320x240 [SAR 1:1 DAR 4:3], 0/1, q=-1--1, 200 kb/s, 10 fps, 1k tbn, 10 tbc (default)
    Metadata:
      alpha_mode      : 1
      DURATION        : 00:00:00.400000000
      encoder         : Lavc57.107.100 libvpx
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
Clipping frame in rate conversion by 0.000008
[webm @ 000002387e656880] get_metadata_duration returned: 400000
[webm @ 000002387e656880] Write early duration from metadata = 400
[webm @ 000002387e656880] Writing block at offset 3, size 11223, pts 0, dts 0, duration 100, keyframe 1
[webm @ 000002387e656880] Writing block at offset 11233, size 1288, pts 100, dts 100, duration 100, keyframe 0
[webm @ 000002387e656880] Writing block at offset 12528, size 1504, pts 200, dts 200, duration 100, keyframe 0
[webm @ 000002387e656880] Writing block at offset 14039, size 2481, pts 300, dts 300, duration 100, keyframe 0
[out_0_0 @ 000002387e743d60] EOF on sink link out_0_0:default.
No more output streams to write to, finishing.
[webm @ 000002387e656880] end duration = 400
[webm @ 000002387e656880] stream 0 end duration = 400
frame=    4 fps=0.0 q=0.0 Lsize=      17kB time=00:00:00.30 bitrate= 457.8kbits/s speed=4.45x    
video:16kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 4.413191%
Input file #0 (in.webm):
  Input stream #0:0 (video): 4 packets read (34992 bytes); 4 frames decoded; 
  Total: 4 packets (34992 bytes) demuxed
Output file #0 (out.webm):
  Output stream #0:0 (video): 4 frames encoded; 4 packets muxed (16496 bytes); 
  Total: 4 packets (16496 bytes) muxed
4 frames successfully decoded, 0 decoding errors
[AVIOContext @ 000002387e698c20] Statistics: 14 seeks, 10 writeouts
[AVIOContext @ 000002387cc773e0] Statistics: 71649 bytes read, 0 seeks