Recherche avancée

Médias (39)

Mot : - Tags -/audio

Autres articles (9)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Selection of projects using MediaSPIP

    2 mai 2011, par

    The examples below are representative elements of MediaSPIP specific uses for specific projects.
    MediaSPIP farm @ Infini
    The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...)

  • ANNEXE : Les extensions, plugins SPIP des canaux

    11 février 2010, par

    Un plugin est un ajout fonctionnel au noyau principal de SPIP. MediaSPIP consiste en un choix délibéré de plugins existant ou pas auparavant dans la communauté SPIP, qui ont pour certains nécessité soit leur création de A à Z, soit des ajouts de fonctionnalités.
    Les extensions que MediaSPIP nécessite pour fonctionner
    Depuis la version 2.1.0, SPIP permet d’ajouter des plugins dans le répertoire extensions/.
    Les "extensions" ne sont ni plus ni moins que des plugins dont la particularité est qu’ils se (...)

Sur d’autres sites (3358)

  • How to make dynamic timestamp in ffpmeg, not static ?

    28 février 2013, par Sergii Rechmp

    I am using ffmpeg (windows pc) with my webcam to record what happens when i'm out of home.
    It works fine, but i can't make dynamic clock and date on screen.
    I have tried

    -vf "[in]drawtext=fontsize=12:fontfile=arial.ttf:text='%date%':x=10:y=25,  
    drawtext=fontsize=12:fontfile=arial.ttf:text="%TIME:~0,2%-%TIME:~3,2%-%TIME:~6,2%"

    It works, but it shows time only for start moment, than it wont change during video recording.
    it looks like :enter image description here
    Is there any method to out time on screen ?
    Thanks.

  • GStreamer with MPEG-TS Video4Linux ATSC/DVB Recording

    21 juin 2013, par Dustin Oprea

    I'm having an impossible time setting up a filtergraph to read from a recording that I made from my DVB video4linux device. Any help would be vastly appreciated.

    How I made the recording :

    To tune the channel :

    azap -c ~/channels.conf "Florida"

    To record the channel :

    cat /dev/dvb/adapter0/dvr0 > /tmp/test

    This is the way that recordings must be made (I can not use any GST DVB plugins to do this for me).

    I used tstools to identify that the recording is a TS stream :

    tstools/bin$ ./stream_type ~/recordings/20130129-202049
    Reading from /home/dustin/recordings/20130129-202049
    It appears to be Transport Stream

    ...but that there are no PAT/PMT frames :

    tstools/bin$ ./tsinfo ~/recordings/20130129-202049
    Reading from /home/dustin/recordings/20130129-202049
    Scanning 10000 TS packets

    Found 0 PAT packets and 0 PMT packets in 10000 TS packets

    I was able to produce a single ES (elementary stream) stream, by running ts2es :

    tstools/bin$ ./ts2es -pid 97 ~/recordings/20130129-202049 ~/recordings/20130129-202049.es
    Reading from /home/dustin/recordings/20130129-202049
    Writing to   /home/dustin/recordings/20130129-202049.es
    Extracting packets for PID 0061 (97)
    !!! 4 bytes ignored at end of file - not enough to make a TS packet
    Extracted 219258 of 248113 TS packets

    I am able to play the ES stream (even though the video is frozen on the first frame) :

    gst-launch-0.10 filesrc location=~/recordings/20130129-202049.es ! decodebin2 ! autovideosink
    gst-launch-0.10 filesrc location=~/recordings/20130129-202049.es ! decodebin2 ! xvimagesink
    gst-launch-0.10 playbin2 uri=file:///home/dustin/recordings/20130129-202049.es

    No matter what I do, though, I can't get the original TS file to open. However, it opens in Mplayer/FFMPEG, perfectly (but not VLC). This is the output of FFMPEG :

    ffmpeg -i 20130129-202049

    ffmpeg version 0.8.5-4:0.8.5-0ubuntu0.12.04.1, Copyright (c) 2000-2012 the Libav developers
     built on Jan 24 2013 18:03:14 with gcc 4.6.3
    *** THIS PROGRAM IS DEPRECATED ***
    This program is only provided for compatibility and will be removed in a future release. Please use avconv instead.
    [mpeg2video @ 0x9be7be0] mpeg_decode_postinit() failure
       Last message repeated 4 times
    [mpegts @ 0x9be3aa0] max_analyze_duration reached
    [mpegts @ 0x9be3aa0] PES packet size mismatch
    Input #0, mpegts, from '20130129-202049':
     Duration: 00:03:39.99, start: 9204.168844, bitrate: 1696 kb/s
       Stream #0.0[0x61]: Video: mpeg2video (Main), yuv420p, 528x480 [PAR 40:33 DAR 4:3], 15000 kb/s, 30.57 fps, 29.97 tbr, 90k tbn, 59.94 tbc
       Stream #0.1[0x64]: Audio: ac3, 48000 Hz, stereo, s16, 192 kb/s
    At least one output file must be specified

    This tells us that the video stream has PID 0x61 (97).

    I have been trying for a few days, so the following are only examples of a couple of attempts. I'll provide the playbin2 example first, since I know that thousands of people will respond, insisting that I just use that. It doesn't work.

    gst-launch-0.10 playbin2 uri=file:///home/dustin/recordings/20130129-202049

    Setting pipeline to PAUSED ...
    Pipeline is PREROLLING ...
    ERROR: from element /GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20/GstMpegTSDemux:mpegtsdemux0: Could not determine type of stream.
    Additional debug info:
    gstmpegtsdemux.c(2931): gst_mpegts_demux_sink_event (): /GstPlayBin2:playbin20/GstURIDecodeBin:uridecodebin0/GstDecodeBin2:decodebin20/GstMpegTSDemux:mpegtsdemux0:
    No valid streams found at EOS
    ERROR: pipeline doesn't want to preroll.
    Setting pipeline to NULL ...
    Freeing pipeline ...

    It fails [probably] because no PID has been specified with which to find the video (the "EOS" error, I think).

    Naturally, I tried the following to start by demuxing the TS format. I believe it's the "es-pids" property that receives the PID in the absence of PMT information (which tstools said there weren't any, above), but I tried "program-number", too, just in case. gst-inspect indicates that one is hex and the other is decimal :

    gst-launch-0.10 -v filesrc location=20130129-202049 ! mpegtsdemux es-pids=0x61 ! fakesink
    gst-launch-0.10 -v filesrc location=20130129-202049 ! mpegtsdemux program-number=97 ! fakesink

    Output :

    gst-launch-0.10 filesrc location=20130129-202049 ! mpegtsdemux es-pids=0x61 ! fakesink
    Setting pipeline to PAUSED ...
    Pipeline is PREROLLING ...
    ERROR: from element /GstPipeline:pipeline0/GstMpegTSDemux:mpegtsdemux0: Could not determine type of stream.
    Additional debug info:
    gstmpegtsdemux.c(2931): gst_mpegts_demux_sink_event (): /GstPipeline:pipeline0/GstMpegTSDemux:mpegtsdemux0:
    No valid streams found at EOS
    ERROR: pipeline doesn't want to preroll.
    Setting pipeline to NULL ...
    Freeing pipeline ...
    dustin@dustinmicro:~/recordings$ gst-launch-0.10 -v filesrc location=20130129-202049 ! mpegtsdemux es-pids=0x61 ! fakesink
    Setting pipeline to PAUSED ...
    Pipeline is PREROLLING ...
    ERROR: from element /GstPipeline:pipeline0/GstMpegTSDemux:mpegtsdemux0: Could not determine type of stream.
    Additional debug info:
    gstmpegtsdemux.c(2931): gst_mpegts_demux_sink_event (): /GstPipeline:pipeline0/GstMpegTSDemux:mpegtsdemux0:
    No valid streams found at EOS
    ERROR: pipeline doesn't want to preroll.
    Setting pipeline to NULL ...
    Freeing pipeline ...
    dustin@dustinmicro:~/recordings$ gst-launch-0.10 -v filesrc location=20130129-202049 ! mpegtsdemux es-pids=0x61 ! fakesink
    Setting pipeline to PAUSED ...
    Pipeline is PREROLLING ...
    ERROR: from element /GstPipeline:pipeline0/GstMpegTSDemux:mpegtsdemux0: Could not determine type of stream.
    Additional debug info:
    gstmpegtsdemux.c(2931): gst_mpegts_demux_sink_event (): /GstPipeline:pipeline0/GstMpegTSDemux:mpegtsdemux0:
    No valid streams found at EOS
    ERROR: pipeline doesn't want to preroll.
    Setting pipeline to NULL ...
    Freeing pipeline ...

    However, when I try mpegpsdemux (for program streams (PS), as opposed to transport streams (TS)), I get further :

    gst-launch-0.10 filesrc location=20130129-202049 ! mpegpsdemux ! fakesink

    Setting pipeline to PAUSED ...
    Pipeline is PREROLLING ...

    (gst-launch-0.10:14805): GStreamer-CRITICAL **: gst_event_new_new_segment_full: assertion `position != -1' failed

    (gst-launch-0.10:14805): GStreamer-CRITICAL **: gst_mini_object_ref: assertion `mini_object != NULL' failed

    (gst-launch-0.10:14805): GStreamer-CRITICAL **: gst_pad_push_event: assertion `event != NULL' failed
    Pipeline is PREROLLED ...
    Setting pipeline to PLAYING ...
    New clock: GstSystemClock

    (gst-launch-0.10:14805): GStreamer-CRITICAL **: gst_event_new_new_segment_full: assertion `position != -1' failed

    (gst-launch-0.10:14805): GStreamer-CRITICAL **: gst_mini_object_ref: assertion `mini_object != NULL' failed

    (gst-launch-0.10:14805): GStreamer-CRITICAL **: gst_pad_push_event: assertion `event != NULL' failed

    (gst-launch-0.10:14805): GStreamer-CRITICAL **: gst_event_new_new_segment_full: assertion `position != -1' failed

    (gst-launch-0.10:14805): GStreamer-CRITICAL **: gst_mini_object_ref: assertion `mini_object != NULL' failed

    (gst-launch-0.10:14805): GStreamer-CRITICAL **: gst_pad_push_event: assertion `event != NULL' failed

    (gst-launch-0.10:14805): GStreamer-CRITICAL **: gst_event_new_new_segment_full: assertion `position != -1' failed

    (gst-launch-0.10:14805): GStreamer-CRITICAL **: gst_mini_object_ref: assertion `mini_object != NULL' failed

    (gst-launch-0.10:14805): GStreamer-CRITICAL **: gst_pad_push_event: assertion `event != NULL' failed


    ...


    Got EOS from element "pipeline0".
    Execution ended after 1654760008 ns.
    Setting pipeline to PAUSED ...
    Setting pipeline to READY ...
    Setting pipeline to NULL ...
    Freeing pipeline ...

    I'll still get the same problem whenever I use the mpegtsdemux, even if it follows mpegpsdemux, above.

    I don't understand this, since I haven't even picked a PID yet.

    What am I doing wrong ?

    Dustin

  • Gsteamer rtp video mixer, found a working pipeline, however need improvement

    8 février 2015, par alkber

    I’m attempting to mix multiple rtp h264 payload video streams into a single video stream of 15FPS.

    A working pipeline that mixes two video streams over a videotestsource pattern of 15FPS

    VIDEO_CAPS="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H264"

    gst-launch -vvvve videomixer2 name=mix ! ffmpegcolorspace ! xvimagesink  

    udpsrc caps=$VIDEO_CAPS port=3030 ! .recv_rtp_sink_0 gstrtpbin ! rtph264depay ! ffdec_h264  ! videoscale ! video/x-raw-yuv , width=176, height=144 ! videobox top=0 left=0 ! video/x-raw-yuv,format=\(fourcc\)AYUV ! ffmpegcolorspace !  mix.  

    udpsrc caps=$VIDEO_CAPS port=6666 ! .recv_rtp_sink_0 gstrtpbin ! rtph264depay ! ffdec_h264  ! videoscale ! video/x-raw-yuv , width=176, height=144 ! videobox top=0 left=-178 ! video/x-raw-yuv,format=\(fourcc\)AYUV ! ffmpegcolorspace !  mix.

    videotestsrc ! video/x-raw-yuv, framerate=15/1, width=640, height=360 ! mix.

    Above pipeline has a strange issue, When attempting to mix, the first video source goes blank, (i’m certain that there is still streaming going on for the first video source, however it doesn’t turn up. Rather second video stream is shown. The verbose is given below.

    Verbose for the above pipeline

    /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, color-matrix=(string)sdtv
    Pipeline is live and does not need PREROLL ...
    Setting pipeline to PLAYING ...
    New clock: GstSystemClock
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, color-matrix=(string)sdtv
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, color-matrix=(string)sdtv
    /GstPipeline:pipeline0/GstVideoMixer2:mix.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1
    /GstPipeline:pipeline0/GstVideoMixer2:mix.GstVideoMixer2Pad:sink_0: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, color-matrix=(string)sdtv
    /GstPipeline:pipeline0/GstRtpBin:rtpbin0/GstRtpSession:rtpsession0.GstPad:recv_rtp_sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin0.GstGhostPad:recv_rtp_sink_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin0.GstGhostPad:recv_rtp_sink_0.GstProxyPad:proxypad0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin0/GstRtpSession:rtpsession0.GstPad:recv_rtp_src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin0/GstRtpSsrcDemux:rtpssrcdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin0/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin0/GstRtpJitterBuffer:rtpjitterbuffer0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin1/GstRtpSession:rtpsession1.GstPad:recv_rtp_sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin1.GstGhostPad:recv_rtp_sink_0: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin1.GstGhostPad:recv_rtp_sink_0.GstProxyPad:proxypad1: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin1/GstRtpSession:rtpsession1.GstPad:recv_rtp_src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin1/GstRtpSsrcDemux:rtpssrcdemux1.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin1/GstRtpJitterBuffer:rtpjitterbuffer1.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin1/GstRtpJitterBuffer:rtpjitterbuffer1.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpBin:rtpbin0/GstRtpPtDemux:rtpptdemux0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
    /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)99
    /GstPipeline:pipeline0/GstRtpBin:rtpbin0.GstGhostPad:recv_rtp_src_0_4186542290_99.GstProxyPad:proxypad2: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)99
    /GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
    /GstPipeline:pipeline0/GstRtpBin:rtpbin1/GstRtpPtDemux:rtpptdemux1.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264
    /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay1.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
    /GstPipeline:pipeline0/GstRtpH264Depay:rtph264depay1.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)99
    /GstPipeline:pipeline0/GstRtpBin:rtpbin1.GstGhostPad:recv_rtp_src_0_4186622237_99.GstProxyPad:proxypad3: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)99
    /GstPipeline:pipeline0/ffdec_h264:ffdec_h2641.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)nal
    /GstPipeline:pipeline0/ffdec_h264:ffdec_h2640.GstPad:src: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:src: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstVideoScale:videoscale0.GstPad:sink: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstVideoBox:videobox0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)176, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstVideoBox:videobox0.GstPad:sink: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)176, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter2.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)176, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp1.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)176, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp1.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)176, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstVideoMixer2:mix.GstVideoMixer2Pad:sink_1: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)176, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
    /GstPipeline:pipeline0/ffdec_h264:ffdec_h2641.GstPad:src: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstVideoScale:videoscale1.GstPad:src: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstVideoScale:videoscale1.GstPad:sink: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter3.GstPad:src: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter3.GstPad:sink: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstVideoBox:videobox1.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)354, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstVideoBox:videobox1.GstPad:sink: caps = video/x-raw-yuv, width=(int)176, height=(int)144, framerate=(fraction)15/1, format=(fourcc)I420, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter4.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)354, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstCapsFilter:capsfilter4.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)354, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp2.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)354, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp2.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)354, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstVideoMixer2:mix.GstVideoMixer2Pad:sink_2: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)354, height=(int)144, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1, interlaced=(boolean)false
    /GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:src: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1
    /GstPipeline:pipeline0/GstFFMpegCsp:ffmpegcsp0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)AYUV, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1
    /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0.GstPad:sink: caps = video/x-raw-yuv, format=(fourcc)YUY2, width=(int)640, height=(int)360, framerate=(fraction)15/1, pixel-aspect-ratio=(fraction)1/1
    WARNING: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: A lot of buffers are being dropped.
    Additional debug info:
    gstbasesink.c(2875): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
    There may be a timestamping problem, or this computer is too slow.

    First video stream missing
    The first video stream

    I highly suspect it has something to with videobox.