Recherche avancée

Médias (2)

Mot : - Tags -/documentation

Autres articles (62)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • Ajouter notes et légendes aux images

    7 février 2011, par

    Pour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
    Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
    Modification lors de l’ajout d’un média
    Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)

  • Demande de création d’un canal

    12 mars 2010, par

    En fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
    Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...)

Sur d’autres sites (9742)

  • vf_hwmap : Properly free a locally derived device

    16 juin 2017, par Mark Thompson
    vf_hwmap : Properly free a locally derived device
    

    Fixes CID 1412853.

    (cherry picked from commit a670eea56087d0ecd4fbeccf3a9beb9110b7031f)

    • [DH] libavfilter/vf_hwmap.c
  • Screen Capture with FFMPEG and Google Native Client

    18 juillet 2016, par Mohammad Abu Musa

    I am building a screen recorder that is based on Native Client for Google, I have ffmpeg installed and ready but I do not have experience programming in ffmpeg. so I am looking for tips to understand how to build this recorder.

    My goal is to make a screen recorder and export videos as webm files, I have all the required libraries I just could not find any code examples to hack on

    This is what I achieved so far

    #define __STDC_LIMIT_MACROS

    #include
    #include <iostream>
    #include
    #include
    #include <sstream>
    #include
    #include <vector>

    #include "ppapi/cpp/instance.h"
    #include "ppapi/cpp/var_dictionary.h"
    #include "ppapi/c/pp_errors.h"
    #include "ppapi/c/ppb_console.h"
    #include "ppapi/cpp/input_event.h"
    #include "ppapi/cpp/module.h"
    #include "ppapi/cpp/rect.h"
    #include "ppapi/cpp/var.h"
    #include "ppapi/cpp/var_array_buffer.h"

    // Begin File IO headers
    #include "ppapi/c/pp_stdint.h"
    #include "ppapi/c/ppb_file_io.h"
    #include "ppapi/cpp/directory_entry.h"
    #include "ppapi/cpp/file_io.h"
    #include "ppapi/cpp/file_ref.h"
    #include "ppapi/cpp/file_system.h"
    #include "ppapi/cpp/instance.h"
    #include "ppapi/cpp/message_loop.h"
    #include "ppapi/cpp/module.h"
    #include "ppapi/cpp/var.h"
    #include "ppapi/cpp/var_array.h"
    #include "ppapi/utility/completion_callback_factory.h"
    #include "ppapi/utility/threading/simple_thread.h"

    #ifndef INT32_MAX
    #define INT32_MAX (0x7FFFFFFF)
    #endif

    #ifdef WIN32
    #undef min
    #undef max
    #undef PostMessage

    // Allow 'this' in initializer list
    #pragma warning(disable : 4355)
    #endif

    namespace {
    typedef std::vector StringVector;
    }
    //End File IO headers

    extern "C" {
    #include <libavcodec></libavcodec>avcodec.h>
    #include <libswscale></libswscale>swscale.h>
    #include <libavformat></libavformat>avformat.h>
    #include <libavutil></libavutil>imgutils.h>
    }

    const char* file_name = "/file.txt";
    const char* video_name = "/video.mpeg";
    const char* file_text = "Echo from NaCl: ";

    /**
    * Pixel formats and codecs
    */
    static const AVPixelFormat sourcePixelFormat = AV_PIX_FMT_BGR24;
    static const AVPixelFormat destPixelFormat = AV_PIX_FMT_YUV420P;
    static const AVCodecID destCodec = AV_CODEC_ID_MPEG2VIDEO;

    class RecorderInstance: public pp::Instance {
    public:
       explicit RecorderInstance(PP_Instance instance) :
               pp::Instance(instance), callback_factory_(this), file_system_(this,
                       PP_FILESYSTEMTYPE_LOCALPERSISTENT), file_system_ready_(
                       false), file_thread_(this) {
       }

       virtual ~RecorderInstance() {
           file_thread_.Join();
       }

       virtual bool Init(uint32_t /*argc*/, const char * /*argn*/[],
               const char * /*argv*/[]) {
           file_thread_.Start();
           file_thread_.message_loop().PostWork(
                   callback_factory_.NewCallback(
                           &amp;RecorderInstance::OpenFileSystem));
           avcodec_register_all(); // mandatory to register ffmpeg functions

           return true;
       }

    private:
       pp::CompletionCallbackFactory<recorderinstance> callback_factory_;
       pp::FileSystem file_system_;

       // Indicates whether file_system_ was opened successfully. We only read/write
       // this on the file_thread_.
       bool file_system_ready_;
       pp::SimpleThread file_thread_;

       virtual void HandleMessage(const pp::Var&amp; var_message) {
           if (!var_message.is_dictionary()) {
               LogToConsole(PP_LOGLEVEL_ERROR, pp::Var("Invalid message!"));
               return;
           }

           pp::VarDictionary dict_message(var_message);
           std::string command = dict_message.Get("message").AsString();


           if (command == "sendFrame") {
               pp::VarArrayBuffer data(dict_message.Get("data"));
               uint width = 600;
               uint height = 800;
               uint8_t endcode[] = { 0, 0, 1, 0xb7 };
               /**
                * Create an encoder and open it
                */
               avcodec_register_all();

               AVCodec *h264encoder = avcodec_find_encoder(destCodec);
               AVCodecContext *h264encoderContext = avcodec_alloc_context3(
                       h264encoder);

               h264encoderContext->pix_fmt = destPixelFormat;
               h264encoderContext->width = width;
               h264encoderContext->height = height;

               if (avcodec_open2(h264encoderContext, h264encoder, NULL) &lt; 0) {
                   ShowErrorMessage("Cannot open codec" ,-1);
                   return;
               }

               /**
                * Create a stream
                */
               AVFormatContext *cv2avFormatContext = avformat_alloc_context();
               AVStream *h264outputstream = avformat_new_stream(cv2avFormatContext,
                       h264encoder);

               AVFrame *sourceAvFrame = av_frame_alloc(), *destAvFrame =
                       av_frame_alloc();
               int got_frame;

               FILE* videoOutFile = fopen("video", "wb");
               /**
                * Prepare the conversion context
                */
               SwsContext *bgr2yuvcontext = sws_getContext(width, height,
                       sourcePixelFormat, width, height, destPixelFormat,
                       SWS_BICUBIC, NULL, NULL, NULL);

               int framesToEncode = 100;
               /**
                * Convert and encode frames
                */
               for (uint i = 0; i &lt; framesToEncode; i++) {

                   /**
                    * Allocate source frame, i.e. input to sws_scale()
                    */
                   av_image_alloc(sourceAvFrame->data, sourceAvFrame->linesize,
                           width, height, sourcePixelFormat, 1);

                   /**
                    * Copy image data into AVFrame from cv::Mat
                    */
                   for (uint32_t h = 0; h &lt; height; h++)
                       memcpy(
                               &amp;(sourceAvFrame->data[0][h
                                       * sourceAvFrame->linesize[0]]),
                               &amp;(data), width * 3);

                   /**
                    * Allocate destination frame, i.e. output from sws_scale()
                    */
                   av_image_alloc(destAvFrame->data, destAvFrame->linesize, width,
                           height, destPixelFormat, 1);

                   sws_scale(bgr2yuvcontext, sourceAvFrame->data,
                           sourceAvFrame->linesize, 0, height, destAvFrame->data,
                           destAvFrame->linesize);
                   sws_freeContext(bgr2yuvcontext);
                   /**
                    * Prepare an AVPacket and set buffer to NULL so that it'll be allocated by FFmpeg
                    */
                   AVPacket avEncodedPacket;
                   av_init_packet(&amp;avEncodedPacket);
                   avEncodedPacket.data = NULL;
                   avEncodedPacket.size = 0;

                   destAvFrame->pts = i;
                   avcodec_encode_video2(h264encoderContext, &amp;avEncodedPacket,
                           destAvFrame, &amp;got_frame);

                   if (got_frame) {
                       ShowErrorMessage(
                               "Encoded a frame of size \n",-1); //+ (string)avEncodedPacket.size + "\n");

                       if (fwrite(avEncodedPacket.data, 1, avEncodedPacket.size,
                               videoOutFile) &lt; (unsigned) avEncodedPacket.size)
                           ShowErrorMessage(
                                   "Could not write all \n",-1);
                                   //+ avEncodedPacket.size
                                           //+ " bytes, but will continue..\n"


                       fflush(videoOutFile);

                   }
                   /**
                    * Per-frame cleanup
                    */
                   av_packet_free_side_data(&amp;avEncodedPacket);
                   av_free_packet(&amp;avEncodedPacket);
                   av_freep(sourceAvFrame->data);
                   av_frame_free(&amp;sourceAvFrame);
                   av_freep(destAvFrame->data);
                   av_frame_free(&amp;destAvFrame);

               }

               fwrite(endcode, 1, sizeof(endcode), videoOutFile);
               fclose(videoOutFile);

               /**
                * Final cleanup
                */
               avformat_free_context(cv2avFormatContext);
               avcodec_close(h264encoderContext);
               avcodec_free_context(&amp;h264encoderContext);

           } else if (command == "createFile") {
               file_thread_.message_loop().PostWork(
                       callback_factory_.NewCallback(&amp;RecorderInstance::Save,
                               file_name, file_text));
           }
       }


       void OpenFileSystem(int32_t /* result */) {
           int32_t rv = file_system_.Open(1024 * 1024, pp::BlockUntilComplete());
           if (rv == PP_OK) {
               file_system_ready_ = true;
               // Notify the user interface that we're ready
               ShowStatusMessage("STORAGE READY");
           } else {
               ShowErrorMessage("Failed to open file system", rv);
           }
       }

       void Save(int32_t /* result */, const std::string&amp; file_name,
               const std::string&amp; file_contents) {
           if (!file_system_ready_) {
               ShowErrorMessage("File system is not open", PP_ERROR_FAILED);
               return;
           }
           pp::FileRef ref(file_system_, file_name.c_str());
           pp::FileIO file(this);

           int32_t open_result = file.Open(ref,
                   PP_FILEOPENFLAG_WRITE | PP_FILEOPENFLAG_CREATE
                           | PP_FILEOPENFLAG_TRUNCATE, pp::BlockUntilComplete());
           if (open_result != PP_OK) {
               ShowErrorMessage("File open for write failed", open_result);
               return;
           }

           // We have truncated the file to 0 bytes. So we need only write if
           // file_contents is non-empty.
           if (!file_contents.empty()) {
               if (file_contents.length() > INT32_MAX) {
                   ShowErrorMessage("File too big", PP_ERROR_FILETOOBIG);
                   return;
               }
               int64_t offset = 0;
               int32_t bytes_written = 0;
               do {
                   bytes_written = file.Write(offset,
                           file_contents.data() + offset, file_contents.length(),
                           pp::BlockUntilComplete());
                   if (bytes_written > 0) {
                       offset += bytes_written;
                   } else {
                       ShowErrorMessage("File write failed", bytes_written);
                       return;
                   }
               } while (bytes_written
                       &lt; static_cast(file_contents.length()));
           }
           // All bytes have been written, flush the write buffer to complete
           int32_t flush_result = file.Flush(pp::BlockUntilComplete());
           if (flush_result != PP_OK) {
               ShowErrorMessage("File fail to flush", flush_result);
               return;
           }
           ShowStatusMessage("Save success");
       }

       /// Encapsulates our simple javascript communication protocol
       void ShowErrorMessage(const std::string&amp; message, int32_t result) {
           std::stringstream ss;
           ss &lt;&lt; "ERROR: " &lt;&lt; message &lt;&lt; " -- Error #: " &lt;&lt; result &lt;&lt; "\n";
           PostMessage(ss.str());
       }

       void ShowStatusMessage(const std::string&amp; message) {
           std::stringstream ss;
           ss &lt;&lt; "LOG: " &lt;&lt; message &lt;&lt; "\n";
           PostMessage(ss.str());
       }

    };

    class RecorderModule: public pp::Module {
    public:
       RecorderModule() :
               pp::Module() {
       }

       virtual ~RecorderModule() {
       }

       virtual pp::Instance* CreateInstance(PP_Instance instance) {
           return new RecorderInstance(instance);
       }
    };

    namespace pp {

    /**
    * This function is an entry point to a NaCl application.
    * It must be implemented.
    */
    Module* CreateModule() {
       return new RecorderModule();
    }

    }  // namespace pp
    </recorderinstance></vector></sstream></iostream>
  • Troubleshooting ffmpeg/ffplay client RTSP RTP UDP * multicast * issue

    6 novembre 2020, par MAXdB

    I'm having problem with using udp_multicast transport method using ffmpeg or ffplay as a client to a webcam.

    &#xA;

    TCP transport works :

    &#xA;

    ffplay -rtsp_transport tcp rtsp://192.168.1.100/videoinput_1/mjpeg_3/media.stm&#xA;

    &#xA;

    UDP transport works :

    &#xA;

    ffplay -rtsp_transport udp rtsp://192.168.1.100/videoinput_1/mjpeg_3/media.stm&#xA;

    &#xA;

    Multicast transport does not work :

    &#xA;

    ffplay -rtsp_transport udp_multicast rtsp://192.168.1.100/videoinput_1/mjpeg_3/media.stm&#xA;

    &#xA;

    The error message when udp_multicast is chosen reads :

    &#xA;

    [rtsp @ 0x7fd6a8000b80] Could not find codec parameters for stream 0 (Video: mjpeg, none(bt470bg/unknown/unknown)): unspecified size&#xA;

    &#xA;

    Run with -v debug : Observe that the UDP multicast information appears in the SDP even though the chosen transport is unicast for this run. The SDP content is unchanged for unicast or multicast.

    &#xA;

    [tcp @ 0x7f648c002f40] Starting connection attempt to 192.168.1.100 port 554&#xA;[tcp @ 0x7f648c002f40] Successfully connected to 192.168.1.100 port 554&#xA;[rtsp @ 0x7f648c000b80] SDP:&#xA;v=0&#xA;o=- 621355968671884050 621355968671884050 IN IP4 192.168.1.100&#xA;s=/videoinput_1:0/mjpeg_3/media.stm&#xA;c=IN IP4 0.0.0.0&#xA;m=video 40004 RTP/AVP 26&#xA;c=IN IP4 237.0.0.3/1&#xA;a=control:trackID=1&#xA;a=range:npt=0-&#xA;a=framerate:25.0&#xA;&#xA;Failed to parse interval end specification &#x27;&#x27;&#xA;[rtp @ 0x7f648c008e00] No default whitelist set&#xA;[udp @ 0x7f648c009900] No default whitelist set&#xA;[udp @ 0x7f648c009900] end receive buffer size reported is 425984&#xA;[udp @ 0x7f648c019c80] No default whitelist set&#xA;[udp @ 0x7f648c019c80] end receive buffer size reported is 425984&#xA;[rtsp @ 0x7f648c000b80] setting jitter buffer size to 500&#xA;[rtsp @ 0x7f648c000b80] hello state=0&#xA;Failed to parse interval end specification &#x27;&#x27;&#xA;[mjpeg @ 0x7f648c0046c0] marker=d8 avail_size_in_buf=145103 &#xA;[mjpeg @ 0x7f648c0046c0] marker parser used 0 bytes (0 bits)&#xA;[mjpeg @ 0x7f648c0046c0] marker=e0 avail_size_in_buf=145101&#xA;[mjpeg @ 0x7f648c0046c0] marker parser used 16 bytes (128 bits)&#xA;[mjpeg @ 0x7f648c0046c0] marker=db avail_size_in_buf=145083&#xA;[mjpeg @ 0x7f648c0046c0] index=0&#xA;[mjpeg @ 0x7f648c0046c0] qscale[0]: 5&#xA;[mjpeg @ 0x7f648c0046c0] index=1&#xA;[mjpeg @ 0x7f648c0046c0] qscale[1]: 10&#xA;[mjpeg @ 0x7f648c0046c0] marker parser used 132 bytes (1056 bits)&#xA;[mjpeg @ 0x7f648c0046c0] marker=c4 avail_size_in_buf=144949&#xA;[mjpeg @ 0x7f648c0046c0] marker parser used 0 bytes (0 bits)&#xA;[mjpeg @ 0x7f648c0046c0] marker=c0 avail_size_in_buf=144529&#xA;[mjpeg @ 0x7f648c0046c0] Changing bps from 0 to 8&#xA;[mjpeg @ 0x7f648c0046c0] sof0: picture: 1920x1080&#xA;[mjpeg @ 0x7f648c0046c0] component 0 2:2 id: 0 quant:0&#xA;[mjpeg @ 0x7f648c0046c0] component 1 1:1 id: 1 quant:1&#xA;[mjpeg @ 0x7f648c0046c0] component 2 1:1 id: 2 quant:1&#xA;[mjpeg @ 0x7f648c0046c0] pix fmt id 22111100&#xA;[mjpeg @ 0x7f648c0046c0] Format yuvj420p chosen by get_format().&#xA;[mjpeg @ 0x7f648c0046c0] marker parser used 17 bytes (136 bits)&#xA;[mjpeg @ 0x7f648c0046c0] escaping removed 676 bytes&#xA;[mjpeg @ 0x7f648c0046c0] marker=da avail_size_in_buf=144510&#xA;[mjpeg @ 0x7f648c0046c0] marker parser used 143834 bytes (1150672 bits)&#xA;[mjpeg @ 0x7f648c0046c0] marker=d9 avail_size_in_buf=2&#xA;[mjpeg @ 0x7f648c0046c0] decode frame unused 2 bytes&#xA;[rtsp @ 0x7f648c000b80] All info found vq=    0KB sq=    0B f=0/0&#xA;[rtsp @ 0x7f648c000b80] rfps: 24.416667 0.018101&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 24.500000 0.013298&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 24.583333 0.009235&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 24.666667 0.005910&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 24.750000 0.003324&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 24.833333 0.001477&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 24.916667 0.000369&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 25.000000 0.000000&#xA;[rtsp @ 0x7f648c000b80] rfps: 25.083333 0.000370&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 25.166667 0.001478&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 25.250000 0.003326&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 25.333333 0.005912&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 25.416667 0.009238&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 25.500000 0.013302&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 25.583333 0.018105&#xA;    Last message repeated 1 times&#xA;[rtsp @ 0x7f648c000b80] rfps: 50.000000 0.000000&#xA;[rtsp @ 0x7f648c000b80] Setting avg frame rate based on r frame rate&#xA;Input #0, rtsp, from &#x27;rtsp://192.168.1.100/videoinput_1/mjpeg_3/media.stm&#x27;:&#xA;  Metadata:&#xA;    title           : /videoinput_1:0/mjpeg_3/media.stm&#xA;  Duration: N/A, start: 0.000000, bitrate: N/A&#xA;    Stream #0:0, 21, 1/90000: Video: mjpeg (Baseline), 1 reference frame, yuvj420p(pc, bt470bg/unknown/unknown, center), 1920x1080 [SAR 1:1 DAR 16:9], 0/1, 25 fps, 25 tbr, 90k tbn, 90k tbc&#xA;[mjpeg @ 0x7f648c02ad80] marker=d8 avail_size_in_buf=145103&#xA;

    &#xA;

    Here is the same debug section when using udp_multicast. The SDP is identical as mentioned, and the block after the SDP containing [mjpeg] codec info is entirely missing (beginning with marker=d8)—the stream is never identified. This happens (to the eye) instantaneously, there's no indication of a timeout waiting unsuccessfully for an RTP packet, though this, too, could just be insufficient debug info in the driver. Also note that ffmpeg knows that the frames are MJPEG frames and the color primaries are PAL, it just doesn't know the size. Also curious, but not relevant to the problem, the unicast UDP transport destination port utilized for the stream does not appear in the ffmpeg debug dump shown above, meaning part of the RTSP/RTP driver is hiding important information under the kimono, that port number and how it knows that the frames will be MJPEG.

    &#xA;

    [tcp @ 0x7effe0002f40] Starting connection attempt to 192.168.1.100 port 554&#xA;[tcp @ 0x7effe0002f40] Successfully connected to 192.168.1.100 port 554&#xA;[rtsp @ 0x7effe0000b80] SDP:aq=    0KB vq=    0KB sq=    0B f=0/0&#xA;v=0&#xA;o=- 621355968671884050 621355968671884050 IN IP4 192.168.1.100&#xA;s=/videoinput_1:0/mjpeg_3/media.stm&#xA;c=IN IP4 0.0.0.0&#xA;m=video 40004 RTP/AVP 26&#xA;c=IN IP4 237.0.0.3/1&#xA;a=control:trackID=1&#xA;a=range:npt=0-&#xA;a=framerate:25.0&#xA;&#xA;Failed to parse interval end specification &#x27;&#x27;&#xA;[rtp @ 0x7effe0008e00] No default whitelist set&#xA;[udp @ 0x7effe0009900] No default whitelist set&#xA;[udp @ 0x7effe0009900] end receive buffer size reported is 425984&#xA;[udp @ 0x7effe0019c40] No default whitelist set&#xA;[udp @ 0x7effe0019c40] end receive buffer size reported is 425984&#xA;[rtsp @ 0x7effe0000b80] setting jitter buffer size to 500&#xA;[rtsp @ 0x7effe0000b80] hello state=0&#xA;Failed to parse interval end specification &#x27;&#x27; &#xA;[rtsp @ 0x7effe0000b80] Could not find codec parameters for stream 0 (Video: mjpeg, 1 reference frame, none(bt470bg/unknown/unknown, center)): unspecified size&#xA;Consider increasing the value for the &#x27;analyzeduration&#x27; (0) and &#x27;probesize&#x27; (5000000) options&#xA;Input #0, rtsp, from &#x27;rtsp://192.168.1.100/videoinput_1/mjpeg_3/media.stm&#x27;:&#xA;  Metadata:&#xA;    title           : /videoinput_1:0/mjpeg_3/media.stm&#xA;  Duration: N/A, start: 0.000000, bitrate: N/A&#xA;    Stream #0:0, 0, 1/90000: Video: mjpeg, 1 reference frame, none(bt470bg/unknown/unknown, center), 90k tbr, 90k tbn, 90k tbc&#xA;    nan M-V:    nan fd=   0 aq=    0KB vq=    0KB sq=    0B f=0/0&#xA;

    &#xA;

    This is the TCPDUMP of the traffic. The information in both streams appears identical.

    &#xA;

    19:21:30.703599 IP 192.168.1.100.64271 > 192.168.1.98.5239: UDP, length 60&#xA;19:21:30.703734 IP 192.168.1.100.64270 > 192.168.1.98.5238: UDP, length 1400&#xA;19:21:30.703852 IP 192.168.1.100.64270 > 192.168.1.98.5238: UDP, length 1400&#xA;19:21:30.704326 IP 192.168.1.100.64270 > 192.168.1.98.5238: UDP, length 1400&#xA;19:21:30.704326 IP 192.168.1.100.64270 > 192.168.1.98.5238: UDP, length 1400&#xA;19:21:30.704327 IP 192.168.1.100.64270 > 192.168.1.98.5238: UDP, length 1400&#xA;19:21:30.704327 IP 192.168.1.100.64270 > 192.168.1.98.5238: UDP, length 1400&#xA;19:21:30.704504 IP 192.168.1.100.64270 > 192.168.1.98.5238: UDP, length 1400&#xA;19:21:30.704813 IP 192.168.1.100.64270 > 192.168.1.98.5238: UDP, length 1400&#xA;19:21:30.704814 IP 192.168.1.100.64270 > 192.168.1.98.5238: UDP, length 1400&#xA;19:21:30.704872 IP 192.168.1.100.64270 > 192.168.1.98.5238: UDP, length 732&#xA;19:21:30.704873 IP 192.168.1.100.59869 > 237.0.0.3.40005: UDP, length 60&#xA;19:21:30.705513 IP 192.168.1.100.59868 > 237.0.0.3.40004: UDP, length 1400&#xA;19:21:30.705513 IP 192.168.1.100.59868 > 237.0.0.3.40004: UDP, length 1400&#xA;19:21:30.705513 IP 192.168.1.100.59868 > 237.0.0.3.40004: UDP, length 1400&#xA;19:21:30.705513 IP 192.168.1.100.59868 > 237.0.0.3.40004: UDP, length 1400&#xA;19:21:30.705594 IP 192.168.1.100.59868 > 237.0.0.3.40004: UDP, length 1400&#xA;19:21:30.705774 IP 192.168.1.100.59868 > 237.0.0.3.40004: UDP, length 1400&#xA;19:21:30.706236 IP 192.168.1.100.59868 > 237.0.0.3.40004: UDP, length 1400&#xA;19:21:30.706236 IP 192.168.1.100.59868 > 237.0.0.3.40004: UDP, length 1400&#xA;19:21:30.706236 IP 192.168.1.100.59868 > 237.0.0.3.40004: UDP, length 1400&#xA;19:21:30.706236 IP 192.168.1.100.59868 > 237.0.0.3.40004: UDP, length 732&#xA;

    &#xA;

    I hope this is a configuration problem, that I can fix this in my ffplay/ffmpeg line, and it's not a bug in ffmpeg. Thanks for any tips.

    &#xA;