Recherche avancée

Médias (1)

Mot : - Tags -/Christian Nold

Autres articles (50)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (9094)

  • Loss of frame order when saving AVFrames in a threaded environment

    10 juin 2015, par Orphid

    Context : I’m using ffmpeg’s api to extract frames from video files for test purposes. I’ve had a lot of issues getting it to work, and several re-writes. This time I’ve added tests at every stage to make sure I can isolate where errors are occurring.

    Problem : I want to have a producer-consumer relationship set-up - a FrameProcessor class has a thread which pushes decoded frames into a queue, which a test client consumes. After adding this in, I suddenly found that my frames were being saved out of order, but I can’t see why this is happening because everything should be locked, and there’s only one producer and one consumer. Here is a section of code that adds frames to the queue :

    void FrameProcessor::_do_processing()
    {
       AVPacket packet;

       _is_processing = true;

       Logger::Info("Starting to buffer frames on a separate thread.");

       while(!_stop_processing_requested && !_video->finished())
       {
           boost::mutex::scoped_lock lock(_mutex);

           if(!_video->get_next_packet(&packet))
           {
               Logger::Info("The video ran out of frames, so FrameProcessor producer thread has stopped.");
               lock.unlock();
               break;
           }

           Logger::Info("FrameProcessor producer thread reads a packet from the stream (pts = " + std::to_string(packet.pts) + ").");

           int indx = packet.stream_index;

           auto decoder_it = _decoder_map.find(indx);

           if(decoder_it == _decoder_map.end())
           {
               Logger::Debug("FrameProcessor discarded a packet from stream index " + std::to_string(indx) + " because the stream was not enabled in the FrameProcessor.");
               av_free_packet(&packet);
           }

           StreamDecoder* strm_dc = decoder_it->second;

           AVFrame* frm = strm_dc->decode_packet(&packet);

           if(frm != NULL)
           {
               Logger::Debug("Producer thread pushed a frame into the FrameProcessor queue (pts = " + std::to_string(frm->pts) + ", coded num = " + std::to_string(frm->coded_picture_number) + ").");
               _frames.push(new TestFrame(frm));
               _print_queue();
           }

           av_free_packet(&packet);

           lock.unlock();

           _wait_for_buffer_space();

       }

       Logger::Info("Finished buffering. Buffer stop requested? " + std::to_string(_stop_processing_requested));

       _is_processing = false;
    }

    The dequeue section of a test before adding producer thread :

    while(v->get_next_packet(&packet))
       {
           if(packet.stream_index == vi_first)
           {
               AVFrame* f = decoder.decode_packet(&packet);

               if(f != NULL)
               {
                   TestFrame* tf = new TestFrame(f);

                   Logger::Info("|MAIN| Dequeued frame with pts " + std::to_string(tf->av_frame->pts) + " and coded num " + std::to_string(tf->av_frame->coded_picture_number) + ".");

                   int cdc_num = tf->av_frame->coded_picture_number;

                   TestFrame* tf1 = formatter.format(*tf);

                   Logger::Debug("|MAIN| Saving frame with coded id " + std::to_string(cdc_num) + " as frame" + std::to_string(i) + ".ppm");

                   save_ppm_frame(tf1->av_frame, v1_ctx->width, v1_ctx->height, "tst3", i++);

                   delete tf1;
               }
               if(i > 40)
                   break;
           }
       }

    And the dequeue section of one of my tests after adding producer thread / FrameProcessor class :

    while(!v->finished() && i < 40)
    {
       tf = processor->next_frame();

       if(tf == NULL)
           continue;

       Logger::Info("|MAIN| Dequeued frame with pts " + std::to_string(tf->av_frame->pts) + " and coded num " + std::to_string(tf->av_frame->coded_picture_number) + ".");

       //formatting the frame somehow seems to cause the coded_picture_number to be zeroed, so get it now.
       int cdc_num = tf->av_frame->coded_picture_number;

       TestFrame* tf1 = formatter->format(*tf);

       Logger::Debug("|MAIN| Saving frame with coded id " + std::to_string(cdc_num) + " as frame" + std::to_string(i) + ".ppm");

       save_ppm_frame(tf1->av_frame, width, height, "tst4", i++);

       delete tf1;
    }

    The only two threads that should be in use are the main program thread, and the producer thread. The really odd thing is that I can see from my logs that all the coded ids are coming out in the correct order, but when I check the actual ppm files, the images are clearly not (hopefully you can see the dot in these images jumping backwards and forwards, when it should be moving in a smooth curve) :

    frame26
    frame27
    frame28
    frame29

    To be clear, before adding the threading in, these frames were coming out correctly. Here’s an example of the logging information from two runs, one without threading, and one with (please be aware that coded_picture_number is not the same as display order - I’m just using the values to check that they match the successful earlier run.) I greped the logs to just show the consumer side of the logging :

    Before adding producer thread & frame queue :

    [2015-Jun-10 11:01:44.783523]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 0.
    [2015-Jun-10 11:01:44.783859]: <debug> |MAIN| Saving frame with coded id 0 as frame0.ppm
    [2015-Jun-10 11:01:44.784811]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 2.
    [2015-Jun-10 11:01:44.784982]: <debug> |MAIN| Saving frame with coded id 2 as frame1.ppm
    [2015-Jun-10 11:01:44.785910]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 3.
    [2015-Jun-10 11:01:44.786075]: <debug> |MAIN| Saving frame with coded id 3 as frame2.ppm
    [2015-Jun-10 11:01:44.786896]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 1.
    [2015-Jun-10 11:01:44.787060]: <debug> |MAIN| Saving frame with coded id 1 as frame3.ppm
    [2015-Jun-10 11:01:44.787967]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 5.
    [2015-Jun-10 11:01:44.788145]: <debug> |MAIN| Saving frame with coded id 5 as frame4.ppm
    [2015-Jun-10 11:01:44.789004]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 6.
    [2015-Jun-10 11:01:44.789173]: <debug> |MAIN| Saving frame with coded id 6 as frame5.ppm
    [2015-Jun-10 11:01:44.789946]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 4.
    [2015-Jun-10 11:01:44.790124]: <debug> |MAIN| Saving frame with coded id 4 as frame6.ppm
    [2015-Jun-10 11:01:44.790909]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 8.
    [2015-Jun-10 11:01:44.791075]: <debug> |MAIN| Saving frame with coded id 8 as frame7.ppm
    [2015-Jun-10 11:01:44.791895]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 9.
    [2015-Jun-10 11:01:44.792060]: <debug> |MAIN| Saving frame with coded id 9 as frame8.ppm
    [2015-Jun-10 11:01:44.793512]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 7.
    [2015-Jun-10 11:01:44.793704]: <debug> |MAIN| Saving frame with coded id 7 as frame9.ppm
    [2015-Jun-10 11:01:44.794566]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 11.
    [2015-Jun-10 11:01:44.794733]: <debug> |MAIN| Saving frame with coded id 11 as frame10.ppm
    [2015-Jun-10 11:01:44.795638]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 12.
    [2015-Jun-10 11:01:44.795802]: <debug> |MAIN| Saving frame with coded id 12 as frame11.ppm
    [2015-Jun-10 11:01:44.796633]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 10.
    [2015-Jun-10 11:01:44.796797]: <debug> |MAIN| Saving frame with coded id 10 as frame12.ppm
    [2015-Jun-10 11:01:44.797594]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 14.
    [2015-Jun-10 11:01:44.797759]: <debug> |MAIN| Saving frame with coded id 14 as frame13.ppm
    [2015-Jun-10 11:01:44.798524]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 15.
    [2015-Jun-10 11:01:44.798666]: <debug> |MAIN| Saving frame with coded id 15 as frame14.ppm
    [2015-Jun-10 11:01:44.799332]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 13.
    [2015-Jun-10 11:01:44.799470]: <debug> |MAIN| Saving frame with coded id 13 as frame15.ppm
    [2015-Jun-10 11:01:44.800176]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 17.
    [2015-Jun-10 11:01:44.800312]: <debug> |MAIN| Saving frame with coded id 17 as frame16.ppm
    [2015-Jun-10 11:01:44.801048]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 18.
    [2015-Jun-10 11:01:44.801186]: <debug> |MAIN| Saving frame with coded id 18 as frame17.ppm
    [2015-Jun-10 11:01:44.801842]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 16.
    [2015-Jun-10 11:01:44.801977]: <debug> |MAIN| Saving frame with coded id 16 as frame18.ppm
    [2015-Jun-10 11:01:44.802629]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 20.
    [2015-Jun-10 11:01:44.802766]: <debug> |MAIN| Saving frame with coded id 20 as frame19.ppm
    [2015-Jun-10 11:01:44.803427]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 21.
    [2015-Jun-10 11:01:44.803564]: <debug> |MAIN| Saving frame with coded id 21 as frame20.ppm
    [2015-Jun-10 11:01:44.804289]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 19.
    [2015-Jun-10 11:01:44.804429]: <debug> |MAIN| Saving frame with coded id 19 as frame21.ppm
    [2015-Jun-10 11:01:44.805131]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 23.
    [2015-Jun-10 11:01:44.805266]: <debug> |MAIN| Saving frame with coded id 23 as frame22.ppm
    [2015-Jun-10 11:01:44.805987]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 24.
    [2015-Jun-10 11:01:44.806132]: <debug> |MAIN| Saving frame with coded id 24 as frame23.ppm
    [2015-Jun-10 11:01:44.806792]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 22.
    [2015-Jun-10 11:01:44.806929]: <debug> |MAIN| Saving frame with coded id 22 as frame24.ppm
    [2015-Jun-10 11:01:44.807604]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 26.
    [2015-Jun-10 11:01:44.807742]: <debug> |MAIN| Saving frame with coded id 26 as frame25.ppm
    [2015-Jun-10 11:01:44.808411]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 27.
    [2015-Jun-10 11:01:44.808550]: <debug> |MAIN| Saving frame with coded id 27 as frame26.ppm
    [2015-Jun-10 11:01:44.809201]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 25.
    [2015-Jun-10 11:01:44.809339]: <debug> |MAIN| Saving frame with coded id 25 as frame27.ppm
    [2015-Jun-10 11:01:44.810079]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 29.
    [2015-Jun-10 11:01:44.810243]: <debug> |MAIN| Saving frame with coded id 29 as frame28.ppm
    [2015-Jun-10 11:01:44.810887]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 30.
    [2015-Jun-10 11:01:44.811001]: <debug> |MAIN| Saving frame with coded id 30 as frame29.ppm
    [2015-Jun-10 11:01:44.811672]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 28.
    [2015-Jun-10 11:01:44.811785]: <debug> |MAIN| Saving frame with coded id 28 as frame30.ppm
    [2015-Jun-10 11:01:44.812450]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 32.
    [2015-Jun-10 11:01:44.812565]: <debug> |MAIN| Saving frame with coded id 32 as frame31.ppm
    [2015-Jun-10 11:01:44.813262]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 33.
    [2015-Jun-10 11:01:44.813380]: <debug> |MAIN| Saving frame with coded id 33 as frame32.ppm
    [2015-Jun-10 11:01:44.813964]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 31.
    [2015-Jun-10 11:01:44.814076]: <debug> |MAIN| Saving frame with coded id 31 as frame33.ppm
    [2015-Jun-10 11:01:44.814664]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 35.
    [2015-Jun-10 11:01:44.814778]: <debug> |MAIN| Saving frame with coded id 35 as frame34.ppm
    [2015-Jun-10 11:01:44.815375]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 36.
    [2015-Jun-10 11:01:44.815489]: <debug> |MAIN| Saving frame with coded id 36 as frame35.ppm
    </debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info>

    After adding producer thread & frame queue :

    [2015-Jun-10 11:01:44.827119]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 0.
    [2015-Jun-10 11:01:44.827459]: <debug> |MAIN| Saving frame with coded id 0 as frame0.ppm
    [2015-Jun-10 11:01:44.829389]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 2.
    [2015-Jun-10 11:01:44.829751]: <debug> |MAIN| Saving frame with coded id 2 as frame1.ppm
    [2015-Jun-10 11:01:44.831429]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 3.
    [2015-Jun-10 11:01:44.831758]: <debug> |MAIN| Saving frame with coded id 3 as frame2.ppm
    [2015-Jun-10 11:01:44.833583]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 1.
    [2015-Jun-10 11:01:44.833906]: <debug> |MAIN| Saving frame with coded id 1 as frame3.ppm
    [2015-Jun-10 11:01:44.834585]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 5.
    [2015-Jun-10 11:01:44.834691]: <debug> |MAIN| Saving frame with coded id 5 as frame4.ppm
    [2015-Jun-10 11:01:44.835167]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 6.
    [2015-Jun-10 11:01:44.835273]: <debug> |MAIN| Saving frame with coded id 6 as frame5.ppm
    [2015-Jun-10 11:01:44.835753]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 4.
    [2015-Jun-10 11:01:44.835857]: <debug> |MAIN| Saving frame with coded id 4 as frame6.ppm
    [2015-Jun-10 11:01:44.836336]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 8.
    [2015-Jun-10 11:01:44.836440]: <debug> |MAIN| Saving frame with coded id 8 as frame7.ppm
    [2015-Jun-10 11:01:44.836897]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 9.
    [2015-Jun-10 11:01:44.837002]: <debug> |MAIN| Saving frame with coded id 9 as frame8.ppm
    [2015-Jun-10 11:01:44.837460]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 7.
    [2015-Jun-10 11:01:44.837565]: <debug> |MAIN| Saving frame with coded id 7 as frame9.ppm
    [2015-Jun-10 11:01:44.838080]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 11.
    [2015-Jun-10 11:01:44.838200]: <debug> |MAIN| Saving frame with coded id 11 as frame10.ppm
    [2015-Jun-10 11:01:44.838666]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 12.
    [2015-Jun-10 11:01:44.838772]: <debug> |MAIN| Saving frame with coded id 12 as frame11.ppm
    [2015-Jun-10 11:01:44.839322]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 10.
    [2015-Jun-10 11:01:44.839428]: <debug> |MAIN| Saving frame with coded id 10 as frame12.ppm
    [2015-Jun-10 11:01:44.839910]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 14.
    [2015-Jun-10 11:01:44.840018]: <debug> |MAIN| Saving frame with coded id 14 as frame13.ppm
    [2015-Jun-10 11:01:44.867898]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 15.
    [2015-Jun-10 11:01:44.868013]: <debug> |MAIN| Saving frame with coded id 15 as frame14.ppm
    [2015-Jun-10 11:01:44.868925]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 13.
    [2015-Jun-10 11:01:44.869039]: <debug> |MAIN| Saving frame with coded id 13 as frame15.ppm
    [2015-Jun-10 11:01:44.869938]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 17.
    [2015-Jun-10 11:01:44.870051]: <debug> |MAIN| Saving frame with coded id 17 as frame16.ppm
    [2015-Jun-10 11:01:44.871000]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 18.
    [2015-Jun-10 11:01:44.871154]: <debug> |MAIN| Saving frame with coded id 18 as frame17.ppm
    [2015-Jun-10 11:01:44.871792]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 16.
    [2015-Jun-10 11:01:44.871907]: <debug> |MAIN| Saving frame with coded id 16 as frame18.ppm
    [2015-Jun-10 11:01:44.872873]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 20.
    [2015-Jun-10 11:01:44.872993]: <debug> |MAIN| Saving frame with coded id 20 as frame19.ppm
    [2015-Jun-10 11:01:44.873606]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 21.
    [2015-Jun-10 11:01:44.873719]: <debug> |MAIN| Saving frame with coded id 21 as frame20.ppm
    [2015-Jun-10 11:01:44.874324]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 19.
    [2015-Jun-10 11:01:44.874432]: <debug> |MAIN| Saving frame with coded id 19 as frame21.ppm
    [2015-Jun-10 11:01:44.874931]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 23.
    [2015-Jun-10 11:01:44.875046]: <debug> |MAIN| Saving frame with coded id 23 as frame22.ppm
    [2015-Jun-10 11:01:44.875507]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 24.
    [2015-Jun-10 11:01:44.875620]: <debug> |MAIN| Saving frame with coded id 24 as frame23.ppm
    [2015-Jun-10 11:01:44.876087]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 22.
    [2015-Jun-10 11:01:44.876190]: <debug> |MAIN| Saving frame with coded id 22 as frame24.ppm
    [2015-Jun-10 11:01:44.876654]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 26.
    [2015-Jun-10 11:01:44.876757]: <debug> |MAIN| Saving frame with coded id 26 as frame25.ppm
    [2015-Jun-10 11:01:44.877242]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 27.
    [2015-Jun-10 11:01:44.877345]: <debug> |MAIN| Saving frame with coded id 27 as frame26.ppm
    [2015-Jun-10 11:01:44.877816]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 25.
    [2015-Jun-10 11:01:44.877919]: <debug> |MAIN| Saving frame with coded id 25 as frame27.ppm
    [2015-Jun-10 11:01:44.878453]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 29.
    [2015-Jun-10 11:01:44.878558]: <debug> |MAIN| Saving frame with coded id 29 as frame28.ppm
    [2015-Jun-10 11:01:44.879066]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 30.
    [2015-Jun-10 11:01:44.879187]: <debug> |MAIN| Saving frame with coded id 30 as frame29.ppm
    [2015-Jun-10 11:01:44.879676]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 28.
    [2015-Jun-10 11:01:44.879790]: <debug> |MAIN| Saving frame with coded id 28 as frame30.ppm
    [2015-Jun-10 11:01:44.908448]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 32.
    [2015-Jun-10 11:01:44.908569]: <debug> |MAIN| Saving frame with coded id 32 as frame31.ppm
    [2015-Jun-10 11:01:44.909642]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 33.
    [2015-Jun-10 11:01:44.909756]: <debug> |MAIN| Saving frame with coded id 33 as frame32.ppm
    [2015-Jun-10 11:01:44.910879]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 31.
    [2015-Jun-10 11:01:44.910992]: <debug> |MAIN| Saving frame with coded id 31 as frame33.ppm
    [2015-Jun-10 11:01:44.911988]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 35.
    [2015-Jun-10 11:01:44.912147]: <debug> |MAIN| Saving frame with coded id 35 as frame34.ppm
    [2015-Jun-10 11:01:44.913108]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 36.
    [2015-Jun-10 11:01:44.913222]: <debug> |MAIN| Saving frame with coded id 36 as frame35.ppm
    </debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info>

    And, basically, I can’t see the difference between what’s being logged between the two tests - they are both saving the frames with the same coded_picture_numbers in the same order as far as I can see, but the images are different. If you select one frame from both runs (say frame28.ppm), you can see that the frames at the same index are just not the same images. I’m still very new to C++, so apologies if this is just a trivial general error.

    EDIT :

    I’ve used a work around to solve this - by removing the whole producer-consumer model and having the FrameProducer simply decode the next frame when requested. This isn’t really an answer to what’s going wrong here, but I understand that it’s difficult for members to help with something for which it’s just impractical to give sufficient code to reproduce.

  • Android bytedeco javacpp ffmpeg decode h264 bytes to yuv and render with openGL ES 2.0. Wrong colors

    9 juin 2015, par tema_man

    there ! I try to display a video stream, which comes from server as byte array.
    Data in this array is h264 encoded image and i decode it with bytedeco javacpp-presets library in this way :

    public class DMDecoder {

    private static final String LOG_TAG = "DMDecoder";

    private AVCodec avCodec;
    private AVCodecContext avCodecContext;
    private AVFrame avFrame;
    private AVPacket avPacket;
    private boolean wasIFrame;
    private long IFrameTimeStampMs;
    private int maxFps;
    private int codecId;

    private DMDecoderCallback callback;

    public DMDecoder(DMDecoderCallback cb) {
       this.callback = cb;
       this.codecId = AV_CODEC_ID_H264;
       avcodec_register_all();
       restart();
    }

    public void restart() {
       stop();
       start();
    }

    public void stop() {
       frames = 0;
       if (avCodecContext != null) {
           avcodec_close(avCodecContext);
           avcodec_free_context(avCodecContext);
           avCodecContext = null;
       }

       if (avCodec != null) {
           av_free(avCodec);
           avCodec = null;
       }

       if (avFrame != null) {
           av_frame_free(avFrame);
           avFrame = null;
       }

       if (avPacket != null) {
           av_free_packet(avPacket);
           avPacket = null;
       }
    }

    public void start() {
       avCodec = avcodec_find_decoder(codecId);

       avCodecContext = avcodec_alloc_context3(avCodec);
       AVDictionary opts = new AVDictionary();
       avcodec_open2(avCodecContext, avCodec, opts);

       avFrame = av_frame_alloc();
       avPacket = new AVPacket();
       av_init_packet(avPacket);
    }

    public VideoFrame decode(byte[] data, int dataOffset, int dataSize) {
       avPacket.pts(AV_NOPTS_VALUE);
       avPacket.dts(AV_NOPTS_VALUE);
       avPacket.data(new BytePointer(data).position(dataOffset));
       avPacket.size(dataSize);
       avPacket.pos(-1);

       IntBuffer gotPicture = IntBuffer.allocate(1);

       int processedBytes = avcodec_decode_video2(
               avCodecContext, avFrame, gotPicture, avPacket);

       if (avFrame.width() == 0 || avFrame.height() == 0) return null;

       VideoFrame frame = new VideoFrame();

      frame.colorPlane0 = new byte[avFrame.width() * avFrame.height()];
      frame.colorPlane1 = new byte[avFrame.width() / 2 * avFrame.height() / 2];
      frame.colorPlane2 = new byte[avFrame.width() / 2 * avFrame.height() / 2];

       if (avFrame.data(0) != null) avFrame.data(0).get(frame.colorPlane0);
       if (avFrame.data(1) != null) avFrame.data(1).get(frame.colorPlane1);
       if (avFrame.data(2) != null) avFrame.data(2).get(frame.colorPlane2);

       frame.lineSize0 = avFrame.width();
       frame.lineSize1 = avFrame.width() / 2;
       frame.lineSize2 = avFrame.width() / 2;

       frame.width = avFrame.width();
       frame.height = avFrame.height();

       return frame;
     }
    }

    VideoFrame class is just simple POJO :

    public class VideoFrame {
       public byte[] colorPlane0;
       public byte[] colorPlane1;
       public byte[] colorPlane2;
       public int lineSize0;
       public int lineSize1;
       public int lineSize2;
       public int width;
       public int height;
       public long presentationTime;
    }

    After decoding i send this frame to my GLRenderer class

    public class GLRenderer implements GLSurfaceView.Renderer {

       private static final String LOG_TAG = "GLRenderer";

       private TexturePlane plane;

       private ConcurrentLinkedQueue<videoframe> frames;
       private int maxFps = 30;
       private VideoFrame currentFrame;
       private long startTime, endTime;
       private int viewWidth, viewHeight;
       private boolean isFirstFrameProcessed;

       public GLRenderer(int viewWidth, int viewHeight) {
           frames = new ConcurrentLinkedQueue&lt;>();
           this.viewWidth = viewWidth;
           this.viewHeight = viewHeight;
       }

       // mMVPMatrix is an abbreviation for "Model View Projection Matrix"
       private final float[] mMVPMatrix = new float[16];
       private final float[] mProjectionMatrix = new float[16];
       private final float[] mViewMatrix = new float[16];

       @Override

       public void onSurfaceCreated(GL10 unused, EGLConfig config) {
           // Set the background frame color
           GLES20.glClearColor(0.1f, 0.1f, 0.1f, 1.0f);

           plane = new TexturePlane();
       }

       public void setMaxFps(int maxFps) {
           this.maxFps = maxFps;
       }

       @Override
       public void onDrawFrame(GL10 unused) {


           // Draw background color
           GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);

           // Set the camera position (View matrix)
           Matrix.setLookAtM(mViewMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0.0f);

           // Calculate the projection and view transformation
           Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0);

           if (!isFirstFrameProcessed) checkViewPort(viewWidth, viewHeight);

           if (maxFps > 0 &amp;&amp; startTime > 0) {
               endTime = System.currentTimeMillis();
               long time = endTime - startTime;
               //
               long wantedTime = 1000 / maxFps;
               //
               long wait;
               if (time &lt; wantedTime) {
                   wait = wantedTime - time;
                   //
                   try {
                       Thread.sleep(wait);
                   } catch (InterruptedException e) {
                       Log.e(LOG_TAG, "thread interrupted exception");
                   }
               }
           }
           startTime = System.currentTimeMillis();
           tick();
           plane.draw(mMVPMatrix);
       }

       private void updateFrame(VideoFrame frame) {
           plane.updateTexture(frame.colorPlane0, frame.width, frame.height, 0);
           plane.updateTexture(frame.colorPlane1, frame.width / 2, frame.height / 2, 1);
           plane.updateTexture(frame.colorPlane2, frame.width / 2, frame.height / 2, 2);
           plane.setTextureWidth(frame.width);
           plane.setTextureHeight(frame.height);
       }

       private void tick() {

           if (frames.isEmpty()) return;

           VideoFrame frame = frames.peek();
           if (frame == null) return;

           long tms = System.currentTimeMillis();
           if (frame.presentationTime &lt;= tms) {
               updateFrame(frame);
               currentFrame = frame;
               frames.remove(frame);
           }
       }

       @Override
       public void onSurfaceChanged(GL10 unused, int width, int height) {
           checkViewPort(width, height);
           viewWidth = width;
           viewHeight = height;
           plane.setTextureWidth(width);
           plane.setTextureHeight(height);
       }

       private void checkViewPort(int width, int height) {
           float viewRatio = (float) width / height;
           if (currentFrame != null) {
               float targetRatio = (float) currentFrame.width / currentFrame.height;
               int x, y, newWidth, newHeight;
               if (targetRatio > viewRatio) {
                   newWidth = width;
                   newHeight = (int) (width / targetRatio);
                   x = 0;
                   y = (height - newHeight) / 2;
               } else {
                   newHeight = height;
                   newWidth = (int) (height * targetRatio);
                   y = 0;
                   x = (width - newWidth) / 2;
               }
               GLES20.glViewport(x, y, newWidth, newHeight);
           } else {
               GLES20.glViewport(0, 0, width, height);
           }

           Matrix.frustumM(mProjectionMatrix, 0, 1, -1, -1, 1, 3, 4);
       }

       public void addFrame(VideoFrame frame) {
           if (frame != null) {
               frames.add(frame);
           }
       }
    }
    </videoframe>

    GLRenderer works with simple openGL polygon, on which i draw all textures

       public class TexturePlane {

       private static final String LOG_TAG = "TexturePlane";

       private final String vertexShaderCode = "" +
       "uniform mat4 uMVPMatrix;" +
       "attribute vec4 vPosition;" +
       "attribute vec2 a_TexCoordinate;" +
       "varying vec2 v_TexCoordinate;" +

       "void main() {" +
       "  gl_Position = uMVPMatrix * vPosition;" +
       "  v_TexCoordinate = a_TexCoordinate;" +
       "}";

       private final String fragmentShaderCode = "" +
       "precision mediump float;" +
       "varying vec2 v_TexCoordinate;" +
       "uniform sampler2D s_texture_y;" +
       "uniform sampler2D s_texture_u;" +
       "uniform sampler2D s_texture_v;" +

       "void main() {" +
       "   float y = texture2D(s_texture_y, v_TexCoordinate).r;" +
       "   float u = texture2D(s_texture_u, v_TexCoordinate).r - 0.5;" +
       "   float v = texture2D(s_texture_v, v_TexCoordinate).r - 0.5;" +

       "   float r = y + 1.13983 * v;" +
       "   float g = y - 0.39465 * u - 0.58060 * v;" +
       "   float b = y + 2.03211 * u;" +

       "   gl_FragColor = vec4(r, g, b, 1.0);" +

       "}";

       private final FloatBuffer vertexBuffer;
       private final FloatBuffer textureBuffer;
       private final ShortBuffer drawListBuffer;
       private final int mProgram;
       private int mPositionHandle;
       private int mMVPMatrixHandle;

           // number of coordinates per vertex in this array
       private static final int COORDS_PER_VERTEX = 3;
       private static final int COORDS_PER_TEXTURE = 2;

       private static float squareCoords[] = {
           -1f, 1f, 0.0f,
           -1f, -1f, 0.0f,
           1f, -1f, 0.0f,
           1f, 1f, 0.0f
       };

       private static float uvs[] = {
           0.0f, 0.0f,
           0.0f, 1.0f,
           1.0f, 1.0f,
           1.0f, 0.0f
       };

       private final short drawOrder[] = {0, 1, 2, 0, 2, 3}; // order to draw vertices
       private final int vertexStride = COORDS_PER_VERTEX * 4; // 4 bytes per vertex

       private int textureWidth = 640;
       private int textureHeight = 480;

       private int yTextureUniformHandle;
       private int uTextureUniformHandle;
       private int vTextureUniformHandle;

       private int yTextureHandle;
       private int uTextureHandle;
       private int vTextureHandle;

       private int mTextureCoordinateHandle;

       public void setTextureWidth(int textureWidth) {
           this.textureWidth = textureWidth;
       }

       public int getTextureWidth() {
           return textureWidth;
       }

       public void setTextureHeight(int textureHeight) {
           this.textureHeight = textureHeight;
       }

       public int getTextureHeight() {
           return textureHeight;
       }

       /**
        * Sets up the drawing object data for use in an OpenGL ES context.
        */
       public TexturePlane() {
               // initialize vertex byte buffer for shape coordinates
           ByteBuffer bb = ByteBuffer.allocateDirect(squareCoords.length * 4);
           bb.order(ByteOrder.nativeOrder());
           vertexBuffer = bb.asFloatBuffer();
           vertexBuffer.put(squareCoords);
           vertexBuffer.position(0);

               // initialize byte buffer for the draw list
           ByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length * 2);
           dlb.order(ByteOrder.nativeOrder());
           drawListBuffer = dlb.asShortBuffer();
           drawListBuffer.put(drawOrder);
           drawListBuffer.position(0);

               // initialize byte buffer for the draw list
           ByteBuffer tbb = ByteBuffer.allocateDirect(uvs.length * 4);
           tbb.order(ByteOrder.nativeOrder());
           textureBuffer = tbb.asFloatBuffer();
           textureBuffer.put(uvs);
           textureBuffer.position(0);

               mProgram = GLES20.glCreateProgram();             // create empty OpenGL Program
               compileShaders();
               setupTextures();
           }

           public void setupTextures() {
               yTextureHandle = setupTexture(null, textureWidth, textureHeight, 0);
               uTextureHandle = setupTexture(null, textureWidth, textureHeight, 1);
               vTextureHandle = setupTexture(null, textureWidth, textureHeight, 2);
           }

           public int setupTexture(ByteBuffer data, int width, int height, int index) {
               final int[] textureHandle = new int[1];

               GLES20.glGenTextures(1, textureHandle, 0);

               if (textureHandle[0] != 0) {
                       // Bind to the texture in OpenGL
                   GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + index);
                   GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle[0]);

                   updateTexture(data, width, height, index);

                       // Set filtering
                   GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
                   GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);

                       // Set wrapping mode
                   GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_REPEAT);
                   GLES20.glTexParameterf(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_REPEAT);
               }

               if (textureHandle[0] == 0) {
                   Log.e(LOG_TAG, "Error loading texture.");
               }

               return textureHandle[0];
           }

           public void updateTexture(byte[] data, int width, int height, int index) {

               if (data == null) {
                   if (width == 0 || height == 0) {
                       width = textureWidth;
                       height = textureHeight;
                   }

                   data = new byte[width * height];
                   if (index == 0) {
                       Arrays.fill(data, y);
                   } else if (index == 1) {
                       Arrays.fill(data, u);
                   } else {
                       Arrays.fill(data, v);
                   }
               }

               byteBuffer.wrap(data);
               byteBuffer.position(0);

               GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + index);

               GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
                   width, height, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, byteBuffer);
           }

           private void compileShaders() {
               // prepare shaders and OpenGL program
               int vertexShader = loadShader(
                   GLES20.GL_VERTEX_SHADER,
                   vertexShaderCode);
               int fragmentShader = loadShader(
                   GLES20.GL_FRAGMENT_SHADER,
                   fragmentShaderCode);

               GLES20.glAttachShader(mProgram, vertexShader);   // add the vertex shader to program
               GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
               GLES20.glLinkProgram(mProgram);                  // create OpenGL program executables
               checkGlError("glLinkProgram");

               // Add program to OpenGL environment
               GLES20.glUseProgram(mProgram);

               mPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
               mTextureCoordinateHandle = GLES20.glGetAttribLocation(mProgram, "a_TexCoordinate");

               GLES20.glEnableVertexAttribArray(mPositionHandle);
               GLES20.glEnableVertexAttribArray(mTextureCoordinateHandle);

               yTextureUniformHandle = GLES20.glGetUniformLocation(mProgram, "s_texture_y");
               uTextureUniformHandle = GLES20.glGetUniformLocation(mProgram, "s_Texture_u");
               vTextureUniformHandle = GLES20.glGetUniformLocation(mProgram, "s_Texture_v");

               mMVPMatrixHandle = GLES20.glGetUniformLocation(mProgram, "uMVPMatrix");
               checkGlError("glGetUniformLocation");
           }

       /**
        * Utility method for compiling a OpenGL shader.
        * <p></p>
        * <p><strong>Note:</strong> When developing shaders, use the checkGlError()
        * method to debug shader coding errors.</p>
        *
        * @param type       - Vertex or fragment shader type.
        * @param shaderCode - String containing the shader code.
        * @return - Returns an id for the shader.
        */
       public int loadShader(int type, String shaderCode) {

               // create a vertex shader type (GLES20.GL_VERTEX_SHADER)
               // or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
           int shader = GLES20.glCreateShader(type);

               // add the source code to the shader and compile it
           GLES20.glShaderSource(shader, shaderCode);
           GLES20.glCompileShader(shader);

           return shader;
       }

       /**
        * Utility method for debugging OpenGL calls. Provide the name of the call
        * just after making it:
        * <p></p>
        * <pre>
        * mColorHandle = GLES20.glGetUniformLocation(mProgram, "vColor");
        * MyGLRenderer.checkGlError("glGetUniformLocation");</pre>
        *
        * If the operation is not successful, the check throws an error.
        *
        * @param glOperation - Name of the OpenGL call to check.
        */
       public void checkGlError(String glOperation) {
           int error;
           String errorString;
           while ((error = GLES20.glGetError()) != GLES20.GL_NO_ERROR) {
               errorString = GLU.gluErrorString(error);
               String message = glOperation + ": glError " + error + ": " + errorString;
               Log.e(LOG_TAG, message);
               throw new RuntimeException(message);
           }
       }

       public void draw(float[] mvpMatrix) {

               // Prepare the triangle coordinate data
           GLES20.glVertexAttribPointer(
               mPositionHandle, COORDS_PER_VERTEX,
               GLES20.GL_FLOAT, false,
               vertexStride, vertexBuffer);

           GLES20.glVertexAttribPointer(
               mTextureCoordinateHandle, COORDS_PER_TEXTURE,
               GLES20.GL_FLOAT, false,
               0, textureBuffer);

           GLES20.glUniformMatrix4fv(mMVPMatrixHandle, 1, false, mvpMatrix, 0);
           checkGlError("glUniformMatrix4fv");

           GLES20.glUniform1i(yTextureUniformHandle, 0);
           GLES20.glUniform1i(uTextureUniformHandle, 1);
           GLES20.glUniform1i(vTextureUniformHandle, 2);

               // Draw the square
           GLES20.glDrawElements(
               GLES20.GL_TRIANGLES, drawOrder.length,
               GLES20.GL_UNSIGNED_SHORT, drawListBuffer);
       }
    }

    But i have a problem there. My GL surface display image with wrong colors. image

    What i’m doing wrong ?

    UPDATE :

    As Ronald S. Bultje say, i added glBindTexture(...) function in my code. And now updateTexture(...) method looks like this :

    public void updateTexture(byte[] data, int width, int height, int index) {

       if (data == null) {
           if (width == 0 || height == 0) {
               width = textureWidth;
               height = textureHeight;
           }

           data = new byte[width * height];
           if (index == 0) {
               Arrays.fill(data, y);
           } else if (index == 1) {
               Arrays.fill(data, u);
           } else {
               Arrays.fill(data, v);
           }
       }

       byteBuffer.wrap(data);
       byteBuffer.position(0);

       GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + index);

       int textureHandle = index == 0 ? yTextureHandle : index == 1 ? uTextureHandle : vTextureHandle;
       GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureHandle);

       GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_LUMINANCE,
           width, height, 0, GLES20.GL_LUMINANCE, GLES20.GL_UNSIGNED_BYTE, byteBuffer);
    }
  • FFMPEG error with avformat_open_input returning -135

    28 avril 2015, par LawfulEvil

    I have a DLL one of my applications uses to receive video from RTSP cameras. Under the hood, the DLL uses FFMPEG libs from this release zip :

    ffmpeg-20141022-git-6dc99fd-win64-shared.7z

    We have a wide variety of cameras in house and most of them work just fine. However, on one particular Pelco Model Number : IXE20DN-OCP, I am unable to connect. I tested the camera and rtsp connection string on VLC and it connects to the camera just fine.

    I found the connection string here : http://www.ispyconnect.com/man.aspx?n=Pelco

    rtsp://IPADDRESS:554/1/stream1

    Oddly, even if I leave the port off of VLC, it connects, so I’m guessing its the default RTSP port or that VLC tries a variety of things based on your input.

    In any case, when I attempt to connect, I get an error from av_format_open_input. It returns a code of -135. When I looked in the error code list I didn’t see that listed. For good measure, I printed out all the errors in error.h just to see what their values were.

    DumpErrorCodes - Error Code : AVERROR_BSF_NOT_FOUND = -1179861752
    DumpErrorCodes - Error Code : AVERROR_BUG = -558323010
    DumpErrorCodes - Error Code : AVERROR_BUFFER_TOO_SMALL = -1397118274
    DumpErrorCodes - Error Code : AVERROR_DECODER_NOT_FOUND = -1128613112
    DumpErrorCodes - Error Code : AVERROR_DEMUXER_NOT_FOUND = -1296385272
    DumpErrorCodes - Error Code : AVERROR_ENCODER_NOT_FOUND = -1129203192
    DumpErrorCodes - Error Code : AVERROR_EOF = -541478725
    DumpErrorCodes - Error Code : AVERROR_EXIT = -1414092869
    DumpErrorCodes - Error Code : AVERROR_EXTERNAL = -542398533
    DumpErrorCodes - Error Code : AVERROR_FILTER_NOT_FOUND = -1279870712
    DumpErrorCodes - Error Code : AVERROR_INVALIDDATA = -1094995529
    DumpErrorCodes - Error Code : AVERROR_MUXER_NOT_FOUND = -1481985528
    DumpErrorCodes - Error Code : AVERROR_OPTION_NOT_FOUND = -1414549496
    DumpErrorCodes - Error Code : AVERROR_PATCHWELCOME = -1163346256
    DumpErrorCodes - Error Code : AVERROR_PROTOCOL_NOT_FOUND = -1330794744
    DumpErrorCodes - Error Code : AVERROR_STREAM_NOT_FOUND = -1381258232
    DumpErrorCodes - Error Code : AVERROR_BUG2 = -541545794
    DumpErrorCodes - Error Code : AVERROR_UNKNOWN = -1313558101
    DumpErrorCodes - Error Code : AVERROR_EXPERIMENTAL = -733130664
    DumpErrorCodes - Error Code : AVERROR_INPUT_CHANGED = -1668179713
    DumpErrorCodes - Error Code : AVERROR_OUTPUT_CHANGED = -1668179714
    DumpErrorCodes - Error Code : AVERROR_HTTP_BAD_REQUEST = -808465656
    DumpErrorCodes - Error Code : AVERROR_HTTP_UNAUTHORIZED = -825242872
    DumpErrorCodes - Error Code : AVERROR_HTTP_FORBIDDEN = -858797304
    DumpErrorCodes - Error Code : AVERROR_HTTP_NOT_FOUND = -875574520
    DumpErrorCodes - Error Code : AVERROR_HTTP_OTHER_4XX = -1482175736
    DumpErrorCodes - Error Code : AVERROR_HTTP_SERVER_ERROR = -1482175992

    Nothing even close to -135. I did find this error, sort of on stack overflow, here runtime error when linking ffmpeg libraries in qt creator where the author claims it is a DLL loading problem error. I’m not sure what led him to think that, but I followed the advice and used the dependency walker (http://www.dependencywalker.com/) to checkout what dependencies it thought my DLL needed. It listed a few, but they were already provided in my install package.

    To make sure it was picking them up, I manually removed them from the install and observed a radical change in program behavior(that being my DLL didn’t load and start to run at all).

    So, I’ve got a bit of init code :

    void FfmpegInitialize()
    {
    av_lockmgr_register(&amp;LockManagerCb);
    av_register_all();
    LOG_DEBUG0("av_register_all returned\n");
    }

    Then I’ve got my main open connection routine ...

    int RTSPConnect(const char *URL, int width, int height, frameReceived callbackFunction)
    {

       int errCode =0;
       if ((errCode = avformat_network_init()) != 0)
       {
           LOG_ERROR1("avformat_network_init returned error code %d\n", errCode);  
       }
       LOG_DEBUG0("avformat_network_init returned\n");
       //Allocate space and setup the the object to be used for storing all info needed for this connection
       fContextReadFrame = avformat_alloc_context(); // free'd in the Close method

       if (fContextReadFrame == 0)
       {
           LOG_ERROR1("Unable to set rtsp_transport options.   Error code = %d\n", errCode);
           return FFMPEG_OPTION_SET_FAILURE;
       }

       LOG_DEBUG1("avformat_alloc_context returned %p\n", fContextReadFrame);

       AVDictionary *opts = 0;
       if ((errCode = av_dict_set(&amp;opts, "rtsp_transport", "tcp", 0)) &lt; 0)
       {
           LOG_ERROR1("Unable to set rtsp_transport options.   Error code = %d\n", errCode);
           return FFMPEG_OPTION_SET_FAILURE;
       }
       LOG_DEBUG1("av_dict_set returned %d\n", errCode);

       //open rtsp
       DumpErrorCodes();
       if ((errCode = avformat_open_input(&amp;fContextReadFrame, URL, NULL, &amp;opts)) &lt; 0)
       {
           LOG_ERROR2("Unable to open avFormat RF inputs.   URL = %s, and Error code = %d\n", URL, errCode);      
           LOG_ERROR2("Error Code %d = %s\n", errCode, errMsg(errCode));      
           // NOTE context is free'd on failure.
           return FFMPEG_FORMAT_OPEN_FAILURE;
       }
    ...

    To be sure I didn’t misunderstand the error code I printed the error message from ffmpeg but the error isn’t found and my canned error message is returned instead.

    My next step was going to be hooking up wireshark on my connection attempt and on the VLC connection attempt and trying to figure out what differences(if any) are causing the problem and what I can do to ffmpeg to make it work. As I said, I’ve got a dozen other cameras in house that use RTSP and they work with my DLL. Some utilize usernames/passwords/etc as well(so I know that isn’t the problem).

    Also, my run logs :

    FfmpegInitialize - av_register_all returned
    Open - Open called.  Pointers valid, passing control.
    Rtsp::RtspInterface::Open - Rtsp::RtspInterface::Open called
    Rtsp::RtspInterface::Open - VideoSourceString(35) = rtsp://192.168.14.60:554/1/stream1
    Rtsp::RtspInterface::Open - Base URL = (192.168.14.60:554/1/stream1)
    Rtsp::RtspInterface::Open - Attempting to open (rtsp://192.168.14.60:554/1/stream1) for WxH(320x240) video
    RTSPSetFormatH264 - RTSPSetFormatH264
    RTSPConnect - Called
    LockManagerCb - LockManagerCb invoked for op 1
    LockManagerCb - LockManagerCb invoked for op 2
    RTSPConnect - avformat_network_init returned
    RTSPConnect - avformat_alloc_context returned 019E6000
    RTSPConnect - av_dict_set returned 0
    DumpErrorCodes - Error Code : AVERROR_BSF_NOT_FOUND = -1179861752
    ...
    DumpErrorCodes - Error Code : AVERROR_HTTP_SERVER_ERROR = -1482175992
    RTSPConnect - Unable to open avFormat RF inputs.   URL = rtsp://192.168.14.60:554/1/stream1, and Error code = -135
    RTSPConnect - Error Code -135 = No Error Message Available

    I’m going to move forward with wireshark but would like to know the origin of the -135 error code from ffmpeg. When I look at the code if ’ret’ is getting set to -135, it must be happening as a result of the return code from a helper method and not directly in the avformat_open_input method.

    https://www.ffmpeg.org/doxygen/2.5/libavformat_2utils_8c_source.html#l00398

    After upgrading to the latest daily ffmpeg build, I get data on wireshark. Real Time Streaming Protocol :

    Request: SETUP rtsp://192.168.14.60/stream1/track1 RTSP/1.0\r\n
    Method: SETUP
    URL: rtsp://192.168.14.60/stream1/track1
    Transport: RTP/AVP/TCP;unicast;interleaved=0-1
    CSeq: 3\r\n
    User-Agent: Lavf56.31.100\r\n
    \r\n

    The response to that is the first ’error’ that I can detect in the initiation.

    Response: RTSP/1.0 461 Unsupported Transport\r\n
    Status: 461
    CSeq: 3\r\n
    Date: Sun, Jan 04 1970 16:03:05 GMT\r\n
    \r\n

    I’m going to guess that... it means the transport we selected was unsupported. I quick check of the code reveals I picked ’tcp’. Looking through the reply to the DESCRIBE command, it appears :

    Media Protocol: RTP/AVP

    Further, when SETUP is issued by ffmpeg, it specifies :

    Transport: RTP/AVP/TCP;unicast;interleaved=0-1

    I’m going to try, on failure here to pick another transport type and see how that works. Still don’t know where the -135 comes from.