
Recherche avancée
Médias (91)
-
Corona Radiata
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Lights in the Sky
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Head Down
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Echoplex
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Discipline
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Letting You
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (56)
-
Demande de création d’un canal
12 mars 2010, parEn fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...) -
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 is the first MediaSPIP stable release.
Its official release date is June 21, 2013 and is announced here.
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur
8 février 2011, parLa visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
Configuration de la boite multimédia
Dès (...)
Sur d’autres sites (6755)
-
Why my code that based on ffmpeg can't sync video' time and audio's time ?
6 juillet 2021, par ZSpirytusBackground


Recently, I use ffmpeg to write my first Android video player. But video channel's time is faster than audio channel's time about 2 3 times.


Code


In short, I use PacketDispatcher to read AVPacket from http hlv source :


void PacketDispatcher::RealDispatch() {
 while (GetStatus() != DISPATCHER_STOP) {
 while (GetStatus() == DISPATCHER_PAUSE) {
 LOGD(TAG, "wait signal");
 pthread_mutex_lock(&mutex);
 pthread_cond_wait(&cond, &mutex);
 pthread_mutex_unlock(&mutex);
 }

 AVPacket *av_packet = av_packet_alloc();
 int ret = av_read_frame(av_format_context, av_packet);
 if (ret) {
 LOGE(TAG, "av_read_frame ret=%d", ret);
 break;
 }

 // PacketDispatcher is who read the AVPacket from http hlv source 
 // and dispatch to decoder by its stream index.
 decoder_map[av_packet->stream_index]->Push(av_packet);
 }
}



And then, Decoder written by Producer-Consumer Pattern, Decoder maintain a queue that store all the AVPacket received from PacketDispatcher. The code like this :


// write to the queue
void BaseDecoder::Push(AVPacket *av_packet) {
 pthread_mutex_lock(&av_packet_queue_mutex);
 av_packet_queue.push(av_packet);
 pthread_cond_signal(&av_packet_queue_cond);
 pthread_mutex_unlock(&av_packet_queue_mutex);
}

// real decode logic
void BaseDecoder::RealDecode() {
 SetDecoderState(START);
 LOGI(LogSpec(), "start decode");

 while (true) {
 // 1. check decoder status and queue size to decide if call thread.wait

 // 2. send packet to codec
 AVPacket* av_packet = av_packet_queue.front();
 int ret = avcodec_send_packet(av_codec_ctx, av_packet);

 // 3. read frame from codec
 AVFrame *av_frame = av_frame_alloc();
 ret = avcodec_receive_frame(av_codec_ctx, av_frame);

 if (m_render) {
 // 3. custom decode logic overrided by child class
 void *decode_result = DecodeFrame(av_frame);
 if (decode_result) {
 // 4. dispatch to render
 m_render->Render(decode_result);
 } else {
 LOGD("BaseDecoder", "decode_result=nullptr");
 }
 }
 }
}



Finally, I do rendering logic in Render. Render also written by Producer-Consumer Pattern, it maintain a queue that store AVFrame received from Decoder, the code like this :


// write AVFrame
void BaseRender::Render(void *frame_data) {
 Lock();
 frame_queue.push(frame_data);
 Signal();
 UnLock();
}

// render to surface or Open SL
void BaseRender::RealRender() {
 // frame data that contain frame pts and other metadata
 frame_data->pts = av_frame->pts = av_frame->best_effort_timestamp * av_q2d(GetTimeBase());
 // video only
 frame_data->video_extra_delay = av_frame->repeat_pict * 1.0 / fps * 2.0;
 if (m_render_synchronizer && m_render_synchronizer->Sync(frame_data)) {
 continue;
 }
}



And then, the synchronizer will decide to sleep time or drop video frame according to the frame pts, frame pts is :


frame_data->pts = av_frame->best_effort_timestamp * av_q2d(GetTimeBase());



Also, video extra delay is :


frame_data->video_extra_delay = av_frame->repeat_pict * 1.0 / fps * 2.0;



RenderSynchronizer code like this :


bool RenderSynchronizer::Sync(void *frame_data) {
 auto base_frame_data = static_cast<baseframedata>(frame_data);
 if (base_frame_data->media_type == AVMEDIA_TYPE_AUDIO) {
 return ReceiveAudioFrame(static_cast<pcmdata>(frame_data));
 } else if (base_frame_data->media_type == AVMEDIA_TYPE_VIDEO) {
 return ReceiveVideoFrame(static_cast<rgbadata>(frame_data));
 }
 return false;
}

bool RenderSynchronizer::ReceiveAudioFrame(PCMData *pcm_data) {
 audio_pts = pcm_data->pts;
 return false;
}

bool RenderSynchronizer::ReceiveVideoFrame(RGBAData *rgba_data) {
 video_pts = rgba_data->pts;

 if (audio_pts <= 0 || video_pts <= 0) {
 return false;
 }

 double diff = video_pts - audio_pts;
 if (diff > 0) {
 if (diff > 1) {
 av_usleep((unsigned int) (rgba_data->extra_delay * 1000000.0));
 } else {
 av_usleep((unsigned int) ((diff + rgba_data->extra_delay) * 1000000.0));
 }
 return false;
 } else if (diff < 0) {
 LOGD(TAG, "drop video frame");
 return true;
 } else {
 return false;
 }
}
</rgbadata></pcmdata></baseframedata>


Why my code can not sync video time and audio time ? Thanks for your reading and answers.


-
Loss of frame order when saving AVFrames in a threaded environment
10 juin 2015, par OrphidContext : I’m using ffmpeg’s api to extract frames from video files for test purposes. I’ve had a lot of issues getting it to work, and several re-writes. This time I’ve added tests at every stage to make sure I can isolate where errors are occurring.
Problem : I want to have a producer-consumer relationship set-up - a FrameProcessor class has a thread which pushes decoded frames into a queue, which a test client consumes. After adding this in, I suddenly found that my frames were being saved out of order, but I can’t see why this is happening because everything should be locked, and there’s only one producer and one consumer. Here is a section of code that adds frames to the queue :
void FrameProcessor::_do_processing()
{
AVPacket packet;
_is_processing = true;
Logger::Info("Starting to buffer frames on a separate thread.");
while(!_stop_processing_requested && !_video->finished())
{
boost::mutex::scoped_lock lock(_mutex);
if(!_video->get_next_packet(&packet))
{
Logger::Info("The video ran out of frames, so FrameProcessor producer thread has stopped.");
lock.unlock();
break;
}
Logger::Info("FrameProcessor producer thread reads a packet from the stream (pts = " + std::to_string(packet.pts) + ").");
int indx = packet.stream_index;
auto decoder_it = _decoder_map.find(indx);
if(decoder_it == _decoder_map.end())
{
Logger::Debug("FrameProcessor discarded a packet from stream index " + std::to_string(indx) + " because the stream was not enabled in the FrameProcessor.");
av_free_packet(&packet);
}
StreamDecoder* strm_dc = decoder_it->second;
AVFrame* frm = strm_dc->decode_packet(&packet);
if(frm != NULL)
{
Logger::Debug("Producer thread pushed a frame into the FrameProcessor queue (pts = " + std::to_string(frm->pts) + ", coded num = " + std::to_string(frm->coded_picture_number) + ").");
_frames.push(new TestFrame(frm));
_print_queue();
}
av_free_packet(&packet);
lock.unlock();
_wait_for_buffer_space();
}
Logger::Info("Finished buffering. Buffer stop requested? " + std::to_string(_stop_processing_requested));
_is_processing = false;
}The dequeue section of a test before adding producer thread :
while(v->get_next_packet(&packet))
{
if(packet.stream_index == vi_first)
{
AVFrame* f = decoder.decode_packet(&packet);
if(f != NULL)
{
TestFrame* tf = new TestFrame(f);
Logger::Info("|MAIN| Dequeued frame with pts " + std::to_string(tf->av_frame->pts) + " and coded num " + std::to_string(tf->av_frame->coded_picture_number) + ".");
int cdc_num = tf->av_frame->coded_picture_number;
TestFrame* tf1 = formatter.format(*tf);
Logger::Debug("|MAIN| Saving frame with coded id " + std::to_string(cdc_num) + " as frame" + std::to_string(i) + ".ppm");
save_ppm_frame(tf1->av_frame, v1_ctx->width, v1_ctx->height, "tst3", i++);
delete tf1;
}
if(i > 40)
break;
}
}And the dequeue section of one of my tests after adding producer thread / FrameProcessor class :
while(!v->finished() && i < 40)
{
tf = processor->next_frame();
if(tf == NULL)
continue;
Logger::Info("|MAIN| Dequeued frame with pts " + std::to_string(tf->av_frame->pts) + " and coded num " + std::to_string(tf->av_frame->coded_picture_number) + ".");
//formatting the frame somehow seems to cause the coded_picture_number to be zeroed, so get it now.
int cdc_num = tf->av_frame->coded_picture_number;
TestFrame* tf1 = formatter->format(*tf);
Logger::Debug("|MAIN| Saving frame with coded id " + std::to_string(cdc_num) + " as frame" + std::to_string(i) + ".ppm");
save_ppm_frame(tf1->av_frame, width, height, "tst4", i++);
delete tf1;
}The only two threads that should be in use are the main program thread, and the producer thread. The really odd thing is that I can see from my logs that all the coded ids are coming out in the correct order, but when I check the actual ppm files, the images are clearly not (hopefully you can see the dot in these images jumping backwards and forwards, when it should be moving in a smooth curve) :
To be clear, before adding the threading in, these frames were coming out correctly. Here’s an example of the logging information from two runs, one without threading, and one with (please be aware that coded_picture_number is not the same as display order - I’m just using the values to check that they match the successful earlier run.) I greped the logs to just show the consumer side of the logging :
Before adding producer thread & frame queue :
[2015-Jun-10 11:01:44.783523]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 0.
[2015-Jun-10 11:01:44.783859]: <debug> |MAIN| Saving frame with coded id 0 as frame0.ppm
[2015-Jun-10 11:01:44.784811]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 2.
[2015-Jun-10 11:01:44.784982]: <debug> |MAIN| Saving frame with coded id 2 as frame1.ppm
[2015-Jun-10 11:01:44.785910]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 3.
[2015-Jun-10 11:01:44.786075]: <debug> |MAIN| Saving frame with coded id 3 as frame2.ppm
[2015-Jun-10 11:01:44.786896]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 1.
[2015-Jun-10 11:01:44.787060]: <debug> |MAIN| Saving frame with coded id 1 as frame3.ppm
[2015-Jun-10 11:01:44.787967]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 5.
[2015-Jun-10 11:01:44.788145]: <debug> |MAIN| Saving frame with coded id 5 as frame4.ppm
[2015-Jun-10 11:01:44.789004]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 6.
[2015-Jun-10 11:01:44.789173]: <debug> |MAIN| Saving frame with coded id 6 as frame5.ppm
[2015-Jun-10 11:01:44.789946]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 4.
[2015-Jun-10 11:01:44.790124]: <debug> |MAIN| Saving frame with coded id 4 as frame6.ppm
[2015-Jun-10 11:01:44.790909]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 8.
[2015-Jun-10 11:01:44.791075]: <debug> |MAIN| Saving frame with coded id 8 as frame7.ppm
[2015-Jun-10 11:01:44.791895]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 9.
[2015-Jun-10 11:01:44.792060]: <debug> |MAIN| Saving frame with coded id 9 as frame8.ppm
[2015-Jun-10 11:01:44.793512]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 7.
[2015-Jun-10 11:01:44.793704]: <debug> |MAIN| Saving frame with coded id 7 as frame9.ppm
[2015-Jun-10 11:01:44.794566]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 11.
[2015-Jun-10 11:01:44.794733]: <debug> |MAIN| Saving frame with coded id 11 as frame10.ppm
[2015-Jun-10 11:01:44.795638]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 12.
[2015-Jun-10 11:01:44.795802]: <debug> |MAIN| Saving frame with coded id 12 as frame11.ppm
[2015-Jun-10 11:01:44.796633]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 10.
[2015-Jun-10 11:01:44.796797]: <debug> |MAIN| Saving frame with coded id 10 as frame12.ppm
[2015-Jun-10 11:01:44.797594]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 14.
[2015-Jun-10 11:01:44.797759]: <debug> |MAIN| Saving frame with coded id 14 as frame13.ppm
[2015-Jun-10 11:01:44.798524]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 15.
[2015-Jun-10 11:01:44.798666]: <debug> |MAIN| Saving frame with coded id 15 as frame14.ppm
[2015-Jun-10 11:01:44.799332]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 13.
[2015-Jun-10 11:01:44.799470]: <debug> |MAIN| Saving frame with coded id 13 as frame15.ppm
[2015-Jun-10 11:01:44.800176]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 17.
[2015-Jun-10 11:01:44.800312]: <debug> |MAIN| Saving frame with coded id 17 as frame16.ppm
[2015-Jun-10 11:01:44.801048]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 18.
[2015-Jun-10 11:01:44.801186]: <debug> |MAIN| Saving frame with coded id 18 as frame17.ppm
[2015-Jun-10 11:01:44.801842]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 16.
[2015-Jun-10 11:01:44.801977]: <debug> |MAIN| Saving frame with coded id 16 as frame18.ppm
[2015-Jun-10 11:01:44.802629]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 20.
[2015-Jun-10 11:01:44.802766]: <debug> |MAIN| Saving frame with coded id 20 as frame19.ppm
[2015-Jun-10 11:01:44.803427]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 21.
[2015-Jun-10 11:01:44.803564]: <debug> |MAIN| Saving frame with coded id 21 as frame20.ppm
[2015-Jun-10 11:01:44.804289]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 19.
[2015-Jun-10 11:01:44.804429]: <debug> |MAIN| Saving frame with coded id 19 as frame21.ppm
[2015-Jun-10 11:01:44.805131]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 23.
[2015-Jun-10 11:01:44.805266]: <debug> |MAIN| Saving frame with coded id 23 as frame22.ppm
[2015-Jun-10 11:01:44.805987]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 24.
[2015-Jun-10 11:01:44.806132]: <debug> |MAIN| Saving frame with coded id 24 as frame23.ppm
[2015-Jun-10 11:01:44.806792]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 22.
[2015-Jun-10 11:01:44.806929]: <debug> |MAIN| Saving frame with coded id 22 as frame24.ppm
[2015-Jun-10 11:01:44.807604]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 26.
[2015-Jun-10 11:01:44.807742]: <debug> |MAIN| Saving frame with coded id 26 as frame25.ppm
[2015-Jun-10 11:01:44.808411]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 27.
[2015-Jun-10 11:01:44.808550]: <debug> |MAIN| Saving frame with coded id 27 as frame26.ppm
[2015-Jun-10 11:01:44.809201]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 25.
[2015-Jun-10 11:01:44.809339]: <debug> |MAIN| Saving frame with coded id 25 as frame27.ppm
[2015-Jun-10 11:01:44.810079]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 29.
[2015-Jun-10 11:01:44.810243]: <debug> |MAIN| Saving frame with coded id 29 as frame28.ppm
[2015-Jun-10 11:01:44.810887]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 30.
[2015-Jun-10 11:01:44.811001]: <debug> |MAIN| Saving frame with coded id 30 as frame29.ppm
[2015-Jun-10 11:01:44.811672]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 28.
[2015-Jun-10 11:01:44.811785]: <debug> |MAIN| Saving frame with coded id 28 as frame30.ppm
[2015-Jun-10 11:01:44.812450]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 32.
[2015-Jun-10 11:01:44.812565]: <debug> |MAIN| Saving frame with coded id 32 as frame31.ppm
[2015-Jun-10 11:01:44.813262]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 33.
[2015-Jun-10 11:01:44.813380]: <debug> |MAIN| Saving frame with coded id 33 as frame32.ppm
[2015-Jun-10 11:01:44.813964]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 31.
[2015-Jun-10 11:01:44.814076]: <debug> |MAIN| Saving frame with coded id 31 as frame33.ppm
[2015-Jun-10 11:01:44.814664]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 35.
[2015-Jun-10 11:01:44.814778]: <debug> |MAIN| Saving frame with coded id 35 as frame34.ppm
[2015-Jun-10 11:01:44.815375]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 36.
[2015-Jun-10 11:01:44.815489]: <debug> |MAIN| Saving frame with coded id 36 as frame35.ppm
</debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info>After adding producer thread & frame queue :
[2015-Jun-10 11:01:44.827119]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 0.
[2015-Jun-10 11:01:44.827459]: <debug> |MAIN| Saving frame with coded id 0 as frame0.ppm
[2015-Jun-10 11:01:44.829389]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 2.
[2015-Jun-10 11:01:44.829751]: <debug> |MAIN| Saving frame with coded id 2 as frame1.ppm
[2015-Jun-10 11:01:44.831429]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 3.
[2015-Jun-10 11:01:44.831758]: <debug> |MAIN| Saving frame with coded id 3 as frame2.ppm
[2015-Jun-10 11:01:44.833583]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 1.
[2015-Jun-10 11:01:44.833906]: <debug> |MAIN| Saving frame with coded id 1 as frame3.ppm
[2015-Jun-10 11:01:44.834585]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 5.
[2015-Jun-10 11:01:44.834691]: <debug> |MAIN| Saving frame with coded id 5 as frame4.ppm
[2015-Jun-10 11:01:44.835167]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 6.
[2015-Jun-10 11:01:44.835273]: <debug> |MAIN| Saving frame with coded id 6 as frame5.ppm
[2015-Jun-10 11:01:44.835753]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 4.
[2015-Jun-10 11:01:44.835857]: <debug> |MAIN| Saving frame with coded id 4 as frame6.ppm
[2015-Jun-10 11:01:44.836336]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 8.
[2015-Jun-10 11:01:44.836440]: <debug> |MAIN| Saving frame with coded id 8 as frame7.ppm
[2015-Jun-10 11:01:44.836897]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 9.
[2015-Jun-10 11:01:44.837002]: <debug> |MAIN| Saving frame with coded id 9 as frame8.ppm
[2015-Jun-10 11:01:44.837460]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 7.
[2015-Jun-10 11:01:44.837565]: <debug> |MAIN| Saving frame with coded id 7 as frame9.ppm
[2015-Jun-10 11:01:44.838080]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 11.
[2015-Jun-10 11:01:44.838200]: <debug> |MAIN| Saving frame with coded id 11 as frame10.ppm
[2015-Jun-10 11:01:44.838666]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 12.
[2015-Jun-10 11:01:44.838772]: <debug> |MAIN| Saving frame with coded id 12 as frame11.ppm
[2015-Jun-10 11:01:44.839322]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 10.
[2015-Jun-10 11:01:44.839428]: <debug> |MAIN| Saving frame with coded id 10 as frame12.ppm
[2015-Jun-10 11:01:44.839910]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 14.
[2015-Jun-10 11:01:44.840018]: <debug> |MAIN| Saving frame with coded id 14 as frame13.ppm
[2015-Jun-10 11:01:44.867898]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 15.
[2015-Jun-10 11:01:44.868013]: <debug> |MAIN| Saving frame with coded id 15 as frame14.ppm
[2015-Jun-10 11:01:44.868925]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 13.
[2015-Jun-10 11:01:44.869039]: <debug> |MAIN| Saving frame with coded id 13 as frame15.ppm
[2015-Jun-10 11:01:44.869938]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 17.
[2015-Jun-10 11:01:44.870051]: <debug> |MAIN| Saving frame with coded id 17 as frame16.ppm
[2015-Jun-10 11:01:44.871000]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 18.
[2015-Jun-10 11:01:44.871154]: <debug> |MAIN| Saving frame with coded id 18 as frame17.ppm
[2015-Jun-10 11:01:44.871792]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 16.
[2015-Jun-10 11:01:44.871907]: <debug> |MAIN| Saving frame with coded id 16 as frame18.ppm
[2015-Jun-10 11:01:44.872873]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 20.
[2015-Jun-10 11:01:44.872993]: <debug> |MAIN| Saving frame with coded id 20 as frame19.ppm
[2015-Jun-10 11:01:44.873606]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 21.
[2015-Jun-10 11:01:44.873719]: <debug> |MAIN| Saving frame with coded id 21 as frame20.ppm
[2015-Jun-10 11:01:44.874324]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 19.
[2015-Jun-10 11:01:44.874432]: <debug> |MAIN| Saving frame with coded id 19 as frame21.ppm
[2015-Jun-10 11:01:44.874931]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 23.
[2015-Jun-10 11:01:44.875046]: <debug> |MAIN| Saving frame with coded id 23 as frame22.ppm
[2015-Jun-10 11:01:44.875507]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 24.
[2015-Jun-10 11:01:44.875620]: <debug> |MAIN| Saving frame with coded id 24 as frame23.ppm
[2015-Jun-10 11:01:44.876087]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 22.
[2015-Jun-10 11:01:44.876190]: <debug> |MAIN| Saving frame with coded id 22 as frame24.ppm
[2015-Jun-10 11:01:44.876654]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 26.
[2015-Jun-10 11:01:44.876757]: <debug> |MAIN| Saving frame with coded id 26 as frame25.ppm
[2015-Jun-10 11:01:44.877242]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 27.
[2015-Jun-10 11:01:44.877345]: <debug> |MAIN| Saving frame with coded id 27 as frame26.ppm
[2015-Jun-10 11:01:44.877816]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 25.
[2015-Jun-10 11:01:44.877919]: <debug> |MAIN| Saving frame with coded id 25 as frame27.ppm
[2015-Jun-10 11:01:44.878453]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 29.
[2015-Jun-10 11:01:44.878558]: <debug> |MAIN| Saving frame with coded id 29 as frame28.ppm
[2015-Jun-10 11:01:44.879066]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 30.
[2015-Jun-10 11:01:44.879187]: <debug> |MAIN| Saving frame with coded id 30 as frame29.ppm
[2015-Jun-10 11:01:44.879676]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 28.
[2015-Jun-10 11:01:44.879790]: <debug> |MAIN| Saving frame with coded id 28 as frame30.ppm
[2015-Jun-10 11:01:44.908448]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 32.
[2015-Jun-10 11:01:44.908569]: <debug> |MAIN| Saving frame with coded id 32 as frame31.ppm
[2015-Jun-10 11:01:44.909642]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 33.
[2015-Jun-10 11:01:44.909756]: <debug> |MAIN| Saving frame with coded id 33 as frame32.ppm
[2015-Jun-10 11:01:44.910879]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 31.
[2015-Jun-10 11:01:44.910992]: <debug> |MAIN| Saving frame with coded id 31 as frame33.ppm
[2015-Jun-10 11:01:44.911988]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 35.
[2015-Jun-10 11:01:44.912147]: <debug> |MAIN| Saving frame with coded id 35 as frame34.ppm
[2015-Jun-10 11:01:44.913108]: <info> |MAIN| Dequeued frame with pts -9223372036854775808 and coded num 36.
[2015-Jun-10 11:01:44.913222]: <debug> |MAIN| Saving frame with coded id 36 as frame35.ppm
</debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info></debug></info>And, basically, I can’t see the difference between what’s being logged between the two tests - they are both saving the frames with the same coded_picture_numbers in the same order as far as I can see, but the images are different. If you select one frame from both runs (say frame28.ppm), you can see that the frames at the same index are just not the same images. I’m still very new to C++, so apologies if this is just a trivial general error.
EDIT :
I’ve used a work around to solve this - by removing the whole producer-consumer model and having the FrameProducer simply decode the next frame when requested. This isn’t really an answer to what’s going wrong here, but I understand that it’s difficult for members to help with something for which it’s just impractical to give sufficient code to reproduce.
-
Trying to sync audio/visual using FFMpeg and openAL
22 août 2013, par user1379811hI have been studying dranger ffmpeg tutorial which explains how to sync audio and visual once you have the frames displayed and audio playing which is where im at.
Unfortunately, the tutorial is out of date (Stephen Dranger explaained that himself to me) and also uses sdl which im not doing - this is for Blackberry 10 application.
I just cannot make the video frames display at the correct speed (they are just playing very fast) and I have been trying for over a week now - seriously !
I have 3 threads happening - one to read from stream into audio and video queues and then 2 threads for audio and video.
If somebody could explain whats happening after scanning my relevent code you would be a lifesaver.
The delay (what I pass to usleep(testDelay) seems to be going up (incrementing) which doesn't seem right to me.
count = 1;
MyApp* inst = worker->app;//(VideoUploadFacebook*)arg;
qDebug() << "\n start loadstream";
w = new QWaitCondition();
w2 = new QWaitCondition();
context = avformat_alloc_context();
inst->threadStarted = true;
cout << "start of decoding thread";
cout.flush();
av_register_all();
avcodec_register_all();
avformat_network_init();
av_log_set_callback(&log_callback);
AVInputFormat *pFormat;
//const char device[] = "/dev/video0";
const char formatName[] = "mp4";
cout << "2start of decoding thread";
cout.flush();
if (!(pFormat = av_find_input_format(formatName))) {
printf("can't find input format %s\n", formatName);
//return void*;
}
//open rtsp
if(avformat_open_input(&context, inst->capturedUrl.data(), pFormat,NULL) != 0){
// return ;
cout << "error opening of decoding thread: " << inst->capturedUrl.data();
cout.flush();
}
cout << "3start of decoding thread";
cout.flush();
// av_dump_format(context, 0, inst->capturedUrl.data(), 0);
/* if(avformat_find_stream_info(context,NULL) < 0){
return EXIT_FAILURE;
}
*/
//search video stream
for(int i =0;inb_streams;i++){
if(context->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO)
inst->video_stream_index = i;
}
cout << "3z start of decoding thread";
cout.flush();
AVFormatContext* oc = avformat_alloc_context();
av_read_play(context);//play RTSP
AVDictionary *optionsDict = NULL;
ccontext = context->streams[inst->video_stream_index]->codec;
inst->audioc = context->streams[1]->codec;
cout << "4start of decoding thread";
cout.flush();
codec = avcodec_find_decoder(ccontext->codec_id);
ccontext->pix_fmt = PIX_FMT_YUV420P;
AVCodec* audio_codec = avcodec_find_decoder(inst->audioc->codec_id);
inst->packet = new AVPacket();
if (!audio_codec) {
cout << "audio codec not found\n"; //fflush( stdout );
exit(1);
}
if (avcodec_open2(inst->audioc, audio_codec, NULL) < 0) {
cout << "could not open codec\n"; //fflush( stdout );
exit(1);
}
if (avcodec_open2(ccontext, codec, &optionsDict) < 0) exit(1);
cout << "5start of decoding thread";
cout.flush();
inst->pic = avcodec_alloc_frame();
av_init_packet(inst->packet);
while(av_read_frame(context,inst->packet) >= 0 && &inst->keepGoing)
{
if(inst->packet->stream_index == 0){//packet is video
int check = 0;
// av_init_packet(inst->packet);
int result = avcodec_decode_video2(ccontext, inst->pic, &check, inst->packet);
if(check)
break;
}
}
inst->originalVideoWidth = inst->pic->width;
inst->originalVideoHeight = inst->pic->height;
float aspect = (float)inst->originalVideoHeight / (float)inst->originalVideoWidth;
inst->newVideoWidth = inst->originalVideoWidth;
int newHeight = (int)(inst->newVideoWidth * aspect);
inst->newVideoHeight = newHeight;//(int)inst->originalVideoHeight / inst->originalVideoWidth * inst->newVideoWidth;// = new height
int size = avpicture_get_size(PIX_FMT_YUV420P, inst->originalVideoWidth, inst->originalVideoHeight);
uint8_t* picture_buf = (uint8_t*)(av_malloc(size));
avpicture_fill((AVPicture *) inst->pic, picture_buf, PIX_FMT_YUV420P, inst->originalVideoWidth, inst->originalVideoHeight);
picrgb = avcodec_alloc_frame();
int size2 = avpicture_get_size(PIX_FMT_YUV420P, inst->newVideoWidth, inst->newVideoHeight);
uint8_t* picture_buf2 = (uint8_t*)(av_malloc(size2));
avpicture_fill((AVPicture *) picrgb, picture_buf2, PIX_FMT_YUV420P, inst->newVideoWidth, inst->newVideoHeight);
if(ccontext->pix_fmt != PIX_FMT_YUV420P)
{
std::cout << "fmt != 420!!!: " << ccontext->pix_fmt << std::endl;//
// return (EXIT_SUCCESS);//-1;
}
if (inst->createForeignWindow(inst->myForeignWindow->windowGroup(),
"HelloForeignWindowAppIDqq", 0,
0, inst->newVideoWidth,
inst->newVideoHeight)) {
} else {
qDebug() << "The ForeginWindow was not properly initialized";
}
inst->keepGoing = true;
inst->img_convert_ctx = sws_getContext(inst->originalVideoWidth, inst->originalVideoHeight, PIX_FMT_YUV420P, inst->newVideoWidth, inst->newVideoHeight,
PIX_FMT_YUV420P, SWS_BILINEAR, NULL, NULL, NULL);
is = (VideoState*)av_mallocz(sizeof(VideoState));
if (!is)
return NULL;
is->audioStream = 1;
is->audio_st = context->streams[1];
is->audio_buf_size = 0;
is->audio_buf_index = 0;
is->videoStream = 0;
is->video_st = context->streams[0];
is->frame_timer = (double)av_gettime() / 1000000.0;
is->frame_last_delay = 40e-3;
is->av_sync_type = DEFAULT_AV_SYNC_TYPE;
//av_strlcpy(is->filename, filename, sizeof(is->filename));
is->iformat = pFormat;
is->ytop = 0;
is->xleft = 0;
/* start video display */
is->pictq_mutex = new QMutex();
is->pictq_cond = new QWaitCondition();
is->subpq_mutex = new QMutex();
is->subpq_cond = new QWaitCondition();
is->video_current_pts_time = av_gettime();
packet_queue_init(&audioq);
packet_queue_init(&videoq);
is->audioq = audioq;
is->videoq = videoq;
AVPacket* packet2 = new AVPacket();
ccontext->get_buffer = our_get_buffer;
ccontext->release_buffer = our_release_buffer;
av_init_packet(packet2);
while(inst->keepGoing)
{
if(av_read_frame(context,packet2) < 0 && keepGoing)
{
printf("bufferframe Could not read a frame from stream.\n");
fflush( stdout );
}else {
if(packet2->stream_index == 0) {
packet_queue_put(&videoq, packet2);
} else if(packet2->stream_index == 1) {
packet_queue_put(&audioq, packet2);
} else {
av_free_packet(packet2);
}
if(!videoThreadStarted)
{
videoThreadStarted = true;
QThread* thread = new QThread;
videoThread = new VideoStreamWorker(this);
// Give QThread ownership of Worker Object
videoThread->moveToThread(thread);
connect(videoThread, SIGNAL(error(QString)), this, SLOT(errorHandler(QString)));
QObject::connect(videoThread, SIGNAL(refreshNeeded()), this, SLOT(refreshNeededSlot()));
connect(thread, SIGNAL(started()), videoThread, SLOT(doWork()));
connect(videoThread, SIGNAL(finished()), thread, SLOT(quit()));
connect(videoThread, SIGNAL(finished()), videoThread, SLOT(deleteLater()));
connect(thread, SIGNAL(finished()), thread, SLOT(deleteLater()));
thread->start();
}
if(!audioThreadStarted)
{
audioThreadStarted = true;
QThread* thread = new QThread;
AudioStreamWorker* videoThread = new AudioStreamWorker(this);
// Give QThread ownership of Worker Object
videoThread->moveToThread(thread);
// Connect videoThread error signal to this errorHandler SLOT.
connect(videoThread, SIGNAL(error(QString)), this, SLOT(errorHandler(QString)));
// Connects the thread’s started() signal to the process() slot in the videoThread, causing it to start.
connect(thread, SIGNAL(started()), videoThread, SLOT(doWork()));
connect(videoThread, SIGNAL(finished()), thread, SLOT(quit()));
connect(videoThread, SIGNAL(finished()), videoThread, SLOT(deleteLater()));
// Make sure the thread object is deleted after execution has finished.
connect(thread, SIGNAL(finished()), thread, SLOT(deleteLater()));
thread->start();
}
}
} //finished main loop
int MyApp::video_thread() {
//VideoState *is = (VideoState *)arg;
AVPacket pkt1, *packet = &pkt1;
int len1, frameFinished;
double pts;
pic = avcodec_alloc_frame();
for(;;) {
if(packet_queue_get(&videoq, packet, 1) < 0) {
// means we quit getting packets
break;
}
pts = 0;
global_video_pkt_pts2 = packet->pts;
// Decode video frame
len1 = avcodec_decode_video2(ccontext, pic, &frameFinished, packet);
if(packet->dts == AV_NOPTS_VALUE
&& pic->opaque && *(uint64_t*)pic->opaque != AV_NOPTS_VALUE) {
pts = *(uint64_t *)pic->opaque;
} else if(packet->dts != AV_NOPTS_VALUE) {
pts = packet->dts;
} else {
pts = 0;
}
pts *= av_q2d(is->video_st->time_base);
// Did we get a video frame?
if(frameFinished) {
pts = synchronize_video(is, pic, pts);
actualPts = pts;
refreshSlot();
}
av_free_packet(packet);
}
av_free(pic);
return 0;
}
int MyApp::audio_thread() {
//VideoState *is = (VideoState *)arg;
AVPacket pkt1, *packet = &pkt1;
int len1, frameFinished;
ALuint source;
ALenum format = 0;
// ALuint frequency;
ALenum alError;
ALint val2;
ALuint buffers[NUM_BUFFERS];
int dataSize;
ALCcontext *aContext;
ALCdevice *device;
if (!alutInit(NULL, NULL)) {
// printf(stderr, "init alut error\n");
}
device = alcOpenDevice(NULL);
if (device == NULL) {
// printf(stderr, "device error\n");
}
//Create a context
aContext = alcCreateContext(device, NULL);
alcMakeContextCurrent(aContext);
if(!(aContext)) {
printf("Could not create the OpenAL context!\n");
return 0;
}
alListener3f(AL_POSITION, 0.0f, 0.0f, 0.0f);
//ALenum alError;
if(alGetError() != AL_NO_ERROR) {
cout << "could not create buffers";
cout.flush();
fflush( stdout );
return 0;
}
alGenBuffers(NUM_BUFFERS, buffers);
alGenSources(1, &source);
if(alGetError() != AL_NO_ERROR) {
cout << "after Could not create buffers or the source.\n";
cout.flush( );
return 0;
}
int i;
int indexOfPacket;
double pts;
//double pts;
int n;
for(i = 0; i < NUM_BUFFERS; i++)
{
if(packet_queue_get(&audioq, packet, 1) < 0) {
// means we quit getting packets
break;
}
cout << "streamindex=audio \n";
cout.flush( );
//printf("before decode audio\n");
//fflush( stdout );
// AVPacket *packet = new AVPacket();//malloc(sizeof(AVPacket*));
AVFrame *decodedFrame = NULL;
int gotFrame = 0;
// AVFrame* decodedFrame;
if(!decodedFrame) {
if(!(decodedFrame = avcodec_alloc_frame())) {
cout << "Run out of memory, stop the streaming...\n";
fflush( stdout );
cout.flush();
return -2;
}
} else {
avcodec_get_frame_defaults(decodedFrame);
}
int len = avcodec_decode_audio4(audioc, decodedFrame, &gotFrame, packet);
if(len < 0) {
cout << "Error while decoding.\n";
cout.flush( );
return -3;
}
if(len < 0) {
/* if error, skip frame */
is->audio_pkt_size = 0;
//break;
}
is->audio_pkt_data += len;
is->audio_pkt_size -= len;
pts = is->audio_clock;
// *pts_ptr = pts;
n = 2 * is->audio_st->codec->channels;
is->audio_clock += (double)packet->size/
(double)(n * is->audio_st->codec->sample_rate);
if(gotFrame) {
cout << "got audio frame.\n";
cout.flush( );
// We have a buffer ready, send it
dataSize = av_samples_get_buffer_size(NULL, audioc->channels,
decodedFrame->nb_samples, audioc->sample_fmt, 1);
if(!format) {
if(audioc->sample_fmt == AV_SAMPLE_FMT_U8 ||
audioc->sample_fmt == AV_SAMPLE_FMT_U8P) {
if(audioc->channels == 1) {
format = AL_FORMAT_MONO8;
} else if(audioc->channels == 2) {
format = AL_FORMAT_STEREO8;
}
} else if(audioc->sample_fmt == AV_SAMPLE_FMT_S16 ||
audioc->sample_fmt == AV_SAMPLE_FMT_S16P) {
if(audioc->channels == 1) {
format = AL_FORMAT_MONO16;
} else if(audioc->channels == 2) {
format = AL_FORMAT_STEREO16;
}
}
if(!format) {
cout << "OpenAL can't open this format of sound.\n";
cout.flush( );
return -4;
}
}
printf("albufferdata audio b4.\n");
fflush( stdout );
alBufferData(buffers[i], format, *decodedFrame->data, dataSize, decodedFrame->sample_rate);
cout << "after albufferdata all buffers \n";
cout.flush( );
av_free_packet(packet);
//=av_free(packet);
av_free(decodedFrame);
if((alError = alGetError()) != AL_NO_ERROR) {
printf("Error while buffering.\n");
printAlError(alError);
return -6;
}
}
}
cout << "before quoe buffers \n";
cout.flush();
alSourceQueueBuffers(source, NUM_BUFFERS, buffers);
cout << "before play.\n";
cout.flush();
alSourcePlay(source);
cout << "after play.\n";
cout.flush();
if((alError = alGetError()) != AL_NO_ERROR) {
cout << "error strating stream.\n";
cout.flush();
printAlError(alError);
return 0;
}
// AVPacket *pkt = &is->audio_pkt;
while(keepGoing)
{
while(packet_queue_get(&audioq, packet, 1) >= 0) {
// means we quit getting packets
do {
alGetSourcei(source, AL_BUFFERS_PROCESSED, &val2);
usleep(SLEEP_BUFFERING);
} while(val2 <= 0);
if(alGetError() != AL_NO_ERROR)
{
fprintf(stderr, "Error gettingsource :(\n");
return 1;
}
while(val2--)
{
ALuint buffer;
alSourceUnqueueBuffers(source, 1, &buffer);
if(alGetError() != AL_NO_ERROR)
{
fprintf(stderr, "Error unqueue buffers :(\n");
// return 1;
}
AVFrame *decodedFrame = NULL;
int gotFrame = 0;
// AVFrame* decodedFrame;
if(!decodedFrame) {
if(!(decodedFrame = avcodec_alloc_frame())) {
cout << "Run out of memory, stop the streaming...\n";
//fflush( stdout );
cout.flush();
return -2;
}
} else {
avcodec_get_frame_defaults(decodedFrame);
}
int len = avcodec_decode_audio4(audioc, decodedFrame, &gotFrame, packet);
if(len < 0) {
cout << "Error while decoding.\n";
cout.flush( );
is->audio_pkt_size = 0;
return -3;
}
is->audio_pkt_data += len;
is->audio_pkt_size -= len;
if(packet->size <= 0) {
/* No data yet, get more frames */
//continue;
}
if(gotFrame) {
pts = is->audio_clock;
len = synchronize_audio(is, (int16_t *)is->audio_buf,
packet->size, pts);
is->audio_buf_size = packet->size;
pts = is->audio_clock;
// *pts_ptr = pts;
n = 2 * is->audio_st->codec->channels;
is->audio_clock += (double)packet->size /
(double)(n * is->audio_st->codec->sample_rate);
if(packet->pts != AV_NOPTS_VALUE) {
is->audio_clock = av_q2d(is->audio_st->time_base)*packet->pts;
}
len = av_samples_get_buffer_size(NULL, audioc->channels,
decodedFrame->nb_samples, audioc->sample_fmt, 1);
alBufferData(buffer, format, *decodedFrame->data, len, decodedFrame->sample_rate);
if(alGetError() != AL_NO_ERROR)
{
fprintf(stderr, "Error buffering :(\n");
return 1;
}
alSourceQueueBuffers(source, 1, &buffer);
if(alGetError() != AL_NO_ERROR)
{
fprintf(stderr, "Error queueing buffers :(\n");
return 1;
}
}
}
alGetSourcei(source, AL_SOURCE_STATE, &val2);
if(val2 != AL_PLAYING)
alSourcePlay(source);
}
//pic = avcodec_alloc_frame();
}
qDebug() << "end audiothread";
return 1;
}
void MyApp::refreshSlot()
{
if(true)
{
printf("got frame %d, %d\n", pic->width, ccontext->width);
fflush( stdout );
sws_scale(img_convert_ctx, (const uint8_t **)pic->data, pic->linesize,
0, originalVideoHeight, &picrgb->data[0], &picrgb->linesize[0]);
printf("rescaled frame %d, %d\n", newVideoWidth, newVideoHeight);
fflush( stdout );
//av_free_packet(packet);
//av_init_packet(packet);
qDebug() << "waking audio as video finished";
////mutex.unlock();
//mutex2.lock();
doingVideoFrame = false;
//doingAudioFrame = false;
////mutex2.unlock();
//mutex2.unlock();
//w2->wakeAll();
//w->wakeAll();
qDebug() << "now woke audio";
//pic = picrgb;
uint8_t *srcy = picrgb->data[0];
uint8_t *srcu = picrgb->data[1];
uint8_t *srcv = picrgb->data[2];
printf("got src yuv frame %d\n", &srcy);
fflush( stdout );
unsigned char *ptr = NULL;
screen_get_buffer_property_pv(mScreenPixelBuffer, SCREEN_PROPERTY_POINTER, (void**) &ptr);
unsigned char *y = ptr;
unsigned char *u = y + (newVideoHeight * mStride) ;
unsigned char *v = u + (newVideoHeight * mStride) / 4;
int i = 0;
printf("got buffer picrgbwidth= %d \n", newVideoWidth);
fflush( stdout );
for ( i = 0; i < newVideoHeight; i++)
{
int doff = i * mStride;
int soff = i * picrgb->linesize[0];
memcpy(&y[doff], &srcy[soff], newVideoWidth);
}
for ( i = 0; i < newVideoHeight / 2; i++)
{
int doff = i * mStride / 2;
int soff = i * picrgb->linesize[1];
memcpy(&u[doff], &srcu[soff], newVideoWidth / 2);
}
for ( i = 0; i < newVideoHeight / 2; i++)
{
int doff = i * mStride / 2;
int soff = i * picrgb->linesize[2];
memcpy(&v[doff], &srcv[soff], newVideoWidth / 2);
}
printf("before posttoscreen \n");
fflush( stdout );
video_refresh_timer();
qDebug() << "end refreshslot";
}
else
{
}
}
void MyApp::refreshNeededSlot2()
{
printf("blitting to buffer");
fflush(stdout);
screen_buffer_t screen_buffer;
screen_get_window_property_pv(mScreenWindow, SCREEN_PROPERTY_RENDER_BUFFERS, (void**) &screen_buffer);
int attribs[] = { SCREEN_BLIT_SOURCE_WIDTH, newVideoWidth, SCREEN_BLIT_SOURCE_HEIGHT, newVideoHeight, SCREEN_BLIT_END };
int res2 = screen_blit(mScreenCtx, screen_buffer, mScreenPixelBuffer, attribs);
printf("dirty rectangles");
fflush(stdout);
int dirty_rects[] = { 0, 0, newVideoWidth, newVideoHeight };
screen_post_window(mScreenWindow, screen_buffer, 1, dirty_rects, 0);
printf("done screneposdtwindow");
fflush(stdout);
}
void MyApp::video_refresh_timer() {
testDelay = 0;
// VideoState *is = ( VideoState* )userdata;
VideoPicture *vp;
//double pts = 0 ;
double actual_delay, delay, sync_threshold, ref_clock, diff;
if(is->video_st) {
if(false)////is->pictq_size == 0)
{
testDelay = 1;
schedule_refresh(is, 1);
} else {
// vp = &is->pictq[is->pictq_rindex];
delay = actualPts - is->frame_last_pts; /* the pts from last time */
if(delay <= 0 || delay >= 1.0) {
/* if incorrect delay, use previous one */
delay = is->frame_last_delay;
}
/* save for next time */
is->frame_last_delay = delay;
is->frame_last_pts = actualPts;
is->video_current_pts = actualPts;
is->video_current_pts_time = av_gettime();
/* update delay to sync to audio */
ref_clock = get_audio_clock(is);
diff = actualPts - ref_clock;
/* Skip or repeat the frame. Take delay into account
FFPlay still doesn't "know if this is the best guess." */
sync_threshold = (delay > AV_SYNC_THRESHOLD) ? delay : AV_SYNC_THRESHOLD;
if(fabs(diff) < AV_NOSYNC_THRESHOLD) {
if(diff <= -sync_threshold) {
delay = 0;
} else if(diff >= sync_threshold) {
delay = 2 * delay;
}
}
is->frame_timer += delay;
/* computer the REAL delay */
actual_delay = is->frame_timer - (av_gettime() / 1000000.0);
if(actual_delay < 0.010) {
/* Really it should skip the picture instead */
actual_delay = 0.010;
}
testDelay = (int)(actual_delay * 1000 + 0.5);
schedule_refresh(is, (int)(actual_delay * 1000 + 0.5));
/* show the picture! */
//video_display(is);
// SDL_CondSignal(is->pictq_cond);
// SDL_UnlockMutex(is->pictq_mutex);
}
} else {
testDelay = 100;
schedule_refresh(is, 100);
}
}
void MyApp::schedule_refresh(VideoState *is, int delay) {
qDebug() << "start schedule refresh timer" << delay;
typeOfEvent = FF_REFRESH_EVENT2;
w->wakeAll();
// SDL_AddTimer(delay,
}I am currently waiting on data in a loop in the following way
QMutex mutex;
mutex.lock();
while(keepGoing)
{
qDebug() << "MAINTHREAD" << testDelay;
w->wait(&mutex);
mutex.unlock();
qDebug() << "MAINTHREAD past wait";
if(!keepGoing)
{
break;
}
if(testDelay > 0 && typeOfEvent == FF_REFRESH_EVENT2)
{
usleep(testDelay);
refreshNeededSlot2();
}
else if(testDelay > 0 && typeOfEvent == FF_QUIT_EVENT2)
{
keepGoing = false;
exit(0);
break;
// usleep(testDelay);
// refreshNeededSlot2();
}
qDebug() << "MAINTHREADend";
mutex.lock();
}
mutex.unlock();Please let me know if I need to provide any more relevent code. I'm sorry my code is untidy - I still learning c++ and have been modifying this code for over a week now as previously mentioned.
Just added a sample of output I'm seeing from print outs I do to console - I can't get my head around it (it's almost too complicated for my level of expertise) but when you see the frames being played and audio playing it's very difficult to give up especially when it took me a couple of weeks to get to this stage.
Please someone give me a hand if they spot the problem.
MAINTHREAD past wait
pts after syncvideo= 1073394046
got frame 640, 640
start video_refresh_timer
actualpts = 1.66833
frame lastpts = 1.63497
start schedule refresh timer need to delay for 123pts after syncvideo= 1073429033
got frame 640, 640
MAINTHREAD loop delay before refresh = 123
start video_refresh_timer
actualpts = 1.7017
frame lastpts = 1.66833
start schedule refresh timer need to delay for 115MAINTHREAD past wait
pts after syncvideo= 1073464021
got frame 640, 640
start video_refresh_timer
actualpts = 1.73507
frame lastpts = 1.7017
start schedule refresh timer need to delay for 140MAINTHREAD loop delay before refresh = 140
pts after syncvideo= 1073499008
got frame 640, 640
start video_refresh_timer
actualpts = 1.76843
frame lastpts = 1.73507
start schedule refresh timer need to delay for 163MAINTHREAD past wait
pts after syncvideo= 1073533996
got frame 640, 640
start video_refresh_timer
actualpts = 1.8018
frame lastpts = 1.76843
start schedule refresh timer need to delay for 188MAINTHREAD loop delay before refresh = 188
pts after syncvideo= 1073568983
got frame 640, 640
start video_refresh_timer
actualpts = 1.83517
frame lastpts = 1.8018
start schedule refresh timer need to delay for 246MAINTHREAD past wait
pts after syncvideo= 1073603971
got frame 640, 640
start video_refresh_timer
actualpts = 1.86853
frame lastpts = 1.83517
start schedule refresh timer need to delay for 299MAINTHREAD loop delay before refresh = 299
pts after syncvideo= 1073638958
got frame 640, 640
start video_refresh_timer
actualpts = 1.9019
frame lastpts = 1.86853
start schedule refresh timer need to delay for 358MAINTHREAD past wait
pts after syncvideo= 1073673946
got frame 640, 640
start video_refresh_timer
actualpts = 1.93527
frame lastpts = 1.9019
start schedule refresh timer need to delay for 416MAINTHREAD loop delay before refresh = 416
pts after syncvideo= 1073708933
got frame 640, 640
start video_refresh_timer
actualpts = 1.96863
frame lastpts = 1.93527
start schedule refresh timer need to delay for 474MAINTHREAD past wait
pts after syncvideo= 1073742872
got frame 640, 640
MAINTHREAD loop delay before refresh = 474
start video_refresh_timer
actualpts = 2.002
frame lastpts = 1.96863
start schedule refresh timer need to delay for 518MAINTHREAD past wait
pts after syncvideo= 1073760366
got frame 640, 640
start video_refresh_timer
actualpts = 2.03537
frame lastpts = 2.002
start schedule refresh timer need to delay for 575