Recherche avancée

Médias (1)

Mot : - Tags -/biographie

Autres articles (33)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • Demande de création d’un canal

    12 mars 2010, par

    En fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
    Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...)

  • Submit enhancements and plugins

    13 avril 2011

    If you have developed a new extension to add one or more useful features to MediaSPIP, let us know and its integration into the core MedisSPIP functionality will be considered.
    You can use the development discussion list to request for help with creating a plugin. As MediaSPIP is based on SPIP - or you can use the SPIP discussion list SPIP-Zone.

Sur d’autres sites (3189)

  • Getting Error during executin native android code

    3 avril 2013, par dilipkaklotar

    Error on my console

    bash : cannot set terminal process group (-1) : Inappropriate ioctl for device
    bash : no job control in this shell
    Your group is currently "mkpasswd". This indicates that your
    gid is not in /etc/group and your uid is not in /etc/passwd.

    The /etc/passwd (and possibly /etc/group) files should be rebuilt.
    See the man pages for mkpasswd and mkgroup then, for example, run

    mkpasswd -l [-d] > /etc/passwd
    mkgroup -l [-d] > /etc/group

    Note that the -d switch is necessary for domain users.
    - ]0 ; -
    - [32mDILIP@DILIP-PC -[33m -[0m
    $

    public class VideoBrowser extends ListActivity implements ListView.OnScrollListener {

    /*this part communicates with native code through jni (java native interface)*/
    //load the native library
    static {
       System.loadLibrary("ffmpeg");
       System.loadLibrary("ffmpeg-test-jni");
    }
    //declare the jni functions
    private static native void naInit(String _videoFileName);
    private static native int[] naGetVideoResolution();
    private static native String naGetVideoCodecName();
    private static native String naGetVideoFormatName();
    private static native void naClose();

    private void showVideoInfo(final File _file) {
       String videoFilename = _file.getAbsolutePath();
       naInit(videoFilename);
       int[] prVideoRes = naGetVideoResolution();
       String prVideoCodecName = naGetVideoCodecName();
       String prVideoFormatName = naGetVideoFormatName();
       naClose();
       String displayText = "Video: " + videoFilename + "\n";
       displayText += "Video Resolution: " + prVideoRes[0] + "x" + prVideoRes[1] + "\n";
       displayText += "Video Codec: " + prVideoCodecName + "\n";
       displayText += "Video Format: " + prVideoFormatName + "\n";
       text_titlebar_text.setText(displayText);
    }


    /*the rest of the file deals with UI and other stuff*/
    private Context mContext;
    public static VideoBrowser self;

    /**
    * activity life cycle: this part of the source code deals with activity life cycle
    */
    @Override
    public void onCreate(Bundle icicle) {
       super.onCreate(icicle);
       mContext = this.getApplicationContext();
       self = this;
       initUI();
    }

    @Override
    protected void onDestroy() {
       super.onDestroy();
       unbindDisplayEntries();
    }

    public void unbindDisplayEntries() {
       if (displayEntries!=null) {
           int l_count = displayEntries.size();
           for (int i = 0; i < l_count; ++i) {
               IconifiedTextSelected l_its = displayEntries.get(i);
               if (l_its != null) {
                   Drawable l_dr = l_its.getIcon();
                   if (l_dr != null) {
                       l_dr.setCallback(null);
                       l_dr = null;
                   }
               }
           }
       }
       if (l_displayEntries!=null) {
           int l_count = l_displayEntries.size();
           for (int i = 0; i < l_count; ++i) {
               IconifiedTextSelected l_its = l_displayEntries.get(i);
               if (l_its != null) {
                   Drawable l_dr = l_its.getIcon();
                   if (l_dr != null) {
                       l_dr.setCallback(null);
                       l_dr = null;
                   }
               }
           }
       }
    }

    /**
    * Data: this part of the code deals with data processing
    */
    public List<iconifiedtextselected> displayEntries = new ArrayList<iconifiedtextselected>();
    public static List<iconifiedtextselected> l_displayEntries = new ArrayList<iconifiedtextselected>();;

    /**load images
    * this part of code deals with loading of images
    */
    private File currentDirectory;
    public int media_browser_load_option = 2;
    private static int last_media_browser_load_option = 2;
    private static int number_of_icons = 0;
    private static final String upOneLevel = "..";

    LoadVideoTask loadTask;
    private void loadVideosFromDirectory(String _dir) {
       try {
           loadTask = new LoadVideoTask();
           loadTask.execute(_dir);
       } catch (Exception e) {
           Toast.makeText(this, "Load media fail!", Toast.LENGTH_SHORT).show();
       }
    }

    private void getVideosFromDirectoryNonRecurAddParent(File _dir) {
       //add the upper one level data
       if (_dir.getParent()!=null) {
           this.displayEntries.add(new IconifiedTextSelected(
                   upOneLevel,
                   getResources().getDrawable(R.drawable.folderback),
                   false, false, 0));
       }
    }

    private void getVideosFromDirectoryNonRecur(File _dir) {
       Drawable folderIcon = this.getResources().getDrawable(R.drawable.normalfolder);
       //add the
       if (!_dir.isDirectory()) {
           return;
       }
       File[] files = _dir.listFiles();
       if (files == null) {
           return;
       }
       Drawable videoIcon = null;
       int l_iconType = 0;
       for (File currentFile : files) {
           if (currentFile.isDirectory()) {
               //if it&#39;s a directory
               this.displayEntries.add(new IconifiedTextSelected(
                       currentFile.getPath(),
                       folderIcon, false, false, 0));
           } else {
               String l_filename = currentFile.getName();
               if (checkEndsWithInStringArray(l_filename,
                           getResources().getStringArray(R.array.fileEndingVideo))) {
                   if (number_of_icons &lt; 10) {
                       videoIcon = null;
                       ++number_of_icons;
                       l_iconType = 22;
                   } else {
                       videoIcon = null;
                       l_iconType = 2;
                   }
                   this.displayEntries.add(new IconifiedTextSelected(
                           currentFile.getPath(),
                           videoIcon, false, false, l_iconType));
               }
           }
       }
    }

    private void getVideosFromDirectoryRecur(File _dir) {
       Drawable videoIcon = null;
       File[] files = _dir.listFiles();
       int l_iconType = 2;
       if (files == null) {
           return;
       }
       for (File currentFile : files) {
           if (currentFile.isDirectory()) {
               getVideosFromDirectoryRecur(currentFile);
               continue;
           } else {
               String l_filename = currentFile.getName();
               //if it&#39;s an image file
               if (checkEndsWithInStringArray(l_filename,
                       getResources().getStringArray(R.array.fileEndingVideo))) {
                   if (number_of_icons &lt; 10) {
                       videoIcon = null;
                       ++number_of_icons;
                       l_iconType = 22;
                   } else {
                       videoIcon = null;
                       l_iconType = 2;
                   }
                   this.displayEntries.add(new IconifiedTextSelected(
                           currentFile.getPath(),
                           videoIcon, false, false, l_iconType));
               }
           }
       }
    }

    private void getVideosFromGallery() {
       Drawable videoIcon = null;
       Uri uri = MediaStore.Video.Media.EXTERNAL_CONTENT_URI;
       String[] projection = {MediaStore.Video.Media.DATA};
       Cursor l_cursor = this.managedQuery(uri, projection, null, null, null);
       int videoNameColumnIndex;
       String videoFilename;
       File videoFile;
       int l_iconType = 2;
       if (l_cursor!=null) {
           if (l_cursor.moveToFirst()) {
               do {
                   videoNameColumnIndex = l_cursor.getColumnIndexOrThrow(
                           MediaStore.Images.Media.DATA);
                   videoFilename = l_cursor.getString(videoNameColumnIndex);
                   videoFile = new File(videoFilename);
                   if (!videoFile.exists()) {
                       continue;
                   }
                   if (number_of_icons &lt;= 10) {
                       videoIcon = null;
                       ++number_of_icons;
                       l_iconType = 22;
                   } else {
                       videoIcon = null;
                       l_iconType = 2;
                   }
                   this.displayEntries.add(new IconifiedTextSelected(
                           videoFile.getAbsolutePath(),
                           videoIcon, false, false, l_iconType));
               } while (l_cursor.moveToNext());
           }
       }
       if (l_cursor!=null) {
           l_cursor.close();
       }
    }

    private boolean checkEndsWithInStringArray(String checkItsEnd,
           String[] fileEndings){
       for(String aEnd : fileEndings){
           if(checkItsEnd.endsWith(aEnd))
               return true;
       }
       return false;
    }

    private class LoadVideoTask extends AsyncTask {
       @Override
       protected void onPreExecute() {
           System.gc();
           displayEntries.clear();
           showDialog(DIALOG_LOAD_MEDIA);
       }
       @Override
       protected Void doInBackground(String... params) {
           File l_root = new File(params[0]);
           if (l_root.isDirectory()) {
               number_of_icons = 0;
               currentDirectory = l_root;
               if (media_browser_load_option == 0) {
                   //list all videos in the root directory without going into sub folder
                   getVideosFromDirectoryNonRecurAddParent(l_root);
                   getVideosFromDirectoryNonRecur(l_root);
               } else if (media_browser_load_option == 1) {
                   //list all videos in the root folder recursively
                   getVideosFromDirectoryRecur(l_root);
               } else if (media_browser_load_option == 2) {
                   //list all videos in the gallery
                   getVideosFromGallery();
               }
           }
           return null;
       }
       @Override
       protected void onPostExecute(Void n) {
           refreshUI();
           dismissDialog(DIALOG_LOAD_MEDIA);
       }
    }

    /**
    * UI: this part of the source code deals with UI
    */
    //bottom menu
    private int currentFocusedBtn = 1;
    private Button btn_bottommenu1;
    private Button btn_bottommenu2;
    private Button btn_bottommenu3;
    //private Button btn_bottommenu4;
    //title bar
    private TextView text_titlebar_text;

    private void initUI() {
       this.requestWindowFeature(Window.FEATURE_NO_TITLE);
       this.setContentView(R.layout.video_browser);
       //title bar
       text_titlebar_text = (TextView) findViewById(R.id.titlebar_text);
       text_titlebar_text.setText("Click a video to display info");

       //bottom menu
       int l_btnWidth = this.getWindowManager().getDefaultDisplay().getWidth()/4;
       btn_bottommenu1 = (Button) findViewById(R.id.video_browser_btn1);
       //btn_bottommenu1 = (ActionMenuButton) findViewById(R.id.main_topsecretimport_btn1);
       btn_bottommenu1.setWidth(l_btnWidth);
       btn_bottommenu1.setOnClickListener(new View.OnClickListener() {
           public void onClick(View v) {
               btn_bottommenu1.setEnabled(false);
               btn_bottommenu2.setEnabled(true);
               btn_bottommenu3.setEnabled(true);
               currentFocusedBtn = 1;
               last_list_view_pos = 0;
               media_browser_load_option = 2;
               last_media_browser_load_option = media_browser_load_option;
               loadVideosFromDirectory("/sdcard/");
           }
       });
       btn_bottommenu2 = (Button) findViewById(R.id.video_browser_btn2);
       btn_bottommenu2.setWidth(l_btnWidth);
       btn_bottommenu2.setOnClickListener(new View.OnClickListener() {
           public void onClick(View v) {
               btn_bottommenu1.setEnabled(true);
               btn_bottommenu2.setEnabled(false);
               btn_bottommenu3.setEnabled(true);
               currentFocusedBtn = 2;
               last_list_view_pos = 0;
               media_browser_load_option = 0;
               last_media_browser_load_option = media_browser_load_option;
               loadVideosFromDirectory("/sdcard/");
           }
       });
       btn_bottommenu3 = (Button) findViewById(R.id.video_browser_btn3);
       btn_bottommenu3.setWidth(l_btnWidth);
       btn_bottommenu3.setOnClickListener(new View.OnClickListener() {
           public void onClick(View v) {
               btn_bottommenu1.setEnabled(true);
               btn_bottommenu2.setEnabled(true);
               btn_bottommenu3.setEnabled(false);
               currentFocusedBtn = 3;
               last_list_view_pos = 0;
               media_browser_load_option = 1;
               last_media_browser_load_option = media_browser_load_option;
               loadVideosFromDirectory("/sdcard/");
           }
       });
       media_browser_load_option = last_media_browser_load_option;
       if (media_browser_load_option==2) {
           btn_bottommenu1.setEnabled(false);
       } else if (media_browser_load_option==0) {
           btn_bottommenu2.setEnabled(false);
       } else if (media_browser_load_option==1){
           btn_bottommenu3.setEnabled(false);
       }
       loadVideosFromDirectory("/sdcard/");
    }
    //refresh the UI when the directoryEntries changes
    private static int last_list_view_pos = 0;
    public void refreshUI() {
       int l_btnWidth = this.getWindowManager().getDefaultDisplay().getWidth()/4;
       btn_bottommenu1.setWidth(l_btnWidth);
       btn_bottommenu2.setWidth(l_btnWidth);
       btn_bottommenu3.setWidth(l_btnWidth);
       //btn_bottommenu4.setWidth(l_btnWidth);

       SlowAdapter itla = new SlowAdapter(this);
       itla.setListItems(this.displayEntries);    
       this.setListAdapter(itla);
       getListView().setOnScrollListener(this);
       int l_size = this.displayEntries.size();
       if (l_size > 50) {
           getListView().setFastScrollEnabled(true);
       } else {
           getListView().setFastScrollEnabled(false);
       }
       if (l_size > 0) {
           if (last_list_view_pos &lt; l_size) {
               getListView().setSelection(last_list_view_pos);
           } else {
               getListView().setSelection(l_size-1);
           }
       }
       registerForContextMenu(getListView());
    }

    @Override
    public void onConfigurationChanged (Configuration newConfig) {
       super.onConfigurationChanged(newConfig);
       refreshUI();
    }

    static final int DIALOG_LOAD_MEDIA = 1;
    static final int DIALOG_HELP = 2;
    @Override
    protected Dialog onCreateDialog(int id) {
       switch(id) {
       case DIALOG_LOAD_MEDIA:
           ProgressDialog dialog = new ProgressDialog(this);
           dialog.setTitle("Load Files");
           dialog.setMessage("Please wait while loading...");
           dialog.setIndeterminate(true);
           dialog.setCancelable(true);
           return dialog;
       default:
           return null;
       }
    }
    /**
    * scroll events methods: this part of the source code contain the control source code
    * for handling scroll events
    */
    private boolean mBusy = false;
    private void disableButtons() {
       btn_bottommenu1.setEnabled(false);
       btn_bottommenu2.setEnabled(false);
       btn_bottommenu3.setEnabled(false);
    }

    private void enableButtons() {
       if (currentFocusedBtn!=1) {
           btn_bottommenu1.setEnabled(true);
       }
       if (currentFocusedBtn!=2) {
           btn_bottommenu2.setEnabled(true);
       }
       if (currentFocusedBtn!=3) {
           btn_bottommenu3.setEnabled(true);
       }
    }
    public void onScroll(AbsListView view, int firstVisibleItem,
           int visibleItemCount, int totalItemCount) {
       last_list_view_pos = view.getFirstVisiblePosition();
    }

    //private boolean mSaveMemory = false;
    public void onScrollStateChanged(AbsListView view, int scrollState) {      
       switch (scrollState) {
       case OnScrollListener.SCROLL_STATE_IDLE:
           enableButtons();
           mBusy = false;
           int first = view.getFirstVisiblePosition();
           int count = view.getChildCount();
           int l_releaseTarget;
           for (int i=0; i/if outofmemory, we try to clean up 10 view image resources,
                       //and try again
                       for (int j = 0; j &lt; 10; ++j) {
                           l_releaseTarget = first - count - j;
                           if (l_releaseTarget > 0) {
                               IconifiedTextSelected l_its = displayEntries.get(l_releaseTarget);
                               IconifiedTextSelectedView l_itsv = (IconifiedTextSelectedView)
                                   this.getListView().getChildAt(l_releaseTarget);
                               if (l_itsv!=null) {
                                   l_itsv.setIcon(null);
                               }
                               if (l_its != null) {
                                   Drawable l_dr = l_its.getIcon();
                                   l_its.setIcon(null);
                                   if (l_dr != null) {
                                       l_dr.setCallback(null);
                                       l_dr = null;
                                   }
                               }
                           }
                       }
                       System.gc();
                       //after clean up, we try again
                       if (l_type == 1) {
                           l_icon = null;
                       } else if (l_type == 2) {
                           l_icon = null;
                       }
                   }
                   this.displayEntries.get(first+i).setIcon(l_icon);
                   if (l_icon != null) {
                       t.setIcon(l_icon);
                       t.setTag(null);
                   }
               }
           }
           //System.gc();
           break;
       case OnScrollListener.SCROLL_STATE_TOUCH_SCROLL:
           disableButtons();
           mBusy = true;
           break;
       case OnScrollListener.SCROLL_STATE_FLING:
           disableButtons();
           mBusy = true;
           break;
       }
    }

    /**
    * List item click action
    */
    private File currentFile;
    @Override
    protected void onListItemClick(ListView l, View v, int position, long id) {
       super.onListItemClick(l, v, position, id);
       last_list_view_pos = position;
       String selectedFileOrDirName = this.displayEntries.get((int)id).getText();
       if (selectedFileOrDirName.equals(upOneLevel)) {
           if (this.currentDirectory.getParent()!=null) {
               last_list_view_pos = 0;
               browseTo(this.currentDirectory.getParentFile());
           }
       } else {
           File l_clickedFile = new File(this.displayEntries.get((int)id).getText());
           if (l_clickedFile != null) {
               if (l_clickedFile.isDirectory()) {
                   last_list_view_pos = 0;
                   browseTo(l_clickedFile);
               } else {
                   showVideoInfo(l_clickedFile);
               }
           }
       }
    }

    private void browseTo(final File _dir) {
       if (_dir.isDirectory()) {
           this.currentDirectory = _dir;
           loadVideosFromDirectory(_dir.getAbsolutePath());
       }
    }

    /**
    * Slow adapter: this part of the code implements the list adapter
    * Will not bind views while the list is scrolling
    */
    private class SlowAdapter extends BaseAdapter {
       /** Remember our context so we can use it when constructing views. */
       private Context mContext;

       private List<iconifiedtextselected> mItems = new ArrayList<iconifiedtextselected>();

       public SlowAdapter(Context context) {
           mContext = context;
       }

       public void setListItems(List<iconifiedtextselected> lit)
       { mItems = lit; }

       /** @return The number of items in the */
       public int getCount() { return mItems.size(); }

       public Object getItem(int position)
       { return mItems.get(position); }

       /** Use the array index as a unique id. */
       public long getItemId(int position) {
           return position;
       }

       /** @param convertView The old view to overwrite, if one is passed
        * @returns a IconifiedTextSelectedView that holds wraps around an IconifiedText */
       public View getView(int position, View convertView, ViewGroup parent) {
           IconifiedTextSelectedView btv;
           if (convertView == null) {
               btv = new IconifiedTextSelectedView(mContext, mItems.get(position));
           } else { // Reuse/Overwrite the View passed
               // We are assuming(!) that it is castable!
               btv = (IconifiedTextSelectedView) convertView;
               btv.setText(mItems.get(position).getText());
           }
           if (position==0) {
               if (VideoBrowser.self.media_browser_load_option==0) {
                   btv.setIcon(R.drawable.folderback);
               } else if (mItems.get(0).getIcon()!=null) {
                   btv.setIcon(mItems.get(position).getIcon());
               } else {
                   btv.setIcon(R.drawable.video);
               }
           }
           //in busy mode
           else if (mBusy){
               //if icon is NULL: the icon is not loaded yet; load default icon
               if (mItems.get(position).getIcon()==null) {
                   btv.setIcon(R.drawable.video);
                   //mark this view, indicates the icon is not loaded
                   btv.setTag(this);
               } else {
                   //if icon is not null, just display the icon
                   btv.setIcon(mItems.get(position).getIcon());
                   //mark this view, indicates the icon is loaded
                   btv.setTag(null);
               }
           } else {
               //if not busy
               Drawable d = mItems.get(position).getIcon();
               if (d == null) {
                   //icon is not loaded, load now
                   btv.setIcon(R.drawable.video);
                   btv.setTag(this);
               } else {
                   btv.setIcon(mItems.get(position).getIcon());
                   btv.setTag(null);
               }
           }
           return btv;
       }
    }
    </iconifiedtextselected></iconifiedtextselected></iconifiedtextselected></iconifiedtextselected></iconifiedtextselected></iconifiedtextselected></iconifiedtextselected>

    }

  • FFMPEG - Read video file and convert to Bitmap

    11 avril 2013, par user1573610

    I can successfully read a video file using ffmpeg

    Now i want to show these video frames on my MFC mdi.

    For that i need a bitmap to feed my CBitmap::FromHandle() function

    memDC.CreateCompatibleDC(dc);

    CBitmap * bmp = CBitmap::FromHandle();

    CBitmap * oldBmp = memDC.SelectObject(bmp);

    dc->BitBlt(0,0,320,240,&amp;memDC,0,0,SRCCOPY);

    For ffmpeg i am using dranger tutorial 01

    http://dranger.com/ffmpeg/tutorial01.html

    Please advise how to convert frames to Bitmap

    Thanks

  • Trying to sync audio/visual using FFMpeg and openAL

    22 août 2013, par user1379811

    hI have been studying dranger ffmpeg tutorial which explains how to sync audio and visual once you have the frames displayed and audio playing which is where im at.

    Unfortunately, the tutorial is out of date (Stephen Dranger explaained that himself to me) and also uses sdl which im not doing - this is for Blackberry 10 application.

    I just cannot make the video frames display at the correct speed (they are just playing very fast) and I have been trying for over a week now - seriously !

    I have 3 threads happening - one to read from stream into audio and video queues and then 2 threads for audio and video.

    If somebody could explain whats happening after scanning my relevent code you would be a lifesaver.

    The delay (what I pass to usleep(testDelay) seems to be going up (incrementing) which doesn't seem right to me.

    count = 1;
       MyApp* inst = worker->app;//(VideoUploadFacebook*)arg;
       qDebug() &lt;&lt; "\n start loadstream";
       w = new QWaitCondition();
       w2 = new QWaitCondition();
       context = avformat_alloc_context();
       inst->threadStarted = true;
       cout &lt;&lt; "start of decoding thread";
       cout.flush();


       av_register_all();
       avcodec_register_all();
       avformat_network_init();
       av_log_set_callback(&amp;log_callback);
       AVInputFormat   *pFormat;
       //const char      device[]     = "/dev/video0";
       const char      formatName[] = "mp4";
       cout &lt;&lt; "2start of decoding thread";
       cout.flush();



       if (!(pFormat = av_find_input_format(formatName))) {
           printf("can&#39;t find input format %s\n", formatName);
           //return void*;
       }
       //open rtsp
       if(avformat_open_input(&amp;context, inst->capturedUrl.data(), pFormat,NULL) != 0){
           // return ;
           cout &lt;&lt; "error opening of decoding thread: " &lt;&lt; inst->capturedUrl.data();
           cout.flush();
       }

       cout &lt;&lt; "3start of decoding thread";
       cout.flush();
       // av_dump_format(context, 0, inst->capturedUrl.data(), 0);
       /*   if(avformat_find_stream_info(context,NULL) &lt; 0){
           return EXIT_FAILURE;
       }
        */
       //search video stream
       for(int i =0;inb_streams;i++){
           if(context->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO)
               inst->video_stream_index = i;
       }
       cout &lt;&lt; "3z start of decoding thread";
       cout.flush();
       AVFormatContext* oc = avformat_alloc_context();
       av_read_play(context);//play RTSP
       AVDictionary *optionsDict = NULL;
       ccontext = context->streams[inst->video_stream_index]->codec;

       inst->audioc = context->streams[1]->codec;

       cout &lt;&lt; "4start of decoding thread";
       cout.flush();
       codec = avcodec_find_decoder(ccontext->codec_id);
       ccontext->pix_fmt = PIX_FMT_YUV420P;

       AVCodec* audio_codec = avcodec_find_decoder(inst->audioc->codec_id);
       inst->packet = new AVPacket();
       if (!audio_codec) {
           cout &lt;&lt; "audio codec not found\n"; //fflush( stdout );
           exit(1);
       }

       if (avcodec_open2(inst->audioc, audio_codec, NULL) &lt; 0) {
           cout &lt;&lt; "could not open codec\n"; //fflush( stdout );
           exit(1);
       }

       if (avcodec_open2(ccontext, codec, &amp;optionsDict) &lt; 0) exit(1);

       cout &lt;&lt; "5start of decoding thread";
       cout.flush();
       inst->pic = avcodec_alloc_frame();

       av_init_packet(inst->packet);

       while(av_read_frame(context,inst->packet) >= 0 &amp;&amp; &amp;inst->keepGoing)
       {

           if(inst->packet->stream_index == 0){//packet is video

               int check = 0;



               // av_init_packet(inst->packet);
               int result = avcodec_decode_video2(ccontext, inst->pic, &amp;check, inst->packet);

               if(check)
                   break;
           }
       }



       inst->originalVideoWidth = inst->pic->width;
       inst->originalVideoHeight = inst->pic->height;
       float aspect = (float)inst->originalVideoHeight / (float)inst->originalVideoWidth;
       inst->newVideoWidth = inst->originalVideoWidth;
       int newHeight = (int)(inst->newVideoWidth * aspect);
       inst->newVideoHeight = newHeight;//(int)inst->originalVideoHeight / inst->originalVideoWidth * inst->newVideoWidth;// = new height
       int size = avpicture_get_size(PIX_FMT_YUV420P, inst->originalVideoWidth, inst->originalVideoHeight);
       uint8_t* picture_buf = (uint8_t*)(av_malloc(size));
       avpicture_fill((AVPicture *) inst->pic, picture_buf, PIX_FMT_YUV420P, inst->originalVideoWidth, inst->originalVideoHeight);

       picrgb = avcodec_alloc_frame();
       int size2 = avpicture_get_size(PIX_FMT_YUV420P, inst->newVideoWidth, inst->newVideoHeight);
       uint8_t* picture_buf2 = (uint8_t*)(av_malloc(size2));
       avpicture_fill((AVPicture *) picrgb, picture_buf2, PIX_FMT_YUV420P, inst->newVideoWidth, inst->newVideoHeight);



       if(ccontext->pix_fmt != PIX_FMT_YUV420P)
       {
           std::cout &lt;&lt; "fmt != 420!!!: " &lt;&lt; ccontext->pix_fmt &lt;&lt; std::endl;//
           // return (EXIT_SUCCESS);//-1;

       }


       if (inst->createForeignWindow(inst->myForeignWindow->windowGroup(),
               "HelloForeignWindowAppIDqq", 0,
               0, inst->newVideoWidth,
               inst->newVideoHeight)) {

       } else {
           qDebug() &lt;&lt; "The ForeginWindow was not properly initialized";
       }




       inst->keepGoing = true;

       inst->img_convert_ctx = sws_getContext(inst->originalVideoWidth, inst->originalVideoHeight, PIX_FMT_YUV420P, inst->newVideoWidth, inst->newVideoHeight,
               PIX_FMT_YUV420P, SWS_BILINEAR, NULL, NULL, NULL);

       is = (VideoState*)av_mallocz(sizeof(VideoState));
       if (!is)
           return NULL;

       is->audioStream = 1;
       is->audio_st = context->streams[1];
       is->audio_buf_size = 0;
       is->audio_buf_index = 0;
       is->videoStream = 0;
       is->video_st = context->streams[0];

       is->frame_timer = (double)av_gettime() / 1000000.0;
       is->frame_last_delay = 40e-3;

       is->av_sync_type = DEFAULT_AV_SYNC_TYPE;
       //av_strlcpy(is->filename, filename, sizeof(is->filename));
       is->iformat = pFormat;
       is->ytop    = 0;
       is->xleft   = 0;

       /* start video display */
       is->pictq_mutex = new QMutex();
       is->pictq_cond  = new QWaitCondition();

       is->subpq_mutex = new QMutex();
       is->subpq_cond  = new QWaitCondition();

       is->video_current_pts_time = av_gettime();


       packet_queue_init(&amp;audioq);

       packet_queue_init(&amp;videoq);
       is->audioq = audioq;
       is->videoq = videoq;
       AVPacket* packet2  = new AVPacket();

       ccontext->get_buffer = our_get_buffer;
       ccontext->release_buffer = our_release_buffer;


       av_init_packet(packet2);
       while(inst->keepGoing)
       {


           if(av_read_frame(context,packet2) &lt; 0 &amp;&amp; keepGoing)
           {
               printf("bufferframe Could not read a frame from stream.\n");
               fflush( stdout );


           }else {



               if(packet2->stream_index == 0) {
                   packet_queue_put(&amp;videoq, packet2);
               } else if(packet2->stream_index == 1) {
                   packet_queue_put(&amp;audioq, packet2);
               } else {
                   av_free_packet(packet2);
               }


               if(!videoThreadStarted)
               {
                   videoThreadStarted = true;
                   QThread* thread = new QThread;
                   videoThread = new VideoStreamWorker(this);

                   // Give QThread ownership of Worker Object
                   videoThread->moveToThread(thread);
                   connect(videoThread, SIGNAL(error(QString)), this, SLOT(errorHandler(QString)));
                   QObject::connect(videoThread, SIGNAL(refreshNeeded()), this, SLOT(refreshNeededSlot()));
                   connect(thread, SIGNAL(started()), videoThread, SLOT(doWork()));
                   connect(videoThread, SIGNAL(finished()), thread, SLOT(quit()));
                   connect(videoThread, SIGNAL(finished()), videoThread, SLOT(deleteLater()));
                   connect(thread, SIGNAL(finished()), thread, SLOT(deleteLater()));

                   thread->start();
               }

               if(!audioThreadStarted)
               {
                   audioThreadStarted = true;
                   QThread* thread = new QThread;
                   AudioStreamWorker* videoThread = new AudioStreamWorker(this);

                   // Give QThread ownership of Worker Object
                   videoThread->moveToThread(thread);

                   // Connect videoThread error signal to this errorHandler SLOT.
                   connect(videoThread, SIGNAL(error(QString)), this, SLOT(errorHandler(QString)));

                   // Connects the thread’s started() signal to the process() slot in the videoThread, causing it to start.
                   connect(thread, SIGNAL(started()), videoThread, SLOT(doWork()));
                   connect(videoThread, SIGNAL(finished()), thread, SLOT(quit()));
                   connect(videoThread, SIGNAL(finished()), videoThread, SLOT(deleteLater()));

                   // Make sure the thread object is deleted after execution has finished.
                   connect(thread, SIGNAL(finished()), thread, SLOT(deleteLater()));

                   thread->start();
               }

           }

       } //finished main loop

       int MyApp::video_thread() {
       //VideoState *is = (VideoState *)arg;
       AVPacket pkt1, *packet = &amp;pkt1;
       int len1, frameFinished;

       double pts;
       pic = avcodec_alloc_frame();

       for(;;) {
           if(packet_queue_get(&amp;videoq, packet, 1) &lt; 0) {
               // means we quit getting packets
               break;
           }

           pts = 0;

           global_video_pkt_pts2 = packet->pts;
           // Decode video frame
           len1 =  avcodec_decode_video2(ccontext, pic, &amp;frameFinished, packet);
           if(packet->dts == AV_NOPTS_VALUE
                   &amp;&amp; pic->opaque &amp;&amp; *(uint64_t*)pic->opaque != AV_NOPTS_VALUE) {
               pts = *(uint64_t *)pic->opaque;
           } else if(packet->dts != AV_NOPTS_VALUE) {
               pts = packet->dts;
           } else {
               pts = 0;
           }
           pts *= av_q2d(is->video_st->time_base);
           // Did we get a video frame?

                   if(frameFinished) {
                       pts = synchronize_video(is, pic, pts);
                       actualPts = pts;
                       refreshSlot();
                   }
                   av_free_packet(packet);
       }
       av_free(pic);
       return 0;
    }


    int MyApp::audio_thread() {
       //VideoState *is = (VideoState *)arg;
       AVPacket pkt1, *packet = &amp;pkt1;
       int len1, frameFinished;
       ALuint source;
       ALenum format = 0;
       //   ALuint frequency;
       ALenum alError;
       ALint val2;
       ALuint buffers[NUM_BUFFERS];
       int dataSize;


       ALCcontext *aContext;
       ALCdevice *device;
       if (!alutInit(NULL, NULL)) {
           // printf(stderr, "init alut error\n");
       }
       device = alcOpenDevice(NULL);
       if (device == NULL) {
           // printf(stderr, "device error\n");
       }

       //Create a context
       aContext = alcCreateContext(device, NULL);
       alcMakeContextCurrent(aContext);
       if(!(aContext)) {
           printf("Could not create the OpenAL context!\n");
           return 0;
       }

       alListener3f(AL_POSITION, 0.0f, 0.0f, 0.0f);









       //ALenum alError;
       if(alGetError() != AL_NO_ERROR) {
           cout &lt;&lt; "could not create buffers";
           cout.flush();
           fflush( stdout );
           return 0;
       }
       alGenBuffers(NUM_BUFFERS, buffers);
       alGenSources(1, &amp;source);
       if(alGetError() != AL_NO_ERROR) {
           cout &lt;&lt; "after Could not create buffers or the source.\n";
           cout.flush(  );
           return 0;
       }

       int i;
       int indexOfPacket;
       double pts;
       //double pts;
       int n;


       for(i = 0; i &lt; NUM_BUFFERS; i++)
       {
           if(packet_queue_get(&amp;audioq, packet, 1) &lt; 0) {
               // means we quit getting packets
               break;
           }
           cout &lt;&lt; "streamindex=audio \n";
           cout.flush(  );
           //printf("before decode  audio\n");
           //fflush( stdout );
           // AVPacket *packet = new AVPacket();//malloc(sizeof(AVPacket*));
           AVFrame *decodedFrame = NULL;
           int gotFrame = 0;
           // AVFrame* decodedFrame;

           if(!decodedFrame) {
               if(!(decodedFrame = avcodec_alloc_frame())) {
                   cout &lt;&lt; "Run out of memory, stop the streaming...\n";
                   fflush( stdout );
                   cout.flush();


                   return -2;
               }
           } else {
               avcodec_get_frame_defaults(decodedFrame);
           }

           int  len = avcodec_decode_audio4(audioc, decodedFrame, &amp;gotFrame, packet);
           if(len &lt; 0) {
               cout &lt;&lt; "Error while decoding.\n";
               cout.flush(  );

               return -3;
           }
           if(len &lt; 0) {
               /* if error, skip frame */
               is->audio_pkt_size = 0;
               //break;
           }
           is->audio_pkt_data += len;
           is->audio_pkt_size -= len;

           pts = is->audio_clock;
           // *pts_ptr = pts;
           n = 2 * is->audio_st->codec->channels;
           is->audio_clock += (double)packet->size/
                   (double)(n * is->audio_st->codec->sample_rate);
           if(gotFrame) {
               cout &lt;&lt; "got audio frame.\n";
               cout.flush(  );
               // We have a buffer ready, send it
               dataSize = av_samples_get_buffer_size(NULL, audioc->channels,
                       decodedFrame->nb_samples, audioc->sample_fmt, 1);

               if(!format) {
                   if(audioc->sample_fmt == AV_SAMPLE_FMT_U8 ||
                           audioc->sample_fmt == AV_SAMPLE_FMT_U8P) {
                       if(audioc->channels == 1) {
                           format = AL_FORMAT_MONO8;
                       } else if(audioc->channels == 2) {
                           format = AL_FORMAT_STEREO8;
                       }
                   } else if(audioc->sample_fmt == AV_SAMPLE_FMT_S16 ||
                           audioc->sample_fmt == AV_SAMPLE_FMT_S16P) {
                       if(audioc->channels == 1) {
                           format = AL_FORMAT_MONO16;
                       } else if(audioc->channels == 2) {
                           format = AL_FORMAT_STEREO16;
                       }
                   }

                   if(!format) {
                       cout &lt;&lt; "OpenAL can&#39;t open this format of sound.\n";
                       cout.flush(  );

                       return -4;
                   }
               }
               printf("albufferdata audio b4.\n");
               fflush( stdout );
               alBufferData(buffers[i], format, *decodedFrame->data, dataSize, decodedFrame->sample_rate);
               cout &lt;&lt; "after albufferdata all buffers \n";
               cout.flush(  );
               av_free_packet(packet);
               //=av_free(packet);
               av_free(decodedFrame);

               if((alError = alGetError()) != AL_NO_ERROR) {
                   printf("Error while buffering.\n");

                   printAlError(alError);
                   return -6;
               }
           }
       }


       cout &lt;&lt; "before quoe buffers \n";
       cout.flush();
       alSourceQueueBuffers(source, NUM_BUFFERS, buffers);
       cout &lt;&lt; "before play.\n";
       cout.flush();
       alSourcePlay(source);
       cout &lt;&lt; "after play.\n";
       cout.flush();
       if((alError = alGetError()) != AL_NO_ERROR) {
           cout &lt;&lt; "error strating stream.\n";
           cout.flush();
           printAlError(alError);
           return 0;
       }


       // AVPacket *pkt = &amp;is->audio_pkt;

       while(keepGoing)
       {
           while(packet_queue_get(&amp;audioq, packet, 1)  >= 0) {
               // means we quit getting packets

               do {
                   alGetSourcei(source, AL_BUFFERS_PROCESSED, &amp;val2);
                   usleep(SLEEP_BUFFERING);
               } while(val2 &lt;= 0);
               if(alGetError() != AL_NO_ERROR)
               {
                   fprintf(stderr, "Error gettingsource :(\n");
                   return 1;
               }

               while(val2--)
               {



                   ALuint buffer;
                   alSourceUnqueueBuffers(source, 1, &amp;buffer);
                   if(alGetError() != AL_NO_ERROR)
                   {
                       fprintf(stderr, "Error unqueue buffers :(\n");
                       //  return 1;
                   }
                   AVFrame *decodedFrame = NULL;
                   int gotFrame = 0;
                   // AVFrame* decodedFrame;

                   if(!decodedFrame) {
                       if(!(decodedFrame = avcodec_alloc_frame())) {
                           cout &lt;&lt; "Run out of memory, stop the streaming...\n";
                           //fflush( stdout );
                           cout.flush();


                           return -2;
                       }
                   } else {
                       avcodec_get_frame_defaults(decodedFrame);
                   }

                   int  len = avcodec_decode_audio4(audioc, decodedFrame, &amp;gotFrame, packet);
                   if(len &lt; 0) {
                       cout &lt;&lt; "Error while decoding.\n";
                       cout.flush(  );
                       is->audio_pkt_size = 0;
                       return -3;
                   }

                   is->audio_pkt_data += len;
                   is->audio_pkt_size -= len;
                   if(packet->size &lt;= 0) {
                       /* No data yet, get more frames */
                       //continue;
                   }


                   if(gotFrame) {
                       pts = is->audio_clock;
                       len = synchronize_audio(is, (int16_t *)is->audio_buf,
                               packet->size, pts);
                       is->audio_buf_size = packet->size;
                       pts = is->audio_clock;
                       // *pts_ptr = pts;
                       n = 2 * is->audio_st->codec->channels;
                       is->audio_clock += (double)packet->size /
                               (double)(n * is->audio_st->codec->sample_rate);
                       if(packet->pts != AV_NOPTS_VALUE) {
                           is->audio_clock = av_q2d(is->audio_st->time_base)*packet->pts;
                       }
                       len = av_samples_get_buffer_size(NULL, audioc->channels,
                               decodedFrame->nb_samples, audioc->sample_fmt, 1);
                       alBufferData(buffer, format, *decodedFrame->data, len, decodedFrame->sample_rate);
                       if(alGetError() != AL_NO_ERROR)
                       {
                           fprintf(stderr, "Error buffering :(\n");
                           return 1;
                       }
                       alSourceQueueBuffers(source, 1, &amp;buffer);
                       if(alGetError() != AL_NO_ERROR)
                       {
                           fprintf(stderr, "Error queueing buffers :(\n");
                           return 1;
                       }
                   }





               }

               alGetSourcei(source, AL_SOURCE_STATE, &amp;val2);
               if(val2 != AL_PLAYING)
                   alSourcePlay(source);

           }


           //pic = avcodec_alloc_frame();
       }
       qDebug() &lt;&lt; "end audiothread";
       return 1;
    }

    void MyApp::refreshSlot()
    {


       if(true)
       {

           printf("got frame %d, %d\n", pic->width, ccontext->width);
           fflush( stdout );

           sws_scale(img_convert_ctx, (const uint8_t **)pic->data, pic->linesize,
                   0, originalVideoHeight, &amp;picrgb->data[0], &amp;picrgb->linesize[0]);

           printf("rescaled frame %d, %d\n", newVideoWidth, newVideoHeight);
           fflush( stdout );
           //av_free_packet(packet);
           //av_init_packet(packet);

           qDebug() &lt;&lt; "waking audio as video finished";
           ////mutex.unlock();
           //mutex2.lock();
           doingVideoFrame = false;
           //doingAudioFrame = false;
           ////mutex2.unlock();


           //mutex2.unlock();
           //w2->wakeAll();
           //w->wakeAll();
           qDebug() &lt;&lt; "now woke audio";

           //pic = picrgb;
           uint8_t *srcy = picrgb->data[0];
           uint8_t *srcu = picrgb->data[1];
           uint8_t *srcv = picrgb->data[2];
           printf("got src yuv frame %d\n", &amp;srcy);
           fflush( stdout );
           unsigned char *ptr = NULL;
           screen_get_buffer_property_pv(mScreenPixelBuffer, SCREEN_PROPERTY_POINTER, (void**) &amp;ptr);
           unsigned char *y = ptr;
           unsigned char *u = y + (newVideoHeight * mStride) ;
           unsigned char *v = u + (newVideoHeight * mStride) / 4;
           int i = 0;
           printf("got buffer  picrgbwidth= %d \n", newVideoWidth);
           fflush( stdout );
           for ( i = 0; i &lt; newVideoHeight; i++)
           {
               int doff = i * mStride;
               int soff = i * picrgb->linesize[0];
               memcpy(&amp;y[doff], &amp;srcy[soff], newVideoWidth);
           }

           for ( i = 0; i &lt; newVideoHeight / 2; i++)
           {
               int doff = i * mStride / 2;
               int soff = i * picrgb->linesize[1];
               memcpy(&amp;u[doff], &amp;srcu[soff], newVideoWidth / 2);
           }

           for ( i = 0; i &lt; newVideoHeight / 2; i++)
           {
               int doff = i * mStride / 2;
               int soff = i * picrgb->linesize[2];
               memcpy(&amp;v[doff], &amp;srcv[soff], newVideoWidth / 2);
           }
           printf("before posttoscreen \n");
           fflush( stdout );

           video_refresh_timer();
           qDebug() &lt;&lt; "end refreshslot";

       }
       else
       {

       }





    }

    void  MyApp::refreshNeededSlot2()
       {
           printf("blitting to buffer");
           fflush(stdout);

           screen_buffer_t screen_buffer;
           screen_get_window_property_pv(mScreenWindow, SCREEN_PROPERTY_RENDER_BUFFERS, (void**) &amp;screen_buffer);
           int attribs[] = { SCREEN_BLIT_SOURCE_WIDTH, newVideoWidth, SCREEN_BLIT_SOURCE_HEIGHT, newVideoHeight, SCREEN_BLIT_END };
           int res2 = screen_blit(mScreenCtx, screen_buffer, mScreenPixelBuffer, attribs);
           printf("dirty rectangles");
           fflush(stdout);
           int dirty_rects[] = { 0, 0, newVideoWidth, newVideoHeight };
           screen_post_window(mScreenWindow, screen_buffer, 1, dirty_rects, 0);
           printf("done screneposdtwindow");
           fflush(stdout);

       }

    void MyApp::video_refresh_timer() {
       testDelay = 0;
       //  VideoState *is = ( VideoState* )userdata;
       VideoPicture *vp;
       //double pts = 0    ;
       double actual_delay, delay, sync_threshold, ref_clock, diff;

       if(is->video_st) {
           if(false)////is->pictq_size == 0)
           {
               testDelay = 1;
               schedule_refresh(is, 1);
           } else {
               // vp = &amp;is->pictq[is->pictq_rindex];

               delay = actualPts - is->frame_last_pts; /* the pts from last time */
               if(delay &lt;= 0 || delay >= 1.0) {
                   /* if incorrect delay, use previous one */
                   delay = is->frame_last_delay;
               }
               /* save for next time */
               is->frame_last_delay = delay;
               is->frame_last_pts = actualPts;

               is->video_current_pts = actualPts;
               is->video_current_pts_time = av_gettime();
               /* update delay to sync to audio */
               ref_clock = get_audio_clock(is);
               diff = actualPts - ref_clock;

               /* Skip or repeat the frame. Take delay into account
        FFPlay still doesn&#39;t "know if this is the best guess." */
               sync_threshold = (delay > AV_SYNC_THRESHOLD) ? delay : AV_SYNC_THRESHOLD;
               if(fabs(diff) &lt; AV_NOSYNC_THRESHOLD) {
                   if(diff &lt;= -sync_threshold) {
                       delay = 0;
                   } else if(diff >= sync_threshold) {
                       delay = 2 * delay;
                   }
               }
               is->frame_timer += delay;
               /* computer the REAL delay */
               actual_delay = is->frame_timer - (av_gettime() / 1000000.0);
               if(actual_delay &lt; 0.010) {
                   /* Really it should skip the picture instead */
                   actual_delay = 0.010;
               }
               testDelay = (int)(actual_delay * 1000 + 0.5);
               schedule_refresh(is, (int)(actual_delay * 1000 + 0.5));
               /* show the picture! */
               //video_display(is);


               // SDL_CondSignal(is->pictq_cond);
               // SDL_UnlockMutex(is->pictq_mutex);
           }
       } else {
           testDelay = 100;
           schedule_refresh(is, 100);

       }
    }

    void MyApp::schedule_refresh(VideoState *is, int delay) {
       qDebug() &lt;&lt; "start schedule refresh timer" &lt;&lt; delay;
       typeOfEvent = FF_REFRESH_EVENT2;
       w->wakeAll();
       //  SDL_AddTimer(delay,


    }

    I am currently waiting on data in a loop in the following way

    QMutex mutex;
       mutex.lock();
       while(keepGoing)
       {



           qDebug() &lt;&lt; "MAINTHREAD" &lt;&lt; testDelay;


           w->wait(&amp;mutex);
           mutex.unlock();
           qDebug() &lt;&lt; "MAINTHREAD past wait";

           if(!keepGoing)
           {
               break;
           }
           if(testDelay > 0 &amp;&amp; typeOfEvent == FF_REFRESH_EVENT2)
           {
               usleep(testDelay);
               refreshNeededSlot2();
           }
           else   if(testDelay > 0 &amp;&amp; typeOfEvent == FF_QUIT_EVENT2)
           {
               keepGoing = false;
               exit(0);
               break;
               // usleep(testDelay);
               // refreshNeededSlot2();
           }
           qDebug() &lt;&lt; "MAINTHREADend";
           mutex.lock();

       }
       mutex.unlock();

    Please let me know if I need to provide any more relevent code. I'm sorry my code is untidy - I still learning c++ and have been modifying this code for over a week now as previously mentioned.

    Just added a sample of output I'm seeing from print outs I do to console - I can't get my head around it (it's almost too complicated for my level of expertise) but when you see the frames being played and audio playing it's very difficult to give up especially when it took me a couple of weeks to get to this stage.

    Please someone give me a hand if they spot the problem.

    MAINTHREAD past wait
    pts after syncvideo= 1073394046
    got frame 640, 640
    start video_refresh_timer
    actualpts = 1.66833
    frame lastpts = 1.63497
    start schedule refresh timer need to delay for 123

    pts after syncvideo= 1073429033
    got frame 640, 640
    MAINTHREAD loop delay before refresh = 123
    start video_refresh_timer
    actualpts = 1.7017
    frame lastpts = 1.66833
    start schedule refresh timer need to delay for 115

    MAINTHREAD past wait
    pts after syncvideo= 1073464021
    got frame 640, 640
    start video_refresh_timer
    actualpts = 1.73507
    frame lastpts = 1.7017
    start schedule refresh timer need to delay for 140

    MAINTHREAD loop delay before refresh = 140
    pts after syncvideo= 1073499008
    got frame 640, 640
    start video_refresh_timer
    actualpts = 1.76843
    frame lastpts = 1.73507
    start schedule refresh timer need to delay for 163

    MAINTHREAD past wait
    pts after syncvideo= 1073533996
    got frame 640, 640
    start video_refresh_timer
    actualpts = 1.8018
    frame lastpts = 1.76843
    start schedule refresh timer need to delay for 188

    MAINTHREAD loop delay before refresh = 188
    pts after syncvideo= 1073568983
    got frame 640, 640
    start video_refresh_timer
    actualpts = 1.83517
    frame lastpts = 1.8018
    start schedule refresh timer need to delay for 246

    MAINTHREAD past wait
    pts after syncvideo= 1073603971
    got frame 640, 640
    start video_refresh_timer
    actualpts = 1.86853
    frame lastpts = 1.83517
    start schedule refresh timer need to delay for 299

    MAINTHREAD loop delay before refresh = 299
    pts after syncvideo= 1073638958
    got frame 640, 640
    start video_refresh_timer
    actualpts = 1.9019
    frame lastpts = 1.86853
    start schedule refresh timer need to delay for 358

    MAINTHREAD past wait
    pts after syncvideo= 1073673946
    got frame 640, 640
    start video_refresh_timer
    actualpts = 1.93527
    frame lastpts = 1.9019
    start schedule refresh timer need to delay for 416

    MAINTHREAD loop delay before refresh = 416
    pts after syncvideo= 1073708933
    got frame 640, 640
    start video_refresh_timer
    actualpts = 1.96863
    frame lastpts = 1.93527
    start schedule refresh timer need to delay for 474

    MAINTHREAD past wait
    pts after syncvideo= 1073742872
    got frame 640, 640
    MAINTHREAD loop delay before refresh = 474
    start video_refresh_timer
    actualpts = 2.002
    frame lastpts = 1.96863
    start schedule refresh timer need to delay for 518

    MAINTHREAD past wait
    pts after syncvideo= 1073760366
    got frame 640, 640
    start video_refresh_timer
    actualpts = 2.03537
    frame lastpts = 2.002
    start schedule refresh timer need to delay for 575