Recherche avancée

Médias (1)

Mot : - Tags -/iphone

Autres articles (56)

  • Formulaire personnalisable

    21 juin 2013, par

    Cette page présente les champs disponibles dans le formulaire de publication d’un média et il indique les différents champs qu’on peut ajouter. Formulaire de création d’un Media
    Dans le cas d’un document de type média, les champs proposés par défaut sont : Texte Activer/Désactiver le forum ( on peut désactiver l’invite au commentaire pour chaque article ) Licence Ajout/suppression d’auteurs Tags
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire. (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Qu’est ce qu’un masque de formulaire

    13 juin 2013, par

    Un masque de formulaire consiste en la personnalisation du formulaire de mise en ligne des médias, rubriques, actualités, éditoriaux et liens vers des sites.
    Chaque formulaire de publication d’objet peut donc être personnalisé.
    Pour accéder à la personnalisation des champs de formulaires, il est nécessaire d’aller dans l’administration de votre MediaSPIP puis de sélectionner "Configuration des masques de formulaires".
    Sélectionnez ensuite le formulaire à modifier en cliquant sur sont type d’objet. (...)

Sur d’autres sites (9179)

  • Video Recording on Android from Frames using javaCV

    23 juin 2016, par jawad bin zafar

    I am using this library javaCV to record video on android. They have provided a sample VideoRecording Activity But There is some bug which I could not figure out what I am doing it wrong or what is missing which causing this bug.

    package org.bytedeco.javacv.recordactivity;

    import android.app.Activity;
    import android.content.Context;
    import android.content.pm.ActivityInfo;
    import android.hardware.Camera;
    import android.hardware.Camera.PreviewCallback;
    import android.media.AudioFormat;
    import android.media.AudioRecord;
    import android.media.MediaRecorder;
    import android.os.Bundle;
    import android.os.PowerManager;
    import android.util.Log;
    import android.view.Display;
    import android.view.KeyEvent;
    import android.view.LayoutInflater;
    import android.view.SurfaceHolder;
    import android.view.SurfaceView;
    import android.view.View;
    import android.view.View.OnClickListener;
    import android.view.WindowManager;
    import android.widget.Button;
    import android.widget.LinearLayout;
    import android.widget.RelativeLayout;

    import java.io.IOException;
    import java.nio.ShortBuffer;

    import org.bytedeco.javacv.FFmpegFrameRecorder;

    import static org.bytedeco.javacpp.opencv_core.*;

    public class RecordActivity extends Activity implements OnClickListener {

       private final static String CLASS_LABEL = "RecordActivity";
       private final static String LOG_TAG = CLASS_LABEL;

       private PowerManager.WakeLock mWakeLock;

       private String ffmpeg_link = "/mnt/sdcard/stream.flv";

       long startTime = 0;
       boolean recording = false;

       private volatile FFmpegFrameRecorder recorder;

       private boolean isPreviewOn = false;

       private int sampleAudioRateInHz = 44100;
       private int imageWidth = 320;
       private int imageHeight = 240;
       private int frameRate = 30;

       /* audio data getting thread */
       private AudioRecord audioRecord;
       private AudioRecordRunnable audioRecordRunnable;
       private Thread audioThread;
       volatile boolean runAudioThread = true;

       /* video data getting thread */
       private Camera cameraDevice;
       private CameraView cameraView;

       private IplImage yuvIplimage = null;

       /* layout setting */
       private final int bg_screen_bx = 232;
       private final int bg_screen_by = 128;
       private final int bg_screen_width = 700;
       private final int bg_screen_height = 500;
       private final int bg_width = 1123;
       private final int bg_height = 715;
       private final int live_width = 640;
       private final int live_height = 480;
       private int screenWidth, screenHeight;
       private Button btnRecorderControl;

       /** The number of seconds in the continuous record loop (or 0 to disable loop). */
       final int RECORD_LENGTH = 10;
       IplImage[] images;
       long[] timestamps;
       ShortBuffer[] samples;
       int imagesIndex, samplesIndex;

       @Override
       public void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);

           setContentView(R.layout.main);

           PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
           mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, CLASS_LABEL);
           mWakeLock.acquire();

           initLayout();
       }


       @Override
       protected void onResume() {
           super.onResume();

           if (mWakeLock == null) {
              PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
              mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, CLASS_LABEL);
              mWakeLock.acquire();
           }
       }

       @Override
       protected void onPause() {
           super.onPause();

           if (mWakeLock != null) {
               mWakeLock.release();
               mWakeLock = null;
           }
       }

       @Override
       protected void onDestroy() {
           super.onDestroy();

           recording = false;

           if (cameraView != null) {
               cameraView.stopPreview();
           }

           if(cameraDevice != null) {
              cameraDevice.stopPreview();
              cameraDevice.release();
              cameraDevice = null;
           }

           if (mWakeLock != null) {
               mWakeLock.release();
               mWakeLock = null;
           }
       }


       private void initLayout() {

           /* get size of screen */
           Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
           screenWidth = display.getWidth();
           screenHeight = display.getHeight();
           RelativeLayout.LayoutParams layoutParam = null;
           LayoutInflater myInflate = null;
           myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
           RelativeLayout topLayout = new RelativeLayout(this);
           setContentView(topLayout);
           LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(R.layout.main, null);
           layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
           topLayout.addView(preViewLayout, layoutParam);

           /* add control button: start and stop */
           btnRecorderControl = (Button) findViewById(R.id.recorder_control);
           btnRecorderControl.setText("Start");
           btnRecorderControl.setOnClickListener(this);

           /* add camera view */
           int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
           int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
           int prev_rw, prev_rh;
           if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
               prev_rh = display_height_d;
               prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
           } else {
               prev_rw = display_width_d;
               prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
           }
           layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
           layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
           layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);

           cameraDevice = Camera.open();
           Log.i(LOG_TAG, "cameara open");
           cameraView = new CameraView(this, cameraDevice);
           topLayout.addView(cameraView, layoutParam);
           Log.i(LOG_TAG, "cameara preview start: OK");
       }

       //---------------------------------------
       // initialize ffmpeg_recorder
       //---------------------------------------
       private void initRecorder() {

           Log.w(LOG_TAG,"init recorder");

           if (RECORD_LENGTH > 0) {
               imagesIndex = 0;
               images = new IplImage[RECORD_LENGTH * frameRate];
               timestamps = new long[images.length];
               for (int i = 0; i < images.length; i++) {
                   images[i] = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
                   timestamps[i] = -1;
               }
           } else if (yuvIplimage == null) {
               yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
               Log.i(LOG_TAG, "create yuvIplimage");
           }

           Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
           recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
           recorder.setFormat("flv");
           recorder.setSampleRate(sampleAudioRateInHz);
           // Set in the surface changed method
           recorder.setFrameRate(frameRate);

           Log.i(LOG_TAG, "recorder initialize success");

           audioRecordRunnable = new AudioRecordRunnable();
           audioThread = new Thread(audioRecordRunnable);
           runAudioThread = true;
       }

       public void startRecording() {

           initRecorder();

           try {
               recorder.start();
               startTime = System.currentTimeMillis();
               recording = true;
               audioThread.start();

           } catch (FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
       }

       public void stopRecording() {

           runAudioThread = false;
           try {
               audioThread.join();
           } catch (InterruptedException e) {
               e.printStackTrace();
           }
           audioRecordRunnable = null;
           audioThread = null;

           if (recorder != null && recording) {
               if (RECORD_LENGTH > 0) {
                   Log.v(LOG_TAG,"Writing frames");
                   try {
                       int firstIndex = imagesIndex % samples.length;
                       int lastIndex = (imagesIndex - 1) % images.length;
                       if (imagesIndex <= images.length) {
                           firstIndex = 0;
                           lastIndex = imagesIndex - 1;
                       }
                       if ((startTime = timestamps[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
                           startTime = 0;
                       }
                       if (lastIndex < firstIndex) {
                           lastIndex += images.length;
                       }
                       for (int i = firstIndex; i <= lastIndex; i++) {
                           long t = timestamps[i % timestamps.length] - startTime;
                           if (t >= 0) {
                               if (t > recorder.getTimestamp()) {
                                   recorder.setTimestamp(t);
                               }
                               recorder.record(images[i % images.length]);
                           }
                       }

                       firstIndex = samplesIndex % samples.length;
                       lastIndex = (samplesIndex - 1) % samples.length;
                       if (samplesIndex <= samples.length) {
                           firstIndex = 0;
                           lastIndex = samplesIndex - 1;
                       }
                       if (lastIndex < firstIndex) {
                           lastIndex += samples.length;
                       }
                       for (int i = firstIndex; i <= lastIndex; i++) {
                           recorder.record(samples[i % samples.length]);
                       }
                   } catch (FFmpegFrameRecorder.Exception e) {
                       Log.v(LOG_TAG,e.getMessage());
                       e.printStackTrace();
                   }
               }

               recording = false;
               Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
               try {
                   recorder.stop();
                   recorder.release();
               } catch (FFmpegFrameRecorder.Exception e) {
                   e.printStackTrace();
               }
               recorder = null;

           }
       }

       @Override
       public boolean onKeyDown(int keyCode, KeyEvent event) {

           if (keyCode == KeyEvent.KEYCODE_BACK) {
               if (recording) {
                   stopRecording();
               }

               finish();

               return true;
           }

           return super.onKeyDown(keyCode, event);
       }


       //---------------------------------------------
       // audio thread, gets and encodes audio data
       //---------------------------------------------
       class AudioRecordRunnable implements Runnable {

           @Override
           public void run() {
               android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

               // Audio
               int bufferSize;
               ShortBuffer audioData;
               int bufferReadResult;

               bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
               audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

               if (RECORD_LENGTH > 0) {
                   samplesIndex = 0;
                   samples = new ShortBuffer[RECORD_LENGTH * sampleAudioRateInHz * 2 / bufferSize + 1];
                   for (int i = 0; i < samples.length; i++) {
                       samples[i] = ShortBuffer.allocate(bufferSize);
                   }
               } else {
                   audioData = ShortBuffer.allocate(bufferSize);
               }

               Log.d(LOG_TAG, "audioRecord.startRecording()");
               audioRecord.startRecording();

               /* ffmpeg_audio encoding loop */
               while (runAudioThread) {
                   if (RECORD_LENGTH > 0) {
                       audioData = samples[samplesIndex++ % samples.length];
                       audioData.position(0).limit(0);
                   }
                   //Log.v(LOG_TAG,"recording? " + recording);
                   bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
                   audioData.limit(bufferReadResult);
                   if (bufferReadResult > 0) {
                       Log.v(LOG_TAG,"bufferReadResult: " + bufferReadResult);
                       // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                       // Why?  Good question...
                       if (recording) {
                           if (RECORD_LENGTH <= 0) try {
                               recorder.record(audioData);
                               //Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                           } catch (FFmpegFrameRecorder.Exception e) {
                               Log.v(LOG_TAG,e.getMessage());
                               e.printStackTrace();
                           }
                       }
                   }
               }
               Log.v(LOG_TAG,"AudioThread Finished, release audioRecord");

               /* encoding finish, release recorder */
               if (audioRecord != null) {
                   audioRecord.stop();
                   audioRecord.release();
                   audioRecord = null;
                   Log.v(LOG_TAG,"audioRecord released");
               }
           }
       }

       //---------------------------------------------
       // camera thread, gets and encodes video data
       //---------------------------------------------
       class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

           private SurfaceHolder mHolder;
           private Camera mCamera;

           public CameraView(Context context, Camera camera) {
               super(context);
               Log.w("camera","camera view");
               mCamera = camera;
               mHolder = getHolder();
               mHolder.addCallback(CameraView.this);
               mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
               mCamera.setPreviewCallback(CameraView.this);
           }

           @Override
           public void surfaceCreated(SurfaceHolder holder) {
               try {
                   stopPreview();
                   mCamera.setPreviewDisplay(holder);
               } catch (IOException exception) {
                   mCamera.release();
                   mCamera = null;
               }
           }

           public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
               Log.v(LOG_TAG,"Setting imageWidth: " + imageWidth + " imageHeight: " + imageHeight + " frameRate: " + frameRate);
               Camera.Parameters camParams = mCamera.getParameters();
               camParams.setPreviewSize(imageWidth, imageHeight);

               Log.v(LOG_TAG,"Preview Framerate: " + camParams.getPreviewFrameRate());

               camParams.setPreviewFrameRate(frameRate);
               mCamera.setParameters(camParams);
               startPreview();
           }

           @Override
           public void surfaceDestroyed(SurfaceHolder holder) {
               try {
                   mHolder.addCallback(null);
                   mCamera.setPreviewCallback(null);
               } catch (RuntimeException e) {
                   // The camera has probably just been released, ignore.
               }
           }

           public void startPreview() {
               if (!isPreviewOn && mCamera != null) {
                   isPreviewOn = true;
                   mCamera.startPreview();
               }
           }

           public void stopPreview() {
               if (isPreviewOn && mCamera != null) {
                   isPreviewOn = false;
                   mCamera.stopPreview();
               }
           }

           @Override
           public void onPreviewFrame(byte[] data, Camera camera) {
               if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
                   startTime = System.currentTimeMillis();
                   return;
               }
               if (RECORD_LENGTH > 0) {
                   int i = imagesIndex++ % images.length;
                   yuvIplimage = images[i];
                   timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
               }
               /* get video data */
               if (yuvIplimage != null && recording) {
                   yuvIplimage.getByteBuffer().put(data);

                   if (RECORD_LENGTH <= 0) try {
                       Log.v(LOG_TAG,"Writing Frame");
                       long t = 1000 * (System.currentTimeMillis() - startTime);
                       if (t > recorder.getTimestamp()) {
                           recorder.setTimestamp(t);
                       }
                       recorder.record(yuvIplimage);
                   } catch (FFmpegFrameRecorder.Exception e) {
                       Log.v(LOG_TAG,e.getMessage());
                       e.printStackTrace();
                   }
               }
           }
       }

       @Override
       public void onClick(View v) {
           if (!recording) {
               startRecording();
               Log.w(LOG_TAG, "Start Button Pushed");
               btnRecorderControl.setText("Stop");
           } else {
               // This will trigger the audio recording loop to stop and then set isRecorderStart = false;
               stopRecording();
               Log.w(LOG_TAG, "Stop Button Pushed");
               btnRecorderControl.setText("Start");
           }
       }
    }

    I am getting this error

    The type org.bytedeco.javacpp.avutil$AVFrame cannot be resolved. It is indirectly referenced from required .class files

    at following line in the above code

    if (RECORD_LENGTH <= 0) try {
       recorder.record(audioData);
       //Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
    }

    at recorder.record(audioData). I don’t know what I am doing wrong here. New to JavaCV. Any help will be appreciated.

  • FFmpegFrameGrabber video artefacts from RTSP network camera

    2 février 2015, par UncleChris

    I’m using JavaCV FFmpegFrameGrabber to grab frames from my network camera through RTSP protocol. Simplified code looks like this :

    /* from ini method */
    // url like: rtsp://ip:port/stream1
    grabber = new FFmpegFrameGrabber(stream.getUrl());
    // type: RTP
    grabber.setFormat(stream.getMediaType());
    grabber.start();

    /* it's called in while loop from outside */
    public void grab() throws FrameProcessorsException {

       try {
           LOGGER.info(grabber.getFrameNumber());
           frame = grabber.grab();
       } catch (FrameGrabber.Exception e) {
           throw new FrameProcessorsException(e);
       }

       // I save my frames to other grabber, to make mp4 file to watch it later
       try {
           videoRecorder.recordFrame(frame, grabber.getTimestamp(), grabber.getImageWidth(), grabber.getImageHeight(), grabber.getAudioChannels());
       } catch (FrameRecorder.Exception e) {
           throw new FrameProcessorsException(e);
       }

       // my processing, the troublemaker
       long currentFrameNum = grabber.getFrameNumber();
       if (processing && currentFrameNum - lastFrameWithAnalysis >= PROCESS_FREQUENCY) {

           lastFrameWithAnalysis = currentFrameNum;

           Mat frameMat = new Mat(frame, false);
           try {
               LOGGER.info("Processing :" + grabber.getFrameNumber());
               AnalysisResult result = frameAnalyzer.processFrame(frameMat, (int) currentFrameNum);
               videoAnalysisSaver.saveFrameAnalysisResult(frameMat, result, (int) currentFrameNum);
           } catch (ServerErrorException | NotExistException e) {
               LOGGER.warn(e);
           }

    In code You can see processing variable. If it’s set to false, I can watch my network streams on page with no problems. But if I set it to true, suddenly, I got visual artefacts, looking like this :

    http://answers.opencv.org/upfiles/1400931120927032.png

    And also I can see some infos on my logs :

    [libx264 @ 0x7fe2a7e2ae00] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
    [libx264 @ 0x7fe2a7e2ae00] profile High, level 4.0
    [libx264 @ 0x7fe2a7e2ae00] 264 - core 142 - H.264/MPEG-4 AVC codec - Copyleft 2003-2014 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1,00:0,00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=12 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=abr mbtree=1 bitrate=400 ratetol=1,0 qcomp=0,60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1,40 aq=1:1,00
    [mp4 @ 0x7fe2909feee0] Using AVStream.codec.time_base as a timebase hint to the muxer is deprecated. Set AVStream.time_base instead.
    2015-02-02 10:34:31,986 INFO  [img.StreamGrabber] 6
    2015-02-02 10:34:31,998 INFO  [img.StreamGrabber] Processing :1
    2015-02-02 10:34:32,881 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1524, 1564, 678, 718) is above threshold
    2015-02-02 10:34:32,882 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1538, 1577, 678, 717) is above threshold
    2015-02-02 10:34:32,884 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1320, 1420, 298, 398) is above threshold
    2015-02-02 10:34:33,199 INFO  [img.StreamGrabber] 1
    2015-02-02 10:34:33,212 INFO  [img.StreamGrabber] 2
    2015-02-02 10:34:33,222 INFO  [img.StreamGrabber] 3
    2015-02-02 10:34:33,232 INFO  [img.StreamGrabber] 4
    2015-02-02 10:34:33,244 INFO  [img.StreamGrabber] 5
    2015-02-02 10:34:33,255 INFO  [img.StreamGrabber] Processing :6
    2015-02-02 10:34:33,870 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1537, 1578, 678, 719) is above threshold
    2015-02-02 10:34:33,871 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1315, 1422, 298, 405) is above threshold
    2015-02-02 10:34:34,318 INFO  [img.StreamGrabber] 6
    2015-02-02 10:34:34,338 INFO  [img.StreamGrabber] 7
    2015-02-02 10:34:34,347 INFO  [img.StreamGrabber] 8
    2015-02-02 10:34:34,357 INFO  [img.StreamGrabber] 9
    2015-02-02 10:34:34,368 INFO  [img.StreamGrabber] 10
    2015-02-02 10:34:34,379 INFO  [img.StreamGrabber] Processing :11
    2015-02-02 10:34:35,025 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1561, 1618, 477, 534) is above threshold
    2015-02-02 10:34:35,027 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1318, 1421, 300, 403) is above threshold
    2015-02-02 10:34:35,185 INFO  [img.StreamGrabber] 11
    2015-02-02 10:34:35,202 INFO  [img.StreamGrabber] 12
    2015-02-02 10:34:35,213 INFO  [img.StreamGrabber] 13
    2015-02-02 10:34:35,223 INFO  [img.StreamGrabber] 14
    2015-02-02 10:34:35,235 INFO  [img.StreamGrabber] 15
    2015-02-02 10:34:35,286 INFO  [img.StreamGrabber] Processing :16
    2015-02-02 10:34:35,952 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1429, 1470, 703, 744) is above threshold
    2015-02-02 10:34:35,954 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1315, 1422, 295, 402) is above threshold
    2015-02-02 10:34:36,218 INFO  [img.StreamGrabber] 16
    2015-02-02 10:34:36,237 INFO  [img.StreamGrabber] 17
    2015-02-02 10:34:36,246 INFO  [img.StreamGrabber] 18
    2015-02-02 10:34:36,257 INFO  [img.StreamGrabber] 19
    2015-02-02 10:34:36,268 INFO  [img.StreamGrabber] 20
    2015-02-02 10:34:36,279 INFO  [img.StreamGrabber] Processing :21
    2015-02-02 10:34:36,967 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1562, 1616, 480, 534) is above threshold
    2015-02-02 10:34:36,968 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1314, 1420, 296, 402) is above threshold
    2015-02-02 10:34:37,186 INFO  [img.StreamGrabber] 21
    2015-02-02 10:34:37,206 INFO  [img.StreamGrabber] 22
    2015-02-02 10:34:37,217 INFO  [img.StreamGrabber] 23
    2015-02-02 10:34:37,227 INFO  [img.StreamGrabber] 24
    [h264 @ 0x7fe2915b30a0] RTP: missed 1514 packets
    [h264 @ 0x7fe2f1050ea0] Cannot use next picture in error concealment
    [h264 @ 0x7fe2f1050ea0] concealing 4608 DC, 4608 AC, 4608 MV errors in P frame
    2015-02-02 10:34:37,238 INFO  [img.StreamGrabber] 25
    2015-02-02 10:34:37,250 INFO  [img.StreamGrabber] Processing :26
    2015-02-02 10:34:37,944 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1562, 1616, 479, 533) is above threshold
    2015-02-02 10:34:37,945 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1315, 1422, 297, 404) is above threshold
    2015-02-02 10:34:38,107 INFO  [img.StreamGrabber] 26
    [h264 @ 0x7fe2915b30a0] RTP: missed 295 packets
    [h264 @ 0x7fe2a5713e00] Cannot use next picture in error concealment
    [h264 @ 0x7fe2a5713e00] concealing 1996 DC, 1996 AC, 1996 MV errors in P frame
    2015-02-02 10:34:38,120 INFO  [img.StreamGrabber] 27
    2015-02-02 10:34:38,130 INFO  [img.StreamGrabber] 28
    2015-02-02 10:34:38,143 INFO  [img.StreamGrabber] 29
    2015-02-02 10:34:38,231 INFO  [img.StreamGrabber] 30
    2015-02-02 10:34:38,249 INFO  [img.StreamGrabber] Processing :31
    2015-02-02 10:34:38,962 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1170, 1211, 322, 363) is above threshold
    2015-02-02 10:34:38,964 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1316, 1421, 298, 403) is above threshold
    2015-02-02 10:34:39,329 INFO  [img.StreamGrabber] 31
    [h264 @ 0x7fe2915b30a0] RTP: missed 232 packets
    [h264 @ 0x7fe2a4203d80] Cannot use next picture in error concealment
    [h264 @ 0x7fe2a4203d80] concealing 1142 DC, 1142 AC, 1142 MV errors in P frame
    2015-02-02 10:34:39,342 INFO  [img.StreamGrabber] 32
    2015-02-02 10:34:39,352 INFO  [img.StreamGrabber] 33
    [h264 @ 0x7fe2915b30a0] RTP: missed 1 packets
    [h264 @ 0x7fe2915b43c0] corrupted macroblock 86 67 (total_coeff=-1)
    [h264 @ 0x7fe2915b43c0] error while decoding MB 86 67
    [h264 @ 0x7fe2915b43c0] Cannot use next picture in error concealment
    [h264 @ 0x7fe2915b43c0] concealing 83 DC, 83 AC, 83 MV errors in P frame
    2015-02-02 10:34:39,362 INFO  [img.StreamGrabber] Processing :144
    2015-02-02 10:34:40,071 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1563, 1614, 480, 531) is above threshold
    2015-02-02 10:34:40,074 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1318, 1423, 296, 401) is above threshold
    2015-02-02 10:34:40,462 INFO  [img.StreamGrabber] 144
    2015-02-02 10:34:40,482 INFO  [img.StreamGrabber] 145
    [h264 @ 0x7fe2915b30a0] RTP: missed 377 packets
    [h264 @ 0x7fe2a515baa0] Cannot use next picture in error concealment
    [h264 @ 0x7fe2a515baa0] concealing 6822 DC, 6822 AC, 6822 MV errors in P frame
    2015-02-02 10:34:40,494 INFO  [img.StreamGrabber] Processing :167
    2015-02-02 10:34:41,222 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1563, 1615, 479, 531) is above threshold
    2015-02-02 10:34:41,230 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1319, 1421, 295, 397) is above threshold
    2015-02-02 10:34:41,930 INFO  [img.StreamGrabber] 167
    2015-02-02 10:34:41,947 INFO  [img.StreamGrabber] 168
    2015-02-02 10:34:41,958 INFO  [img.StreamGrabber] 169
    2015-02-02 10:34:41,970 INFO  [img.StreamGrabber] 170
    2015-02-02 10:34:41,985 INFO  [img.StreamGrabber] 171
    [h264 @ 0x7fe2915b30a0] RTP: missed 311 packets
    [h264 @ 0x7fe2f10506c0] Cannot use next picture in error concealment
    [h264 @ 0x7fe2f10506c0] concealing 1409 DC, 1409 AC, 1409 MV errors in P frame
    2015-02-02 10:34:41,997 INFO  [img.StreamGrabber] Processing :190
    2015-02-02 10:34:42,715 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1322, 1384, 340, 402) is above threshold
    2015-02-02 10:34:42,717 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1312, 1425, 290, 403) is above threshold
    2015-02-02 10:34:42,929 INFO  [img.StreamGrabber] 190
    [h264 @ 0x7fe2915b30a0] RTP: missed 13 packets
    [h264 @ 0x7fe2f1050ea0] Cannot use next picture in error concealment
    [h264 @ 0x7fe2f1050ea0] concealing 6489 DC, 6489 AC, 6489 MV errors in P frame
    2015-02-02 10:34:42,943 INFO  [img.StreamGrabber] 191
    [h264 @ 0x7fe2915b30a0] RTP: missed 484 packets
    [h264 @ 0x7fe2915b43c0] concealing 6609 DC, 6609 AC, 6609 MV errors in I frame
    2015-02-02 10:34:42,957 INFO  [img.StreamGrabber] 192
    2015-02-02 10:34:42,970 INFO  [img.StreamGrabber] 193
    [h264 @ 0x7fe2915b30a0] RTP: missed 313 packets
    [h264 @ 0x7fe2a51a0fc0] Cannot use next picture in error concealment
    [h264 @ 0x7fe2a51a0fc0] concealing 1666 DC, 1666 AC, 1666 MV errors in P frame
    2015-02-02 10:34:43,271 INFO  [img.StreamGrabber] 194
    2015-02-02 10:34:43,314 INFO  [img.StreamGrabber] Processing :249
    2015-02-02 10:34:44,099 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1322, 1384, 340, 402) is above threshold
    2015-02-02 10:34:44,100 INFO  [img.analysis.face.PersonFaceRecognizer] Predicted face (1313, 1403, 300, 390) is above threshold
    2015-02-02 10:34:45,473 INFO  [img.save.event.EventMaker] Creating 1 face recognition events
    2015-02-02 10:34:45,618 INFO  [core.task.StreamRecordingTaskExecutor] Stream recording task ended: rtsp://MYURL

    My guess is, that my computer is simply too busy to catch all packages from camera stream. I’m operating on two streams, one is low quality with like 3 fps, and other is 30. Of course problems show up all the time on fast one, and rarely on slow stream.
    I’m wondering if there is possibility to force somehow FFmpegFrameGraber not to create artifacts, but simply drop current frame and go to next one ? Fps and frame continuity is not so important. I was trying to use grabber’s setfps, settimestamp, delayedGrab method’s to somehow slow down 30-fps stream, but it didn’t even react to that. I’m sure I’m doing something wrong.

    I’ve found some topics related to my problem, but they did not helped me, maybe You will see more :
    http://answers.opencv.org/question/34012/ip-camera-h264-error-while-decoding/
    How to deal with cv::VideoCapture decode errors ?
    http://superuser.com/questions/663928/ffmpeg-to-capture-stills-from-h-264-stream

    Thank You for Your help.

  • Replaced the inline onclick handler for cboxPhoto with a namespaced e…

    14 mai 2015, par jackmoore
    Replaced the inline onclick handler for cboxPhoto with a namespaced event for easier unbinding.  Fixes #719.