Recherche avancée

Médias (2)

Mot : - Tags -/kml

Autres articles (68)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

Sur d’autres sites (7184)

  • H264 Encoding - Could not play video using VLC Player

    31 mars 2016, par bot1131357

    I am have trouble encoding an H264 video correctly using FFmpeg libav. I could not play the encoded video in VLC media player, and although I could play the video on MPC-HC the time shows 00:00/00:00. Clearly I’m missing something.

    The Media info from MPC-HC shows this :

    General
    Format : AVC
    Format/Info : Advanced Video Codec
    File size : 110 KiB
    Duration : 2s 400ms
    Overall bit rate : 375 Kbps
    Writing library : x264 core 148 r2665 a01e339
    Encoding settings : cabac=0 / ref=3 / deblock=1:0:0 / analyse=0x1:0x111 / me=hex / subme=7 / psy=1 / psy_rd=1.00:0.00 / mixed_ref=1 / me_range=16 / chroma_me=1 / trellis=1 / 8x8dct=0 / cqm=0 / deadzone=21,11 / fast_pskip=1 / chroma_qp_offset=-2 / threads=7 / lookahead_threads=1 / sliced_threads=0 / nr=0 / decimate=1 / interlaced=0 / bluray_compat=0 / constrained_intra=0 / bframes=0 / weightp=0 / keyint=12 / keyint_min=1 / scenecut=40 / intra_refresh=0 / rc_lookahead=12 / rc=abr / mbtree=1 / bitrate=2000 / ratetol=1.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / ip_ratio=1.40 / aq=1:1.00

    Video
    Format : AVC
    Format/Info : Advanced Video Codec
    Format profile : Baseline@L2.1
    Format settings, CABAC : No
    Format settings, ReFrames : 3 frames
    Format settings, GOP : M=1, N=12
    Duration : 2s 400ms
    Bit rate : 2 000 Kbps
    Width : 320 pixels
    Height : 240 pixels
    Display aspect ratio : 4:3
    Frame rate mode : Variable
    Frame rate : 20.833 fps
    Color space : YUV
    Chroma subsampling : 4:2:0
    Bit depth : 8 bits
    Scan type : Progressive
    Bits/(Pixel*Frame) : 1.250
    Stream size : 586 KiB
    Writing library : x264 core 148 r2665 a01e339
    Encoding settings : cabac=0 / ref=3 / deblock=1:0:0 / analyse=0x1:0x111 / me=hex / subme=7 / psy=1 / psy_rd=1.00:0.00 / mixed_ref=1 / me_range=16 / chroma_me=1 / trellis=1 / 8x8dct=0 / cqm=0 / deadzone=21,11 / fast_pskip=1 / chroma_qp_offset=-2 / threads=7 / lookahead_threads=1 / sliced_threads=0 / nr=0 / decimate=1 / interlaced=0 / bluray_compat=0 / constrained_intra=0 / bframes=0 / weightp=0 / keyint=12 / keyint_min=1 / scenecut=40 / intra_refresh=0 / rc_lookahead=12 / rc=abr / mbtree=1 / bitrate=2000 / ratetol=1.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / ip_ratio=1.40 / aq=1:1.00

    I noticed something odd in the above info :
    - The frame rate is 20.833 fps, instead of the specified 10 fps.
    - Duration of 2s 400ms did not seem right either, since the video played for more than 4s.

    Also, (AVFrame* picture)->pict_type is always set to AV_PICTURE_TYPE_NONE. I don’t think this is normal.

    The library that I’m using is ffmpeg-20160219-git-98a0053-win32-dev. I would really really appreciate if you could help me out of this confusion.

    /*
    * Video encoding example
    */
    char filename[] = "test.mp4";
    int main(int argc, char** argv)
    {
       AVCodec *codec = NULL;
       AVCodecContext *codecCtx= NULL;
       AVFormatContext *pFormatCtx = NULL;
       AVStream * pVideoStream = NULL;
       AVFrame *picture = NULL;

       int i, x, y,            //
           ret,                // Return value
           got_packet_ptr;     // Data encoded into packet

       printf("Video encoding\n");

       // Register all formats and codecs
       av_register_all();

       // allocate context
       pFormatCtx = avformat_alloc_context();
       memcpy(pFormatCtx->filename,filename,
           min(strlen(filename), sizeof(pFormatCtx->filename)));

       // guess format
       pFormatCtx->oformat = av_guess_format("h264", NULL, NULL);
       if (NULL==pFormatCtx->oformat)
       {
           cerr << "Could not guess output format" << endl;
           return -1;
       }  

       // Find the codec.
       codec = avcodec_find_encoder(pFormatCtx->oformat->video_codec);
       if (codec == NULL) {
           fprintf(stderr, "Codec not found\n");
           return -1;
       }

       // Set context
       int framerate = 10;
       codecCtx = avcodec_alloc_context3(codec);
       avcodec_get_context_defaults3(codecCtx, codec);
       codecCtx->pix_fmt = AV_PIX_FMT_YUV420P;
       codecCtx->profile = FF_PROFILE_H264_BASELINE;
       // Resolution must be a multiple of two.
       codecCtx->width  = 320;
       codecCtx->height = 240;

       codecCtx->bit_rate = 2000000;
       codecCtx->time_base.den = framerate;
       codecCtx->time_base.num = 1;
       codecCtx->gop_size = 12; // emit one intra frame every twelve frames at most

       // Open the codec.  
       if (avcodec_open2(codecCtx, codec, NULL) < 0)
       {
           printf("Cannot open video codec\n");
           return -1;
       }

       // Add stream to pFormatCtx
       pVideoStream = avformat_new_stream(pFormatCtx, codec);
       if (!pVideoStream)
       {
           printf("Cannot add new video stream\n");
           return -1;
       }
       pVideoStream->codec = codecCtx;
       pVideoStream->time_base.den = framerate;
       pVideoStream->time_base.num = 1;

       if (avio_open2(&pFormatCtx->pb, filename, AVIO_FLAG_WRITE, NULL, NULL) < 0)
       {
           printf("Cannot open file\n");
           return -1;
       }

       // Write file header.
       avformat_write_header(pFormatCtx, NULL);

       // Create frame
       picture= av_frame_alloc();
       picture->format = codecCtx->pix_fmt;
       picture->width  = codecCtx->width;
       picture->height = codecCtx->height;

       int bufferImgSize = av_image_get_buffer_size(codecCtx->pix_fmt, codecCtx->width,
                       codecCtx->height,1);    
       av_image_alloc(picture->data, picture->linesize, codecCtx->width, codecCtx->height,                 codecCtx->pix_fmt, 32);

       AVPacket avpkt;

       /* encode 1 second of video */
       for(i=0;i<50;i++)
       {
           /* prepare a dummy image */
           /* Y */
           for(y=0;yheight;y++)
           {
               for(x=0;xwidth;x++)
               {
                   picture->data[0][y * picture->linesize[0] + x] = x + y + i * 3;
               }
           }
           /* Cb and Cr */
           for(y=0;yheight/2;y++)
           {
               for(x=0;xwidth/2;x++)
               {
                   picture->data[1][y * picture->linesize[1] + x] = 128 + y + i * 2;
                   picture->data[2][y * picture->linesize[2] + x] = 64 + x + i * 5;
               }
           }

           // Get timestamp
           picture->pts = (float) i * (1000.0/(float)(codecCtx->time_base.den)) * 90;

           // Encode frame to packet
           av_init_packet(&avpkt);
           got_packet_ptr = 0;
           int error = avcodec_encode_video2(codecCtx, &avpkt, picture, &got_packet_ptr);
           if (!error && got_packet_ptr > 0)
           {
               // Write packet with frame.
               ret = (av_interleaved_write_frame(pFormatCtx, &avpkt) == 0);        
           }  
           av_packet_unref(&avpkt);
       }

       // Flush remaining encoded data
       while(1)
       {
           av_init_packet(&avpkt);
           got_packet_ptr = 0;
           // Encode frame to packet.
           int error = avcodec_encode_video2(codecCtx, &avpkt, NULL, &got_packet_ptr);
           if (!error && got_packet_ptr > 0)
           {
               // Write packet with frame.
               ret = (av_interleaved_write_frame(pFormatCtx, &avpkt) == 0);        
           }
           else
           {
               break;
           }
           av_packet_unref(&avpkt);
       }
       av_write_trailer(pFormatCtx);

       av_packet_unref(&avpkt);
       av_frame_free(&picture);

       avcodec_close(codecCtx);
       av_free(codecCtx);

       cin.get();
    }
  • Android recording video with overlay view

    6 mars 2016, par t0m

    I am trying app in android which have functionality to capture video with overlay views. (Also i need onPreviewFrame method.)
    Via SurfaceView and JavaCV with FFmpeg :


    OpenCVCameraActivity.java :

    import android.app.Activity;
    import android.content.Context;
    import android.hardware.Camera;
    import android.hardware.Camera.PreviewCallback;
    import android.media.AudioFormat;
    import android.media.AudioRecord;
    import android.media.MediaRecorder;
    import android.os.Bundle;
    import android.util.Log;
    import android.view.KeyEvent;
    import android.view.Menu;
    import android.view.SurfaceHolder;
    import android.view.SurfaceView;
    import android.view.View;
    import android.widget.FrameLayout;
    import android.widget.Toast;

    import org.bytedeco.javacv.FFmpegFrameRecorder;
    import org.bytedeco.javacv.Frame;

    import java.io.File;
    import java.io.IOException;
    import java.nio.ByteBuffer;
    import java.nio.ShortBuffer;
    import java.util.List;


    @SuppressWarnings("ALL")
    public class OpenCVCameraActivity extends Activity {

       private static final String TAG = OpenCVCameraActivity.class.getSimpleName();

       private long startTime = 0;
       private boolean isPreviewOn = false;

       private int sampleAudioRateInHz = 44100;
       private Camera.Size previewSize;        //preview and Camera and Recorder width and height
       private int recorderFrameRate = 25;

       // audio data getting thread
       private AudioRecord audioRecord;
       private AudioRecordRunnable audioRecordRunnable;
       private Thread audioThread;
       private volatile boolean runAudioThread = true;

       // video data getting thread
       private Camera mCamera;
       private CameraView mPreview;
       private FFmpegFrameRecorder recorder;
       private boolean recording = false;
       private Frame yuvImage = null;

       //storage
       private Storage storage;

       @Override
       public void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           if(Static.DEBUG) Log.i(TAG, "onCreate()");

           Thread.setDefaultUncaughtExceptionHandler(uncaughtExceptionHandler);

           setContentView(R.layout.activity_opencv);

           prepareCamera();
       }

       private Thread.UncaughtExceptionHandler uncaughtExceptionHandler =
               new Thread.UncaughtExceptionHandler() {
                   public void uncaughtException(Thread thread, Throwable ex) {
                       if(Static.DEBUG) Log.e(TAG, "Uncaught exception", ex);
                   }
               };

       @Override
       protected void onRestart() {
           super.onRestart();
           if (Static.DEBUG) Log.i(TAG, "onRestart()");
       }

       @Override
       protected void onStart() {
           super.onStart();
           if (Static.DEBUG) Log.i(TAG, "onStart()");
       }

       @Override
       protected void onResume() {
           super.onResume();
           if (Static.DEBUG) Log.i(TAG, "onResume()");

           storage = new Storage(this);
           if(storage.mExternalStorageAvailable == true && storage.mExternalStorageWriteable == false)
               Static.showToast(this, getString(R.string.errExternalStorageReadOnly), Toast.LENGTH_LONG);
           else if (storage.mExternalStorageAvailable == false && storage.mExternalStorageWriteable == false)
               Static.showToast(this, getString(R.string.errExternalStorage), Toast.LENGTH_LONG);
       }

       @Override
       public boolean onCreateOptionsMenu(Menu menu) {
           if (Static.DEBUG) Log.i(TAG, "onCreateOptionsMenu()");
           return super.onCreateOptionsMenu(menu);
       }

       @Override
       protected void onPause() {
           super.onPause();
           if (Static.DEBUG) Log.i(TAG, "onPause()");
       }

       @Override
       protected void onStop() {
           super.onStop();
           if (Static.DEBUG) Log.i(TAG, "onStop()");
       }

       @Override
       protected void onDestroy() {
           super.onDestroy();
           if (Static.DEBUG) Log.i(TAG, "onDestroy()");

           recording = false;

           if (mPreview != null) {
               mPreview.stopPreview();
           }

           if (mCamera != null) {
               mCamera.stopPreview();
               mCamera.release();
               mCamera = null;
           }
       }

       /** Prepare camera object.
        * */
       private void prepareCamera() {
           //1. Open camera object
           try {
               mCamera = getCameraInstance(this);
           } catch (Exception e) {
               e.printStackTrace();
               Static.showToast(this, e.getMessage(), Toast.LENGTH_LONG);
               finish();
               return;
           }

           setOptimalCameraParams();

           //2. Connect Preview + 3. Start Preview + 8e Stop preview v metode surfaceChanged
           mPreview = new CameraView(this, mCamera);
           FrameLayout preview = (FrameLayout) findViewById(R.id.cameraPreview);
           preview.addView(mPreview); //surfaceView to FrameLayout
           if(Static.DEBUG) Log.i(TAG, "camera preview start: OK");
       }

       /**
        * A safe way to get an instance of the Camera object.
        */
       @SuppressWarnings("deprecation")
       public static Camera getCameraInstance(Context ctx) throws Exception {
           Camera c = Camera.open();
           if (c == null)
               throw new Exception(ctx.getString(R.string.errCameraNotAvailable));

           if(Static.DEBUG) Log.i(TAG, "camera open");
           return c; // attempt to get a Camera instance, otherwise null
       }

       /**Method set optmial parameters. Depend on HW possibilities.*/
       @SuppressWarnings("deprecation")
       private void setOptimalCameraParams(){

           // Camera parameters
           Camera.Parameters params = mCamera.getParameters();
           List<string> focusModes = params.getSupportedFocusModes();
           if (Static.DEBUG) Log.i(TAG, "focusModes():" + focusModes.toString());
           if (Static.DEBUG) Log.i(TAG, "Camera parameters:\n" + params.flatten());

           params.setRecordingHint(true); //MediaRecorder.start() to start faster

           //Automatically autofocus if it's possible
           if (params.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
               params.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
           } else if (focusModes.contains(Camera.Parameters.FOCUS_MODE_AUTO)) { //at least focus auto
               params.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
           }

           // set Camera parameters
           mCamera.setParameters(params);
       }

       //---------------------------------------
       // initialize ffmpeg_recorder
       //---------------------------------------
       private void initRecorder() throws Exception {
           if(Static.DEBUG) Log.i(TAG,"init recorder");

           File output = null;

           try {
               output = storage.getOutputMediaFile(storage.MEDIA_TYPE_VIDEO);
               if(output == null)
                   throw new Exception();
           } catch (Exception e) {
               e.printStackTrace();
               throw new Exception(getString(R.string.errSetOutputFile));
           }

           if (yuvImage == null) {
               yuvImage = new Frame(previewSize.width, previewSize.height, Frame.DEPTH_UBYTE, 2);
               if(Static.DEBUG) Log.i(TAG, "create yuvImage");
           }

           if(Static.DEBUG) Log.i(TAG, "ffmpeg_url: " + output.getPath());
           recorder = new FFmpegFrameRecorder(output.getPath(), previewSize.width, previewSize.height, 1);
           //recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
           //recorder.setVideoOption("preset", "veryfast"); // or ultrafast or fast, etc.
           //recorder.setVideoQuality(0); // maximum quality, replace recorder.setVideoBitrate(16384);
           //recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
           recorder.setFormat("mp4");
           recorder.setSampleRate(sampleAudioRateInHz);
           recorder.setFrameRate(recorderFrameRate);

           if(Static.DEBUG) Log.i(TAG, "recorder initialize success");

           audioRecordRunnable = new AudioRecordRunnable();
           audioThread = new Thread(audioRecordRunnable);
           runAudioThread = true;
       }

       public void startRecording() {

           try {
               initRecorder();
           } catch (Exception e){
               e.printStackTrace();
               Static.showToast(this, e.getMessage(), Toast.LENGTH_LONG);
           }

           try {
               recorder.start();
               startTime = System.currentTimeMillis();
               recording = true;
               audioThread.start();

           } catch (FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
       }

       public void stopRecording() {

           runAudioThread = false;
           try {
               audioThread.join();
           } catch (InterruptedException e) {
               // reset interrupt to be nice
               Thread.currentThread().interrupt();
               return;
           }
           audioRecordRunnable = null;
           audioThread = null;

           if (recorder != null &amp;&amp; recording) {

               recording = false;
               if(Static.DEBUG) Log.i(TAG,"Finishing recording, calling stop and release on recorder");
               try {
                   recorder.stop();
                   recorder.release();
               } catch (FFmpegFrameRecorder.Exception e) {
                   e.printStackTrace();
               }
               recorder = null;

           }
       }

       @Override
       public boolean onKeyDown(int keyCode, KeyEvent event) {

           if (keyCode == KeyEvent.KEYCODE_BACK) {
               if (recording) {
                   stopRecording();
               }

               finish();

               return true;
           }

           return super.onKeyDown(keyCode, event);
       }

       public void onClickBtnStartRecord(View v) {
           if (!recording) {
               startRecording();
               if(Static.DEBUG) Log.i(TAG, "Start Button Pushed");
           } else {
               // This will trigger the audio recording loop to stop and then set isRecorderStart = false;
               stopRecording();
               if(Static.DEBUG) Log.i(TAG, "Stop Button Pushed");
           }
       }

       //---------------------------------------------
       // audio thread, gets and encodes audio data
       //---------------------------------------------
       class AudioRecordRunnable implements Runnable {

           @Override
           public void run() {
               android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

               // Audio
               int bufferSize;
               ShortBuffer audioData;
               int bufferReadResult;

               bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
               audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

               audioData = ShortBuffer.allocate(bufferSize);

               if(Static.DEBUG) Log.i(TAG, "audioRecord.startRecording()");
               audioRecord.startRecording();

               /* ffmpeg_audio encoding loop */
               while (runAudioThread) {
                   //if(Static.DEBUG) Log.i(TAG,"recording? " + recording);
                   bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
                   audioData.limit(bufferReadResult);
                   if (bufferReadResult > 0) {
                       //if(Static.DEBUG) Log.i(TAG,"bufferReadResult: " + bufferReadResult);
                       // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                       // Why?  Good question...
                       if (recording) {
                           try {
                               recorder.recordSamples(audioData);
                               //if(Static.DEBUG) Log.i(TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                           } catch (FFmpegFrameRecorder.Exception e) {
                               if(Static.DEBUG) Log.i(TAG,e.getMessage());
                               e.printStackTrace();
                           }
                       }
                   }
               }
               if(Static.DEBUG) Log.i(TAG,"AudioThread Finished, release audioRecord");

               /* encoding finish, release recorder */
               if (audioRecord != null) {
                   audioRecord.stop();
                   audioRecord.release();
                   audioRecord = null;
                   if(Static.DEBUG) Log.i(TAG,"audioRecord released");
               }
           }
       }

       /**TODO*/
       private Camera.Size getBestPreviewSize(int width, int height, Camera.Parameters parameters) {
           Camera.Size result=null;

           for (Camera.Size size : parameters.getSupportedPreviewSizes()) {

               if(Static.DEBUG) Log.i(TAG, size.width + "/" + size.height);

               if (size.width&lt;=width &amp;&amp; size.height&lt;=height) {
                   if (result==null) {
                       result=size;
                   } else {
                       int resultArea=result.width*result.height;
                       int newArea=size.width*size.height;

                       if (newArea>resultArea) {
                           result=size;
                       }
                   }
               }
           }

           return(result);
       }

       //---------------------------------------------
       // camera thread, gets and encodes video data
       //---------------------------------------------
       private class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

           private final String TAG = CameraView.class.getSimpleName();

           private SurfaceHolder mHolder;
           private Camera mCamera;

           public CameraView(Context context, Camera camera) {
               super(context);
               if(Static.DEBUG) Log.i(TAG, "camera view");

               mCamera = camera;

               // Install a SurfaceHolder.Callback so we get notified when the
               // underlying surface is created and destroyed.
               mHolder = getHolder();
               mHolder.addCallback(CameraView.this);
               mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
               mCamera.setPreviewCallback(CameraView.this);
           }

           @Override
           public void surfaceCreated(SurfaceHolder holder) {
               try {
                   stopPreview();
                   mCamera.setPreviewDisplay(holder);
               } catch (IOException exception) {
                   mCamera.release();
                   mCamera = null;
               }
           }

           public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
               if(Static.DEBUG) Log.i(TAG, "surfaceChanged() => w=" + w + ", h=" + h);

               // If your preview can change or rotate, take care of those events here.
               // Make sure to stop the preview before resizing or reformatting it.

               if (mHolder.getSurface() == null){
                   // preview surface does not exist
                   return;
               }

               // stop preview before making changes
               try {
                   stopPreview();
               } catch (Exception e){
                   // ignore: tried to stop a non-existent preview
               }

               // start preview with new settings
               try {
                   Camera.Parameters params = mCamera.getParameters();

                   previewSize = getBestPreviewSize(w, h, params);
                   if(Static.DEBUG) Log.i(TAG, "getBestPreviewSize() => w=" + previewSize.width + ", h=" + previewSize.height);
                   if (previewSize != null)
                       params.setPreviewSize(previewSize.width, previewSize.height);

                   params.setPreviewFrameRate(recorderFrameRate);
                   if(Static.DEBUG) Log.i(TAG,"Preview Framerate: " + params.getPreviewFrameRate());
                   mCamera.setParameters(params);
                   mCamera.setPreviewDisplay(holder);
                   mCamera.setPreviewCallback(CameraView.this);
                   startPreview();
               } catch (Exception e){
                   if(Static.DEBUG) Log.i(TAG, "Could not set preview display in surfaceChanged");
                   e.printStackTrace();
               }

           }

           @Override
           public void surfaceDestroyed(SurfaceHolder holder) {
               try {
                   mHolder.addCallback(null);
                   mCamera.setPreviewCallback(null);
               } catch (RuntimeException e) {
                   // The camera has probably just been released, ignore.
               }
           }

           public void startPreview() {
               if (!isPreviewOn &amp;&amp; mCamera != null) {
                   isPreviewOn = true;
                   mCamera.startPreview();
               }
           }

           public void stopPreview() {
               if (isPreviewOn &amp;&amp; mCamera != null) {
                   isPreviewOn = false;
                   mCamera.stopPreview();
               }
           }

           @Override
           public void onPreviewFrame(byte[] data, Camera camera) {
               if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
                   startTime = System.currentTimeMillis();
                   return;
               }

               // get video data
               if (yuvImage != null &amp;&amp; recording) {
                   ((ByteBuffer)yuvImage.image[0].position(0)).put(data);

                   try {
                       long t = 1000 * (System.currentTimeMillis() - startTime);
                       if(Static.DEBUG) Log.i(TAG,"Writing Frame on timestamp: "+t);
                       if (t > recorder.getTimestamp()) {
                           recorder.setTimestamp(t);
                       }
                       recorder.record(yuvImage);
                   } catch (FFmpegFrameRecorder.Exception e) {
                       if(Static.DEBUG) Log.i(TAG,e.getMessage());
                       e.printStackTrace();
                   }
               }
           }
       }
    }
    </string>

    activity_opencv.xml :

    &lt;?xml version="1.0" encoding="utf-8"?>

    <relativelayout>

       <framelayout>
       </framelayout>

       &lt;ImageButton<br />
           android:id=&quot;@+id/btnStartRecord&quot;<br />
           android:layout_width=&quot;70dp&quot;<br />
           android:layout_height=&quot;70dp&quot;<br />
           android:scaleType=&quot;fitXY&quot;<br />
           android:src=&quot;@drawable/record_icon&quot;<br />
           android:background=&quot;@null&quot;<br />
           android:text=&quot;@string/btnStartRecord&quot;<br />
           android:onClick=&quot;onClickBtnStartRecord&quot;<br />
           android:clickable=&quot;true&quot;<br />
           android:layout_centerVertical=&quot;true&quot;<br />
           android:layout_alignParentRight=&quot;true&quot;<br />
           android:layout_alignParentEnd=&quot;true&quot;/&gt;

       <textview></textview>

    </relativelayout>

    Overlay views working, but recorded video is without overlay views.

  • FFmpeg : avconv and ffserver - connection reset by peer

    5 mars 2016, par einsA

    I’m trying to stream a video via ffserver. My dev environment is a Raspberry Pi 2.

    I’m getting an av_interleaved_write_frame(): Connection reset by peer with the following approach :

    pi@raspberrypi:/opt/livefeed $ avconv -f mjpeg -i movie.mjpg -r 20 -b 2000k -vcodec libx264 -preset ultrafast -f flv -metadata streamName="mjpeg_to_flv" -metadata title="converting mjpeg to flv via avconv" http://localhost:8090/feed1.ffm
    avconv version 11.4-6:11.4-1~deb8u1+rpi1, Copyright (c) 2000-2014 the Libav developers
     built on Jun 16 2015 05:32:34 with gcc 4.9.2 (Raspbian 4.9.2-10)
    [mjpeg @ 0x20461e0] Estimating duration from bitrate, this may be inaccurate
    Input #0, mjpeg, from 'movie.mjpg':
     Duration: N/A, bitrate: N/A
       Stream #0.0: Video: mjpeg, yuvj422p, 768x512, 25 fps, 25 tbn
    [tcp @ 0x2050bc0] Connection to tcp://localhost:8090 failed (Connection refused), trying next address
    [libx264 @ 0x20501a0] using cpu capabilities: none!
    [libx264 @ 0x20501a0] profile Constrained Baseline, level 3.0
    [libx264 @ 0x20501a0] 264 - core 142 r2431 a5831aa - H.264/MPEG-4 AVC codec - Copyleft 2003-2014 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=20 scenecut=0 intra_refresh=0 rc=abr mbtree=0 bitrate=2000 ratetol=1.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0
    Output #0, flv, to 'http://localhost:8090/feed1.ffm':
     Metadata:
       streamName      : mjpeg_to_flv
       title           : converting mjpeg to flv via avconv
       encoder         : Lavf56.1.0
       Stream #0.0: Video: libx264, yuv420p, 768x512, q=-1--1, 2000 kb/s, 20 fps, 1k tbn, 20 tbc
       Metadata:
         encoder         : Lavc56.1.0 libx264
    Stream mapping:
     Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
    Press ctrl-c to stop encoding
    av_interleaved_write_frame(): Connection reset by peer0000.00 bitrate=   0.0kbits/s

    I am able to stream the file doing it step by step with the following settings and commands :

    ffserver

    ffserver config :

    pi@raspberrypi:/opt/livefeed $ cat ffserver_flv.conf
    # FLV live feed

    Port 8090
    BindAddress 0.0.0.0
    MaxHTTPConnections 2000
    MaxClients 1000
    MaxBandwidth 1000
    CustomLog -

    <stream>
     Format status
     ACL allow localhost
     ACL allow 192.168.0.0 192.168.255.255
    </stream>

    <feed>
     File /tmp/feed_flv.ffm
     FileMaxSize 1G
     ACL allow 127.0.0.1
    </feed>

    <stream>
     Format flv
     Feed feed1.ffm

     VideoCodec libx264
     VideoFrameRate 30
     VideoBitRate 800
     VideoSize 720x576
     AVOptionVideo crf 23
     AVOptionVideo preset medium
     AVOptionVideo me_range 16
     AVOptionVideo qdiff 4
     AVOptionVideo qmin 10
     AVOptionVideo qmax 51
     AVOptionVideo flags +global_header

     NoAudio
    </stream>

    starting the server with :

    pi@raspberrypi:/opt/livefeed $ ffserver -v debug -f ffserver_flv.conf
    ffserver version N-78909-gfb2f164 Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.9.2 (Raspbian 4.9.2-10)
     configuration: --arch=armel --target-os=linux --enable-gpl --enable-libx264 --enable-nonfree --enable-libass --enable-libfaac --enable-libmp3lame --enable-libfreetype
     libavutil      55. 19.100 / 55. 19.100
     libavcodec     57. 27.101 / 57. 27.101
     libavformat    57. 28.100 / 57. 28.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 39.100 /  6. 39.100
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    ffserver_flv.conf:6: Port option is deprecated. Use HTTPPort instead.
    ffserver_flv.conf:7: BindAddress option is deprecated. Use HTTPBindAddress instead.
    ffserver_flv.conf:48: Setting default value for video bit rate tolerance = 200000. Use NoDefaults to disable it.
    ffserver_flv.conf:48: Setting default value for video rate control equation = tex^qComp. Use NoDefaults to disable it.
    ffserver_flv.conf:48: Setting default value for video max rate = 49537952. Use NoDefaults to disable it.
    ffserver_flv.conf:48: Setting default value for video buffer size = 1600000. Use NoDefaults to disable it.
    Sat Mar  5 13:29:46 2016 [file @ 0x2f3d0d0]Setting default whitelist 'file'
    Sat Mar  5 13:29:46 2016 [ffm @ 0x2f40bc0]Format ffm probed with size=2048 and score=101
    Sat Mar  5 13:29:46 2016 [NULL @ 0x2f43760]Setting entry with key 'time_base' to value '1/30'
    Sat Mar  5 13:29:46 2016 [NULL @ 0x2f43760]Setting entry with key 'b' to value '800000'
    Sat Mar  5 13:29:46 2016 [NULL @ 0x2f43760]Setting entry with key 'video_size' to value '720x576'
    Sat Mar  5 13:29:46 2016 [NULL @ 0x2f43760]Setting entry with key 'me_range' to value '16'
    Sat Mar  5 13:29:46 2016 [NULL @ 0x2f43760]Setting entry with key 'qdiff' to value '4'
    Sat Mar  5 13:29:46 2016 [NULL @ 0x2f43760]Setting entry with key 'qmin' to value '10'
    Sat Mar  5 13:29:46 2016 [NULL @ 0x2f43760]Setting entry with key 'qmax' to value '51'
    Sat Mar  5 13:29:46 2016 [NULL @ 0x2f43760]Setting entry with key 'flags' to value '+global_header'
    Sat Mar  5 13:29:46 2016 [NULL @ 0x2f43760]Setting entry with key 'bt' to value '200000'
    Sat Mar  5 13:29:46 2016 [NULL @ 0x2f43760]Setting entry with key 'rc_eq' to value 'tex^qComp'
    Sat Mar  5 13:29:46 2016 [NULL @ 0x2f43760]Setting entry with key 'maxrate' to value '1600000'
    Sat Mar  5 13:29:46 2016 [NULL @ 0x2f43760]Setting entry with key 'bufsize' to value '1600000'
    Sat Mar  5 13:29:46 2016 [AVIOContext @ 0x2f41450]Statistics: 32768 bytes read, 0 seeks
    Sat Mar  5 13:29:46 2016 FFserver started.

    avconv

    I’m converting a mjpeg movie (movie.mjpeg) via avconv to flv (movie.flv) :

    pi@raspberrypi:/opt/livefeed $ avconv -f mjpeg -i movie.mjpg -r 20 -b 2000k -vcodec libx264 -preset ultrafast -f flv -metadata streamName="mjpeg_to_flv" -metadata title="converting mjpeg to flv via avconv" movie.flv
    avconv version 11.4-6:11.4-1~deb8u1+rpi1, Copyright (c) 2000-2014 the Libav developers
     built on Jun 16 2015 05:32:34 with gcc 4.9.2 (Raspbian 4.9.2-10)
    [mjpeg @ 0x34f1e0] Estimating duration from bitrate, this may be inaccurate
    Input #0, mjpeg, from 'movie.mjpg':
     Duration: N/A, bitrate: N/A
       Stream #0.0: Video: mjpeg, yuvj422p, 768x512, 25 fps, 25 tbn
    [libx264 @ 0x3591a0] using cpu capabilities: none!
    [libx264 @ 0x3591a0] profile Constrained Baseline, level 3.0
    [libx264 @ 0x3591a0] 264 - core 142 r2431 a5831aa - H.264/MPEG-4 AVC codec - Copyleft 2003-2014 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=20 scenecut=0 intra_refresh=0 rc=abr mbtree=0 bitrate=2000 ratetol=1.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0
    Output #0, flv, to 'movie.flv':
     Metadata:
       streamName      : mjpeg_to_flv
       title           : converting mjpeg to flv via avconv
       encoder         : Lavf56.1.0
       Stream #0.0: Video: libx264, yuv420p, 768x512, q=-1--1, 2000 kb/s, 20 fps, 1k tbn, 20 tbc
       Metadata:
         encoder         : Lavc56.1.0 libx264
    Stream mapping:
     Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
    Press ctrl-c to stop encoding
    frame=   93 fps=  9 q=-1.0 Lsize=    1078kB time=4.60 bitrate=1919.3kbits/s    ts/s    
    video:1076kB audio:0kB other streams:0kB global headers:0kB muxing overhead: 0.201471%
    [libx264 @ 0x3591a0] frame I:1     Avg QP:19.00  size: 35978
    [libx264 @ 0x3591a0] frame P:92    Avg QP:21.43  size: 11574
    [libx264 @ 0x3591a0] mb I  I16..4: 100.0%  0.0%  0.0%
    [libx264 @ 0x3591a0] mb P  I16..4: 25.9%  0.0%  0.0%  P16..4: 53.6%  0.0%  0.0%  0.0%  0.0%    skip:20.5%
    [libx264 @ 0x3591a0] final ratefactor: 20.82
    [libx264 @ 0x3591a0] coded y,uvDC,uvAC intra: 28.9% 46.9% 18.2% inter: 33.7% 39.8% 2.8%
    [libx264 @ 0x3591a0] i16 v,h,dc,p: 29% 30% 28% 14%
    [libx264 @ 0x3591a0] i8c dc,h,v,p: 53% 26% 15%  7%
    [libx264 @ 0x3591a0] kb/s:1893.82

    send flv to ffserver

    pi@raspberrypi:/opt/livefeed $ ffmpeg -stream_loop -1 -i movie.flv http://localhost:8090/feed1.ffm
    ffmpeg version N-78909-gfb2f164 Copyright (c) 2000-2016 the FFmpeg developers
     built with gcc 4.9.2 (Raspbian 4.9.2-10)
     configuration: --arch=armel --target-os=linux --enable-gpl --enable-libx264 --enable-nonfree --enable-libass --enable-libfaac --enable-libmp3lame --enable-libfreetype
     libavutil      55. 19.100 / 55. 19.100
     libavcodec     57. 27.101 / 57. 27.101
     libavformat    57. 28.100 / 57. 28.100
     libavdevice    57.  0.101 / 57.  0.101
     libavfilter     6. 39.100 /  6. 39.100
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.101 /  2.  0.101
     libpostproc    54.  0.100 / 54.  0.100
    Input #0, flv, from 'movie.flv':
     Metadata:
       streamName      : mjpeg_to_flv
       title           : converting mjpeg to flv via avconv
       encoder         : Lavf56.1.0
     Duration: 00:00:04.65, start: 0.000000, bitrate: 1898 kb/s
       Stream #0:0: Video: h264 (Constrained Baseline), yuv420p, 768x512, 2000 kb/s, 20 fps, 20 tbr, 1k tbn, 40 tbc
    [tcp @ 0x30d2920] Connection to tcp://localhost:8090 failed (Connection refused), trying next address
    [tcp @ 0x329a080] Connection to tcp://localhost:8090 failed (Connection refused), trying next address
    [libx264 @ 0x325fb30] using cpu capabilities: ARMv6 NEON
    [libx264 @ 0x325fb30] profile High, level 3.1
    [libx264 @ 0x325fb30] 264 - core 148 r2665 a01e339 - H.264/MPEG-4 AVC codec - Copyleft 2003-2016 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=10 qpmax=51 qpstep=4 vbv_maxrate=1600 vbv_bufsize=1600 crf_max=0.0 nal_hrd=none filler=0 ip_ratio=1.40 aq=1:1.00
    Output #0, ffm, to 'http://localhost:8090/feed1.ffm':
     Metadata:
       streamName      : mjpeg_to_flv
       title           : converting mjpeg to flv via avconv
       creation_time   : now
       encoder         : Lavf57.28.100
       Stream #0:0: Video: h264 (libx264), yuv420p, 720x576, q=10-51, 800 kb/s, 20 fps, 1000k tbn, 30 tbc
       Metadata:
         encoder         : Lavc57.27.101 libx264
       Side data:
         cpb: bitrate max/min/avg: 1600000/0/800000 buffer size: 1600000 vbv_delay: -1
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
    Press [q] to stop, [?] for help
    Past duration 0.999992 too large
       Last message repeated 2 times
    Past duration 0.999992 too large      4kB time=00:00:00.00 bitrate=N/A dup=5 drop=0 speed=   0x    
       Last message repeated 3 times
    Past duration 0.999992 too large      4kB time=00:00:00.00 bitrate=N/A dup=9 drop=0 speed=  
    [croped]    
    video:968kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 3.719636%
    [libx264 @ 0x325fb30] frame I:11    Avg QP:22.49  size: 13888
    [libx264 @ 0x325fb30] frame P:487   Avg QP:16.10  size:  1382
    [libx264 @ 0x325fb30] frame B:980   Avg QP:21.31  size:   168
    [libx264 @ 0x325fb30] consecutive B-frames: 11.0%  1.8%  0.4% 86.9%
    [libx264 @ 0x325fb30] mb I  I16..4: 11.5% 81.2%  7.3%
    [libx264 @ 0x325fb30] mb P  I16..4:  1.6%  5.0%  0.2%  P16..4: 12.0%  1.6%  0.7%  0.0%  0.0%    skip:79.0%
    [libx264 @ 0x325fb30] mb B  I16..4:  0.0%  0.0%  0.0%  B16..8:  5.8%  0.1%  0.0%  direct: 0.0%  skip:94.1%  L0:55.3% L1:44.4% BI: 0.3%
    [libx264 @ 0x325fb30] 8x8 transform intra:75.4% inter:88.7%
    [libx264 @ 0x325fb30] coded y,uvDC,uvAC intra: 38.6% 48.2% 14.9% inter: 1.5% 2.3% 0.0%
    [libx264 @ 0x325fb30] i16 v,h,dc,p: 28% 32% 11% 28%
    [libx264 @ 0x325fb30] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 25% 21% 31%  3%  4%  4%  4%  4%  4%
    [libx264 @ 0x325fb30] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 23% 25% 23%  4%  7%  5%  7%  4%  3%
    [libx264 @ 0x325fb30] i8c dc,h,v,p: 63% 20% 14%  4%
    [libx264 @ 0x325fb30] Weighted P-Frames: Y:1.6% UV:1.2%
    [libx264 @ 0x325fb30] ref P L0: 67.5% 11.6% 13.8%  6.8%  0.3%
    [libx264 @ 0x325fb30] ref B L0: 58.7% 40.4%  0.9%
    [libx264 @ 0x325fb30] ref B L1: 96.2%  3.8%
    [libx264 @ 0x325fb30] kb/s:160.83
    Exiting normally, received signal 2.

    On another machine I can open the flv-stream via VLC by opening the address http://ip.running.ffserver:8090/live.flv.

    So why does it works step by step via the movie.flv file and not directly ?

    What is the difference between

    • avconv -f mjpeg -i movie.mjpg -r 20 -b 2000k -vcodec libx264 -preset ultrafast -f flv -metadata streamName="mjpeg_to_flv" -metadata title="converting mjpeg to flv via avconv" movie.flv
      followed by
      ffmpeg -stream_loop -1 -i movie.flv http://localhost:8090/feed1.ffm and

    • avconv -f mjpeg -i movie.mjpg -r 20 -b 2000k -vcodec libx264 -preset ultrafast -f flv -metadata streamName="mjpeg_to_flv" -metadata title="converting mjpeg to flv via avconv" http://localhost:8090/feed1.ffm ? Why ends this command with a av_interleaved_write_frame(): Connection reset by peer ?


    Informations and logs

    • ffserver log (both times) :

      Sat Mar  5 13:54:08 2016 127.0.0.1 - - [POST] "/feed1.ffm HTTP/1.1" 200 4096
    • ffserver version

      pi@raspberrypi:/opt/livefeed $ ffserver -version
      ffserver version N-78909-gfb2f164 Copyright (c) 2000-2016 the FFmpeg developers
      built with gcc 4.9.2 (Raspbian 4.9.2-10)
      configuration: --arch=armel --target-os=linux --enable-gpl --enable-libx264 --enable-nonfree --enable-libass --enable-libfaac --enable-libmp3lame --enable-libfreetype
      libavutil      55. 19.100 / 55. 19.100
      libavcodec     57. 27.101 / 57. 27.101
      libavformat    57. 28.100 / 57. 28.100
      libavdevice    57.  0.101 / 57.  0.101
      libavfilter     6. 39.100 /  6. 39.100
      libswscale      4.  0.100 /  4.  0.100
      libswresample   2.  0.101 /  2.  0.101
      libpostproc    54.  0.100 / 54.  0.100
    • ffmpeg version

      pi@raspberrypi:/opt/livefeed $ ffmpeg -version
      ffmpeg version N-78909-gfb2f164 Copyright (c) 2000-2016 the FFmpeg developers
      built with gcc 4.9.2 (Raspbian 4.9.2-10)
      configuration: --arch=armel --target-os=linux --enable-gpl --enable-libx264 --enable-nonfree --enable-libass --enable-libfaac --enable-libmp3lame --enable-libfreetype
      libavutil      55. 19.100 / 55. 19.100
      libavcodec     57. 27.101 / 57. 27.101
      libavformat    57. 28.100 / 57. 28.100
      libavdevice    57.  0.101 / 57.  0.101
      libavfilter     6. 39.100 /  6. 39.100
      libswscale      4.  0.100 /  4.  0.100
      libswresample   2.  0.101 /  2.  0.101
      libpostproc    54.  0.100 / 54.  0.100
    • avconv version :

      pi@raspberrypi:/opt/livefeed $ avconv -version
      avconv version 11.4-6:11.4-1~deb8u1+rpi1, Copyright (c) 2000-2014 the Libav developers
      built on Jun 16 2015 05:32:34 with gcc 4.9.2 (Raspbian 4.9.2-10)
      avconv 11.4-6:11.4-1~deb8u1+rpi1
      libavutil     54.  3. 0 / 54.  3. 0
      libavcodec    56.  1. 0 / 56.  1. 0
      libavformat   56.  1. 0 / 56.  1. 0
      libavdevice   55.  0. 0 / 55.  0. 0
      libavfilter    5.  0. 0 /  5.  0. 0
      libavresample  2.  1. 0 /  2.  1. 0
      libswscale     3.  0. 0 /  3.  0. 0

    Thanks and cheers

    Clemens