Recherche avancée

Médias (91)

Autres articles (95)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (9156)

  • FFMpeg colorkey/chromakey

    29 février 2016, par Rafał Kowalczyk

    I’m trying to achieve the following effect :

    1. I’ve a static background texture (for example : sky).
    2. I’m playing using FFMpeg and SDL 2.0 libraries .avi file with explosion where all non-transparent pixels are simple black.
    3. I want to draw only non-transparent pixels on my background.

    I though I can use SetColor for FFMpeg single frame but in SDL 2.0 I can use colorkey only for surfaces and FFMpeg is using YUV420 pixel format.

    How should I proceed ? any code suggestions ?

  • Android recording video with overlay view [way 1]

    2 mars 2016, par t0m

    I am trying app in android which have functionality to capture video with overlay views. I tried two ways (1. and 2.).
    1. Via SurfaceView and JavaCV with FFmpeg.
    2. Via OpenCV and JavaCV with FFmpeg.
    3. For API21+ maybe with MediaProjection.

    (Question is divided to two questions, due to stackoverflow length limit.)

    ad 1. Via SurfaceView and JavaCV with FFmpeg :

    OpenCVCameraActivity.java :

    import android.app.Activity;
    import android.content.Context;
    import android.hardware.Camera;
    import android.hardware.Camera.PreviewCallback;
    import android.media.AudioFormat;
    import android.media.AudioRecord;
    import android.media.MediaRecorder;
    import android.os.Bundle;
    import android.util.Log;
    import android.view.KeyEvent;
    import android.view.Menu;
    import android.view.SurfaceHolder;
    import android.view.SurfaceView;
    import android.view.View;
    import android.widget.FrameLayout;
    import android.widget.Toast;

    import org.bytedeco.javacv.FFmpegFrameRecorder;
    import org.bytedeco.javacv.Frame;

    import java.io.File;
    import java.io.IOException;
    import java.nio.ByteBuffer;
    import java.nio.ShortBuffer;
    import java.util.List;


    @SuppressWarnings("ALL")
    public class OpenCVCameraActivity extends Activity {

       private static final String TAG = OpenCVCameraActivity.class.getSimpleName();

       private long startTime = 0;
       private boolean isPreviewOn = false;

       private int sampleAudioRateInHz = 44100;
       private Camera.Size previewSize;        //preview and Camera and Recorder width and height
       private int recorderFrameRate = 25;

       // audio data getting thread
       private AudioRecord audioRecord;
       private AudioRecordRunnable audioRecordRunnable;
       private Thread audioThread;
       private volatile boolean runAudioThread = true;

       // video data getting thread
       private Camera mCamera;
       private CameraView mPreview;
       private FFmpegFrameRecorder recorder;
       private boolean recording = false;
       private Frame yuvImage = null;

       //storage
       private Storage storage;

       @Override
       public void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           if(Static.DEBUG) Log.i(TAG, "onCreate()");

           Thread.setDefaultUncaughtExceptionHandler(uncaughtExceptionHandler);

           setContentView(R.layout.activity_opencv);

           prepareCamera();
       }

       private Thread.UncaughtExceptionHandler uncaughtExceptionHandler =
               new Thread.UncaughtExceptionHandler() {
                   public void uncaughtException(Thread thread, Throwable ex) {
                       if(Static.DEBUG) Log.e(TAG, "Uncaught exception", ex);
                   }
               };

       @Override
       protected void onRestart() {
           super.onRestart();
           if (Static.DEBUG) Log.i(TAG, "onRestart()");
       }

       @Override
       protected void onStart() {
           super.onStart();
           if (Static.DEBUG) Log.i(TAG, "onStart()");
       }

       @Override
       protected void onResume() {
           super.onResume();
           if (Static.DEBUG) Log.i(TAG, "onResume()");

           storage = new Storage(this);
           if(storage.mExternalStorageAvailable == true && storage.mExternalStorageWriteable == false)
               Static.showToast(this, getString(R.string.errExternalStorageReadOnly), Toast.LENGTH_LONG);
           else if (storage.mExternalStorageAvailable == false && storage.mExternalStorageWriteable == false)
               Static.showToast(this, getString(R.string.errExternalStorage), Toast.LENGTH_LONG);
       }

       @Override
       public boolean onCreateOptionsMenu(Menu menu) {
           if (Static.DEBUG) Log.i(TAG, "onCreateOptionsMenu()");
           return super.onCreateOptionsMenu(menu);
       }

       @Override
       protected void onPause() {
           super.onPause();
           if (Static.DEBUG) Log.i(TAG, "onPause()");
       }

       @Override
       protected void onStop() {
           super.onStop();
           if (Static.DEBUG) Log.i(TAG, "onStop()");
       }

       @Override
       protected void onDestroy() {
           super.onDestroy();
           if (Static.DEBUG) Log.i(TAG, "onDestroy()");

           recording = false;

           if (mPreview != null) {
               mPreview.stopPreview();
           }

           if (mCamera != null) {
               mCamera.stopPreview();
               mCamera.release();
               mCamera = null;
           }
       }

       /** Prepare camera object.
        * */
       private void prepareCamera() {
           //1. Open camera object
           try {
               mCamera = getCameraInstance(this);
           } catch (Exception e) {
               e.printStackTrace();
               Static.showToast(this, e.getMessage(), Toast.LENGTH_LONG);
               finish();
               return;
           }

           setOptimalCameraParams();

           //2. Connect Preview + 3. Start Preview + 8e Stop preview v metode surfaceChanged
           mPreview = new CameraView(this, mCamera);
           FrameLayout preview = (FrameLayout) findViewById(R.id.cameraPreview);
           preview.addView(mPreview); //surfaceView to FrameLayout
           if(Static.DEBUG) Log.i(TAG, "camera preview start: OK");
       }

       /**
        * A safe way to get an instance of the Camera object.
        */
       @SuppressWarnings("deprecation")
       public static Camera getCameraInstance(Context ctx) throws Exception {
           Camera c = Camera.open();
           if (c == null)
               throw new Exception(ctx.getString(R.string.errCameraNotAvailable));

           if(Static.DEBUG) Log.i(TAG, "camera open");
           return c; // attempt to get a Camera instance, otherwise null
       }

       /**Method set optmial parameters. Depend on HW possibilities.*/
       @SuppressWarnings("deprecation")
       private void setOptimalCameraParams(){

           // Camera parameters
           Camera.Parameters params = mCamera.getParameters();
           List<string> focusModes = params.getSupportedFocusModes();
           if (Static.DEBUG) Log.i(TAG, "focusModes():" + focusModes.toString());
           if (Static.DEBUG) Log.i(TAG, "Camera parameters:\n" + params.flatten());

           params.setRecordingHint(true); //MediaRecorder.start() to start faster

           //Automatically autofocus if it's possible
           if (params.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
               params.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
           } else if (focusModes.contains(Camera.Parameters.FOCUS_MODE_AUTO)) { //at least focus auto
               params.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
           }

           // set Camera parameters
           mCamera.setParameters(params);
       }

       //---------------------------------------
       // initialize ffmpeg_recorder
       //---------------------------------------
       private void initRecorder() throws Exception {
           if(Static.DEBUG) Log.i(TAG,"init recorder");

           File output = null;

           try {
               output = storage.getOutputMediaFile(storage.MEDIA_TYPE_VIDEO);
               if(output == null)
                   throw new Exception();
           } catch (Exception e) {
               e.printStackTrace();
               throw new Exception(getString(R.string.errSetOutputFile));
           }

           if (yuvImage == null) {
               yuvImage = new Frame(previewSize.width, previewSize.height, Frame.DEPTH_UBYTE, 2);
               if(Static.DEBUG) Log.i(TAG, "create yuvImage");
           }

           if(Static.DEBUG) Log.i(TAG, "ffmpeg_url: " + output.getPath());
           recorder = new FFmpegFrameRecorder(output.getPath(), previewSize.width, previewSize.height, 1);
           //recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
           //recorder.setVideoOption("preset", "veryfast"); // or ultrafast or fast, etc.
           //recorder.setVideoQuality(0); // maximum quality, replace recorder.setVideoBitrate(16384);
           //recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
           recorder.setFormat("mp4");
           recorder.setSampleRate(sampleAudioRateInHz);
           recorder.setFrameRate(recorderFrameRate);

           if(Static.DEBUG) Log.i(TAG, "recorder initialize success");

           audioRecordRunnable = new AudioRecordRunnable();
           audioThread = new Thread(audioRecordRunnable);
           runAudioThread = true;
       }

       public void startRecording() {

           try {
               initRecorder();
           } catch (Exception e){
               e.printStackTrace();
               Static.showToast(this, e.getMessage(), Toast.LENGTH_LONG);
           }

           try {
               recorder.start();
               startTime = System.currentTimeMillis();
               recording = true;
               audioThread.start();

           } catch (FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
       }

       public void stopRecording() {

           runAudioThread = false;
           try {
               audioThread.join();
           } catch (InterruptedException e) {
               // reset interrupt to be nice
               Thread.currentThread().interrupt();
               return;
           }
           audioRecordRunnable = null;
           audioThread = null;

           if (recorder != null &amp;&amp; recording) {

               recording = false;
               if(Static.DEBUG) Log.i(TAG,"Finishing recording, calling stop and release on recorder");
               try {
                   recorder.stop();
                   recorder.release();
               } catch (FFmpegFrameRecorder.Exception e) {
                   e.printStackTrace();
               }
               recorder = null;

           }
       }

       @Override
       public boolean onKeyDown(int keyCode, KeyEvent event) {

           if (keyCode == KeyEvent.KEYCODE_BACK) {
               if (recording) {
                   stopRecording();
               }

               finish();

               return true;
           }

           return super.onKeyDown(keyCode, event);
       }

       public void onClickBtnStartRecord(View v) {
           if (!recording) {
               startRecording();
               if(Static.DEBUG) Log.i(TAG, "Start Button Pushed");
           } else {
               // This will trigger the audio recording loop to stop and then set isRecorderStart = false;
               stopRecording();
               if(Static.DEBUG) Log.i(TAG, "Stop Button Pushed");
           }
       }

       //---------------------------------------------
       // audio thread, gets and encodes audio data
       //---------------------------------------------
       class AudioRecordRunnable implements Runnable {

           @Override
           public void run() {
               android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

               // Audio
               int bufferSize;
               ShortBuffer audioData;
               int bufferReadResult;

               bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
               audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

               audioData = ShortBuffer.allocate(bufferSize);

               if(Static.DEBUG) Log.i(TAG, "audioRecord.startRecording()");
               audioRecord.startRecording();

               /* ffmpeg_audio encoding loop */
               while (runAudioThread) {
                   //if(Static.DEBUG) Log.i(TAG,"recording? " + recording);
                   bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
                   audioData.limit(bufferReadResult);
                   if (bufferReadResult > 0) {
                       //if(Static.DEBUG) Log.i(TAG,"bufferReadResult: " + bufferReadResult);
                       // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                       // Why?  Good question...
                       if (recording) {
                           try {
                               recorder.recordSamples(audioData);
                               //if(Static.DEBUG) Log.i(TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                           } catch (FFmpegFrameRecorder.Exception e) {
                               if(Static.DEBUG) Log.i(TAG,e.getMessage());
                               e.printStackTrace();
                           }
                       }
                   }
               }
               if(Static.DEBUG) Log.i(TAG,"AudioThread Finished, release audioRecord");

               /* encoding finish, release recorder */
               if (audioRecord != null) {
                   audioRecord.stop();
                   audioRecord.release();
                   audioRecord = null;
                   if(Static.DEBUG) Log.i(TAG,"audioRecord released");
               }
           }
       }

       /**TODO*/
       private Camera.Size getBestPreviewSize(int width, int height, Camera.Parameters parameters) {
           Camera.Size result=null;

           for (Camera.Size size : parameters.getSupportedPreviewSizes()) {

               if(Static.DEBUG) Log.i(TAG, size.width + "/" + size.height);

               if (size.width&lt;=width &amp;&amp; size.height&lt;=height) {
                   if (result==null) {
                       result=size;
                   } else {
                       int resultArea=result.width*result.height;
                       int newArea=size.width*size.height;

                       if (newArea>resultArea) {
                           result=size;
                       }
                   }
               }
           }

           return(result);
       }

       //---------------------------------------------
       // camera thread, gets and encodes video data
       //---------------------------------------------
       private class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

           private final String TAG = CameraView.class.getSimpleName();

           private SurfaceHolder mHolder;
           private Camera mCamera;

           public CameraView(Context context, Camera camera) {
               super(context);
               if(Static.DEBUG) Log.i(TAG, "camera view");

               mCamera = camera;

               // Install a SurfaceHolder.Callback so we get notified when the
               // underlying surface is created and destroyed.
               mHolder = getHolder();
               mHolder.addCallback(CameraView.this);
               mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
               mCamera.setPreviewCallback(CameraView.this);
           }

           @Override
           public void surfaceCreated(SurfaceHolder holder) {
               try {
                   stopPreview();
                   mCamera.setPreviewDisplay(holder);
               } catch (IOException exception) {
                   mCamera.release();
                   mCamera = null;
               }
           }

           public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
               if(Static.DEBUG) Log.i(TAG, "surfaceChanged() => w=" + w + ", h=" + h);

               // If your preview can change or rotate, take care of those events here.
               // Make sure to stop the preview before resizing or reformatting it.

               if (mHolder.getSurface() == null){
                   // preview surface does not exist
                   return;
               }

               // stop preview before making changes
               try {
                   stopPreview();
               } catch (Exception e){
                   // ignore: tried to stop a non-existent preview
               }

               // start preview with new settings
               try {
                   Camera.Parameters params = mCamera.getParameters();

                   previewSize = getBestPreviewSize(w, h, params);
                   if(Static.DEBUG) Log.i(TAG, "getBestPreviewSize() => w=" + previewSize.width + ", h=" + previewSize.height);
                   if (previewSize != null)
                       params.setPreviewSize(previewSize.width, previewSize.height);

                   params.setPreviewFrameRate(recorderFrameRate);
                   if(Static.DEBUG) Log.i(TAG,"Preview Framerate: " + params.getPreviewFrameRate());
                   mCamera.setParameters(params);
                   mCamera.setPreviewDisplay(holder);
                   mCamera.setPreviewCallback(CameraView.this);
                   startPreview();
               } catch (Exception e){
                   if(Static.DEBUG) Log.i(TAG, "Could not set preview display in surfaceChanged");
                   e.printStackTrace();
               }

           }

           @Override
           public void surfaceDestroyed(SurfaceHolder holder) {
               try {
                   mHolder.addCallback(null);
                   mCamera.setPreviewCallback(null);
               } catch (RuntimeException e) {
                   // The camera has probably just been released, ignore.
               }
           }

           public void startPreview() {
               if (!isPreviewOn &amp;&amp; mCamera != null) {
                   isPreviewOn = true;
                   mCamera.startPreview();
               }
           }

           public void stopPreview() {
               if (isPreviewOn &amp;&amp; mCamera != null) {
                   isPreviewOn = false;
                   mCamera.stopPreview();
               }
           }

           @Override
           public void onPreviewFrame(byte[] data, Camera camera) {
               if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
                   startTime = System.currentTimeMillis();
                   return;
               }

               // get video data
               if (yuvImage != null &amp;&amp; recording) {
                   ((ByteBuffer)yuvImage.image[0].position(0)).put(data);

                   try {
                       long t = 1000 * (System.currentTimeMillis() - startTime);
                       if(Static.DEBUG) Log.i(TAG,"Writing Frame on timestamp: "+t);
                       if (t > recorder.getTimestamp()) {
                           recorder.setTimestamp(t);
                       }
                       recorder.record(yuvImage);
                   } catch (FFmpegFrameRecorder.Exception e) {
                       if(Static.DEBUG) Log.i(TAG,e.getMessage());
                       e.printStackTrace();
                   }
               }
           }
       }
    }
    </string>

    activity_opencv.xml :

    &lt;?xml version="1.0" encoding="utf-8"?>

    <relativelayout>

       <framelayout>
       </framelayout>

       &lt;ImageButton<br />
           android:id=&quot;@+id/btnStartRecord&quot;<br />
           android:layout_width=&quot;70dp&quot;<br />
           android:layout_height=&quot;70dp&quot;<br />
           android:scaleType=&quot;fitXY&quot;<br />
           android:src=&quot;@drawable/record_icon&quot;<br />
           android:background=&quot;@null&quot;<br />
           android:text=&quot;@string/btnStartRecord&quot;<br />
           android:onClick=&quot;onClickBtnStartRecord&quot;<br />
           android:clickable=&quot;true&quot;<br />
           android:layout_centerVertical=&quot;true&quot;<br />
           android:layout_alignParentRight=&quot;true&quot;<br />
           android:layout_alignParentEnd=&quot;true&quot;/&gt;

       <textview></textview>

    </relativelayout>

    Overlay views working, but recorded video is without overlay views.

    2. Via OpenCV and JavaCV with FFmpeg :

    Here

  • Android recording video with overlay view

    6 mars 2016, par t0m

    I am trying app in android which have functionality to capture video with overlay views. (Also i need onPreviewFrame method.)
    Via SurfaceView and JavaCV with FFmpeg :


    OpenCVCameraActivity.java :

    import android.app.Activity;
    import android.content.Context;
    import android.hardware.Camera;
    import android.hardware.Camera.PreviewCallback;
    import android.media.AudioFormat;
    import android.media.AudioRecord;
    import android.media.MediaRecorder;
    import android.os.Bundle;
    import android.util.Log;
    import android.view.KeyEvent;
    import android.view.Menu;
    import android.view.SurfaceHolder;
    import android.view.SurfaceView;
    import android.view.View;
    import android.widget.FrameLayout;
    import android.widget.Toast;

    import org.bytedeco.javacv.FFmpegFrameRecorder;
    import org.bytedeco.javacv.Frame;

    import java.io.File;
    import java.io.IOException;
    import java.nio.ByteBuffer;
    import java.nio.ShortBuffer;
    import java.util.List;


    @SuppressWarnings("ALL")
    public class OpenCVCameraActivity extends Activity {

       private static final String TAG = OpenCVCameraActivity.class.getSimpleName();

       private long startTime = 0;
       private boolean isPreviewOn = false;

       private int sampleAudioRateInHz = 44100;
       private Camera.Size previewSize;        //preview and Camera and Recorder width and height
       private int recorderFrameRate = 25;

       // audio data getting thread
       private AudioRecord audioRecord;
       private AudioRecordRunnable audioRecordRunnable;
       private Thread audioThread;
       private volatile boolean runAudioThread = true;

       // video data getting thread
       private Camera mCamera;
       private CameraView mPreview;
       private FFmpegFrameRecorder recorder;
       private boolean recording = false;
       private Frame yuvImage = null;

       //storage
       private Storage storage;

       @Override
       public void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           if(Static.DEBUG) Log.i(TAG, "onCreate()");

           Thread.setDefaultUncaughtExceptionHandler(uncaughtExceptionHandler);

           setContentView(R.layout.activity_opencv);

           prepareCamera();
       }

       private Thread.UncaughtExceptionHandler uncaughtExceptionHandler =
               new Thread.UncaughtExceptionHandler() {
                   public void uncaughtException(Thread thread, Throwable ex) {
                       if(Static.DEBUG) Log.e(TAG, "Uncaught exception", ex);
                   }
               };

       @Override
       protected void onRestart() {
           super.onRestart();
           if (Static.DEBUG) Log.i(TAG, "onRestart()");
       }

       @Override
       protected void onStart() {
           super.onStart();
           if (Static.DEBUG) Log.i(TAG, "onStart()");
       }

       @Override
       protected void onResume() {
           super.onResume();
           if (Static.DEBUG) Log.i(TAG, "onResume()");

           storage = new Storage(this);
           if(storage.mExternalStorageAvailable == true &amp;&amp; storage.mExternalStorageWriteable == false)
               Static.showToast(this, getString(R.string.errExternalStorageReadOnly), Toast.LENGTH_LONG);
           else if (storage.mExternalStorageAvailable == false &amp;&amp; storage.mExternalStorageWriteable == false)
               Static.showToast(this, getString(R.string.errExternalStorage), Toast.LENGTH_LONG);
       }

       @Override
       public boolean onCreateOptionsMenu(Menu menu) {
           if (Static.DEBUG) Log.i(TAG, "onCreateOptionsMenu()");
           return super.onCreateOptionsMenu(menu);
       }

       @Override
       protected void onPause() {
           super.onPause();
           if (Static.DEBUG) Log.i(TAG, "onPause()");
       }

       @Override
       protected void onStop() {
           super.onStop();
           if (Static.DEBUG) Log.i(TAG, "onStop()");
       }

       @Override
       protected void onDestroy() {
           super.onDestroy();
           if (Static.DEBUG) Log.i(TAG, "onDestroy()");

           recording = false;

           if (mPreview != null) {
               mPreview.stopPreview();
           }

           if (mCamera != null) {
               mCamera.stopPreview();
               mCamera.release();
               mCamera = null;
           }
       }

       /** Prepare camera object.
        * */
       private void prepareCamera() {
           //1. Open camera object
           try {
               mCamera = getCameraInstance(this);
           } catch (Exception e) {
               e.printStackTrace();
               Static.showToast(this, e.getMessage(), Toast.LENGTH_LONG);
               finish();
               return;
           }

           setOptimalCameraParams();

           //2. Connect Preview + 3. Start Preview + 8e Stop preview v metode surfaceChanged
           mPreview = new CameraView(this, mCamera);
           FrameLayout preview = (FrameLayout) findViewById(R.id.cameraPreview);
           preview.addView(mPreview); //surfaceView to FrameLayout
           if(Static.DEBUG) Log.i(TAG, "camera preview start: OK");
       }

       /**
        * A safe way to get an instance of the Camera object.
        */
       @SuppressWarnings("deprecation")
       public static Camera getCameraInstance(Context ctx) throws Exception {
           Camera c = Camera.open();
           if (c == null)
               throw new Exception(ctx.getString(R.string.errCameraNotAvailable));

           if(Static.DEBUG) Log.i(TAG, "camera open");
           return c; // attempt to get a Camera instance, otherwise null
       }

       /**Method set optmial parameters. Depend on HW possibilities.*/
       @SuppressWarnings("deprecation")
       private void setOptimalCameraParams(){

           // Camera parameters
           Camera.Parameters params = mCamera.getParameters();
           List<string> focusModes = params.getSupportedFocusModes();
           if (Static.DEBUG) Log.i(TAG, "focusModes():" + focusModes.toString());
           if (Static.DEBUG) Log.i(TAG, "Camera parameters:\n" + params.flatten());

           params.setRecordingHint(true); //MediaRecorder.start() to start faster

           //Automatically autofocus if it's possible
           if (params.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
               params.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
           } else if (focusModes.contains(Camera.Parameters.FOCUS_MODE_AUTO)) { //at least focus auto
               params.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
           }

           // set Camera parameters
           mCamera.setParameters(params);
       }

       //---------------------------------------
       // initialize ffmpeg_recorder
       //---------------------------------------
       private void initRecorder() throws Exception {
           if(Static.DEBUG) Log.i(TAG,"init recorder");

           File output = null;

           try {
               output = storage.getOutputMediaFile(storage.MEDIA_TYPE_VIDEO);
               if(output == null)
                   throw new Exception();
           } catch (Exception e) {
               e.printStackTrace();
               throw new Exception(getString(R.string.errSetOutputFile));
           }

           if (yuvImage == null) {
               yuvImage = new Frame(previewSize.width, previewSize.height, Frame.DEPTH_UBYTE, 2);
               if(Static.DEBUG) Log.i(TAG, "create yuvImage");
           }

           if(Static.DEBUG) Log.i(TAG, "ffmpeg_url: " + output.getPath());
           recorder = new FFmpegFrameRecorder(output.getPath(), previewSize.width, previewSize.height, 1);
           //recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
           //recorder.setVideoOption("preset", "veryfast"); // or ultrafast or fast, etc.
           //recorder.setVideoQuality(0); // maximum quality, replace recorder.setVideoBitrate(16384);
           //recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
           recorder.setFormat("mp4");
           recorder.setSampleRate(sampleAudioRateInHz);
           recorder.setFrameRate(recorderFrameRate);

           if(Static.DEBUG) Log.i(TAG, "recorder initialize success");

           audioRecordRunnable = new AudioRecordRunnable();
           audioThread = new Thread(audioRecordRunnable);
           runAudioThread = true;
       }

       public void startRecording() {

           try {
               initRecorder();
           } catch (Exception e){
               e.printStackTrace();
               Static.showToast(this, e.getMessage(), Toast.LENGTH_LONG);
           }

           try {
               recorder.start();
               startTime = System.currentTimeMillis();
               recording = true;
               audioThread.start();

           } catch (FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
       }

       public void stopRecording() {

           runAudioThread = false;
           try {
               audioThread.join();
           } catch (InterruptedException e) {
               // reset interrupt to be nice
               Thread.currentThread().interrupt();
               return;
           }
           audioRecordRunnable = null;
           audioThread = null;

           if (recorder != null &amp;&amp; recording) {

               recording = false;
               if(Static.DEBUG) Log.i(TAG,"Finishing recording, calling stop and release on recorder");
               try {
                   recorder.stop();
                   recorder.release();
               } catch (FFmpegFrameRecorder.Exception e) {
                   e.printStackTrace();
               }
               recorder = null;

           }
       }

       @Override
       public boolean onKeyDown(int keyCode, KeyEvent event) {

           if (keyCode == KeyEvent.KEYCODE_BACK) {
               if (recording) {
                   stopRecording();
               }

               finish();

               return true;
           }

           return super.onKeyDown(keyCode, event);
       }

       public void onClickBtnStartRecord(View v) {
           if (!recording) {
               startRecording();
               if(Static.DEBUG) Log.i(TAG, "Start Button Pushed");
           } else {
               // This will trigger the audio recording loop to stop and then set isRecorderStart = false;
               stopRecording();
               if(Static.DEBUG) Log.i(TAG, "Stop Button Pushed");
           }
       }

       //---------------------------------------------
       // audio thread, gets and encodes audio data
       //---------------------------------------------
       class AudioRecordRunnable implements Runnable {

           @Override
           public void run() {
               android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

               // Audio
               int bufferSize;
               ShortBuffer audioData;
               int bufferReadResult;

               bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
               audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

               audioData = ShortBuffer.allocate(bufferSize);

               if(Static.DEBUG) Log.i(TAG, "audioRecord.startRecording()");
               audioRecord.startRecording();

               /* ffmpeg_audio encoding loop */
               while (runAudioThread) {
                   //if(Static.DEBUG) Log.i(TAG,"recording? " + recording);
                   bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
                   audioData.limit(bufferReadResult);
                   if (bufferReadResult > 0) {
                       //if(Static.DEBUG) Log.i(TAG,"bufferReadResult: " + bufferReadResult);
                       // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                       // Why?  Good question...
                       if (recording) {
                           try {
                               recorder.recordSamples(audioData);
                               //if(Static.DEBUG) Log.i(TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                           } catch (FFmpegFrameRecorder.Exception e) {
                               if(Static.DEBUG) Log.i(TAG,e.getMessage());
                               e.printStackTrace();
                           }
                       }
                   }
               }
               if(Static.DEBUG) Log.i(TAG,"AudioThread Finished, release audioRecord");

               /* encoding finish, release recorder */
               if (audioRecord != null) {
                   audioRecord.stop();
                   audioRecord.release();
                   audioRecord = null;
                   if(Static.DEBUG) Log.i(TAG,"audioRecord released");
               }
           }
       }

       /**TODO*/
       private Camera.Size getBestPreviewSize(int width, int height, Camera.Parameters parameters) {
           Camera.Size result=null;

           for (Camera.Size size : parameters.getSupportedPreviewSizes()) {

               if(Static.DEBUG) Log.i(TAG, size.width + "/" + size.height);

               if (size.width&lt;=width &amp;&amp; size.height&lt;=height) {
                   if (result==null) {
                       result=size;
                   } else {
                       int resultArea=result.width*result.height;
                       int newArea=size.width*size.height;

                       if (newArea>resultArea) {
                           result=size;
                       }
                   }
               }
           }

           return(result);
       }

       //---------------------------------------------
       // camera thread, gets and encodes video data
       //---------------------------------------------
       private class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

           private final String TAG = CameraView.class.getSimpleName();

           private SurfaceHolder mHolder;
           private Camera mCamera;

           public CameraView(Context context, Camera camera) {
               super(context);
               if(Static.DEBUG) Log.i(TAG, "camera view");

               mCamera = camera;

               // Install a SurfaceHolder.Callback so we get notified when the
               // underlying surface is created and destroyed.
               mHolder = getHolder();
               mHolder.addCallback(CameraView.this);
               mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
               mCamera.setPreviewCallback(CameraView.this);
           }

           @Override
           public void surfaceCreated(SurfaceHolder holder) {
               try {
                   stopPreview();
                   mCamera.setPreviewDisplay(holder);
               } catch (IOException exception) {
                   mCamera.release();
                   mCamera = null;
               }
           }

           public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
               if(Static.DEBUG) Log.i(TAG, "surfaceChanged() => w=" + w + ", h=" + h);

               // If your preview can change or rotate, take care of those events here.
               // Make sure to stop the preview before resizing or reformatting it.

               if (mHolder.getSurface() == null){
                   // preview surface does not exist
                   return;
               }

               // stop preview before making changes
               try {
                   stopPreview();
               } catch (Exception e){
                   // ignore: tried to stop a non-existent preview
               }

               // start preview with new settings
               try {
                   Camera.Parameters params = mCamera.getParameters();

                   previewSize = getBestPreviewSize(w, h, params);
                   if(Static.DEBUG) Log.i(TAG, "getBestPreviewSize() => w=" + previewSize.width + ", h=" + previewSize.height);
                   if (previewSize != null)
                       params.setPreviewSize(previewSize.width, previewSize.height);

                   params.setPreviewFrameRate(recorderFrameRate);
                   if(Static.DEBUG) Log.i(TAG,"Preview Framerate: " + params.getPreviewFrameRate());
                   mCamera.setParameters(params);
                   mCamera.setPreviewDisplay(holder);
                   mCamera.setPreviewCallback(CameraView.this);
                   startPreview();
               } catch (Exception e){
                   if(Static.DEBUG) Log.i(TAG, "Could not set preview display in surfaceChanged");
                   e.printStackTrace();
               }

           }

           @Override
           public void surfaceDestroyed(SurfaceHolder holder) {
               try {
                   mHolder.addCallback(null);
                   mCamera.setPreviewCallback(null);
               } catch (RuntimeException e) {
                   // The camera has probably just been released, ignore.
               }
           }

           public void startPreview() {
               if (!isPreviewOn &amp;&amp; mCamera != null) {
                   isPreviewOn = true;
                   mCamera.startPreview();
               }
           }

           public void stopPreview() {
               if (isPreviewOn &amp;&amp; mCamera != null) {
                   isPreviewOn = false;
                   mCamera.stopPreview();
               }
           }

           @Override
           public void onPreviewFrame(byte[] data, Camera camera) {
               if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
                   startTime = System.currentTimeMillis();
                   return;
               }

               // get video data
               if (yuvImage != null &amp;&amp; recording) {
                   ((ByteBuffer)yuvImage.image[0].position(0)).put(data);

                   try {
                       long t = 1000 * (System.currentTimeMillis() - startTime);
                       if(Static.DEBUG) Log.i(TAG,"Writing Frame on timestamp: "+t);
                       if (t > recorder.getTimestamp()) {
                           recorder.setTimestamp(t);
                       }
                       recorder.record(yuvImage);
                   } catch (FFmpegFrameRecorder.Exception e) {
                       if(Static.DEBUG) Log.i(TAG,e.getMessage());
                       e.printStackTrace();
                   }
               }
           }
       }
    }
    </string>

    activity_opencv.xml :

    &lt;?xml version="1.0" encoding="utf-8"?>

    <relativelayout>

       <framelayout>
       </framelayout>

       &lt;ImageButton<br />
           android:id=&quot;@+id/btnStartRecord&quot;<br />
           android:layout_width=&quot;70dp&quot;<br />
           android:layout_height=&quot;70dp&quot;<br />
           android:scaleType=&quot;fitXY&quot;<br />
           android:src=&quot;@drawable/record_icon&quot;<br />
           android:background=&quot;@null&quot;<br />
           android:text=&quot;@string/btnStartRecord&quot;<br />
           android:onClick=&quot;onClickBtnStartRecord&quot;<br />
           android:clickable=&quot;true&quot;<br />
           android:layout_centerVertical=&quot;true&quot;<br />
           android:layout_alignParentRight=&quot;true&quot;<br />
           android:layout_alignParentEnd=&quot;true&quot;/&gt;

       <textview></textview>

    </relativelayout>

    Overlay views working, but recorded video is without overlay views.