Recherche avancée

Médias (91)

Autres articles (32)

  • Création définitive du canal

    12 mars 2010, par

    Lorsque votre demande est validée, vous pouvez alors procéder à la création proprement dite du canal. Chaque canal est un site à part entière placé sous votre responsabilité. Les administrateurs de la plateforme n’y ont aucun accès.
    A la validation, vous recevez un email vous invitant donc à créer votre canal.
    Pour ce faire il vous suffit de vous rendre à son adresse, dans notre exemple "http://votre_sous_domaine.mediaspip.net".
    A ce moment là un mot de passe vous est demandé, il vous suffit d’y (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

Sur d’autres sites (3909)

  • squeeze image while capturing video with FFmpegFrameRecorder

    12 avril 2016, par Saty

    I am trying to stream video with FFmpegFrameRecorder using javacv. All are working great except I find videos which are actually images are bit squeeze from the height.

    I am using the below code which half of the internet community is using to live stream

    public class MainActivity extends Activity implements OnClickListener {

    private final static String LOG_TAG = "MainActivity";

    private PowerManager.WakeLock mWakeLock;

    private String ffmpeg_link = "rtmp://username:password@xxx.xxx.xxx.xxx:1935/live/test.flv";
    //private String ffmpeg_link = "/mnt/sdcard/new_stream.flv";

    private volatile FFmpegFrameRecorder recorder;
    boolean recording = false;
    long startTime = 0;

    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 320;
    private int imageHeight = 240;
    private int frameRate = 30;

    private Thread audioThread;
    volatile boolean runAudioThread = true;
    private AudioRecord audioRecord;
    private AudioRecordRunnable audioRecordRunnable;

    private CameraView cameraView;
    private IplImage yuvIplimage = null;

    private Button recordButton;
    private LinearLayout mainLayout;

    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);

       setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
       setContentView(R.layout.activity_main);

       initLayout();
       initRecorder();
    }

    @Override
    protected void onResume() {
       super.onResume();

       if (mWakeLock == null) {
           PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
           mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, LOG_TAG);
           mWakeLock.acquire();
       }
    }

    @Override
    protected void onPause() {
       super.onPause();

       if (mWakeLock != null) {
           mWakeLock.release();
           mWakeLock = null;
       }
    }

    @Override
    protected void onDestroy() {
       super.onDestroy();

       recording = false;
    }


    private void initLayout() {

       mainLayout = (LinearLayout) this.findViewById(R.id.record_layout);

       recordButton = (Button) findViewById(R.id.recorder_control);
       recordButton.setText("Start");
       recordButton.setOnClickListener(this);

       cameraView = new CameraView(this);

       LinearLayout.LayoutParams layoutParam = new LinearLayout.LayoutParams(imageWidth, imageHeight);        
       mainLayout.addView(cameraView, layoutParam);
       Log.v(LOG_TAG, "added cameraView to mainLayout");
    }

    private void initRecorder() {
       Log.w(LOG_TAG,"initRecorder");

       if (yuvIplimage == null) {
           // Recreated after frame size is set in surface change method
           yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
           //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);

           Log.v(LOG_TAG, "IplImage.create");
       }

       recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
       Log.v(LOG_TAG, "FFmpegFrameRecorder: " + ffmpeg_link + " imageWidth: " + imageWidth + " imageHeight " + imageHeight);

       recorder.setFormat("flv");
       Log.v(LOG_TAG, "recorder.setFormat(\"flv\")");

       recorder.setSampleRate(sampleAudioRateInHz);
       Log.v(LOG_TAG, "recorder.setSampleRate(sampleAudioRateInHz)");

       // re-set in the surface changed method as well
       recorder.setFrameRate(frameRate);
       Log.v(LOG_TAG, "recorder.setFrameRate(frameRate)");

       // Create audio recording thread
       audioRecordRunnable = new AudioRecordRunnable();
       audioThread = new Thread(audioRecordRunnable);
    }

    // Start the capture
    public void startRecording() {
       try {
           recorder.start();
           startTime = System.currentTimeMillis();
           recording = true;
           audioThread.start();
       } catch (FFmpegFrameRecorder.Exception e) {
           e.printStackTrace();
       }
    }

    public void stopRecording() {
       // This should stop the audio thread from running
       runAudioThread = false;

       if (recorder != null && recording) {
           recording = false;
           Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
           try {
               recorder.stop();
               recorder.release();
           } catch (FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
           recorder = null;
       }
    }

    @Override
    public boolean onKeyDown(int keyCode, KeyEvent event) {
       // Quit when back button is pushed
       if (keyCode == KeyEvent.KEYCODE_BACK) {
           if (recording) {
               stopRecording();
           }
           finish();
           return true;
       }
       return super.onKeyDown(keyCode, event);
    }

    @Override
    public void onClick(View v) {
       if (!recording) {
           startRecording();
           Log.w(LOG_TAG, "Start Button Pushed");
           recordButton.setText("Stop");
       } else {
           stopRecording();
           Log.w(LOG_TAG, "Stop Button Pushed");
           recordButton.setText("Start");
       }
    }

    //---------------------------------------------
    // audio thread, gets and encodes audio data
    //---------------------------------------------
    class AudioRecordRunnable implements Runnable {

       @Override
       public void run() {
           // Set the thread priority
           android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

           // Audio
           int bufferSize;
           short[] audioData;
           int bufferReadResult;

           bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                   AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
           audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                   AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

           audioData = new short[bufferSize];

           Log.d(LOG_TAG, "audioRecord.startRecording()");
           audioRecord.startRecording();

           // Audio Capture/Encoding Loop
           while (runAudioThread) {
               // Read from audioRecord
               bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
               if (bufferReadResult > 0) {
                   //Log.v(LOG_TAG,"audioRecord bufferReadResult: " + bufferReadResult);

                   // Changes in this variable may not be picked up despite it being "volatile"
                   if (recording) {
                       try {
                           // Write to FFmpegFrameRecorder
                           Buffer[] buffer = {ShortBuffer.wrap(audioData, 0, bufferReadResult)};                        
                           recorder.record(buffer);
                       } catch (FFmpegFrameRecorder.Exception e) {
                           Log.v(LOG_TAG,e.getMessage());
                           e.printStackTrace();
                       }
                   }
               }
           }
           Log.v(LOG_TAG,"AudioThread Finished");

           /* Capture/Encoding finished, release recorder */
           if (audioRecord != null) {
               audioRecord.stop();
               audioRecord.release();
               audioRecord = null;
               Log.v(LOG_TAG,"audioRecord released");
           }
       }
    }

    class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

       private boolean previewRunning = false;

       private SurfaceHolder holder;
       private Camera camera;

       private byte[] previewBuffer;

       long videoTimestamp = 0;

       Bitmap bitmap;
       Canvas canvas;

       public CameraView(Context _context) {
           super(_context);

           holder = this.getHolder();
           holder.addCallback(this);
           holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
       }

       @Override
       public void surfaceCreated(SurfaceHolder holder) {
           camera = Camera.open();

           try {
               camera.setPreviewDisplay(holder);
               camera.setPreviewCallback(this);

               Camera.Parameters currentParams = camera.getParameters();
               Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
               Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

               // Use these values
               imageWidth = currentParams.getPreviewSize().width;
               imageHeight = currentParams.getPreviewSize().height;
               frameRate = currentParams.getPreviewFrameRate();                

               bitmap = Bitmap.createBitmap(imageWidth, imageHeight, Bitmap.Config.ALPHA_8);


               /*
               Log.v(LOG_TAG,"Creating previewBuffer size: " + imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8);
               previewBuffer = new byte[imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8];
               camera.addCallbackBuffer(previewBuffer);
               camera.setPreviewCallbackWithBuffer(this);
               */              

               camera.startPreview();
               previewRunning = true;
           }
           catch (IOException e) {
               Log.v(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }  
       }

       public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
           Log.v(LOG_TAG,"Surface Changed: width " + width + " height: " + height);

           // We would do this if we want to reset the camera parameters
           /*
           if (!recording) {
               if (previewRunning){
                   camera.stopPreview();
               }
               try {
                   //Camera.Parameters cameraParameters = camera.getParameters();
                   //p.setPreviewSize(imageWidth, imageHeight);
                   //p.setPreviewFrameRate(frameRate);
                   //camera.setParameters(cameraParameters);

                   camera.setPreviewDisplay(holder);
                   camera.startPreview();
                   previewRunning = true;
               }
               catch (IOException e) {
                   Log.e(LOG_TAG,e.getMessage());
                   e.printStackTrace();
               }  
           }            
           */

           // Get the current parameters
           Camera.Parameters currentParams = camera.getParameters();
           Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
           Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

           // Use these values
           imageWidth = currentParams.getPreviewSize().width;
           imageHeight = currentParams.getPreviewSize().height;
           frameRate = currentParams.getPreviewFrameRate();

           // Create the yuvIplimage if needed
           yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
           //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
       }

       @Override
       public void surfaceDestroyed(SurfaceHolder holder) {
           try {
               camera.setPreviewCallback(null);

               previewRunning = false;
               camera.release();

           } catch (RuntimeException e) {
               Log.v(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }
       }

       @Override
       public void onPreviewFrame(byte[] data, Camera camera) {

           if (yuvIplimage != null && recording) {
               videoTimestamp = 1000 * (System.currentTimeMillis() - startTime);

               // Put the camera preview frame right into the yuvIplimage object
               yuvIplimage.getByteBuffer().put(data);

               // FAQ about IplImage:
               // - For custom raw processing of data, getByteBuffer() returns an NIO direct
               //   buffer wrapped around the memory pointed by imageData, and under Android we can
               //   also use that Buffer with Bitmap.copyPixelsFromBuffer() and copyPixelsToBuffer().
               // - To get a BufferedImage from an IplImage, we may call getBufferedImage().
               // - The createFrom() factory method can construct an IplImage from a BufferedImage.
               // - There are also a few copy*() methods for BufferedImage<->IplImage data transfers.

               // Let's try it..
               // This works but only on transparency
               // Need to find the right Bitmap and IplImage matching types

               /*
               bitmap.copyPixelsFromBuffer(yuvIplimage.getByteBuffer());
               //bitmap.setPixel(10,10,Color.MAGENTA);

               canvas = new Canvas(bitmap);
               Paint paint = new Paint();
               paint.setColor(Color.GREEN);
               float leftx = 20;
               float topy = 20;
               float rightx = 50;
               float bottomy = 100;
               RectF rectangle = new RectF(leftx,topy,rightx,bottomy);
               canvas.drawRect(rectangle, paint);

               bitmap.copyPixelsToBuffer(yuvIplimage.getByteBuffer());
               */
               //Log.v(LOG_TAG,"Writing Frame");

               try {

                   // Get the correct time
                   recorder.setTimestamp(videoTimestamp);

                   // Record the image into FFmpegFrameRecorder
                   recorder.record(yuvIplimage);

               } catch (FFmpegFrameRecorder.Exception e) {
                   Log.v(LOG_TAG,e.getMessage());
                   e.printStackTrace();
               }
           }
       }
    }

    }

  • JavaCV/FFmpeg causes crash only on Lollipop

    13 novembre 2015, par WeirdHat

    I get reports of this crash from users of my app on Galaxy Note devices running Android Lollipop (might happen on other devices but the majority of my users have Galaxy Note because it’s a drawing app). Below is the relevant code from my ASyncTask which exports a series of images (plus audio) to a video file using JavaCV/FFmpeg, and the stack trace from the reported crash. It works fine on my Note 8.0 running KitKat (there’s no Lollipop update available for that device), and works fine in Lollipop on the emulator, so I’m having a hard time tracking down the cause and don’t know what to do.

    @Override
    protected String doInBackground(final File... params) {
       FFmpegFrameRecorder recorder = new FFmpegFrameRecorder(params[0],canvas_width,canvas_height, 0);
       FrameGrabber audiograbber = null;
       if(audio!=null) audiograbber = new FFmpegFrameGrabber(audio);
       Frame audioframe;
       Frame image;
       AndroidFrameConverter converter = new AndroidFrameConverter();

       try {
           recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
           recorder.setFrameRate(framerate);
           recorder.setPixelFormat(0);
           if(audio!=null) {
               audiograbber.setFrameRate(framerate);
               audiograbber.start();
               recorder.setAudioChannels(audiograbber.getAudioChannels());
               recorder.setSampleRate(audiograbber.getSampleRate());
           }
           recorder.start();

           Bitmap drawframe = createBitmap(canvas_width, canvas_height, Bitmap.Config.ARGB_8888);
           Canvas c = new Canvas(drawframe);
           Paint p = new Paint();

           for(int frame=0; frame/code to draw image

               image = converter.convert(drawframe);
               recorder.record(image);
           }

           if(isCancelled()) {
               bgbitmap.recycle();
               drawframe.recycle();
               return "Cancelled";
           }

           if(audio!=null) {
               while((audioframe = audiograbber.grabFrame()) != null) {
                   recorder.record(audioframe);
               }
           }

           recorder.stop();
           bgbitmap.recycle();
           drawframe.recycle();
           if(audio!=null) audiograbber.stop();
           return "Saved "+params[0];
       } catch (Exception e) {
           e.printStackTrace();
       }
       return "Failed";
    }

    -

    *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
    Build fingerprint: 'samsung/treltexx/trelte:5.1.1/LMY47X/N910CXXU1COH4:user/release-keys'
    Revision: '21'
    ABI: 'arm'
    pid: 12039, tid: 12053, name: GCDaemon  >>> com.weirdhat.roughanimator <<<
    signal 6 (SIGABRT), code -6 (SI_TKILL), fault addr --------
    Abort message: 'sart/runtime/gc/collector/mark_sweep.cc:381] Can't mark invalid object'
       r0 00000000  r1 00002f15  r2 00000006  r3 00000000
       r4 b366bdb8  r5 00000006  r6 00000002  r7 0000010c
       r8 00000001  r9 b446f550  sl b4429000  fp 9a975c60
       ip 00002f15  sp b366b008  lr b6db9cb9  pc b6ddd3ac  cpsr 60070010

    backtrace:
       #00 pc 0003b3ac  /system/lib/libc.so (tgkill+12)
       #01 pc 00017cb5  /system/lib/libc.so (pthread_kill+52)
       #02 pc 000188c7  /system/lib/libc.so (raise+10)
       #03 pc 00015165  /system/lib/libc.so (__libc_android_abort+36)
       #04 pc 00012fac  /system/lib/libc.so (abort+4)
       #05 pc 00242f17  /system/lib/libart.so (art::Runtime::Abort()+170)
       #06 pc 000ad991  /system/lib/libart.so (art::LogMessage::~LogMessage()+1360)
       #07 pc 0013ec53  /system/lib/libart.so (bool art::gc::accounting::HeapBitmap::AtomicTestAndSet(art::mirror::Object const*, art::gc::collector::MarkSweepMarkObjectSlowPath const&)+422)
       #08 pc 0013ed27  /system/lib/libart.so (art::gc::collector::MarkSweep::MarkObjectParallel(art::mirror::Object const*)+142)
       #09 pc 0013ff23  /system/lib/libart.so (art::gc::collector::MarkSweep::MarkRootParallelCallback(art::mirror::Object**, void*, art::RootInfo const&)+26)
       #10 pc 0025893d  /system/lib/libart.so (art::ReferenceMapVisitor::VisitQuickFrame()+1024)
       #11 pc 00258cad  /system/lib/libart.so (art::ReferenceMapVisitor::VisitFrame()+224)
       #12 pc 0024c8e9  /system/lib/libart.so (art::StackVisitor::WalkStack(bool)+276)
       #13 pc 0024e617  /system/lib/libart.so (art::Thread::VisitRoots(void (*)(art::mirror::Object**, void*, art::RootInfo const&), void*)+994)
       #14 pc 0013ef9f  /system/lib/libart.so (art::gc::collector::CheckpointMarkThreadRoots::Run(art::Thread*)+126)
       #15 pc 0025b44d  /system/lib/libart.so (art::ThreadList::RunCheckpoint(art::Closure*)+296)
       #16 pc 0013dc0d  /system/lib/libart.so (art::gc::collector::MarkSweep::MarkRootsCheckpoint(art::Thread*, bool)+96)
       #17 pc 0014165d  /system/lib/libart.so (art::gc::collector::MarkSweep::PreCleanCards()+172)
       #18 pc 001417d3  /system/lib/libart.so (art::gc::collector::MarkSweep::MarkingPhase()+126)
       #19 pc 001418b9  /system/lib/libart.so (art::gc::collector::MarkSweep::RunPhases()+176)
       #20 pc 00138527  /system/lib/libart.so (art::gc::collector::GarbageCollector::Run(art::gc::GcCause, bool)+246)
       #21 pc 0015865b  /system/lib/libart.so (art::gc::Heap::CollectGarbageInternal(art::gc::collector::GcType, art::gc::GcCause, bool)+1426)
       #22 pc 00159af5  /system/lib/libart.so (art::gc::Heap::ConcurrentGC(art::Thread*)+56)
       #23 pc 000003ef  /system/framework/arm/boot.oat
  • Cant we use mp4 as format of FFmpegFrameRecorder while RTMP streaming

    13 avril 2016, par Saty

    I am trying to stream video with FFmpegFrameRecorder using javacv. All are working good however I want to use "mp4" as the recording format instead of FLV. is that possible because when i set that, it crashed ?

    I am using the below code which half of the internet community is using to live stream

    public class MainActivity extends Activity implements OnClickListener {

    private final static String LOG_TAG = "MainActivity";

    private PowerManager.WakeLock mWakeLock;

    private String ffmpeg_link = "rtmp://username:password@xxx.xxx.xxx.xxx:1935/live/test.flv";
    //private String ffmpeg_link = "/mnt/sdcard/new_stream.flv";

    private volatile FFmpegFrameRecorder recorder;
    boolean recording = false;
    long startTime = 0;

    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 320;
    private int imageHeight = 240;
    private int frameRate = 30;

    private Thread audioThread;
    volatile boolean runAudioThread = true;
    private AudioRecord audioRecord;
    private AudioRecordRunnable audioRecordRunnable;

    private CameraView cameraView;
    private IplImage yuvIplimage = null;

    private Button recordButton;
    private LinearLayout mainLayout;

    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);

       setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
       setContentView(R.layout.activity_main);

       initLayout();
       initRecorder();
    }

    @Override
    protected void onResume() {
       super.onResume();

       if (mWakeLock == null) {
           PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
           mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, LOG_TAG);
           mWakeLock.acquire();
       }
    }

    @Override
    protected void onPause() {
       super.onPause();

       if (mWakeLock != null) {
           mWakeLock.release();
           mWakeLock = null;
       }
    }

    @Override
    protected void onDestroy() {
       super.onDestroy();

       recording = false;
    }


    private void initLayout() {

       mainLayout = (LinearLayout) this.findViewById(R.id.record_layout);

       recordButton = (Button) findViewById(R.id.recorder_control);
       recordButton.setText("Start");
       recordButton.setOnClickListener(this);

       cameraView = new CameraView(this);

       LinearLayout.LayoutParams layoutParam = new LinearLayout.LayoutParams(imageWidth, imageHeight);        
       mainLayout.addView(cameraView, layoutParam);
       Log.v(LOG_TAG, "added cameraView to mainLayout");
    }

    private void initRecorder() {
       Log.w(LOG_TAG,"initRecorder");

       if (yuvIplimage == null) {
           // Recreated after frame size is set in surface change method
           yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
           //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);

           Log.v(LOG_TAG, "IplImage.create");
       }

       recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
       Log.v(LOG_TAG, "FFmpegFrameRecorder: " + ffmpeg_link + " imageWidth: " + imageWidth + " imageHeight " + imageHeight);

       recorder.setFormat("flv");
       Log.v(LOG_TAG, "recorder.setFormat(\"flv\")");

       recorder.setSampleRate(sampleAudioRateInHz);
       Log.v(LOG_TAG, "recorder.setSampleRate(sampleAudioRateInHz)");

       // re-set in the surface changed method as well
       recorder.setFrameRate(frameRate);
       Log.v(LOG_TAG, "recorder.setFrameRate(frameRate)");

       // Create audio recording thread
       audioRecordRunnable = new AudioRecordRunnable();
       audioThread = new Thread(audioRecordRunnable);
    }

    // Start the capture
    public void startRecording() {
       try {
           recorder.start();
           startTime = System.currentTimeMillis();
           recording = true;
           audioThread.start();
       } catch (FFmpegFrameRecorder.Exception e) {
           e.printStackTrace();
       }
    }

    public void stopRecording() {
       // This should stop the audio thread from running
       runAudioThread = false;

       if (recorder != null && recording) {
           recording = false;
           Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
           try {
               recorder.stop();
               recorder.release();
           } catch (FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
           recorder = null;
       }
    }

    @Override
    public boolean onKeyDown(int keyCode, KeyEvent event) {
       // Quit when back button is pushed
       if (keyCode == KeyEvent.KEYCODE_BACK) {
           if (recording) {
               stopRecording();
           }
           finish();
           return true;
       }
       return super.onKeyDown(keyCode, event);
    }

    @Override
    public void onClick(View v) {
       if (!recording) {
           startRecording();
           Log.w(LOG_TAG, "Start Button Pushed");
           recordButton.setText("Stop");
       } else {
           stopRecording();
           Log.w(LOG_TAG, "Stop Button Pushed");
           recordButton.setText("Start");
       }
    }

    //---------------------------------------------
    // audio thread, gets and encodes audio data
    //---------------------------------------------
    class AudioRecordRunnable implements Runnable {

       @Override
       public void run() {
           // Set the thread priority
           android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

           // Audio
           int bufferSize;
           short[] audioData;
           int bufferReadResult;

           bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                   AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
           audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                   AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

           audioData = new short[bufferSize];

           Log.d(LOG_TAG, "audioRecord.startRecording()");
           audioRecord.startRecording();

           // Audio Capture/Encoding Loop
           while (runAudioThread) {
               // Read from audioRecord
               bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
               if (bufferReadResult > 0) {
                   //Log.v(LOG_TAG,"audioRecord bufferReadResult: " + bufferReadResult);

                   // Changes in this variable may not be picked up despite it being "volatile"
                   if (recording) {
                       try {
                           // Write to FFmpegFrameRecorder
                           Buffer[] buffer = {ShortBuffer.wrap(audioData, 0, bufferReadResult)};                        
                           recorder.record(buffer);
                       } catch (FFmpegFrameRecorder.Exception e) {
                           Log.v(LOG_TAG,e.getMessage());
                           e.printStackTrace();
                       }
                   }
               }
           }
           Log.v(LOG_TAG,"AudioThread Finished");

           /* Capture/Encoding finished, release recorder */
           if (audioRecord != null) {
               audioRecord.stop();
               audioRecord.release();
               audioRecord = null;
               Log.v(LOG_TAG,"audioRecord released");
           }
       }
    }

    class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

       private boolean previewRunning = false;

       private SurfaceHolder holder;
       private Camera camera;

       private byte[] previewBuffer;

       long videoTimestamp = 0;

       Bitmap bitmap;
       Canvas canvas;

       public CameraView(Context _context) {
           super(_context);

           holder = this.getHolder();
           holder.addCallback(this);
           holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
       }

       @Override
       public void surfaceCreated(SurfaceHolder holder) {
           camera = Camera.open();

           try {
               camera.setPreviewDisplay(holder);
               camera.setPreviewCallback(this);

               Camera.Parameters currentParams = camera.getParameters();
               Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
               Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

               // Use these values
               imageWidth = currentParams.getPreviewSize().width;
               imageHeight = currentParams.getPreviewSize().height;
               frameRate = currentParams.getPreviewFrameRate();                

               bitmap = Bitmap.createBitmap(imageWidth, imageHeight, Bitmap.Config.ALPHA_8);


               /*
               Log.v(LOG_TAG,"Creating previewBuffer size: " + imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8);
               previewBuffer = new byte[imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8];
               camera.addCallbackBuffer(previewBuffer);
               camera.setPreviewCallbackWithBuffer(this);
               */              

               camera.startPreview();
               previewRunning = true;
           }
           catch (IOException e) {
               Log.v(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }  
       }

       public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
           Log.v(LOG_TAG,"Surface Changed: width " + width + " height: " + height);

           // We would do this if we want to reset the camera parameters
           /*
           if (!recording) {
               if (previewRunning){
                   camera.stopPreview();
               }
               try {
                   //Camera.Parameters cameraParameters = camera.getParameters();
                   //p.setPreviewSize(imageWidth, imageHeight);
                   //p.setPreviewFrameRate(frameRate);
                   //camera.setParameters(cameraParameters);

                   camera.setPreviewDisplay(holder);
                   camera.startPreview();
                   previewRunning = true;
               }
               catch (IOException e) {
                   Log.e(LOG_TAG,e.getMessage());
                   e.printStackTrace();
               }  
           }            
           */

           // Get the current parameters
           Camera.Parameters currentParams = camera.getParameters();
           Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
           Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

           // Use these values
           imageWidth = currentParams.getPreviewSize().width;
           imageHeight = currentParams.getPreviewSize().height;
           frameRate = currentParams.getPreviewFrameRate();

           // Create the yuvIplimage if needed
           yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
           //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
       }

       @Override
       public void surfaceDestroyed(SurfaceHolder holder) {
           try {
               camera.setPreviewCallback(null);

               previewRunning = false;
               camera.release();

           } catch (RuntimeException e) {
               Log.v(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }
       }

       @Override
       public void onPreviewFrame(byte[] data, Camera camera) {

           if (yuvIplimage != null && recording) {
               videoTimestamp = 1000 * (System.currentTimeMillis() - startTime);

               // Put the camera preview frame right into the yuvIplimage object
               yuvIplimage.getByteBuffer().put(data);

               // FAQ about IplImage:
               // - For custom raw processing of data, getByteBuffer() returns an NIO direct
               //   buffer wrapped around the memory pointed by imageData, and under Android we can
               //   also use that Buffer with Bitmap.copyPixelsFromBuffer() and copyPixelsToBuffer().
               // - To get a BufferedImage from an IplImage, we may call getBufferedImage().
               // - The createFrom() factory method can construct an IplImage from a BufferedImage.
               // - There are also a few copy*() methods for BufferedImage<->IplImage data transfers.

               // Let's try it..
               // This works but only on transparency
               // Need to find the right Bitmap and IplImage matching types

               /*
               bitmap.copyPixelsFromBuffer(yuvIplimage.getByteBuffer());
               //bitmap.setPixel(10,10,Color.MAGENTA);

               canvas = new Canvas(bitmap);
               Paint paint = new Paint();
               paint.setColor(Color.GREEN);
               float leftx = 20;
               float topy = 20;
               float rightx = 50;
               float bottomy = 100;
               RectF rectangle = new RectF(leftx,topy,rightx,bottomy);
               canvas.drawRect(rectangle, paint);

               bitmap.copyPixelsToBuffer(yuvIplimage.getByteBuffer());
               */
               //Log.v(LOG_TAG,"Writing Frame");

               try {

                   // Get the correct time
                   recorder.setTimestamp(videoTimestamp);

                   // Record the image into FFmpegFrameRecorder
                   recorder.record(yuvIplimage);

               } catch (FFmpegFrameRecorder.Exception e) {
                   Log.v(LOG_TAG,e.getMessage());
                   e.printStackTrace();
               }
           }
       }
    }

    }