Recherche avancée

Médias (0)

Mot : - Tags -/serveur

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (60)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs

    12 avril 2011, par

    La manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
    Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (7438)

  • Squeeze image when images are larger image than 1024X768 in FFMpeg with javacv

    14 avril 2016, par Saty

    I am using this below code to stream RTMP to my Adobe FMS Server... its doing good. It shows squeezed image if the camera resolution is above 1024X768.
    The issue came when we tested on a Tab which has camera resolution of 1200X800, The recorder automatically takes the resolution of 1024X768 which makes the preview and the actual video a squeezed one and one more thing is recording format does not support MP4

    Can anyone describe why its not working with this and can we use that format.

    public class MainActivity extends Activity implements OnClickListener {

    private final static String LOG_TAG = "MainActivity";

    private PowerManager.WakeLock mWakeLock;

    private String ffmpeg_link = "rtmp://username:password@xxx.xxx.xxx.xxx:1935/live/test.flv";

    //private String ffmpeg_link = "/mnt/sdcard/new_stream.flv" ;

    private volatile FFmpegFrameRecorder recorder;
    boolean recording = false;
    long startTime = 0;

    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 320;
    private int imageHeight = 240;
    private int frameRate = 30;

    private Thread audioThread;
    volatile boolean runAudioThread = true;
    private AudioRecord audioRecord;
    private AudioRecordRunnable audioRecordRunnable;

    private CameraView cameraView;
    private IplImage yuvIplimage = null;

    private Button recordButton;
    private LinearLayout mainLayout;

    @Override
    public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
    setContentView(R.layout.activity_main);

    initLayout();
    initRecorder();
    }

    @Override
    protected void onResume() {
     super.onResume();

    if (mWakeLock == null) {
       PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
       mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, LOG_TAG);
       mWakeLock.acquire();
    }
    }

    @Override
    protected void onPause() {
    super.onPause();

    if (mWakeLock != null) {
       mWakeLock.release();
       mWakeLock = null;
    }
    }

    @Override
    protected void onDestroy() {
    super.onDestroy();

    recording = false;
    }


    private void initLayout() {

    mainLayout = (LinearLayout) this.findViewById(R.id.record_layout);

    recordButton = (Button) findViewById(R.id.recorder_control);
    recordButton.setText("Start");
    recordButton.setOnClickListener(this);

    cameraView = new CameraView(this);

    LinearLayout.LayoutParams layoutParam = new LinearLayout.LayoutParams(imageWidth, imageHeight);        
    mainLayout.addView(cameraView, layoutParam);
    Log.v(LOG_TAG, "added cameraView to mainLayout");
    }

    private void initRecorder() {
    Log.w(LOG_TAG,"initRecorder");

    if (yuvIplimage == null) {
       // Recreated after frame size is set in surface change method
       yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
       //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);

       Log.v(LOG_TAG, "IplImage.create");
     }

    recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
    Log.v(LOG_TAG, "FFmpegFrameRecorder: " + ffmpeg_link + " imageWidth: " + imageWidth + " imageHeight " + imageHeight);

    recorder.setFormat("flv");
    Log.v(LOG_TAG, "recorder.setFormat(\"flv\")");

    recorder.setSampleRate(sampleAudioRateInHz);
    Log.v(LOG_TAG, "recorder.setSampleRate(sampleAudioRateInHz)");

    // re-set in the surface changed method as well
    recorder.setFrameRate(frameRate);
    Log.v(LOG_TAG, "recorder.setFrameRate(frameRate)");

    // Create audio recording thread
    audioRecordRunnable = new AudioRecordRunnable();
    audioThread = new Thread(audioRecordRunnable);

    }

     // Start the capture
        public void startRecording() {
        try {
       recorder.start();
       startTime = System.currentTimeMillis();
       recording = true;
       audioThread.start();
    } catch (FFmpegFrameRecorder.Exception e) {
       e.printStackTrace();
        }
        }

        public void stopRecording() {
        // This should stop the audio thread from running
       runAudioThread = false;

       if (recorder != null && recording) {
       recording = false;
       Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
       try {
           recorder.stop();
           recorder.release();
       } catch (FFmpegFrameRecorder.Exception e) {
           e.printStackTrace();
       }
       recorder = null;
       }
       }

      @Override
      public boolean onKeyDown(int keyCode, KeyEvent event) {
    // Quit when back button is pushed
      if (keyCode == KeyEvent.KEYCODE_BACK) {
       if (recording) {
           stopRecording();
       }
       finish();
       return true;
       }
       return super.onKeyDown(keyCode, event);
       }

       @Override
       public void onClick(View v) {
       if (!recording) {
       startRecording();
       Log.w(LOG_TAG, "Start Button Pushed");
       recordButton.setText("Stop");
       } else {
       stopRecording();
       Log.w(LOG_TAG, "Stop Button Pushed");
       recordButton.setText("Start");
       }
       }

       //---------------------------------------------
       // audio thread, gets and encodes audio data
       //---------------------------------------------
       class AudioRecordRunnable implements Runnable {

       @Override
        public void run() {
       // Set the thread priority
         android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

       // Audio
       int bufferSize;
       short[] audioData;
       int bufferReadResult;

       bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
               AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
       audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
               AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

       audioData = new short[bufferSize];

       Log.d(LOG_TAG, "audioRecord.startRecording()");
       audioRecord.startRecording();

       // Audio Capture/Encoding Loop
       while (runAudioThread) {
           // Read from audioRecord
           bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
           if (bufferReadResult > 0) {
               //Log.v(LOG_TAG,"audioRecord bufferReadResult: " + bufferReadResult);

               // Changes in this variable may not be picked up despite it being "volatile"
               if (recording) {
                   try {
                       // Write to FFmpegFrameRecorder
                       Buffer[] buffer = {ShortBuffer.wrap(audioData, 0, bufferReadResult)};                        
                       recorder.record(buffer);
                   } catch (FFmpegFrameRecorder.Exception e) {
                       Log.v(LOG_TAG,e.getMessage());
                       e.printStackTrace();
                   }
               }
           }
       }
       Log.v(LOG_TAG,"AudioThread Finished");

       /* Capture/Encoding finished, release recorder */
       if (audioRecord != null) {
           audioRecord.stop();
           audioRecord.release();
           audioRecord = null;
           Log.v(LOG_TAG,"audioRecord released");
       }
    }

    }

     class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

    private boolean previewRunning = false;

    private SurfaceHolder holder;
    private Camera camera;

    private byte[] previewBuffer;

    long videoTimestamp = 0;

    Bitmap bitmap;
    Canvas canvas;

    public CameraView(Context _context) {
       super(_context);

       holder = this.getHolder();
       holder.addCallback(this);
       holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
       camera = Camera.open();

       try {
           camera.setPreviewDisplay(holder);
           camera.setPreviewCallback(this);

           Camera.Parameters currentParams = camera.getParameters();
           Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
           Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

           // Use these values
           imageWidth = currentParams.getPreviewSize().width;
           imageHeight = currentParams.getPreviewSize().height;
           frameRate = currentParams.getPreviewFrameRate();                

           bitmap = Bitmap.createBitmap(imageWidth, imageHeight, Bitmap.Config.ALPHA_8);


           /*
           Log.v(LOG_TAG,"Creating previewBuffer size: " + imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8);
           previewBuffer = new byte[imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8];
           camera.addCallbackBuffer(previewBuffer);
           camera.setPreviewCallbackWithBuffer(this);
           */              

           camera.startPreview();
           previewRunning = true;
       }
       catch (IOException e) {
           Log.v(LOG_TAG,e.getMessage());
           e.printStackTrace();
       }  
    }

    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
       Log.v(LOG_TAG,"Surface Changed: width " + width + " height: " + height);

       // We would do this if we want to reset the camera parameters
       /*
       if (!recording) {
           if (previewRunning){
               camera.stopPreview();
           }
           try {
               //Camera.Parameters cameraParameters = camera.getParameters();
               //p.setPreviewSize(imageWidth, imageHeight);
               //p.setPreviewFrameRate(frameRate);
               //camera.setParameters(cameraParameters);

               camera.setPreviewDisplay(holder);
               camera.startPreview();
               previewRunning = true;
           }
           catch (IOException e) {
               Log.e(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }  
       }            
       */

       // Get the current parameters
       Camera.Parameters currentParams = camera.getParameters();
       Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
       Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

       // Use these values
       imageWidth = currentParams.getPreviewSize().width;
       imageHeight = currentParams.getPreviewSize().height;
       frameRate = currentParams.getPreviewFrameRate();

       // Create the yuvIplimage if needed
       yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
       //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
       try {
           camera.setPreviewCallback(null);

           previewRunning = false;
           camera.release();

       } catch (RuntimeException e) {
           Log.v(LOG_TAG,e.getMessage());
           e.printStackTrace();
       }
    }

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {

       if (yuvIplimage != null && recording) {
           videoTimestamp = 1000 * (System.currentTimeMillis() - startTime);

           // Put the camera preview frame right into the yuvIplimage object
           yuvIplimage.getByteBuffer().put(data);

           // FAQ about IplImage:
           // - For custom raw processing of data, getByteBuffer() returns an NIO direct
           //   buffer wrapped around the memory pointed by imageData, and under Android we can
           //   also use that Buffer with Bitmap.copyPixelsFromBuffer() and copyPixelsToBuffer().
           // - To get a BufferedImage from an IplImage, we may call getBufferedImage().
           // - The createFrom() factory method can construct an IplImage from a BufferedImage.
           // - There are also a few copy*() methods for BufferedImage<->IplImage data transfers.

           // Let's try it..
           // This works but only on transparency
           // Need to find the right Bitmap and IplImage matching types

           /*
           bitmap.copyPixelsFromBuffer(yuvIplimage.getByteBuffer());
           //bitmap.setPixel(10,10,Color.MAGENTA);

           canvas = new Canvas(bitmap);
           Paint paint = new Paint();
           paint.setColor(Color.GREEN);
           float leftx = 20;
           float topy = 20;
           float rightx = 50;
           float bottomy = 100;
           RectF rectangle = new RectF(leftx,topy,rightx,bottomy);
           canvas.drawRect(rectangle, paint);

           bitmap.copyPixelsToBuffer(yuvIplimage.getByteBuffer());
           */
           //Log.v(LOG_TAG,"Writing Frame");

           try {

               // Get the correct time
               recorder.setTimestamp(videoTimestamp);

               // Record the image into FFmpegFrameRecorder
               recorder.record(yuvIplimage);

           } catch (FFmpegFrameRecorder.Exception e) {
               Log.v(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }
       }
    }

    }
    }

  • Cant we use mp4 as format of FFmpegFrameRecorder while RTMP streaming

    13 avril 2016, par Saty

    I am trying to stream video with FFmpegFrameRecorder using javacv. All are working good however I want to use "mp4" as the recording format instead of FLV. is that possible because when i set that, it crashed ?

    I am using the below code which half of the internet community is using to live stream

    public class MainActivity extends Activity implements OnClickListener {

    private final static String LOG_TAG = "MainActivity";

    private PowerManager.WakeLock mWakeLock;

    private String ffmpeg_link = "rtmp://username:password@xxx.xxx.xxx.xxx:1935/live/test.flv";
    //private String ffmpeg_link = "/mnt/sdcard/new_stream.flv";

    private volatile FFmpegFrameRecorder recorder;
    boolean recording = false;
    long startTime = 0;

    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 320;
    private int imageHeight = 240;
    private int frameRate = 30;

    private Thread audioThread;
    volatile boolean runAudioThread = true;
    private AudioRecord audioRecord;
    private AudioRecordRunnable audioRecordRunnable;

    private CameraView cameraView;
    private IplImage yuvIplimage = null;

    private Button recordButton;
    private LinearLayout mainLayout;

    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);

       setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
       setContentView(R.layout.activity_main);

       initLayout();
       initRecorder();
    }

    @Override
    protected void onResume() {
       super.onResume();

       if (mWakeLock == null) {
           PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
           mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, LOG_TAG);
           mWakeLock.acquire();
       }
    }

    @Override
    protected void onPause() {
       super.onPause();

       if (mWakeLock != null) {
           mWakeLock.release();
           mWakeLock = null;
       }
    }

    @Override
    protected void onDestroy() {
       super.onDestroy();

       recording = false;
    }


    private void initLayout() {

       mainLayout = (LinearLayout) this.findViewById(R.id.record_layout);

       recordButton = (Button) findViewById(R.id.recorder_control);
       recordButton.setText("Start");
       recordButton.setOnClickListener(this);

       cameraView = new CameraView(this);

       LinearLayout.LayoutParams layoutParam = new LinearLayout.LayoutParams(imageWidth, imageHeight);        
       mainLayout.addView(cameraView, layoutParam);
       Log.v(LOG_TAG, "added cameraView to mainLayout");
    }

    private void initRecorder() {
       Log.w(LOG_TAG,"initRecorder");

       if (yuvIplimage == null) {
           // Recreated after frame size is set in surface change method
           yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
           //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);

           Log.v(LOG_TAG, "IplImage.create");
       }

       recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
       Log.v(LOG_TAG, "FFmpegFrameRecorder: " + ffmpeg_link + " imageWidth: " + imageWidth + " imageHeight " + imageHeight);

       recorder.setFormat("flv");
       Log.v(LOG_TAG, "recorder.setFormat(\"flv\")");

       recorder.setSampleRate(sampleAudioRateInHz);
       Log.v(LOG_TAG, "recorder.setSampleRate(sampleAudioRateInHz)");

       // re-set in the surface changed method as well
       recorder.setFrameRate(frameRate);
       Log.v(LOG_TAG, "recorder.setFrameRate(frameRate)");

       // Create audio recording thread
       audioRecordRunnable = new AudioRecordRunnable();
       audioThread = new Thread(audioRecordRunnable);
    }

    // Start the capture
    public void startRecording() {
       try {
           recorder.start();
           startTime = System.currentTimeMillis();
           recording = true;
           audioThread.start();
       } catch (FFmpegFrameRecorder.Exception e) {
           e.printStackTrace();
       }
    }

    public void stopRecording() {
       // This should stop the audio thread from running
       runAudioThread = false;

       if (recorder != null && recording) {
           recording = false;
           Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
           try {
               recorder.stop();
               recorder.release();
           } catch (FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
           recorder = null;
       }
    }

    @Override
    public boolean onKeyDown(int keyCode, KeyEvent event) {
       // Quit when back button is pushed
       if (keyCode == KeyEvent.KEYCODE_BACK) {
           if (recording) {
               stopRecording();
           }
           finish();
           return true;
       }
       return super.onKeyDown(keyCode, event);
    }

    @Override
    public void onClick(View v) {
       if (!recording) {
           startRecording();
           Log.w(LOG_TAG, "Start Button Pushed");
           recordButton.setText("Stop");
       } else {
           stopRecording();
           Log.w(LOG_TAG, "Stop Button Pushed");
           recordButton.setText("Start");
       }
    }

    //---------------------------------------------
    // audio thread, gets and encodes audio data
    //---------------------------------------------
    class AudioRecordRunnable implements Runnable {

       @Override
       public void run() {
           // Set the thread priority
           android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

           // Audio
           int bufferSize;
           short[] audioData;
           int bufferReadResult;

           bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                   AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
           audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                   AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

           audioData = new short[bufferSize];

           Log.d(LOG_TAG, "audioRecord.startRecording()");
           audioRecord.startRecording();

           // Audio Capture/Encoding Loop
           while (runAudioThread) {
               // Read from audioRecord
               bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
               if (bufferReadResult > 0) {
                   //Log.v(LOG_TAG,"audioRecord bufferReadResult: " + bufferReadResult);

                   // Changes in this variable may not be picked up despite it being "volatile"
                   if (recording) {
                       try {
                           // Write to FFmpegFrameRecorder
                           Buffer[] buffer = {ShortBuffer.wrap(audioData, 0, bufferReadResult)};                        
                           recorder.record(buffer);
                       } catch (FFmpegFrameRecorder.Exception e) {
                           Log.v(LOG_TAG,e.getMessage());
                           e.printStackTrace();
                       }
                   }
               }
           }
           Log.v(LOG_TAG,"AudioThread Finished");

           /* Capture/Encoding finished, release recorder */
           if (audioRecord != null) {
               audioRecord.stop();
               audioRecord.release();
               audioRecord = null;
               Log.v(LOG_TAG,"audioRecord released");
           }
       }
    }

    class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

       private boolean previewRunning = false;

       private SurfaceHolder holder;
       private Camera camera;

       private byte[] previewBuffer;

       long videoTimestamp = 0;

       Bitmap bitmap;
       Canvas canvas;

       public CameraView(Context _context) {
           super(_context);

           holder = this.getHolder();
           holder.addCallback(this);
           holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
       }

       @Override
       public void surfaceCreated(SurfaceHolder holder) {
           camera = Camera.open();

           try {
               camera.setPreviewDisplay(holder);
               camera.setPreviewCallback(this);

               Camera.Parameters currentParams = camera.getParameters();
               Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
               Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

               // Use these values
               imageWidth = currentParams.getPreviewSize().width;
               imageHeight = currentParams.getPreviewSize().height;
               frameRate = currentParams.getPreviewFrameRate();                

               bitmap = Bitmap.createBitmap(imageWidth, imageHeight, Bitmap.Config.ALPHA_8);


               /*
               Log.v(LOG_TAG,"Creating previewBuffer size: " + imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8);
               previewBuffer = new byte[imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8];
               camera.addCallbackBuffer(previewBuffer);
               camera.setPreviewCallbackWithBuffer(this);
               */              

               camera.startPreview();
               previewRunning = true;
           }
           catch (IOException e) {
               Log.v(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }  
       }

       public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
           Log.v(LOG_TAG,"Surface Changed: width " + width + " height: " + height);

           // We would do this if we want to reset the camera parameters
           /*
           if (!recording) {
               if (previewRunning){
                   camera.stopPreview();
               }
               try {
                   //Camera.Parameters cameraParameters = camera.getParameters();
                   //p.setPreviewSize(imageWidth, imageHeight);
                   //p.setPreviewFrameRate(frameRate);
                   //camera.setParameters(cameraParameters);

                   camera.setPreviewDisplay(holder);
                   camera.startPreview();
                   previewRunning = true;
               }
               catch (IOException e) {
                   Log.e(LOG_TAG,e.getMessage());
                   e.printStackTrace();
               }  
           }            
           */

           // Get the current parameters
           Camera.Parameters currentParams = camera.getParameters();
           Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
           Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

           // Use these values
           imageWidth = currentParams.getPreviewSize().width;
           imageHeight = currentParams.getPreviewSize().height;
           frameRate = currentParams.getPreviewFrameRate();

           // Create the yuvIplimage if needed
           yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
           //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
       }

       @Override
       public void surfaceDestroyed(SurfaceHolder holder) {
           try {
               camera.setPreviewCallback(null);

               previewRunning = false;
               camera.release();

           } catch (RuntimeException e) {
               Log.v(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }
       }

       @Override
       public void onPreviewFrame(byte[] data, Camera camera) {

           if (yuvIplimage != null && recording) {
               videoTimestamp = 1000 * (System.currentTimeMillis() - startTime);

               // Put the camera preview frame right into the yuvIplimage object
               yuvIplimage.getByteBuffer().put(data);

               // FAQ about IplImage:
               // - For custom raw processing of data, getByteBuffer() returns an NIO direct
               //   buffer wrapped around the memory pointed by imageData, and under Android we can
               //   also use that Buffer with Bitmap.copyPixelsFromBuffer() and copyPixelsToBuffer().
               // - To get a BufferedImage from an IplImage, we may call getBufferedImage().
               // - The createFrom() factory method can construct an IplImage from a BufferedImage.
               // - There are also a few copy*() methods for BufferedImage<->IplImage data transfers.

               // Let's try it..
               // This works but only on transparency
               // Need to find the right Bitmap and IplImage matching types

               /*
               bitmap.copyPixelsFromBuffer(yuvIplimage.getByteBuffer());
               //bitmap.setPixel(10,10,Color.MAGENTA);

               canvas = new Canvas(bitmap);
               Paint paint = new Paint();
               paint.setColor(Color.GREEN);
               float leftx = 20;
               float topy = 20;
               float rightx = 50;
               float bottomy = 100;
               RectF rectangle = new RectF(leftx,topy,rightx,bottomy);
               canvas.drawRect(rectangle, paint);

               bitmap.copyPixelsToBuffer(yuvIplimage.getByteBuffer());
               */
               //Log.v(LOG_TAG,"Writing Frame");

               try {

                   // Get the correct time
                   recorder.setTimestamp(videoTimestamp);

                   // Record the image into FFmpegFrameRecorder
                   recorder.record(yuvIplimage);

               } catch (FFmpegFrameRecorder.Exception e) {
                   Log.v(LOG_TAG,e.getMessage());
                   e.printStackTrace();
               }
           }
       }
    }

    }

  • Writing an MP4 file on the Mac with OpenCV ffmpeg

    13 janvier 2015, par Sameer Parekh

    I am using OpenCV with ffmpeg on a mac to write video. I’ve been able to successfully write .avi files using the codec/fourcc code, FMP4. I would like to write .mp4 files, however. When I try to write an .mp4 file using fourcc FMP4 I get this error :

    [mp4 @ 0x100b4ec00] Tag FMP4/0x34504d46 incompatible with output codec id '13' ( [0][0][0])

    When I use AVC1 I get the following error :

    [libx264 @ 0x104003000] broken ffmpeg default settings detected
    [libx264 @ 0x104003000] use an encoding preset (e.g. -vpre medium)
    [libx264 @ 0x104003000] preset usage: -vpre <speed> -vpre <profile>
    [libx264 @ 0x104003000] speed presets are listed in x264 --help
    [libx264 @ 0x104003000] profile is optional; x264 defaults to high
    Could not open codec 'libx264': Unspecified error
    </profile></speed>

    Does anyone here know the right codec to use with OpenCV and ffmpeg to write to an MP4 container on the Mac ?

    If AVC1 is the right codec, how do I install ffmpeg + OpenCV correctly ? I did

    brew install gpac
    brew install ffmpeg
    brew install opencv

    The call I am using to open the videowriter :

    fourcc = cv2.cv.CV_FOURCC('A', 'V', 'C', '1')  
    video_out = cv2.VideoWriter(
       filename=output_filename,
       fourcc=fourcc,
       fps=video_fps,
       frameSize=(video_width,video_height),
       isColor=1)

    When I run x264 --help I get

    % x264 --help
    x264 core:125
    Syntax: x264 [options] -o outfile infile

    Infile can be raw (in which case resolution is required),
     or YUV4MPEG (*.y4m),
     or Avisynth if compiled with support (no).
     or libav* formats if compiled with lavf support (no) or ffms support (no).
    Outfile type is selected by filename:
    .264 -> Raw bytestream
    .mkv -> Matroska
    .flv -> Flash Video
    .mp4 -> MP4 if compiled with GPAC support (no)
    Output bit depth: 8 (configured at compile time)

    Options:

     -h, --help                  List basic options
         --longhelp              List more options
         --fullhelp              List all options

    Example usage:

         Constant quality mode:
               x264 --crf 24 -o <output> <input />

         Two-pass with a bitrate of 1000kbps:
               x264 --pass 1 --bitrate 1000 -o <output> <input />
               x264 --pass 2 --bitrate 1000 -o <output> <input />

         Lossless:
               x264 --qp 0 -o <output> <input />

         Maximum PSNR at the cost of speed and visual quality:
               x264 --preset placebo --tune psnr -o <output> <input />

         Constant bitrate at 1000kbps with a 2 second-buffer:
               x264 --vbv-bufsize 2000 --bitrate 1000 -o <output> <input />

    Presets:

         --profile <string>      Force the limits of an H.264 profile
                                     Overrides all settings.
                                     - baseline,main,high,high10,high422,high444
         --preset <string>       Use a preset to select encoding settings [medium]
                                     Overridden by user settings.
                                     - ultrafast,superfast,veryfast,faster,fast
                                     - medium,slow,slower,veryslow,placebo
         --tune <string>         Tune the settings for a particular type of source
                                 or situation
                                     Overridden by user settings.
                                     Multiple tunings are separated by commas.
                                     Only one psy tuning can be used at a time.
                                     - psy tunings: film,animation,grain,
                                                    stillimage,psnr,ssim
                                     - other tunings: fastdecode,zerolatency

    Frame-type options:

     -I, --keyint <integer or="or"> Maximum GOP size [250]
         --tff                   Enable interlaced mode (top field first)
         --bff                   Enable interlaced mode (bottom field first)
         --pulldown <string>     Use soft pulldown to change frame rate
                                     - none, 22, 32, 64, double, triple, euro (requires cfr input)

    Ratecontrol:

     -B, --bitrate <integer>     Set bitrate (kbit/s)
         --crf <float>           Quality-based VBR (0-51) [23.0]
         --vbv-maxrate <integer> Max local bitrate (kbit/s) [0]
         --vbv-bufsize <integer> Set size of the VBV buffer (kbit) [0]
     -p, --pass <integer>        Enable multipass ratecontrol
                                     - 1: First pass, creates stats file
                                     - 2: Last pass, does not overwrite stats file

    Input/Output:

     -o, --output <string>       Specify output file
         --sar width:height      Specify Sample Aspect Ratio
         --fps   Specify framerate
         --seek <integer>        First frame to encode
         --frames <integer>      Maximum number of frames to encode
         --level <string>        Specify level (as defined by Annex A)
         --quiet                 Quiet Mode

    Filtering:

         --vf, --video-filter <filter0>/<filter1>/... Apply video filtering to the input file

         Filter options may be specified in <filter>:<option>=<value> format.

         Available filters:
         crop:left,top,right,bottom
         select_every:step,offset1[,...]
    </value></option></filter></filter1></filter0></string></integer></integer></string></integer></integer></integer></float></integer></string></integer></string></string></string></output></output></output></output></output></output>

    Thanks,
    - s