Recherche avancée

Médias (1)

Mot : - Tags -/portrait

Autres articles (37)

  • Personnaliser les catégories

    21 juin 2013, par

    Formulaire de création d’une catégorie
    Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
    Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire.
    Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
    Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (6877)

  • Video encoding and keyframes

    24 février 2013, par Tishu

    I am transcoding a video frame by frame and using x264+ffmpeg to encode. The original video plays fine, but the first few frames of my transcoded vide show grey artefacts. I understand this is because of time compression and these artefacts disappear after a few frames.

    See these two pictures which are the first and second frames. The third frame is normal (i.e. no grey artefact and not blurry like the second one)
    First frame
    Second frame

    How can I force the first frame to be a key frame (ie fully encoded in my output video) so that these artefacts do not show ?

    Edit - more details

    Here is what I am doing more in details. I used bit form differents tutorials to read a video frame by frame and reencode each frame to a new video. My encoding parameters are the following :

    avcodec_get_context_defaults3(c, *codec);
    c->codec_id = codec_id;
    c->bit_rate = output_bitrate;
    /* Resolution must be a multiple of two. */
    c->width    = output_width;
    c->height   = output_height;
    /* timebase: This is the fundamental unit of time (in seconds) in terms
    * of which frame timestamps are represented. For fixed-fps content,
    * timebase should be 1/framerate and timestamp increments should be
    * identical to 1. */
    st->r_frame_rate.num = output_framerate_num;
    st->r_frame_rate.den = output_framerate_den;
    c->time_base.den = output_timebase_den;
    c->time_base.num = output_timebase_num;
    c->gop_size      = 3; /* emit one intra frame every twelve frames at most */
    c->pix_fmt       = STREAM_PIX_FMT;
    if (c->codec_id == AV_CODEC_ID_MPEG2VIDEO) {
       /* just for testing, we also add B frames */
       c->max_b_frames = 2;
    }
    if (c->codec_id == AV_CODEC_ID_MPEG1VIDEO) {
       /* Needed to avoid using macroblocks in which some coeffs overflow.
        * This does not happen with normal video, it just happens here as
        * the motion of the chroma plane does not match the luma plane. */
       c->mb_decision = 2;
    }
    c->max_b_frames = 2;
    c->scenechange_threshold = 0;
    c->rc_buffer_size = 0;
    c->me_method = ME_ZERO;

    Then I process each frame, probably doing something wrong there. The decoding bit :

    while(av_read_frame(gFormatCtx, &packet)>=0) {
       // Is this a packet from the video stream?
       if(packet.stream_index==gVideoStreamIndex) {
           // Decode video frame
           avcodec_decode_video2(gVideoCodecCtx, pCurrentFrame, &frameFinished, &packet);
           // Did we get a video frame?
           if(frameFinished) {
               [...]
               if(firstPts == -999) /*Initial value*/
                   firstPts = packet.pts;
               deltaPts = packet.pts - firstPts;
               double seconds = deltaPts*av_q2d(gFormatCtx->streams[gVideoStreamIndex]->time_base);
               [...]
               muxing_writeVideoFrame(pCurrentFrame, packet.pts);
           }
       }
    }

    The actual writing :

    int muxing_writeVideoFrame(AVFrame *frame, int64_t pts)
    {
    frameCount = frameCount +1;
    if(frameCount > 0)
    {
       if (video_st)
           video_pts = (double)video_st->pts.val * video_st->time_base.num /
                       video_st->time_base.den;
       else
           video_pts = 0.0;

       if (video_st && !(video_st && audio_st && audio_pts < video_pts))
       {
           frame->pts = pts;//av_rescale_q(frame_count, video_st->codec->time_base, video_st->time_base);
           write_video_frame(oc, video_st, frame);
       }
    }

    return 0;
    }

    static int write_video_frame(AVFormatContext *oc, AVStream *st, AVFrame *frame)
    {
       int ret;
       static struct SwsContext *sws_ctx;
       //LOGI(10, frame_count);
       AVCodecContext *c = st->codec;

       /* encode the image */
       AVPacket pkt;
       int got_output;
       av_init_packet(&pkt);
       pkt.data = NULL;    // packet data will be allocated by the encoder
       pkt.size = 0;
    ret = avcodec_encode_video2(c, &pkt, frame, &got_output);
    if (ret < 0) {
       fprintf(stderr, "Error encoding video frame: %s\n", av_err2str(ret));
       exit(1);
    }
    /* If size is zero, it means the image was buffered. */
    if (got_output) {
       if (c->coded_frame->key_frame)
           pkt.flags |= AV_PKT_FLAG_KEY;
       pkt.stream_index = st->index;
       /* Write the compressed frame to the media file. */
       ret = av_interleaved_write_frame(oc, &pkt);
    } else {
       ret = 0;
    }

       if (ret != 0) {
           LOGI(10, av_err2str(ret));
           exit(1);
       }
       frame_count++;
       return got_output;
    }
  • FFmpeg memory leak

    10 septembre 2015, par Spamdark

    I have developed just a simple library modifing a library that I found on the internet.

    What scares me, is that, when I play an avi, it plays and free the memory when the video ends, but when I play the video, it’s like a memory leak ! It grows to 138mb although the video has ended and the FreeAll method (A function that deletes the context, etc...) has been called.

    Here is the code of the method that is causing the memory leak :

    int VideoGL::NextVideoFrame(){
    int frameDone = 0;
    int result = 0;
    double pts = 0;

    if(!this->ended){

    if (!_started) return 0;
    AVPacket* packet;

    // Get the number of milliseconds passed and see if we should display a new frame
    int64_t msPassed = (1000 * (clock() - _baseTime)) / CLOCKS_PER_SEC;
    if (msPassed >= _currentPts)
    {
       // If this is not the current frame, copy it to the buffer
       if (_currentFramePts != _currentPts){
           _currentFramePts = _currentPts;
           memcpy(buffer_a,buffer, 3 * _codec_context_video->width * _codec_context_video->height);
           result = 1;
       }

       // Try to load a new frame from the video packet queue
       bool goodop=false;
       AVFrame *_n_frame = avcodec_alloc_frame();
       while (!frameDone && (packet = this->DEQUEUE(VIDEO)) != NULL)
       {
           if (packet == (AVPacket*)-1) return -1;

           goodop=true;

           _s_pts = packet->pts;
           avcodec_decode_video2(_codec_context_video, _n_frame, &frameDone, packet);
           av_free_packet(packet);

           if (packet->dts == AV_NOPTS_VALUE)
           {
               if (_n_frame->opaque && *(uint64_t*)_n_frame->opaque != AV_NOPTS_VALUE) pts = (double) *(uint64_t*)_n_frame->opaque;
               else pts = 0;
           }
           else pts = (double) packet->dts;

           pts *= av_q2d(_codec_context_video->time_base);

       }

       if (frameDone)
       {
           // if a frame was loaded scale it to the current texture frame buffer, but also set the pts so that it won't be copied to the texture until it's time
           sws_scale(sws_ctx,_n_frame->data, _n_frame->linesize, 0, _codec_context_video->height, _rgb_frame->data, _rgb_frame->linesize);


           double nts = 1.0/av_q2d(_codec_context_video->time_base);
           _currentPts = (uint64_t) (pts*nts);

       }

       avcodec_free_frame(&_n_frame);
       av_free(_n_frame);

       if(!goodop){
           ended=true;
       }
    }
    }

    return result;
    }

    I’ll be waiting for answers, thanks.

  • Audio & Video not synchronized properly if i merged more videos in mp4parser

    1er octobre 2013, par maniya

    I have used mp4parser for merging video with dynamic pause and record video capture for max 6 second recording. In preview its working fine when recorded video with minimum pause/record, If i tried with more than 3 pause/record mean the last video file not get merged properly with audio.At the start of the video the sync is ok but at the end the video hanged and audio playing in screen for the remaining file duration about 1sec.

    My Recording manager

    public class RecordingManager implements Camera.ErrorCallback, MediaRecorder.OnErrorListener, MediaRecorder.OnInfoListener {

       private static final String TAG = RecordingManager.class.getSimpleName();
       private static final int FOCUS_AREA_RADIUS = 32;
       private static final int FOCUS_MAX_VALUE = 1000;
       private static final int FOCUS_MIN_VALUE = -1000;
       private static final long MINIMUM_RECORDING_TIME = 2000;
       private static final int MAXIMUM_RECORDING_TIME = 70 * 1000;
       private static final long LOW_STORAGE_THRESHOLD = 5 * 1024 * 1024;
       private static final long RECORDING_FILE_LIMIT = 100 * 1024 * 1024;

       private boolean paused = true;

       private MediaRecorder mediaRecorder = null;
       private boolean recording = false;

       private FrameLayout previewFrame = null;

       private boolean mPreviewing = false;

    //    private TextureView mTextureView = null;
    //    private SurfaceTexture mSurfaceTexture = null;
    //    private boolean mSurfaceTextureReady = false;
    //
       private SurfaceView surfaceView = null;
       private SurfaceHolder surfaceHolder = null;
       private boolean surfaceViewReady = false;

       private Camera camera = null;
       private Camera.Parameters cameraParameters = null;
       private CamcorderProfile camcorderProfile = null;

       private int mOrientation = -1;
       private OrientationEventListener mOrientationEventListener = null;

       private long mStartRecordingTime;
       private int mVideoWidth;
       private int mVideoHeight;
       private long mStorageSpace;

       private Handler mHandler = new Handler();
    //    private Runnable mUpdateRecordingTimeTask = new Runnable() {
    //        @Override
    //        public void run() {
    //            long recordingTime = System.currentTimeMillis() - mStartRecordingTime;
    //            Log.d(TAG, String.format("Recording time:%d", recordingTime));
    //            mHandler.postDelayed(this, CLIP_GRAPH_UPDATE_INTERVAL);
    //        }
    //    };
       private Runnable mStopRecordingTask = new Runnable() {
           @Override
           public void run() {
               stopRecording();
           }
       };

       private static RecordingManager mInstance = null;
       private Activity currentActivity = null;
       private String destinationFilepath = "";
       private String snapshotFilepath = "";

       public static RecordingManager getInstance(Activity activity, FrameLayout previewFrame) {
           if (mInstance == null || mInstance.currentActivity != activity) {
               mInstance = new RecordingManager(activity, previewFrame);
           }
           return mInstance;
       }

       private RecordingManager(Activity activity, FrameLayout previewFrame) {
           currentActivity = activity;
           this.previewFrame = previewFrame;
       }

       public int getVideoWidth() {
           return this.mVideoWidth;
       }
       public int getVideoHeight() {
           return this.mVideoHeight;
       }
       public void setDestinationFilepath(String filepath) {
           this.destinationFilepath = filepath;
       }
       public String getDestinationFilepath() {
           return this.destinationFilepath;
       }
       public void setSnapshotFilepath(String filepath) {
           this.snapshotFilepath = filepath;
       }
       public String getSnapshotFilepath() {
           return this.snapshotFilepath;
       }
       public void init(String videoPath, String snapshotPath) {
           Log.v(TAG, "init.");
           setDestinationFilepath(videoPath);
           setSnapshotFilepath(snapshotPath);
           if (!Utils.isExternalStorageAvailable()) {
               showStorageErrorAndFinish();
               return;
           }

           openCamera();
           if (camera == null) {
               showCameraErrorAndFinish();
               return;
           }



       public void onResume() {
           Log.v(TAG, "onResume.");
           paused = false;

           // Open the camera
           if (camera == null) {
               openCamera();
               if (camera == null) {
                   showCameraErrorAndFinish();
                   return;
               }
           }

           // Initialize the surface texture or surface view
    //        if (useTexture() && mTextureView == null) {
    //            initTextureView();
    //            mTextureView.setVisibility(View.VISIBLE);
    //        } else if (!useTexture() && mSurfaceView == null) {
               initSurfaceView();
               surfaceView.setVisibility(View.VISIBLE);
    //        }

           // Start the preview
           if (!mPreviewing) {
               startPreview();
           }
       }

       private void openCamera() {
           Log.v(TAG, "openCamera");
           try {
               camera = Camera.open();
               camera.setErrorCallback(this);
               camera.setDisplayOrientation(90); // Since we only support portrait mode
               cameraParameters = camera.getParameters();
           } catch (RuntimeException e) {
               e.printStackTrace();
               camera = null;
           }
       }

       private void closeCamera() {
           Log.v(TAG, "closeCamera");
           if (camera == null) {
               Log.d(TAG, "Already stopped.");
               return;
           }

           camera.setErrorCallback(null);
           if (mPreviewing) {
               stopPreview();
           }
           camera.release();
           camera = null;
       }




       private void initSurfaceView() {
           surfaceView = new SurfaceView(currentActivity);
           surfaceView.getHolder().addCallback(new SurfaceViewCallback());
           surfaceView.setVisibility(View.GONE);
           FrameLayout.LayoutParams params = new LayoutParams(
                   LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT, Gravity.CENTER);
           surfaceView.setLayoutParams(params);
           Log.d(TAG, "add surface view to preview frame");
           previewFrame.addView(surfaceView);
       }

       private void releaseSurfaceView() {
           if (surfaceView != null) {
               previewFrame.removeAllViews();
               surfaceView = null;
               surfaceHolder = null;
               surfaceViewReady = false;
           }
       }

       private void startPreview() {
    //        if ((useTexture() && !mSurfaceTextureReady) || (!useTexture() && !mSurfaceViewReady)) {
    //            return;
    //        }

           Log.v(TAG, "startPreview.");
           if (mPreviewing) {
               stopPreview();
           }

           setCameraParameters();
           resizePreview();

           try {
    //            if (useTexture()) {
    //                mCamera.setPreviewTexture(mSurfaceTexture);
    //            } else {
                   camera.setPreviewDisplay(surfaceHolder);
    //            }
               camera.startPreview();
               mPreviewing = true;
           } catch (Exception e) {
               closeCamera();
               e.printStackTrace();
               Log.e(TAG, "startPreview failed.");
           }

       }

       private void stopPreview() {
           Log.v(TAG, "stopPreview");
           if (camera != null) {
               camera.stopPreview();
               mPreviewing = false;
           }
       }

       public void onPause() {
           paused = true;

           if (recording) {
               stopRecording();
           }
           closeCamera();

    //        if (useTexture()) {
    //            releaseSurfaceTexture();
    //        } else {
               releaseSurfaceView();
    //        }
       }

       private void setCameraParameters() {
           if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)) {
               camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_720P);
           } else if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_480P)) {
               camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_480P);
           } else {
               camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH);
           }
           mVideoWidth = camcorderProfile.videoFrameWidth;
           mVideoHeight = camcorderProfile.videoFrameHeight;
           camcorderProfile.fileFormat = MediaRecorder.OutputFormat.MPEG_4;
           camcorderProfile.videoFrameRate = 30;

           Log.v(TAG, "mVideoWidth=" + mVideoWidth + " mVideoHeight=" + mVideoHeight);
           cameraParameters.setPreviewSize(mVideoWidth, mVideoHeight);

           if (cameraParameters.getSupportedWhiteBalance().contains(Camera.Parameters.WHITE_BALANCE_AUTO)) {
               cameraParameters.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_AUTO);
           }

           if (cameraParameters.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
               cameraParameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
           }

           cameraParameters.setRecordingHint(true);
           cameraParameters.set("cam_mode", 1);

           camera.setParameters(cameraParameters);
           cameraParameters = camera.getParameters();

           camera.setDisplayOrientation(90);
           android.hardware.Camera.CameraInfo info = new android.hardware.Camera.CameraInfo();
           Log.d(TAG, info.orientation + " degree");
       }

       private void resizePreview() {
           Log.d(TAG, String.format("Video size:%d|%d", mVideoWidth, mVideoHeight));

           Point optimizedSize = getOptimizedPreviewSize(mVideoWidth, mVideoHeight);
           Log.d(TAG, String.format("Optimized size:%d|%d", optimizedSize.x, optimizedSize.y));

           ViewGroup.LayoutParams params = (ViewGroup.LayoutParams) previewFrame.getLayoutParams();
           params.width = optimizedSize.x;
           params.height = optimizedSize.y;
           previewFrame.setLayoutParams(params);
       }

       public void setOrientation(int ori) {
           this.mOrientation = ori;
       }

       public void setOrientationEventListener(OrientationEventListener listener) {
           this.mOrientationEventListener = listener;
       }

       public Camera getCamera() {
           return camera;
       }

       @SuppressWarnings("serial")
       public void setFocusArea(float x, float y) {
           if (camera != null) {
               int viewWidth = surfaceView.getWidth();
               int viewHeight = surfaceView.getHeight();

               int focusCenterX = FOCUS_MAX_VALUE - (int) (x / viewWidth * (FOCUS_MAX_VALUE - FOCUS_MIN_VALUE));
               int focusCenterY = FOCUS_MIN_VALUE + (int) (y / viewHeight * (FOCUS_MAX_VALUE - FOCUS_MIN_VALUE));
               final int left = focusCenterY - FOCUS_AREA_RADIUS < FOCUS_MIN_VALUE ? FOCUS_MIN_VALUE : focusCenterY - FOCUS_AREA_RADIUS;
               final int top = focusCenterX - FOCUS_AREA_RADIUS < FOCUS_MIN_VALUE ? FOCUS_MIN_VALUE : focusCenterX - FOCUS_AREA_RADIUS;
               final int right = focusCenterY + FOCUS_AREA_RADIUS > FOCUS_MAX_VALUE ? FOCUS_MAX_VALUE : focusCenterY + FOCUS_AREA_RADIUS;
               final int bottom = focusCenterX + FOCUS_AREA_RADIUS > FOCUS_MAX_VALUE ? FOCUS_MAX_VALUE : focusCenterX + FOCUS_AREA_RADIUS;

               Camera.Parameters params = camera.getParameters();
               params.setFocusAreas(new ArrayList() {
                   {
                       add(new Camera.Area(new Rect(left, top, right, bottom), 1000));
                   }
               });
               camera.setParameters(params);
               camera.autoFocus(new AutoFocusCallback() {
                   @Override
                   public void onAutoFocus(boolean success, Camera camera) {
                       Log.d(TAG, "onAutoFocus");
                   }
               });
           }
       }

       public void startRecording(String destinationFilepath) {
           if (!recording) {
               updateStorageSpace();
               setDestinationFilepath(destinationFilepath);
               if (mStorageSpace <= LOW_STORAGE_THRESHOLD) {
                   Log.v(TAG, "Storage issue, ignore the start request");
                   Toast.makeText(currentActivity, "Storage issue, ignore the recording request", Toast.LENGTH_LONG).show();
                   return;
               }

               if (!prepareMediaRecorder()) {
                   Toast.makeText(currentActivity, "prepareMediaRecorder failed.", Toast.LENGTH_LONG).show();
                   return;
               }

               Log.d(TAG, "Successfully prepare media recorder.");
               try {
                   mediaRecorder.start();
               } catch (RuntimeException e) {
                   Log.e(TAG, "MediaRecorder start failed.");
                   releaseMediaRecorder();
                   return;
               }

               mStartRecordingTime = System.currentTimeMillis();

               if (mOrientationEventListener != null) {
                   mOrientationEventListener.disable();
               }

               recording = true;
           }
       }

       public void stopRecording() {
           if (recording) {
               if (!paused) {
                   // Capture at least 1 second video
                   long currentTime = System.currentTimeMillis();
                   if (currentTime - mStartRecordingTime < MINIMUM_RECORDING_TIME) {
                       mHandler.postDelayed(mStopRecordingTask, MINIMUM_RECORDING_TIME - (currentTime - mStartRecordingTime));
                       return;
                   }
               }

               if (mOrientationEventListener != null) {
                   mOrientationEventListener.enable();
               }

    //            mHandler.removeCallbacks(mUpdateRecordingTimeTask);

               try {
                   mediaRecorder.setOnErrorListener(null);
                   mediaRecorder.setOnInfoListener(null);
                   mediaRecorder.stop(); // stop the recording
                   Toast.makeText(currentActivity, "Video file saved.", Toast.LENGTH_LONG).show();

                   long stopRecordingTime = System.currentTimeMillis();
                   Log.d(TAG, String.format("stopRecording. file:%s duration:%d", destinationFilepath, stopRecordingTime - mStartRecordingTime));

                   // Calculate the duration of video
                   MediaMetadataRetriever mmr = new MediaMetadataRetriever();
                   mmr.setDataSource(this.destinationFilepath);
                   String _length = mmr.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION);
                   if (_length != null) {
                       Log.d(TAG, String.format("clip duration:%d", Long.parseLong(_length)));
                   }

                   // Taking the snapshot of video
                   Bitmap snapshot = ThumbnailUtils.createVideoThumbnail(this.destinationFilepath, Thumbnails.MICRO_KIND);
                   try {
                       FileOutputStream out = new FileOutputStream(this.snapshotFilepath);
                       snapshot.compress(Bitmap.CompressFormat.JPEG, 70, out);
                       out.close();
                   } catch (Exception e) {
                       e.printStackTrace();
                   }

    //                mActivity.showPlayButton();

               } catch (RuntimeException e) {
                   e.printStackTrace();
                   Log.e(TAG, e.getMessage());
                   // if no valid audio/video data has been received when stop() is
                   // called
               } finally {
    //          

                   releaseMediaRecorder(); // release the MediaRecorder object
                   if (!paused) {
                       cameraParameters = camera.getParameters();
                   }
                   recording = false;
               }

           }
       }

       public void setRecorderOrientation(int orientation) {
           // For back camera only
           if (orientation != -1) {
               Log.d(TAG, "set orientationHint:" + (orientation + 135) % 360 / 90 * 90);
               mediaRecorder.setOrientationHint((orientation + 135) % 360 / 90 * 90);
           }else {
               Log.d(TAG, "not set orientationHint to mediaRecorder");
           }
       }

       private boolean prepareMediaRecorder() {
           mediaRecorder = new MediaRecorder();

           camera.unlock();
           mediaRecorder.setCamera(camera);

           mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
           mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA);

           mediaRecorder.setProfile(camcorderProfile);

           mediaRecorder.setMaxDuration(MAXIMUM_RECORDING_TIME);
           mediaRecorder.setOutputFile(this.destinationFilepath);

           try {
               mediaRecorder.setMaxFileSize(Math.min(RECORDING_FILE_LIMIT, mStorageSpace - LOW_STORAGE_THRESHOLD));
           } catch (RuntimeException exception) {
           }

           setRecorderOrientation(mOrientation);

           if (!useTexture()) {
               mediaRecorder.setPreviewDisplay(surfaceHolder.getSurface());
           }

           try {
               mediaRecorder.prepare();
           } catch (IllegalStateException e) {
               releaseMediaRecorder();
               return false;
           } catch (IOException e) {
               releaseMediaRecorder();
               return false;
           }

           mediaRecorder.setOnErrorListener(this);
           mediaRecorder.setOnInfoListener(this);

           return true;

       }

       private void releaseMediaRecorder() {
           if (mediaRecorder != null) {
               mediaRecorder.reset(); // clear recorder configuration
               mediaRecorder.release(); // release the recorder object
               mediaRecorder = null;
               camera.lock(); // lock camera for later use
           }
       }

       private Point getOptimizedPreviewSize(int videoWidth, int videoHeight) {
           Display display = currentActivity.getWindowManager().getDefaultDisplay();
           Point size = new Point();
           display.getSize(size);

           Point optimizedSize = new Point();
           optimizedSize.x = size.x;
           optimizedSize.y = (int) ((float) videoWidth / (float) videoHeight * size.x);

           return optimizedSize;
       }

       private void showCameraErrorAndFinish() {
           DialogInterface.OnClickListener buttonListener = new DialogInterface.OnClickListener() {
               @Override
               public void onClick(DialogInterface dialog, int which) {
                   currentActivity.finish();
               }
           };
           new AlertDialog.Builder(currentActivity).setCancelable(false)
                   .setTitle("Camera error")
                   .setMessage("Cannot connect to the camera.")
                   .setNeutralButton("OK", buttonListener)
                   .show();
       }

       private void showStorageErrorAndFinish() {
           DialogInterface.OnClickListener buttonListener = new DialogInterface.OnClickListener() {
               @Override
               public void onClick(DialogInterface dialog, int which) {
                   currentActivity.finish();
               }
           };
           new AlertDialog.Builder(currentActivity).setCancelable(false)
                   .setTitle("Storage error")
                   .setMessage("Cannot read external storage.")
                   .setNeutralButton("OK", buttonListener)
                   .show();
       }

       private void updateStorageSpace() {
           mStorageSpace = getAvailableSpace();
           Log.v(TAG, "updateStorageSpace mStorageSpace=" + mStorageSpace);
       }

       private long getAvailableSpace() {
           String state = Environment.getExternalStorageState();
           Log.d(TAG, "External storage state=" + state);
           if (Environment.MEDIA_CHECKING.equals(state)) {
               return -1;
           }
           if (!Environment.MEDIA_MOUNTED.equals(state)) {
               return -1;
           }

           File directory = currentActivity.getExternalFilesDir("vine");
           directory.mkdirs();
           if (!directory.isDirectory() || !directory.canWrite()) {
               return -1;
           }

           try {
               StatFs stat = new StatFs(directory.getAbsolutePath());
               return stat.getAvailableBlocks() * (long) stat.getBlockSize();
           } catch (Exception e) {
               Log.i(TAG, "Fail to access external storage", e);
           }
           return -1;
       }

       private boolean useTexture() {
           return false;
    //        return Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR1;
       }

       private class SurfaceViewCallback implements SurfaceHolder.Callback {

           @Override
           public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
               Log.v(TAG, "surfaceChanged. width=" + width + ". height=" + height);
           }

           @Override
           public void surfaceCreated(SurfaceHolder holder) {
               Log.v(TAG, "surfaceCreated");
               surfaceViewReady = true;
               surfaceHolder = holder;
               startPreview();
           }

           @Override
           public void surfaceDestroyed(SurfaceHolder holder) {
               Log.d(TAG, "surfaceDestroyed");
               surfaceViewReady = false;
           }

       }

       @Override
       public void onError(int error, Camera camera) {
           Log.e(TAG, "Camera onError. what=" + error + ".");
           if (error == Camera.CAMERA_ERROR_SERVER_DIED) {

           } else if (error == Camera.CAMERA_ERROR_UNKNOWN) {

           }
       }

       @Override
       public void onInfo(MediaRecorder mr, int what, int extra) {
           if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_DURATION_REACHED) {
               stopRecording();
           } else if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_FILESIZE_REACHED) {
               stopRecording();
               Toast.makeText(currentActivity, "Size limit reached", Toast.LENGTH_LONG).show();
           }
       }

       @Override
       public void onError(MediaRecorder mr, int what, int extra) {
           Log.e(TAG, "MediaRecorder onError. what=" + what + ". extra=" + extra);
           if (what == MediaRecorder.MEDIA_RECORDER_ERROR_UNKNOWN) {
               stopRecording();
           }
       }

    }

    VideoUtils

    public class VideoUtils {
       private static final String TAG = VideoUtils.class.getSimpleName();

       static double[] matrix = new double[] { 0.0, 1.0, 0.0, -1.0, 0.0, 0.0, 0.0,
               0.0, 1.0 };

       public static boolean MergeFiles(String speratedDirPath,
               String targetFileName) {
           File videoSourceDirFile = new File(speratedDirPath);
           String[] videoList = videoSourceDirFile.list();
           List<track> videoTracks = new LinkedList<track>();
           List<track> audioTracks = new LinkedList<track>();
           for (String file : videoList) {
               Log.d(TAG, "source files" + speratedDirPath
                       + File.separator + file);
               try {
                   FileChannel fc = new FileInputStream(speratedDirPath
                           + File.separator + file).getChannel();
                   Movie movie = MovieCreator.build(fc);
                   for (Track t : movie.getTracks()) {
                       if (t.getHandler().equals("soun")) {
                           audioTracks.add(t);
                       }
                       if (t.getHandler().equals("vide")) {

                           videoTracks.add(t);
                       }
                   }
               } catch (FileNotFoundException e) {
                   e.printStackTrace();
                   return false;
               } catch (IOException e) {
                   e.printStackTrace();
                   return false;
               }
           }

           Movie result = new Movie();

           try {
               if (audioTracks.size() > 0) {
                   result.addTrack(new AppendTrack(audioTracks
                           .toArray(new Track[audioTracks.size()])));
               }
               if (videoTracks.size() > 0) {
                   result.addTrack(new AppendTrack(videoTracks
                           .toArray(new Track[videoTracks.size()])));
               }
               IsoFile out = new DefaultMp4Builder().build(result);



               FileChannel fc = new RandomAccessFile(
                       String.format(targetFileName), "rw").getChannel();

               Log.d(TAG, "target file:" + targetFileName);
               TrackBox tb = out.getMovieBox().getBoxes(TrackBox.class).get(1);

               TrackHeaderBox tkhd = tb.getTrackHeaderBox();
               double[] b = tb.getTrackHeaderBox().getMatrix();

               tkhd.setMatrix(matrix);

               fc.position(0);
               out.getBox(fc);
               fc.close();
               for (String file : videoList) {
                   File TBRFile = new File(speratedDirPath + File.separator + file);
                   TBRFile.delete();
               }
               boolean a = videoSourceDirFile.delete();
               Log.d(TAG, "try to delete dir:" + a);
           } catch (IOException e) {
               // TODO Auto-generated catch block
               e.printStackTrace();
               return false;
           }

           return true;
       }

       public static boolean clearFiles(String speratedDirPath) {
           File videoSourceDirFile = new File(speratedDirPath);
           if (videoSourceDirFile != null
                   &amp;&amp; videoSourceDirFile.listFiles() != null) {
               File[] videoList = videoSourceDirFile.listFiles();
               for (File video : videoList) {
                   video.delete();
               }
               videoSourceDirFile.delete();
           }
           return true;
       }

       public static int createSnapshot(String videoFile, int kind, String snapshotFilepath) {
           return 0;
       };

       public static int createSnapshot(String videoFile, int width, int height, String snapshotFilepath) {
           return 0;
       }
    }
    </track></track></track></track>

    my reference code project link is

    https://github.com/jwfing/AndroidVideoKit