Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
Crash while aac to mp3 transcoding using libav ?
1er octobre 2013, par iotaI am using libav to decode aac audio and then transcode it into mp3 using libmp3lame. I know when I decode aac stream I get AV_SAMPLE_FMT_FLTP output and mp3 encoder needs the input in AV_SAMPLE_FMT_S16P.So I am doing a sample format conversion using swr_convert of libswresample. I know the no. of samples in the decode output is different (i.e. 1024) from the one required by libmp3lame (i.e. 1152) ? for that I am doing buffering as well.
But it is not crashing because of the buffering, it doesn't even come to that part, it crashes in swr_convert.
and if I see the stacktrace using gdb I see the crash happening somewhere in
ff_float_to_int16.next()
What could be the possible problem?
-
ffmpeg DirectShow device original encoding
1er octobre 2013, par MaxMy capture card must encode video as MPEG2. Is there ffmpeg command that shows that card provides MPEG2 stream or other stream format? (I wish to confirm that card has MPEG2 encoder inside)
Here are options for the capture device
DirectShow video device options Pin "Capture" pixel_format=yuyv422 min s=720x480 fps=59.9402 max s=720x480 fps=59.9402 pixel_format=yuyv422 min s=720x480 fps=29.97 max s=720x480 fps=29.97 pixel_format=yuyv422 min s=720x576 fps=50 max s=720x576 fps=50 pixel_format=yuyv422 min s=720x576 fps=25 max s=720x576 fps=25 pixel_format=yuyv422 min s=640x480 fps=59.9402 max s=640x480 fps=59.9402 pixel_format=yuyv422 min s=1920x1080 fps=29.97 max s=1920x1080 fps=29.97 pixel_format=yuyv422 min s=1920x1080 fps=25 max s=1920x1080 fps=25 pixel_format=yuyv422 min s=1920x1080 fps=24 max s=1920x1080 fps=24 pixel_format=yuyv422 min s=1280x720 fps=59.9402 max s=1280x720 fps=59.9402 pixel_format=yuyv422 min s=1280x720 fps=50 max s=1280x720 fps=50 Pin "Audio"
-
Audio & Video not synchronized properly if i merged more videos in mp4parser
1er octobre 2013, par maniyaI have used mp4parser for merging video with dynamic pause and record video capture for max 6 second recording. In preview its working fine when recorded video with minimum pause/record, If i tried with more than 3 pause/record mean the last video file not get merged properly with audio.At the start of the video the sync is ok but at the end the video hanged and audio playing in screen for the remaining file duration about 1sec.
My Recording manager
public class RecordingManager implements Camera.ErrorCallback, MediaRecorder.OnErrorListener, MediaRecorder.OnInfoListener { private static final String TAG = RecordingManager.class.getSimpleName(); private static final int FOCUS_AREA_RADIUS = 32; private static final int FOCUS_MAX_VALUE = 1000; private static final int FOCUS_MIN_VALUE = -1000; private static final long MINIMUM_RECORDING_TIME = 2000; private static final int MAXIMUM_RECORDING_TIME = 70 * 1000; private static final long LOW_STORAGE_THRESHOLD = 5 * 1024 * 1024; private static final long RECORDING_FILE_LIMIT = 100 * 1024 * 1024; private boolean paused = true; private MediaRecorder mediaRecorder = null; private boolean recording = false; private FrameLayout previewFrame = null; private boolean mPreviewing = false; // private TextureView mTextureView = null; // private SurfaceTexture mSurfaceTexture = null; // private boolean mSurfaceTextureReady = false; // private SurfaceView surfaceView = null; private SurfaceHolder surfaceHolder = null; private boolean surfaceViewReady = false; private Camera camera = null; private Camera.Parameters cameraParameters = null; private CamcorderProfile camcorderProfile = null; private int mOrientation = -1; private OrientationEventListener mOrientationEventListener = null; private long mStartRecordingTime; private int mVideoWidth; private int mVideoHeight; private long mStorageSpace; private Handler mHandler = new Handler(); // private Runnable mUpdateRecordingTimeTask = new Runnable() { // @Override // public void run() { // long recordingTime = System.currentTimeMillis() - mStartRecordingTime; // Log.d(TAG, String.format("Recording time:%d", recordingTime)); // mHandler.postDelayed(this, CLIP_GRAPH_UPDATE_INTERVAL); // } // }; private Runnable mStopRecordingTask = new Runnable() { @Override public void run() { stopRecording(); } }; private static RecordingManager mInstance = null; private Activity currentActivity = null; private String destinationFilepath = ""; private String snapshotFilepath = ""; public static RecordingManager getInstance(Activity activity, FrameLayout previewFrame) { if (mInstance == null || mInstance.currentActivity != activity) { mInstance = new RecordingManager(activity, previewFrame); } return mInstance; } private RecordingManager(Activity activity, FrameLayout previewFrame) { currentActivity = activity; this.previewFrame = previewFrame; } public int getVideoWidth() { return this.mVideoWidth; } public int getVideoHeight() { return this.mVideoHeight; } public void setDestinationFilepath(String filepath) { this.destinationFilepath = filepath; } public String getDestinationFilepath() { return this.destinationFilepath; } public void setSnapshotFilepath(String filepath) { this.snapshotFilepath = filepath; } public String getSnapshotFilepath() { return this.snapshotFilepath; } public void init(String videoPath, String snapshotPath) { Log.v(TAG, "init."); setDestinationFilepath(videoPath); setSnapshotFilepath(snapshotPath); if (!Utils.isExternalStorageAvailable()) { showStorageErrorAndFinish(); return; } openCamera(); if (camera == null) { showCameraErrorAndFinish(); return; } public void onResume() { Log.v(TAG, "onResume."); paused = false; // Open the camera if (camera == null) { openCamera(); if (camera == null) { showCameraErrorAndFinish(); return; } } // Initialize the surface texture or surface view // if (useTexture() && mTextureView == null) { // initTextureView(); // mTextureView.setVisibility(View.VISIBLE); // } else if (!useTexture() && mSurfaceView == null) { initSurfaceView(); surfaceView.setVisibility(View.VISIBLE); // } // Start the preview if (!mPreviewing) { startPreview(); } } private void openCamera() { Log.v(TAG, "openCamera"); try { camera = Camera.open(); camera.setErrorCallback(this); camera.setDisplayOrientation(90); // Since we only support portrait mode cameraParameters = camera.getParameters(); } catch (RuntimeException e) { e.printStackTrace(); camera = null; } } private void closeCamera() { Log.v(TAG, "closeCamera"); if (camera == null) { Log.d(TAG, "Already stopped."); return; } camera.setErrorCallback(null); if (mPreviewing) { stopPreview(); } camera.release(); camera = null; } private void initSurfaceView() { surfaceView = new SurfaceView(currentActivity); surfaceView.getHolder().addCallback(new SurfaceViewCallback()); surfaceView.setVisibility(View.GONE); FrameLayout.LayoutParams params = new LayoutParams( LayoutParams.MATCH_PARENT, LayoutParams.MATCH_PARENT, Gravity.CENTER); surfaceView.setLayoutParams(params); Log.d(TAG, "add surface view to preview frame"); previewFrame.addView(surfaceView); } private void releaseSurfaceView() { if (surfaceView != null) { previewFrame.removeAllViews(); surfaceView = null; surfaceHolder = null; surfaceViewReady = false; } } private void startPreview() { // if ((useTexture() && !mSurfaceTextureReady) || (!useTexture() && !mSurfaceViewReady)) { // return; // } Log.v(TAG, "startPreview."); if (mPreviewing) { stopPreview(); } setCameraParameters(); resizePreview(); try { // if (useTexture()) { // mCamera.setPreviewTexture(mSurfaceTexture); // } else { camera.setPreviewDisplay(surfaceHolder); // } camera.startPreview(); mPreviewing = true; } catch (Exception e) { closeCamera(); e.printStackTrace(); Log.e(TAG, "startPreview failed."); } } private void stopPreview() { Log.v(TAG, "stopPreview"); if (camera != null) { camera.stopPreview(); mPreviewing = false; } } public void onPause() { paused = true; if (recording) { stopRecording(); } closeCamera(); // if (useTexture()) { // releaseSurfaceTexture(); // } else { releaseSurfaceView(); // } } private void setCameraParameters() { if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)) { camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_720P); } else if (CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_480P)) { camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_480P); } else { camcorderProfile = CamcorderProfile.get(CamcorderProfile.QUALITY_HIGH); } mVideoWidth = camcorderProfile.videoFrameWidth; mVideoHeight = camcorderProfile.videoFrameHeight; camcorderProfile.fileFormat = MediaRecorder.OutputFormat.MPEG_4; camcorderProfile.videoFrameRate = 30; Log.v(TAG, "mVideoWidth=" + mVideoWidth + " mVideoHeight=" + mVideoHeight); cameraParameters.setPreviewSize(mVideoWidth, mVideoHeight); if (cameraParameters.getSupportedWhiteBalance().contains(Camera.Parameters.WHITE_BALANCE_AUTO)) { cameraParameters.setWhiteBalance(Camera.Parameters.WHITE_BALANCE_AUTO); } if (cameraParameters.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) { cameraParameters.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO); } cameraParameters.setRecordingHint(true); cameraParameters.set("cam_mode", 1); camera.setParameters(cameraParameters); cameraParameters = camera.getParameters(); camera.setDisplayOrientation(90); android.hardware.Camera.CameraInfo info = new android.hardware.Camera.CameraInfo(); Log.d(TAG, info.orientation + " degree"); } private void resizePreview() { Log.d(TAG, String.format("Video size:%d|%d", mVideoWidth, mVideoHeight)); Point optimizedSize = getOptimizedPreviewSize(mVideoWidth, mVideoHeight); Log.d(TAG, String.format("Optimized size:%d|%d", optimizedSize.x, optimizedSize.y)); ViewGroup.LayoutParams params = (ViewGroup.LayoutParams) previewFrame.getLayoutParams(); params.width = optimizedSize.x; params.height = optimizedSize.y; previewFrame.setLayoutParams(params); } public void setOrientation(int ori) { this.mOrientation = ori; } public void setOrientationEventListener(OrientationEventListener listener) { this.mOrientationEventListener = listener; } public Camera getCamera() { return camera; } @SuppressWarnings("serial") public void setFocusArea(float x, float y) { if (camera != null) { int viewWidth = surfaceView.getWidth(); int viewHeight = surfaceView.getHeight(); int focusCenterX = FOCUS_MAX_VALUE - (int) (x / viewWidth * (FOCUS_MAX_VALUE - FOCUS_MIN_VALUE)); int focusCenterY = FOCUS_MIN_VALUE + (int) (y / viewHeight * (FOCUS_MAX_VALUE - FOCUS_MIN_VALUE)); final int left = focusCenterY - FOCUS_AREA_RADIUS < FOCUS_MIN_VALUE ? FOCUS_MIN_VALUE : focusCenterY - FOCUS_AREA_RADIUS; final int top = focusCenterX - FOCUS_AREA_RADIUS < FOCUS_MIN_VALUE ? FOCUS_MIN_VALUE : focusCenterX - FOCUS_AREA_RADIUS; final int right = focusCenterY + FOCUS_AREA_RADIUS > FOCUS_MAX_VALUE ? FOCUS_MAX_VALUE : focusCenterY + FOCUS_AREA_RADIUS; final int bottom = focusCenterX + FOCUS_AREA_RADIUS > FOCUS_MAX_VALUE ? FOCUS_MAX_VALUE : focusCenterX + FOCUS_AREA_RADIUS; Camera.Parameters params = camera.getParameters(); params.setFocusAreas(new ArrayList() { { add(new Camera.Area(new Rect(left, top, right, bottom), 1000)); } }); camera.setParameters(params); camera.autoFocus(new AutoFocusCallback() { @Override public void onAutoFocus(boolean success, Camera camera) { Log.d(TAG, "onAutoFocus"); } }); } } public void startRecording(String destinationFilepath) { if (!recording) { updateStorageSpace(); setDestinationFilepath(destinationFilepath); if (mStorageSpace <= LOW_STORAGE_THRESHOLD) { Log.v(TAG, "Storage issue, ignore the start request"); Toast.makeText(currentActivity, "Storage issue, ignore the recording request", Toast.LENGTH_LONG).show(); return; } if (!prepareMediaRecorder()) { Toast.makeText(currentActivity, "prepareMediaRecorder failed.", Toast.LENGTH_LONG).show(); return; } Log.d(TAG, "Successfully prepare media recorder."); try { mediaRecorder.start(); } catch (RuntimeException e) { Log.e(TAG, "MediaRecorder start failed."); releaseMediaRecorder(); return; } mStartRecordingTime = System.currentTimeMillis(); if (mOrientationEventListener != null) { mOrientationEventListener.disable(); } recording = true; } } public void stopRecording() { if (recording) { if (!paused) { // Capture at least 1 second video long currentTime = System.currentTimeMillis(); if (currentTime - mStartRecordingTime < MINIMUM_RECORDING_TIME) { mHandler.postDelayed(mStopRecordingTask, MINIMUM_RECORDING_TIME - (currentTime - mStartRecordingTime)); return; } } if (mOrientationEventListener != null) { mOrientationEventListener.enable(); } // mHandler.removeCallbacks(mUpdateRecordingTimeTask); try { mediaRecorder.setOnErrorListener(null); mediaRecorder.setOnInfoListener(null); mediaRecorder.stop(); // stop the recording Toast.makeText(currentActivity, "Video file saved.", Toast.LENGTH_LONG).show(); long stopRecordingTime = System.currentTimeMillis(); Log.d(TAG, String.format("stopRecording. file:%s duration:%d", destinationFilepath, stopRecordingTime - mStartRecordingTime)); // Calculate the duration of video MediaMetadataRetriever mmr = new MediaMetadataRetriever(); mmr.setDataSource(this.destinationFilepath); String _length = mmr.extractMetadata(MediaMetadataRetriever.METADATA_KEY_DURATION); if (_length != null) { Log.d(TAG, String.format("clip duration:%d", Long.parseLong(_length))); } // Taking the snapshot of video Bitmap snapshot = ThumbnailUtils.createVideoThumbnail(this.destinationFilepath, Thumbnails.MICRO_KIND); try { FileOutputStream out = new FileOutputStream(this.snapshotFilepath); snapshot.compress(Bitmap.CompressFormat.JPEG, 70, out); out.close(); } catch (Exception e) { e.printStackTrace(); } // mActivity.showPlayButton(); } catch (RuntimeException e) { e.printStackTrace(); Log.e(TAG, e.getMessage()); // if no valid audio/video data has been received when stop() is // called } finally { // releaseMediaRecorder(); // release the MediaRecorder object if (!paused) { cameraParameters = camera.getParameters(); } recording = false; } } } public void setRecorderOrientation(int orientation) { // For back camera only if (orientation != -1) { Log.d(TAG, "set orientationHint:" + (orientation + 135) % 360 / 90 * 90); mediaRecorder.setOrientationHint((orientation + 135) % 360 / 90 * 90); }else { Log.d(TAG, "not set orientationHint to mediaRecorder"); } } private boolean prepareMediaRecorder() { mediaRecorder = new MediaRecorder(); camera.unlock(); mediaRecorder.setCamera(camera); mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER); mediaRecorder.setVideoSource(MediaRecorder.VideoSource.CAMERA); mediaRecorder.setProfile(camcorderProfile); mediaRecorder.setMaxDuration(MAXIMUM_RECORDING_TIME); mediaRecorder.setOutputFile(this.destinationFilepath); try { mediaRecorder.setMaxFileSize(Math.min(RECORDING_FILE_LIMIT, mStorageSpace - LOW_STORAGE_THRESHOLD)); } catch (RuntimeException exception) { } setRecorderOrientation(mOrientation); if (!useTexture()) { mediaRecorder.setPreviewDisplay(surfaceHolder.getSurface()); } try { mediaRecorder.prepare(); } catch (IllegalStateException e) { releaseMediaRecorder(); return false; } catch (IOException e) { releaseMediaRecorder(); return false; } mediaRecorder.setOnErrorListener(this); mediaRecorder.setOnInfoListener(this); return true; } private void releaseMediaRecorder() { if (mediaRecorder != null) { mediaRecorder.reset(); // clear recorder configuration mediaRecorder.release(); // release the recorder object mediaRecorder = null; camera.lock(); // lock camera for later use } } private Point getOptimizedPreviewSize(int videoWidth, int videoHeight) { Display display = currentActivity.getWindowManager().getDefaultDisplay(); Point size = new Point(); display.getSize(size); Point optimizedSize = new Point(); optimizedSize.x = size.x; optimizedSize.y = (int) ((float) videoWidth / (float) videoHeight * size.x); return optimizedSize; } private void showCameraErrorAndFinish() { DialogInterface.OnClickListener buttonListener = new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialog, int which) { currentActivity.finish(); } }; new AlertDialog.Builder(currentActivity).setCancelable(false) .setTitle("Camera error") .setMessage("Cannot connect to the camera.") .setNeutralButton("OK", buttonListener) .show(); } private void showStorageErrorAndFinish() { DialogInterface.OnClickListener buttonListener = new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialog, int which) { currentActivity.finish(); } }; new AlertDialog.Builder(currentActivity).setCancelable(false) .setTitle("Storage error") .setMessage("Cannot read external storage.") .setNeutralButton("OK", buttonListener) .show(); } private void updateStorageSpace() { mStorageSpace = getAvailableSpace(); Log.v(TAG, "updateStorageSpace mStorageSpace=" + mStorageSpace); } private long getAvailableSpace() { String state = Environment.getExternalStorageState(); Log.d(TAG, "External storage state=" + state); if (Environment.MEDIA_CHECKING.equals(state)) { return -1; } if (!Environment.MEDIA_MOUNTED.equals(state)) { return -1; } File directory = currentActivity.getExternalFilesDir("vine"); directory.mkdirs(); if (!directory.isDirectory() || !directory.canWrite()) { return -1; } try { StatFs stat = new StatFs(directory.getAbsolutePath()); return stat.getAvailableBlocks() * (long) stat.getBlockSize(); } catch (Exception e) { Log.i(TAG, "Fail to access external storage", e); } return -1; } private boolean useTexture() { return false; // return Build.VERSION.SDK_INT >= Build.VERSION_CODES.JELLY_BEAN_MR1; } private class SurfaceViewCallback implements SurfaceHolder.Callback { @Override public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) { Log.v(TAG, "surfaceChanged. width=" + width + ". height=" + height); } @Override public void surfaceCreated(SurfaceHolder holder) { Log.v(TAG, "surfaceCreated"); surfaceViewReady = true; surfaceHolder = holder; startPreview(); } @Override public void surfaceDestroyed(SurfaceHolder holder) { Log.d(TAG, "surfaceDestroyed"); surfaceViewReady = false; } } @Override public void onError(int error, Camera camera) { Log.e(TAG, "Camera onError. what=" + error + "."); if (error == Camera.CAMERA_ERROR_SERVER_DIED) { } else if (error == Camera.CAMERA_ERROR_UNKNOWN) { } } @Override public void onInfo(MediaRecorder mr, int what, int extra) { if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_DURATION_REACHED) { stopRecording(); } else if (what == MediaRecorder.MEDIA_RECORDER_INFO_MAX_FILESIZE_REACHED) { stopRecording(); Toast.makeText(currentActivity, "Size limit reached", Toast.LENGTH_LONG).show(); } } @Override public void onError(MediaRecorder mr, int what, int extra) { Log.e(TAG, "MediaRecorder onError. what=" + what + ". extra=" + extra); if (what == MediaRecorder.MEDIA_RECORDER_ERROR_UNKNOWN) { stopRecording(); } } }
VideoUtils
public class VideoUtils { private static final String TAG = VideoUtils.class.getSimpleName(); static double[] matrix = new double[] { 0.0, 1.0, 0.0, -1.0, 0.0, 0.0, 0.0, 0.0, 1.0 }; public static boolean MergeFiles(String speratedDirPath, String targetFileName) { File videoSourceDirFile = new File(speratedDirPath); String[] videoList = videoSourceDirFile.list(); List
my reference code project link is
-
ffmpeg-Create the video slideshow with effect ? [on hold]
1er octobre 2013, par Lily ShuklaI want to make video slide show with image using the ffmpeg. i try the code
Using a single image as an input
If you want to create a video out of just one image, this will do (output video duration is set to 30 seconds with -t 30):
ffmpeg -loop 1 -i img.png -c:v libx264 -t 30 -pix_fmt yuv420p out.mp4
but sir it only showing the picture without any effect. i want to add effect like we see in power point. plz sie how to do it ffmpeg-Create the video slideshow with effect???
-
ffmpeg, how to add new audio (not mixing) in video using html or php
1er octobre 2013, par Lily ShuklaI want to create the html or php script for add audio (not mixing) in video . I find the ffmpeg code for doing this.
ffmpeg -i video.avi -i audio.mp3 -map 0 -map 1 -codec copy -shortest output_video.avi
This example will stream copy (no re-encoding so no quality loss; think of it as a copy and paste) all streams from both inputs into the output. The option -shortest will make the output duration the same length of the shortest input in-case one input longer than the other. This option is not required but is generally useful in this type of command.
but sir i do not understding that how to do that in the using html or php.
i want to make html or php script that, add audio in the video by user input??? in web browser