Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Not able to make playable ASF file ffmpeg [on hold]

    12 février 2016, par Farukh Zahoor

    I am using this command line to make playable asf file from an image

    ffmpeg -loop 1 -i "G:\ffmpeg\1.jpg" -i "G:\ffmpeg\silence.mp3" -t 5 -vcodec libx264 -acodec libvo_aacenc -pix_fmt yuv420p -f asf -movflags faststart -b 1352k -r 30 -s 704x480 -vf "setdar=4:3" "G:\ffmpeg\output.asf"

    But I am able to make mp4 file using this command from an image

    ffmpeg -loop 1 -i "G:\ffmpeg\1.jpg" -i "G:\ffmpeg\silence.mp3" -t 5 -vcodec libx264 -acodec libvo_aacenc -pix_fmt yuv420p -f mp4 -movflags faststart -b 1352k -r 30 -s 704x480 -vf "setdar=4:3" "G:\ffmpeg\output.mp4"

    I get no error making files from the above commands. Can anybody guide me on this ?

  • live Streaming video in server side not responding(Wowza -ffmpeg-JavaCV1.1)

    12 février 2016, par Muthukumar S

    I am new to javaCV. I tried to record mp4 and flv videos using javaCV 1.1 in mobile side working nice. But i need server side didn't show it. thanks I am using wowza media stream engine in server side. thanks for help

    package com.example.javacv11;
    
    import java.io.IOException;
    import java.nio.ByteBuffer;
    import java.nio.ShortBuffer;
    import java.util.Collections;
    import java.util.Comparator;
    import java.util.List;
    
    import org.bytedeco.javacv.FFmpegFrameRecorder;
    import org.bytedeco.javacv.Frame;
    
    import android.app.Activity;
    import android.content.Context;
    import android.content.pm.ActivityInfo;
    import android.hardware.Camera;
    import android.hardware.Camera.PreviewCallback;
    import android.media.AudioFormat;
    import android.media.AudioRecord;
    import android.media.MediaRecorder;
    import android.os.Bundle;
    import android.os.PowerManager;
    import android.util.Log;
    import android.view.Display;
    import android.view.KeyEvent;
    import android.view.LayoutInflater;
    import android.view.SurfaceHolder;
    import android.view.SurfaceView;
    import android.view.View;
    import android.view.View.OnClickListener;
    import android.view.WindowManager;
    import android.widget.Button;
    import android.widget.LinearLayout;
    import android.widget.RelativeLayout;
    
    public class FirstActivity extends Activity implements OnClickListener {
    
        private final static String CLASS_LABEL = "RecordActivity";
        private final static String LOG_TAG = CLASS_LABEL;
    
        private PowerManager.WakeLock mWakeLock;
    
        private String ffmpeg_link = "rtmp://10.0.0.136:1935/live/myStream";
    
        long startTime = 0;
        boolean recording = false;
    
        private FFmpegFrameRecorder recorder;
    
        private boolean isPreviewOn = false;
    
        private int sampleAudioRateInHz = 44100;
        private int imageWidth = 320;
        private int imageHeight = 240;
        private int frameRate = 30;
    
        /* audio data getting thread */
        private AudioRecord audioRecord;
        private AudioRecordRunnable audioRecordRunnable;
        private Thread audioThread;
        volatile boolean runAudioThread = true;
    
        /* video data getting thread */
        private Camera cameraDevice;
        private CameraView cameraView;
    
        private Frame yuvImage = null;
    
        /* layout setting */
        private final int bg_screen_bx = 232;
        private final int bg_screen_by = 128;
        private final int bg_screen_width = 700;
        private final int bg_screen_height = 500;
        private final int bg_width = 1123;
        private final int bg_height = 715;
        private final int live_width = 640;
        private final int live_height = 480;
        private int screenWidth, screenHeight;
        private Button btnRecorderControl;
    
        /* The number of seconds in the continuous record loop (or 0 to disable loop). */
        final int RECORD_LENGTH = 10;
        Frame[] images;
        long[] timestamps;
        ShortBuffer[] samples;
        int imagesIndex, samplesIndex;
    
        @Override
        public void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
    
            setContentView(R.layout.activity_first);
    
            PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE); 
            mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, CLASS_LABEL); 
            mWakeLock.acquire(); 
    
            initLayout();
        }
    
    
        @Override
        protected void onResume() {
            super.onResume();
    
            if (mWakeLock == null) {
               PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
               mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, CLASS_LABEL);
               mWakeLock.acquire();
            }
        }
    
        @Override
        protected void onPause() {
            super.onPause();
    
            if (mWakeLock != null) {
                mWakeLock.release();
                mWakeLock = null;
            }
        }
    
        @Override
        protected void onDestroy() {
            super.onDestroy();
    
            recording = false;
    
            if (cameraView != null) {
                cameraView.stopPreview();
            }
    
            if(cameraDevice != null) {
               cameraDevice.stopPreview();
               cameraDevice.release();
               cameraDevice = null;
            }
    
            if (mWakeLock != null) {
                mWakeLock.release();
                mWakeLock = null;
            }
        }
    
    
        private void initLayout() {
    
            /* get size of screen */
            Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
            screenWidth = display.getWidth();
            screenHeight = display.getHeight();
            RelativeLayout.LayoutParams layoutParam = null; 
            LayoutInflater myInflate = null; 
            myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
            RelativeLayout topLayout = new RelativeLayout(this);
            setContentView(topLayout);
            LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(R.layout.activity_first, null);
            layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
            topLayout.addView(preViewLayout, layoutParam);
    
            /* add control button: start and stop */
            btnRecorderControl = (Button) findViewById(R.id.recorder_control);
            btnRecorderControl.setText("Start");
            btnRecorderControl.setOnClickListener(this);
    
            /* add camera view */
            int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
            int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
            int prev_rw, prev_rh;
            if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
                prev_rh = display_height_d;
                prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
            } else {
                prev_rw = display_width_d;
                prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
            }
            layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
            layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
            layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);
    
            cameraDevice = Camera.open();
            Log.i(LOG_TAG, "cameara open");
            cameraView = new CameraView(this, cameraDevice);
            topLayout.addView(cameraView, layoutParam);
            Log.i(LOG_TAG, "cameara preview start: OK");
        }
    
        //---------------------------------------
        // initialize ffmpeg_recorder
        //---------------------------------------
        private void initRecorder() {
    
            Log.w(LOG_TAG,"init recorder");
    
            if (RECORD_LENGTH > 0) {
                imagesIndex = 0;
                images = new Frame[RECORD_LENGTH * frameRate];
                timestamps = new long[images.length];
                for (int i = 0; i < images.length; i++) {
                    images[i] = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
                    timestamps[i] = -1;
                }
            } else if (yuvImage == null) {
                yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
                Log.i(LOG_TAG, "create yuvImage");
            }
    
            Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
            recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
            recorder.setFormat("mp4");
            recorder.setSampleRate(sampleAudioRateInHz);
            // Set in the surface changed method
            recorder.setFrameRate(frameRate);
    
            Log.i(LOG_TAG, "recorder initialize success");
    
            audioRecordRunnable = new AudioRecordRunnable();
            audioThread = new Thread(audioRecordRunnable);
            runAudioThread = true;
        }
    
        public void startRecording() {
    
            initRecorder();
    
            try {
                recorder.start();
                startTime = System.currentTimeMillis();
                recording = true;
                audioThread.start();
    
            } catch (FFmpegFrameRecorder.Exception e) {
                e.printStackTrace();
            }
        }
    
        public void stopRecording() {
    
            runAudioThread = false;
            try {
                audioThread.join();
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
            audioRecordRunnable = null;
            audioThread = null;
    
            if (recorder != null && recording) {
                if (RECORD_LENGTH > 0) {
                    Log.v(LOG_TAG,"Writing frames");
                    try {
                        int firstIndex = imagesIndex % samples.length;
                        int lastIndex = (imagesIndex - 1) % images.length;
                        if (imagesIndex <= images.length) {
                            firstIndex = 0;
                            lastIndex = imagesIndex - 1;
                        }
                        if ((startTime = timestamps[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
                            startTime = 0;
                        }
                        if (lastIndex < firstIndex) {
                            lastIndex += images.length;
                        }
                        for (int i = firstIndex; i <= lastIndex; i++) {
                            long t = timestamps[i % timestamps.length] - startTime;
                            if (t >= 0) {
                                if (t > recorder.getTimestamp()) {
                                    recorder.setTimestamp(t);
                                }
                                recorder.record(images[i % images.length]);
                            }
                        }
    
                        firstIndex = samplesIndex % samples.length;
                        lastIndex = (samplesIndex - 1) % samples.length;
                        if (samplesIndex <= samples.length) {
                            firstIndex = 0;
                            lastIndex = samplesIndex - 1;
                        }
                        if (lastIndex < firstIndex) {
                            lastIndex += samples.length;
                        }
                        for (int i = firstIndex; i <= lastIndex; i++) {
                            recorder.recordSamples(samples[i % samples.length]);
                        }
                    } catch (FFmpegFrameRecorder.Exception e) {
                        Log.v(LOG_TAG,e.getMessage());
                        e.printStackTrace();
                    }
                }
    
                recording = false;
                Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
                try {
                    recorder.stop();
                    recorder.release();
                } catch (FFmpegFrameRecorder.Exception e) {
                    e.printStackTrace();
                }
                recorder = null;
    
            }
        }
    
        @Override
        public boolean onKeyDown(int keyCode, KeyEvent event) {
    
            if (keyCode == KeyEvent.KEYCODE_BACK) {
                if (recording) {
                    stopRecording();
                }
    
                finish();
    
                return true;
            }
    
            return super.onKeyDown(keyCode, event);
        }
    
    
        //---------------------------------------------
        // audio thread, gets and encodes audio data
        //---------------------------------------------
        class AudioRecordRunnable implements Runnable {
    
            @Override
            public void run() {
                android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
    
                // Audio
                int bufferSize;
                ShortBuffer audioData;
                int bufferReadResult;
    
                bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz, 
                        AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
                audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz, 
                        AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
    
                if (RECORD_LENGTH > 0) {
                    samplesIndex = 0;
                    samples = new ShortBuffer[RECORD_LENGTH * sampleAudioRateInHz * 2 / bufferSize + 1];
                    for (int i = 0; i < samples.length; i++) {
                        samples[i] = ShortBuffer.allocate(bufferSize);
                    }
                } else {
                    audioData = ShortBuffer.allocate(bufferSize);
                }
    
                Log.d(LOG_TAG, "audioRecord.startRecording()");
                audioRecord.startRecording();
    
                /* ffmpeg_audio encoding loop */
                while (runAudioThread) {
                    if (RECORD_LENGTH > 0) {
                        audioData = samples[samplesIndex++ % samples.length];
                        audioData.position(0).limit(0);
                    }
                    //Log.v(LOG_TAG,"recording? " + recording);
                    bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
                    audioData.limit(bufferReadResult);
                    if (bufferReadResult > 0) {
                        Log.v(LOG_TAG,"bufferReadResult: " + bufferReadResult);
                        // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                        // Why?  Good question...
                        if (recording) {
                            if (RECORD_LENGTH <= 0) try {
                                recorder.recordSamples(audioData);
                                //Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                            } catch (FFmpegFrameRecorder.Exception e) {
                                Log.v(LOG_TAG,e.getMessage());
                                e.printStackTrace();
                            }
                        }
                    }
                }
                Log.v(LOG_TAG,"AudioThread Finished, release audioRecord");
    
                /* encoding finish, release recorder */
                if (audioRecord != null) {
                    audioRecord.stop();
                    audioRecord.release();
                    audioRecord = null;
                    Log.v(LOG_TAG,"audioRecord released");
                }
            }
        }
    
        //---------------------------------------------
        // camera thread, gets and encodes video data
        //---------------------------------------------
        class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {
    
            private SurfaceHolder mHolder;
            private Camera mCamera;
    
            public CameraView(Context context, Camera camera) {
                super(context);
                Log.w("camera","camera view");
                mCamera = camera;
                mHolder = getHolder();
                mHolder.addCallback(CameraView.this);
                mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
                mCamera.setPreviewCallback(CameraView.this);
            }
    
            @Override
            public void surfaceCreated(SurfaceHolder holder) {
                try {
                    stopPreview();
                    mCamera.setPreviewDisplay(holder);
                } catch (IOException exception) {
                    mCamera.release();
                    mCamera = null;
                }
            }
    
            public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
                Camera.Parameters camParams = mCamera.getParameters();
                List sizes = camParams.getSupportedPreviewSizes();
                // Sort the list in ascending order
                Collections.sort(sizes, new Comparator() {
    
                    public int compare(final Camera.Size a, final Camera.Size b) {
                        return a.width * a.height - b.width * b.height;
                    }
                });
    
                // Pick the first preview size that is equal or bigger, or pick the last (biggest) option if we cannot
                // reach the initial settings of imageWidth/imageHeight.
                for (int i = 0; i < sizes.size(); i++) {
                    if ((sizes.get(i).width >= imageWidth && sizes.get(i).height >= imageHeight) || i == sizes.size() - 1) {
                        imageWidth = sizes.get(i).width;
                        imageHeight = sizes.get(i).height;
                        Log.v(LOG_TAG, "Changed to supported resolution: " + imageWidth + "x" + imageHeight);
                        break;
                    }
                }
                camParams.setPreviewSize(imageWidth, imageHeight);
    
                Log.v(LOG_TAG,"Setting imageWidth: " + imageWidth + " imageHeight: " + imageHeight + " frameRate: " + frameRate);
    
                camParams.setPreviewFrameRate(frameRate);
                Log.v(LOG_TAG,"Preview Framerate: " + camParams.getPreviewFrameRate());
    
                mCamera.setParameters(camParams);
                startPreview();
            }
    
            @Override
            public void surfaceDestroyed(SurfaceHolder holder) {
                try {
                    mHolder.addCallback(null);
                    mCamera.setPreviewCallback(null);
                } catch (RuntimeException e) {
                    // The camera has probably just been released, ignore.
                }
            }
    
            public void startPreview() {
                if (!isPreviewOn && mCamera != null) {
                    isPreviewOn = true;
                    mCamera.startPreview();
                }
            }
    
            public void stopPreview() {
                if (isPreviewOn && mCamera != null) {
                    isPreviewOn = false;
                    mCamera.stopPreview();
                }
            }
    
            @Override
            public void onPreviewFrame(byte[] data, Camera camera) {
                if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
                    startTime = System.currentTimeMillis();
                    return;
                }
                if (RECORD_LENGTH > 0) {
                    int i = imagesIndex++ % images.length;
                    yuvImage = images[i];
                    timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
                }
                /* get video data */
                if (yuvImage != null && recording) {
                    ((ByteBuffer)yuvImage.image[0].position(0)).put(data);
    
                    if (RECORD_LENGTH <= 0) try {
                        Log.v(LOG_TAG,"Writing Frame");
                        long t = 1000 * (System.currentTimeMillis() - startTime);
                        if (t > recorder.getTimestamp()) {
                            recorder.setTimestamp(t);
                        }
                        recorder.record(yuvImage);
                    } catch (FFmpegFrameRecorder.Exception e) {
                        Log.v(LOG_TAG,e.getMessage());
                        e.printStackTrace();
                    }
                }
            }
        }
    
        @Override
        public void onClick(View v) {
            if (!recording) {
                startRecording();
                Log.w(LOG_TAG, "Start Button Pushed");
                btnRecorderControl.setText("Stop");
            } else {
                // This will trigger the audio recording loop to stop and then set isRecorderStart = false;
                stopRecording();
                Log.w(LOG_TAG, "Stop Button Pushed");
                btnRecorderControl.setText("Start");
            }
        }
    
    }
    
  • FFMPEG - RTMP to HLS no audio output

    12 février 2016, par DerHighland

    I am currently developing a dynamic HLS segmenter for our livecam application. Therefor I catch the external RTMP stream and convert it into segments with ffmpeg.

    The following command works:

    ffmpeg -i rtmp://"$serverip"/"$application"/mp4:"$stream_name".f4v -c:v libx264 -profile:v baseline -level 5.1 \
    -c:a aac -strict experimental -flags +global_header -f mpegts - | ffmpeg -i - -c copy -map 0 -f segment \
    -segment_list /tmp/hls/"$id"/"$stream_name".m3u8 -segment_format libmp3lame -segment_time 10 \
    -segment_wrap 4 /tmp/hls/"$id"/"$stream_name"%03d.ts
    

    But with this command I do have a huge latency between the livestream and the HLS output (around 1-2 minutes!).

    So I tried another command, which results in a latency of 20-30 seconds! My only problem is, that the audio stream is not recognized and also not put to HLS files (means I only got the video, but no audio at all):

    ffmpeg -probesize 50k -i rtmp://"$serverip"/"$application"/mp4:"$stream_name".f4v \
    -c:v libx264 -b:v 128k -g 90 -c:a aac -strict experimental -flags -global_header -map 0 \
    -f segment -segment_time 3 -segment_list /tmp/hls/"$id"/"$stream_name".m3u8 -segment_list_flags +live \
    -segment_list_type m3u8 -segment_list_size 5 -segment_format mpegts /tmp/hls/"$id"/"$stream_name"%d.ts
    

    I thought, the -c:a aac Flag should do the job for muxing the audio as well.

    Do you have any suggestions what went wrong on the second command? I definitly have to segement the audio stream as well!

    Thanks in advance

    Update:

    Some outputs of the FFMPEG command:

    I started the command (2) ones, and got an audio output, but it seems not to work everytime.

    Output from working command 2, audio is working: http://pastebin.com/bhxfNQBg

    Output from working command 2, audio not working (nothing changed): http://pastebin.com/engEuSdn

    Whats strange for me, is the line:

    [flv @ 0x1eed900] New audio stream 0:1 at pos:716680 and DTS:0s
    

    This only occurs, if the audio on hls side is NOT working.

    Any help will be appreciated

    Update 2:
    It seems like there is a problem when I start the ffmpeg command after the stream is already published.
    If I follow these steps, everything works fine:
    1. Start Stream (nc Connection to AMS is established)
    2. Start FFMPEG command (it will idle until the stream publishes)
    3. Start publishing

    But If I do it like that (which we will need), no audio will be present:
    1. Start Stream
    2. User join, start publishing
    3. Trigger ffmpeg command

  • mp3 audio + background.jpg = result.mp4 in ffmpeg

    12 février 2016, par editinit

    having problem converting video into mp4 to upload on youtube.

    I have converted into avi format with below command.

    ffmpeg -y -i background.jpg -i deepmix.mp3 -c:a copy result.avi
    

    but when i upload this to youtube it is in "processing state"(0%). as per this https://support.google.com/youtube/answer/71674?hl=en I waited 8 hours but no luck.

    also tried converting avi to mp4 with below command.

    ffmpeg -i input.avi -c:v libx264 -crf 19 -preset slow -c:a aac -strict experimental -b:a 192k -ac 2 out.mp4
    

    no luck. not able to play out.mp4 in vlc.(audio is playing no video) but when i upload this video to youtube it was in "processing state"(95%).

    also tried :

    ffmpeg  -i result.avi -vcodec libx264 -crf 25 out.mp4
    

    Still in "processing state"(95%) when i upload to youtube.

    how do i directly convert into mp4 720hd video with one audio(mp3) and one image (jpg) ?

  • Make a video based on info in another video file ffmpeg

    12 février 2016, par Farukh Zahoor

    What I am trying to do is to merge 2 or more video files using ffmpeg. It is important video files should have same resolution, codecs, channels etc to be merged successfully.

    I am making a slide video based on image for 5 or 10 seconds, I need slide video should be made based on first video resolution, codecs, channel info so that these 2 files should be merged.

    Now question is is there a way with ffmpeg to make a slide video on the fly based on the first video information ?