Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • FFMPEG with Rails 4.2 in production

    12 février 2016, par Vincent

    I'm using FFMPEG to convert an uploaded wav file in a mp3.

    My code works in development mode but not in production and I can't find where the error is. As far as I can tell, I think it's a path problem with the tmp file. FFMPEG works in production (Digital Ocean) when I run a command on the terminal.

    My mp3 job:

        class Mp3UploadJob < Struct.new(:audiofile_id)
    
          def perform
            path = audiofile.wav_file.url.to_s
            mp3_file = File.join(Rails.root, "tmp", "uploads", "#{audiofile.filename}.mp3")
            %x[ffmpeg -i "#{path}" -b:a 192k -ar 44100 -metadata title="#{audiofile.title}" -metadata artist="#{audiofile.mp3_artist_name}" -metadata album="#{audiofile.mp3_album}" -metadata date="#{audiofile.release_date}" -metadata genre="#{audiofile.genre}" "#{mp3_file}"]
            audiofile.mp3_file = open(mp3_file)
            audiofile.save!
            File.delete(mp3_file)
          end
    
        end
    

    Note: the wav file comes from a dropbox chooser box.

    My wav job (ran before the mp3 job):

        class WavUploadJob < Struct.new(:audiofile_id)
    
          def perform
            audiofile.remote_wav_file_url = audiofile.dropbox_wav
            audiofile.save!
          end
        end
    
  • Java, serve HLS live video streams

    12 février 2016, par momo

    I know the topic is not an easy one, but I am looking for a Java class to send an HLS stream from the server to the client.

    I have files being generated greater and greater:

    out.m3u8
    out0.ts
    out1.ts
    out2.ts
    out3.ts
    out4.ts
    out5.ts
    out6.ts
    

    This is generated using ffmpeg from an original source:

    ffmpeg -i http://sourceurl.com:9981/stream/channel/1232131 out.m3u8
    

    I can play it using VLC.

    Somehow, I need to stream this live to the clients.

    At this point, I do not really care about different bit rates, i just want live streaming to work, in mobile browsers and on desktop browsers.

    I found this class:

    https://github.com/Red5/red5-hls-plugin/blob/master/plugin/src/main/java/org/red5/stream/http/servlet/PlayList.java
    

    Which might be doing something like that.

    I have pulled in hls.js into my application in hopes of using it for desktops.

    HLS should however work IOS devices without hls.js right now.

    How should one serve HLS content from the server? It's very difficult to find any good and simple example to do that.

    Anyone knows of the steps needed to do that ?

    I've looked into Wowza and Red5 just a little bit, but unsure what they can provide for me at this stage and seems to be overly complicated to setup just to serve some files. But please explain to me why that's not the case.

  • FFMPEG add text frames to the start of video

    12 février 2016, par Ryan

    I have some videos either in mp4 or webm format, and I'd like to use ffmpeg to add 4 seconds to the start of each video to display some text in the center with no sound.

    Some other requirements:

    • try to avoid re-encoding the video
    • need to maintain the quality (resolution, bitrate, etc)
    • (optional) to make the text fade in/out

    I am new to ffmpeg and any help will be appreciated.

    thanks in advance

    Example ffprobe information for mp4 below:

    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'input.mp4':
      Metadata:
        major_brand : isom
        minor_version   : 512
        compatible_brands: isomiso2avc1mp41
        encoder         : Lavf55.33.100
      Duration: 00:00:03.84, start: 0.042667, bitrate: 1117 kb/s
    Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 1280x720, 1021 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc (default)
    Metadata:
      handler_name    : VideoHandler
    Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 140 kb/s (default)
    Metadata:
      handler_name    : SoundHandler
    

    Example webm

    Input #0, matroska,webm, from 'input.webm':
      Metadata:
      encoder         : Lavf55.33.100
     Duration: 00:00:03.80, start: 0.000000, bitrate: 1060 kb/s
       Stream #0:0(eng): Video: vp8, yuv420p, 1280x720, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 1k tbn, 1k tbc (default)
       Stream #0:1(eng): Audio: vorbis, 48000 Hz, stereo, fltp (default)
    

    Screenshot from joined.mp4

    Screenshot for step 3 console

  • Not able to make playable ASF file ffmpeg [on hold]

    12 février 2016, par Farukh Zahoor

    I am using this command line to make playable asf file from an image

    ffmpeg -loop 1 -i "G:\ffmpeg\1.jpg" -i "G:\ffmpeg\silence.mp3" -t 5 -vcodec libx264 -acodec libvo_aacenc -pix_fmt yuv420p -f asf -movflags faststart -b 1352k -r 30 -s 704x480 -vf "setdar=4:3" "G:\ffmpeg\output.asf"

    But I am able to make mp4 file using this command from an image

    ffmpeg -loop 1 -i "G:\ffmpeg\1.jpg" -i "G:\ffmpeg\silence.mp3" -t 5 -vcodec libx264 -acodec libvo_aacenc -pix_fmt yuv420p -f mp4 -movflags faststart -b 1352k -r 30 -s 704x480 -vf "setdar=4:3" "G:\ffmpeg\output.mp4"

    I get no error making files from the above commands. Can anybody guide me on this ?

  • live Streaming video in server side not responding(Wowza -ffmpeg-JavaCV1.1)

    12 février 2016, par Muthukumar S

    I am new to javaCV. I tried to record mp4 and flv videos using javaCV 1.1 in mobile side working nice. But i need server side didn't show it. thanks I am using wowza media stream engine in server side. thanks for help

    package com.example.javacv11;
    
    import java.io.IOException;
    import java.nio.ByteBuffer;
    import java.nio.ShortBuffer;
    import java.util.Collections;
    import java.util.Comparator;
    import java.util.List;
    
    import org.bytedeco.javacv.FFmpegFrameRecorder;
    import org.bytedeco.javacv.Frame;
    
    import android.app.Activity;
    import android.content.Context;
    import android.content.pm.ActivityInfo;
    import android.hardware.Camera;
    import android.hardware.Camera.PreviewCallback;
    import android.media.AudioFormat;
    import android.media.AudioRecord;
    import android.media.MediaRecorder;
    import android.os.Bundle;
    import android.os.PowerManager;
    import android.util.Log;
    import android.view.Display;
    import android.view.KeyEvent;
    import android.view.LayoutInflater;
    import android.view.SurfaceHolder;
    import android.view.SurfaceView;
    import android.view.View;
    import android.view.View.OnClickListener;
    import android.view.WindowManager;
    import android.widget.Button;
    import android.widget.LinearLayout;
    import android.widget.RelativeLayout;
    
    public class FirstActivity extends Activity implements OnClickListener {
    
        private final static String CLASS_LABEL = "RecordActivity";
        private final static String LOG_TAG = CLASS_LABEL;
    
        private PowerManager.WakeLock mWakeLock;
    
        private String ffmpeg_link = "rtmp://10.0.0.136:1935/live/myStream";
    
        long startTime = 0;
        boolean recording = false;
    
        private FFmpegFrameRecorder recorder;
    
        private boolean isPreviewOn = false;
    
        private int sampleAudioRateInHz = 44100;
        private int imageWidth = 320;
        private int imageHeight = 240;
        private int frameRate = 30;
    
        /* audio data getting thread */
        private AudioRecord audioRecord;
        private AudioRecordRunnable audioRecordRunnable;
        private Thread audioThread;
        volatile boolean runAudioThread = true;
    
        /* video data getting thread */
        private Camera cameraDevice;
        private CameraView cameraView;
    
        private Frame yuvImage = null;
    
        /* layout setting */
        private final int bg_screen_bx = 232;
        private final int bg_screen_by = 128;
        private final int bg_screen_width = 700;
        private final int bg_screen_height = 500;
        private final int bg_width = 1123;
        private final int bg_height = 715;
        private final int live_width = 640;
        private final int live_height = 480;
        private int screenWidth, screenHeight;
        private Button btnRecorderControl;
    
        /* The number of seconds in the continuous record loop (or 0 to disable loop). */
        final int RECORD_LENGTH = 10;
        Frame[] images;
        long[] timestamps;
        ShortBuffer[] samples;
        int imagesIndex, samplesIndex;
    
        @Override
        public void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
    
            setContentView(R.layout.activity_first);
    
            PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE); 
            mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, CLASS_LABEL); 
            mWakeLock.acquire(); 
    
            initLayout();
        }
    
    
        @Override
        protected void onResume() {
            super.onResume();
    
            if (mWakeLock == null) {
               PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
               mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, CLASS_LABEL);
               mWakeLock.acquire();
            }
        }
    
        @Override
        protected void onPause() {
            super.onPause();
    
            if (mWakeLock != null) {
                mWakeLock.release();
                mWakeLock = null;
            }
        }
    
        @Override
        protected void onDestroy() {
            super.onDestroy();
    
            recording = false;
    
            if (cameraView != null) {
                cameraView.stopPreview();
            }
    
            if(cameraDevice != null) {
               cameraDevice.stopPreview();
               cameraDevice.release();
               cameraDevice = null;
            }
    
            if (mWakeLock != null) {
                mWakeLock.release();
                mWakeLock = null;
            }
        }
    
    
        private void initLayout() {
    
            /* get size of screen */
            Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
            screenWidth = display.getWidth();
            screenHeight = display.getHeight();
            RelativeLayout.LayoutParams layoutParam = null; 
            LayoutInflater myInflate = null; 
            myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
            RelativeLayout topLayout = new RelativeLayout(this);
            setContentView(topLayout);
            LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(R.layout.activity_first, null);
            layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
            topLayout.addView(preViewLayout, layoutParam);
    
            /* add control button: start and stop */
            btnRecorderControl = (Button) findViewById(R.id.recorder_control);
            btnRecorderControl.setText("Start");
            btnRecorderControl.setOnClickListener(this);
    
            /* add camera view */
            int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
            int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
            int prev_rw, prev_rh;
            if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
                prev_rh = display_height_d;
                prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
            } else {
                prev_rw = display_width_d;
                prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
            }
            layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
            layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
            layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);
    
            cameraDevice = Camera.open();
            Log.i(LOG_TAG, "cameara open");
            cameraView = new CameraView(this, cameraDevice);
            topLayout.addView(cameraView, layoutParam);
            Log.i(LOG_TAG, "cameara preview start: OK");
        }
    
        //---------------------------------------
        // initialize ffmpeg_recorder
        //---------------------------------------
        private void initRecorder() {
    
            Log.w(LOG_TAG,"init recorder");
    
            if (RECORD_LENGTH > 0) {
                imagesIndex = 0;
                images = new Frame[RECORD_LENGTH * frameRate];
                timestamps = new long[images.length];
                for (int i = 0; i < images.length; i++) {
                    images[i] = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
                    timestamps[i] = -1;
                }
            } else if (yuvImage == null) {
                yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
                Log.i(LOG_TAG, "create yuvImage");
            }
    
            Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
            recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
            recorder.setFormat("mp4");
            recorder.setSampleRate(sampleAudioRateInHz);
            // Set in the surface changed method
            recorder.setFrameRate(frameRate);
    
            Log.i(LOG_TAG, "recorder initialize success");
    
            audioRecordRunnable = new AudioRecordRunnable();
            audioThread = new Thread(audioRecordRunnable);
            runAudioThread = true;
        }
    
        public void startRecording() {
    
            initRecorder();
    
            try {
                recorder.start();
                startTime = System.currentTimeMillis();
                recording = true;
                audioThread.start();
    
            } catch (FFmpegFrameRecorder.Exception e) {
                e.printStackTrace();
            }
        }
    
        public void stopRecording() {
    
            runAudioThread = false;
            try {
                audioThread.join();
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
            audioRecordRunnable = null;
            audioThread = null;
    
            if (recorder != null && recording) {
                if (RECORD_LENGTH > 0) {
                    Log.v(LOG_TAG,"Writing frames");
                    try {
                        int firstIndex = imagesIndex % samples.length;
                        int lastIndex = (imagesIndex - 1) % images.length;
                        if (imagesIndex <= images.length) {
                            firstIndex = 0;
                            lastIndex = imagesIndex - 1;
                        }
                        if ((startTime = timestamps[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
                            startTime = 0;
                        }
                        if (lastIndex < firstIndex) {
                            lastIndex += images.length;
                        }
                        for (int i = firstIndex; i <= lastIndex; i++) {
                            long t = timestamps[i % timestamps.length] - startTime;
                            if (t >= 0) {
                                if (t > recorder.getTimestamp()) {
                                    recorder.setTimestamp(t);
                                }
                                recorder.record(images[i % images.length]);
                            }
                        }
    
                        firstIndex = samplesIndex % samples.length;
                        lastIndex = (samplesIndex - 1) % samples.length;
                        if (samplesIndex <= samples.length) {
                            firstIndex = 0;
                            lastIndex = samplesIndex - 1;
                        }
                        if (lastIndex < firstIndex) {
                            lastIndex += samples.length;
                        }
                        for (int i = firstIndex; i <= lastIndex; i++) {
                            recorder.recordSamples(samples[i % samples.length]);
                        }
                    } catch (FFmpegFrameRecorder.Exception e) {
                        Log.v(LOG_TAG,e.getMessage());
                        e.printStackTrace();
                    }
                }
    
                recording = false;
                Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
                try {
                    recorder.stop();
                    recorder.release();
                } catch (FFmpegFrameRecorder.Exception e) {
                    e.printStackTrace();
                }
                recorder = null;
    
            }
        }
    
        @Override
        public boolean onKeyDown(int keyCode, KeyEvent event) {
    
            if (keyCode == KeyEvent.KEYCODE_BACK) {
                if (recording) {
                    stopRecording();
                }
    
                finish();
    
                return true;
            }
    
            return super.onKeyDown(keyCode, event);
        }
    
    
        //---------------------------------------------
        // audio thread, gets and encodes audio data
        //---------------------------------------------
        class AudioRecordRunnable implements Runnable {
    
            @Override
            public void run() {
                android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
    
                // Audio
                int bufferSize;
                ShortBuffer audioData;
                int bufferReadResult;
    
                bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz, 
                        AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
                audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz, 
                        AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
    
                if (RECORD_LENGTH > 0) {
                    samplesIndex = 0;
                    samples = new ShortBuffer[RECORD_LENGTH * sampleAudioRateInHz * 2 / bufferSize + 1];
                    for (int i = 0; i < samples.length; i++) {
                        samples[i] = ShortBuffer.allocate(bufferSize);
                    }
                } else {
                    audioData = ShortBuffer.allocate(bufferSize);
                }
    
                Log.d(LOG_TAG, "audioRecord.startRecording()");
                audioRecord.startRecording();
    
                /* ffmpeg_audio encoding loop */
                while (runAudioThread) {
                    if (RECORD_LENGTH > 0) {
                        audioData = samples[samplesIndex++ % samples.length];
                        audioData.position(0).limit(0);
                    }
                    //Log.v(LOG_TAG,"recording? " + recording);
                    bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
                    audioData.limit(bufferReadResult);
                    if (bufferReadResult > 0) {
                        Log.v(LOG_TAG,"bufferReadResult: " + bufferReadResult);
                        // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                        // Why?  Good question...
                        if (recording) {
                            if (RECORD_LENGTH <= 0) try {
                                recorder.recordSamples(audioData);
                                //Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                            } catch (FFmpegFrameRecorder.Exception e) {
                                Log.v(LOG_TAG,e.getMessage());
                                e.printStackTrace();
                            }
                        }
                    }
                }
                Log.v(LOG_TAG,"AudioThread Finished, release audioRecord");
    
                /* encoding finish, release recorder */
                if (audioRecord != null) {
                    audioRecord.stop();
                    audioRecord.release();
                    audioRecord = null;
                    Log.v(LOG_TAG,"audioRecord released");
                }
            }
        }
    
        //---------------------------------------------
        // camera thread, gets and encodes video data
        //---------------------------------------------
        class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {
    
            private SurfaceHolder mHolder;
            private Camera mCamera;
    
            public CameraView(Context context, Camera camera) {
                super(context);
                Log.w("camera","camera view");
                mCamera = camera;
                mHolder = getHolder();
                mHolder.addCallback(CameraView.this);
                mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
                mCamera.setPreviewCallback(CameraView.this);
            }
    
            @Override
            public void surfaceCreated(SurfaceHolder holder) {
                try {
                    stopPreview();
                    mCamera.setPreviewDisplay(holder);
                } catch (IOException exception) {
                    mCamera.release();
                    mCamera = null;
                }
            }
    
            public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
                Camera.Parameters camParams = mCamera.getParameters();
                List sizes = camParams.getSupportedPreviewSizes();
                // Sort the list in ascending order
                Collections.sort(sizes, new Comparator() {
    
                    public int compare(final Camera.Size a, final Camera.Size b) {
                        return a.width * a.height - b.width * b.height;
                    }
                });
    
                // Pick the first preview size that is equal or bigger, or pick the last (biggest) option if we cannot
                // reach the initial settings of imageWidth/imageHeight.
                for (int i = 0; i < sizes.size(); i++) {
                    if ((sizes.get(i).width >= imageWidth && sizes.get(i).height >= imageHeight) || i == sizes.size() - 1) {
                        imageWidth = sizes.get(i).width;
                        imageHeight = sizes.get(i).height;
                        Log.v(LOG_TAG, "Changed to supported resolution: " + imageWidth + "x" + imageHeight);
                        break;
                    }
                }
                camParams.setPreviewSize(imageWidth, imageHeight);
    
                Log.v(LOG_TAG,"Setting imageWidth: " + imageWidth + " imageHeight: " + imageHeight + " frameRate: " + frameRate);
    
                camParams.setPreviewFrameRate(frameRate);
                Log.v(LOG_TAG,"Preview Framerate: " + camParams.getPreviewFrameRate());
    
                mCamera.setParameters(camParams);
                startPreview();
            }
    
            @Override
            public void surfaceDestroyed(SurfaceHolder holder) {
                try {
                    mHolder.addCallback(null);
                    mCamera.setPreviewCallback(null);
                } catch (RuntimeException e) {
                    // The camera has probably just been released, ignore.
                }
            }
    
            public void startPreview() {
                if (!isPreviewOn && mCamera != null) {
                    isPreviewOn = true;
                    mCamera.startPreview();
                }
            }
    
            public void stopPreview() {
                if (isPreviewOn && mCamera != null) {
                    isPreviewOn = false;
                    mCamera.stopPreview();
                }
            }
    
            @Override
            public void onPreviewFrame(byte[] data, Camera camera) {
                if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
                    startTime = System.currentTimeMillis();
                    return;
                }
                if (RECORD_LENGTH > 0) {
                    int i = imagesIndex++ % images.length;
                    yuvImage = images[i];
                    timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
                }
                /* get video data */
                if (yuvImage != null && recording) {
                    ((ByteBuffer)yuvImage.image[0].position(0)).put(data);
    
                    if (RECORD_LENGTH <= 0) try {
                        Log.v(LOG_TAG,"Writing Frame");
                        long t = 1000 * (System.currentTimeMillis() - startTime);
                        if (t > recorder.getTimestamp()) {
                            recorder.setTimestamp(t);
                        }
                        recorder.record(yuvImage);
                    } catch (FFmpegFrameRecorder.Exception e) {
                        Log.v(LOG_TAG,e.getMessage());
                        e.printStackTrace();
                    }
                }
            }
        }
    
        @Override
        public void onClick(View v) {
            if (!recording) {
                startRecording();
                Log.w(LOG_TAG, "Start Button Pushed");
                btnRecorderControl.setText("Stop");
            } else {
                // This will trigger the audio recording loop to stop and then set isRecorderStart = false;
                stopRecording();
                Log.w(LOG_TAG, "Stop Button Pushed");
                btnRecorderControl.setText("Start");
            }
        }
    
    }