Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Problems linking to FFMPEG library in Visual Studio 2010

    11 décembre 2013, par Dídac Pérez

    I have cross-compiled FFMPEG from Debian using the mingw32 toolchain. The result of the compilation is a set of .a files. When I try to use them in my project I get linker errors, concretely the following ones:

    1>RTSPCapture.obj : error LNK2019: unresolved external symbol _avformat_free_context referenced in function ...
    1>RTSPCapture.obj : error LNK2019: unresolved external symbol _avio_close referenced in function ...
    1>RTSPCapture.obj : error LNK2019: unresolved external symbol _avcodec_close referenced in function ...
    (and much more...)
    

    I have already included the header files like this:

    extern "C"
    {
        #include avcodec.h>
        #include avformat.h>
        #include avio.h>
    }
    

    And I use the .a files like this:

    #pragma comment(lib, "libavcodec.a")
    #pragma comment(lib, "libavformat.a")
    #pragma comment(lib, "libavutil.a")
    

    May I know why I am still getting linker errors? Best regards,

    EDIT: I have realized that this is not possible. So, what should I do to use FFMPEG library in my MSVC2010 project taking into account that I can't compile ffmpeg in Windows? it seems to be REALLY difficult...

  • FFMPEG : Displaying a white screen using ffplay via a custom decoder

    11 décembre 2013, par Zax

    I have created a dummy decoder, Here in its decode function, i would be assigning the output file pointer with a YUV420 data filled with 255 (i.e. a white screen).

    I also have a corresponding probe function for my dummy decoder, where it takes an dummy input file and based on some checking i return AVPROBE_SCORE_MAX. This probe section works perfectly fine and invokes my custom dummy decoder.

    The AVCodec structure of for my dummy decoder is as shown below:

    AVCodec ff_dummyDec_decoder = {
        .name           = "dummyDec",
        .type           = AVMEDIA_TYPE_VIDEO,
        .id             = AV_CODEC_ID_MYDEC,
        .priv_data_size = sizeof(MYDECContext),
        .pix_fmts       = (const enum AVPixelFormat[]) {AV_PIX_FMT_YUV420P},
        .init           = dummyDec_decode_init,
        .close          = dummyDec_decode_close,
        .decode         = dummyDec_decode_frame,
    };
    

    Where,

    .init -> is a pointer to a function that performs my decoder related initializations
    .close -> is a pointer to a function that frees all memory that was allocated during initialization
    .decode -> is pointer to a function that decodes a frame.
    

    The definitions for the above functions is shown below:

    #include 
    #include 
    #include "avcodec.h"
    
    unsigned char *yPtr=NULL;
    unsigned char *uPtr=NULL;
    unsigned char *vPtr=NULL;
    
    int memFlag=0;//If memFlag is zero then allocate memory for YUV data
    
    int width=416;//Picture width and height that i want to display in ffplay
    int height=240;
    
    static int dummyDec_decode_frame(AVCodecContext *avctx, void *data,
                                 int *got_frame_ptr, AVPacket *avpkt)
    {
        AVFrame *frame=data; //make frame point to the pointer on which output should be mapped
        printf("\nDecode function entered\n");
        frame->width=width;
        frame->height=height;
        frame->format=AV_PIX_FMT_YUV420P;
    
        //initialize frame->linesize[] array
        avpicture_fill((AVPicture*)frame, NULL, frame->format,frame->width,frame->height);
    
        frame->data[0]=yPtr;
        frame->data[1]=uPtr;
        frame->data[2]=vPtr;
    
        *got_frame_ptr = 1;
    
        printf("\nGotFramePtr set to 1\n");
    
        return width*height+(width/2)*(height/2)+(width/2)*(height/2);//returning the amount of bytes being used
    }
    
    static int dummyDec_decode_init(AVCodecContext *avctx)
    {
        printf("\nDummy Decoders init entered\n");
    
        //Allocate memory for YUV data
        yPtr=(unsigned char*)malloc(sizeof(unsigned char*)*width*height);
        uPtr=(unsigned char*)malloc(sizeof(unsigned char*)*width/2*height/2);
        vPtr=(unsigned char*)malloc(sizeof(unsigned char*)*width/2*height/2);
    
        if(yPtr == NULL || uPtr ==NULL ||vPtr==NULL)
            exit(0);
    
        //set allocated memory with 255 i.e white color
       memset(yPtr,255,width*height);
       memset(uPtr,255,width/2*height/2);
       memset(vPtr,255,width/2*height/2);
    }
    
    static int dummyDec_decode_close(AVCodecContext *avctx)
    {
        free(yPtr);
        free(uPtr);
        free(vPtr);
    }
    

    From the dummyDec_decode_frame() function, i'm returning the number of bytes that are being used to display the white colour. Is this right? Secondly, I have no parser for my decoder, because i'm just mapping a yuv buffer containing white data to AVFrame structure pointer.

    The command that i use for executing is:

    ./ffplay -vcodec dummyDec -i input.bin
    

    The output is an infinite loop with the following messages:

    dummyDec probe entered
    
    Dummy Decoders init entered
    
    Decode function entered
    
    GotFramePtr set to 1
    
    Decode function entered
    
    GotFramePtr set to 1
    
    Decode function entered
    
    GotFramePtr set to 1
    
    .
    
    .
    
    .(the last two messages keep repeating)
    

    Where is it i'm going wrong? is it the absence of parser or something else? I'm unable to proceed because of this. Please provide your valuable answers. Thanks in advance.

    --Regards

  • cvWriteFrame cant write frames into mp4 file when using opencv's ffmpeg_64.dll

    11 décembre 2013, par user3074013

    1.When I successfully create a videowrite ,I try to write some frames into a mp4 format file.

    The crash occurred.

    2.codes:

    m_vw = cvCreateVideoWriter(fileName, codec, fps, m_size, bColor); ....//had got a right iplimage here if( 0 == cvWriteFrame(m_vw, iplimg) )//Here it is crash.But just for 64bits, 32bits is ok. break; // failed to write frame

    3.Some persons said it maybe ffmpeg codec plugin is needed to be installed,like ffdshow or xvid.But i install them ,it still cant work.

    So how to write image into mp4 for 64bits. I will appreciate your helps.Thanks.

  • Recording video on Android using JavaCV

    11 décembre 2013, par Fabio Bergmann

    I'm trying to record a video in Android using the JavaCV lib. I need to record the video in 640x360.

    I have installed everything as described in README.txt file and I followed the example as below: https://code.google.com/p/javacv/source/browse/samples/RecordActivity.java In this example, the video size is this: private int imageWidth = 320; private int imageHeight = 240;

    In my case, I need to record a video in 640x360 H.264. But, or I get an exception or the video records with no correct image.

    I think that the fix is arround a correct combination on: yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8S, 2); changing the depth and: recorder.record(yuvIplimage, AV_PIX_FMT_VDPAU_H264); the pixel format.

    Here is my code:

    import static com.googlecode.javacv.cpp.avutil.AV_PIX_FMT_NONE;
    import static com.googlecode.javacv.cpp.opencv_core.IPL_DEPTH_8U;
    
    import java.io.IOException;
    import java.nio.ShortBuffer;
    
    import android.app.Activity;
    import android.content.Context;
    import android.content.pm.ActivityInfo;
    import android.hardware.Camera;
    import android.hardware.Camera.PreviewCallback;
    import android.media.AudioFormat;
    import android.media.AudioRecord;
    import android.media.MediaRecorder;
    import android.os.Bundle;
    import android.os.PowerManager;
    import android.util.Log;
    import android.view.Display;
    import android.view.KeyEvent;
    import android.view.LayoutInflater;
    import android.view.SurfaceHolder;
    import android.view.SurfaceView;
    import android.view.View;
    import android.view.View.OnClickListener;
    import android.view.WindowManager;
    import android.widget.Button;
    import android.widget.LinearLayout;
    import android.widget.RelativeLayout;
    
    import com.autosonvideo.logic.CameraHelpers;
    import com.googlecode.javacv.FFmpegFrameRecorder;
    import com.googlecode.javacv.cpp.opencv_core.IplImage;
    
    public class FFmpegRecordActivity extends Activity implements OnClickListener {
    
        private final static String CLASS_LABEL = "RecordActivity";
        private final static String LOG_TAG = CLASS_LABEL;
    
        private PowerManager.WakeLock mWakeLock;
    
        private String ffmpeg_link = "/mnt/sdcard/stream.flv";
    
        long startTime = 0;
        boolean recording = false;
    
        private volatile FFmpegFrameRecorder recorder;
    
        private boolean isPreviewOn = false;
    
        private int sampleAudioRateInHz = 44100;
        private int imageWidth = 640;
        private int imageHeight = 360;
        private int frameRate = 30;
    
        /* audio data getting thread */
        private AudioRecord audioRecord;
        private AudioRecordRunnable audioRecordRunnable;
        private Thread audioThread;
        volatile boolean runAudioThread = true;
    
        /* video data getting thread */
        private Camera cameraDevice;
        private CameraView cameraView;
    
        private IplImage yuvIplimage = null;
    
        /* layout setting */
        private final int bg_screen_bx = 232;
        private final int bg_screen_by = 128;
        private final int bg_screen_width = 700;
        private final int bg_screen_height = 500;
        private final int bg_width = 1123;
        private final int bg_height = 715;
        private final int live_width = 640;
        private final int live_height = 360;
        private int screenWidth, screenHeight;
        private Button btnRecorderControl;
    
        @Override
        public void onCreate(Bundle savedInstanceState) {
    
            super.onCreate(savedInstanceState);
            setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
    
            setContentView(R.layout.main);
    
            PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
            mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK,
                    CLASS_LABEL);
            mWakeLock.acquire();
    
            initLayout();
            initRecorder();
        }
    
        @Override
        protected void onResume() {
            super.onResume();
    
            if (mWakeLock == null) {
                PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
                mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK,
                        CLASS_LABEL);
                mWakeLock.acquire();
            }
        }
    
        @Override
        protected void onPause() {
            super.onPause();
    
            if (mWakeLock != null) {
                mWakeLock.release();
                mWakeLock = null;
            }
        }
    
        @Override
        protected void onDestroy() {
            super.onDestroy();
    
            recording = false;
    
            if (cameraView != null) {
                cameraView.stopPreview();
                cameraDevice.release();
                cameraDevice = null;
            }
    
            if (mWakeLock != null) {
                mWakeLock.release();
                mWakeLock = null;
            }
        }
    
        private void initLayout() {
    
            /* get size of screen */
            Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE))
                    .getDefaultDisplay();
            screenWidth = display.getWidth();
            screenHeight = display.getHeight();
            RelativeLayout.LayoutParams layoutParam = null;
            LayoutInflater myInflate = null;
            myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
            RelativeLayout topLayout = new RelativeLayout(this);
            setContentView(topLayout);
            LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(
                    R.layout.main, null);
            layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
            topLayout.addView(preViewLayout, layoutParam);
    
            /* add control button: start and stop */
            btnRecorderControl = (Button) findViewById(R.id.recorder_control);
            btnRecorderControl.setText("Start");
            btnRecorderControl.setOnClickListener(this);
    
            /* add camera view */
            int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
            int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
            int prev_rw, prev_rh;
            if (1.0 * display_width_d / display_height_d > 1.0 * live_width
                    / live_height) {
                prev_rh = display_height_d;
                prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
            } else {
                prev_rw = display_width_d;
                prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
            }
            layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
            layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
            layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);
    
            cameraDevice = Camera.open();
            Log.i(LOG_TAG, "cameara open");
            cameraView = new CameraView(this, cameraDevice);
            topLayout.addView(cameraView, layoutParam);
            Log.i(LOG_TAG, "cameara preview start: OK");
        }
    
        // ---------------------------------------
        // initialize ffmpeg_recorder
        // ---------------------------------------
        private void initRecorder() {
    
            Log.w(LOG_TAG, "init recorder");
    
            if (yuvIplimage == null) {
                yuvIplimage = IplImage.create(imageWidth, imageHeight,
                        IPL_DEPTH_8U, 2);
                Log.i(LOG_TAG, "create yuvIplimage");
            }
    
            ffmpeg_link = CameraHelpers.getOutputMediaFile(
                    CameraHelpers.MEDIA_TYPE_VIDEO).toString();
    
            Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
            recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth,
                    imageHeight, 1);
            recorder.setFormat("mp4");
            recorder.setSampleRate(sampleAudioRateInHz);
            // Set in the surface changed method
            recorder.setFrameRate(frameRate);
    
            Log.i(LOG_TAG, "recorder initialize success");
    
            audioRecordRunnable = new AudioRecordRunnable();
            audioThread = new Thread(audioRecordRunnable);
        }
    
        public void startRecording() {
    
            try {
                recorder.start();
                startTime = System.currentTimeMillis();
                recording = true;
                audioThread.start();
    
            } catch (FFmpegFrameRecorder.Exception e) {
                e.printStackTrace();
            }
        }
    
        public void stopRecording() {
    
            runAudioThread = false;
    
            if (recorder != null && recording) {
                recording = false;
                Log.v(LOG_TAG,
                        "Finishing recording, calling stop and release on recorder");
                try {
                    recorder.stop();
                    recorder.release();
                } catch (FFmpegFrameRecorder.Exception e) {
                    e.printStackTrace();
                }
                recorder = null;
    
            }
        }
    
        @Override
        public boolean onKeyDown(int keyCode, KeyEvent event) {
    
            if (keyCode == KeyEvent.KEYCODE_BACK) {
                if (recording) {
                    stopRecording();
                }
    
                finish();
    
                return true;
            }
    
            return super.onKeyDown(keyCode, event);
        }
    
        // ---------------------------------------------
        // audio thread, gets and encodes audio data
        // ---------------------------------------------
        class AudioRecordRunnable implements Runnable {
    
            @Override
            public void run() {
                android.os.Process
                        .setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
    
                // Audio
                int bufferSize;
                short[] audioData;
                int bufferReadResult;
    
                bufferSize = AudioRecord
                        .getMinBufferSize(sampleAudioRateInHz,
                                AudioFormat.CHANNEL_IN_MONO,
                                AudioFormat.ENCODING_PCM_16BIT);
                audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC,
                        sampleAudioRateInHz, AudioFormat.CHANNEL_IN_MONO,
                        AudioFormat.ENCODING_PCM_16BIT, bufferSize);
    
                audioData = new short[bufferSize];
    
                Log.d(LOG_TAG, "audioRecord.startRecording()");
                audioRecord.startRecording();
    
                /* ffmpeg_audio encoding loop */
                while (runAudioThread) {
                    // Log.v(LOG_TAG,"recording? " + recording);
                    bufferReadResult = audioRecord.read(audioData, 0,
                            audioData.length);
                    if (bufferReadResult > 0) {
                        // Log.v(LOG_TAG, "bufferReadResult: " + bufferReadResult);
                        // If "recording" isn't true when start this thread, it
                        // never get's set according to this if statement...!!!
                        // Why? Good question...
                        if (recording) {
                            try {
                                recorder.record(ShortBuffer.wrap(audioData, 0,
                                        bufferReadResult));
                                // Log.v(LOG_TAG,"recording " + 1024*i + " to " +
                                // 1024*i+1024);
                            } catch (FFmpegFrameRecorder.Exception e) {
                                Log.v(LOG_TAG, e.getMessage());
                                e.printStackTrace();
                            }
                        }
                    }
                }
                Log.v(LOG_TAG, "AudioThread Finished, release audioRecord");
    
                /* encoding finish, release recorder */
                if (audioRecord != null) {
                    audioRecord.stop();
                    audioRecord.release();
                    audioRecord = null;
                    Log.v(LOG_TAG, "audioRecord released");
                }
            }
        }
    
        // ---------------------------------------------
        // camera thread, gets and encodes video data
        // ---------------------------------------------
        class CameraView extends SurfaceView implements SurfaceHolder.Callback,
                PreviewCallback {
    
            private SurfaceHolder mHolder;
            private Camera mCamera;
    
            public CameraView(Context context, Camera camera) {
                super(context);
                Log.w("camera", "camera view");
                mCamera = camera;
                mHolder = getHolder();
                mHolder.addCallback(CameraView.this);
                mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
                mCamera.setPreviewCallback(CameraView.this);
            }
    
            @Override
            public void surfaceCreated(SurfaceHolder holder) {
                try {
                    stopPreview();
                    mCamera.setPreviewDisplay(holder);
                } catch (IOException exception) {
                    mCamera.release();
                    mCamera = null;
                }
            }
    
            public void surfaceChanged(SurfaceHolder holder, int format, int width,
                    int height) {
                Log.v(LOG_TAG, "Setting imageWidth: " + imageWidth
                        + " imageHeight: " + imageHeight + " frameRate: "
                        + frameRate);
                Camera.Parameters camParams = mCamera.getParameters();
                camParams.setPreviewSize(imageWidth, imageHeight);
    
                Log.v(LOG_TAG,
                        "Preview Framerate: " + camParams.getPreviewFrameRate());
    
                camParams.setPreviewFrameRate(frameRate);
                mCamera.setParameters(camParams);
                startPreview();
            }
    
            @Override
            public void surfaceDestroyed(SurfaceHolder holder) {
                try {
                    mHolder.addCallback(null);
                    mCamera.setPreviewCallback(null);
                } catch (RuntimeException e) {
                    // The camera has probably just been released, ignore.
                }
            }
    
            public void startPreview() {
                if (!isPreviewOn && mCamera != null) {
                    isPreviewOn = true;
                    mCamera.startPreview();
                }
            }
    
            public void stopPreview() {
                if (isPreviewOn && mCamera != null) {
                    isPreviewOn = false;
                    mCamera.stopPreview();
                }
            }
    
            @Override
            public void onPreviewFrame(byte[] data, Camera camera) {
                /* get video data */
                if (yuvIplimage != null && recording) {
                    int cap = yuvIplimage.getByteBuffer().capacity();
                    int dataLen = data.length;
                    Log.d("FFmpeg", "cap: " + cap + " data: " + dataLen);
                    yuvIplimage.getByteBuffer().put(data);
    
                    Log.v(LOG_TAG, "Writing Frame");
                    try {
                        long t = 1000 * (System.currentTimeMillis() - startTime);
                        if (t > recorder.getTimestamp()) {
                            recorder.setTimestamp(t);
                        }
                        recorder.record(yuvIplimage, AV_PIX_FMT_NONE);
                    } catch (FFmpegFrameRecorder.Exception e) {
                        Log.v(LOG_TAG, e.getMessage());
                        e.printStackTrace();
                    }
                }
            }
        }
    
        @Override
        public void onClick(View v) {
            if (!recording) {
                startRecording();
                Log.w(LOG_TAG, "Start Button Pushed");
                btnRecorderControl.setText("Stop");
            } else {
                // This will trigger the audio recording loop to stop and then set
                // isRecorderStart = false;
                stopRecording();
                Log.w(LOG_TAG, "Stop Button Pushed");
                btnRecorderControl.setText("Start");
            }
        }
    }
    

    Thanks in advance!

  • How to tell whether the colors in a video frame decoded by FFmpeg are pre-multiplied by alpha ?

    11 décembre 2013, par smokris

    When I decode video frames with FFmpeg (avcodec_decode_video2(), sws_scale()), with some videos (e.g., ProRes4444), I get colors pre-multiplied by alpha, and with other videos (e.g., QuickTime PNG), I get colors that aren't pre-multiplied by alpha.

    How can I tell whether the colors are pre-multiplied? Alternatively, how can I tell FFmpeg to always provide either pre-multiplied or un-pre-multiplied ("straight alpha") colors?