Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • A Windows-compatible C IDE that will be able to compile FFmpeg ?

    6 décembre 2015, par Shimmy

    I want to set up an environment on my PC (Win7x64) that I will be able to combine these projects and compile them into one executable:

    For this I need an environment that can compile/debug both the above projects (creating a Windows compatible exe).

    Any ideas?

  • OpenGL (GLEW) + FFMPEG + VS2010 release build

    6 décembre 2015, par rickhik

    Currently I am working on a application that uses OpenGL (GLEW) and FFMPEG (libav's) in Visual Studio 2010. The GLEW libs are included as Additional Libraries. The libav is set via a header file with extern "c" and pragma comment lib.

    The application opens a OpenGL window which draws some quads with textures via FFMPEG. I want to use this application for my volunteer job at a TV station of a Childrens Hospital.

    The problem I have is building a release version of my project. When I am building a Debug version it all works fine. I get a OpenGL Window with the quads. When I change Debug into Release and start the Debugger, it gives me the message "entry point of procedure __GLEW_EXT_framebuffer_object not found in DLL avcodec-56.dll", which is a bit strange to me because of the fact that the GLEW is not part of the avcodec-56.dll?

    I can solve this issue by going to the project settings -> Linker -> Optimization and switch References from Yes(/OPT:REF) to No(/OPT:NOREF). I start the debugger to build the solution and it works as well. Now when I go to the Release folder of the project and start the build executable, the OpenGL Window stays black without giving me an error message.

    I hope you guys can help me out.

    @michael-iv I saw you had a similar issue

  • how to change the frame rate and capturing image size from IP camera

    6 décembre 2015, par rockycai

    Now I have a IP camera and I want to capture image from it through RTSP.I use below code and it works well.But the camera's frame rate is 25/s.So I got a lot of images per second.I don't want it.And per image is 6.2MB.I also don't want need to get high quality image.What can I do to slower the frame rate and smaller the size of image?

    #ifndef INT64_C
    #define INT64_C(c) (c ## LL)
    #define UINT64_C(c) (c ## ULL)
    #endif
    
    #ifdef __cplusplus
    extern "C" {
    #endif
        /*Include ffmpeg header file*/
    #include avformat.h>
    #include avcodec.h>
    #include swscale.h>
    #include 
    #ifdef __cplusplus
    }
    #endif
    
    #include 
    
    
    static void SaveFrame(AVFrame *pFrame, int width, int height, int iFrame);
    
    int main (int argc, const char * argv[])
    {
        AVFormatContext *pFormatCtx;
        int             i, videoStream;
        AVCodecContext  *pCodecCtx;
        AVCodec         *pCodec;
        AVFrame         *pFrame; 
        AVFrame         *pFrameRGB;
        AVPacket        packet;
        int             frameFinished;
        int             numBytes;
        uint8_t         *buffer;
    
        // Register all formats and codecs
        av_register_all();
    //  const char *filename="C:\libraries\gfjyp.avi";
        // Open video file
        //AVDictionary *options = NULL;
        //av_dict_set(&options,"rtsp_transport","tcp",0);
        if(av_open_input_file(&pFormatCtx, argv[1], NULL, 0, NULL)!=0)
            return -1; // Couldn't open file
    
        // Retrieve stream information
        if(av_find_stream_info(pFormatCtx)<0)
            return -1; // Couldn't find stream information
    
        // Dump information about file onto standard error
        dump_format(pFormatCtx, 0, argv[1], false);
    
        // Find the first video stream
        videoStream=-1;
        for(i=0; inb_streams; i++)
            if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO)
            {
                videoStream=i;
                break;
            }
            if(videoStream==-1)
                return -1; // Didn't find a video stream
    
            // Get a pointer to the codec context for the video stream
            pCodecCtx=pFormatCtx->streams[videoStream]->codec;
    
            // Find the decoder for the video stream
            pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
            if(pCodec==NULL)
                return -1; // Codec not found
    
            // Open codec
            if(avcodec_open(pCodecCtx, pCodec)<0)
                return -1; // Could not open codec
    
            // Hack to correct wrong frame rates that seem to be generated by some codecs
    
            if(pCodecCtx->time_base.num>1000 && pCodecCtx->time_base.den==1)
                pCodecCtx->time_base.den=1000;
    
            //pCodecCtx->time_base.den=1;
            //pCodecCtx->time_base.num=1;
            // Allocate video frame
            pFrame=avcodec_alloc_frame();
    
            // Allocate an AVFrame structure
            pFrameRGB=avcodec_alloc_frame();
            if(pFrameRGB==NULL)
                return -1;
    
            // Determine required buffer size and allocate buffer
            numBytes=avpicture_get_size(PIX_FMT_RGB24, pCodecCtx->width,
                pCodecCtx->height);
    
            //buffer=malloc(numBytes);
            buffer=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
    
            // Assign appropriate parts of buffer to image planes in pFrameRGB
            avpicture_fill((AVPicture *)pFrameRGB, buffer, PIX_FMT_RGB24,
                pCodecCtx->width, pCodecCtx->height);
    
            // Read frames and save first five frames to disk
            i=0;
            while(av_read_frame(pFormatCtx, &packet)>=0)
            {
                // Is this a packet from the video stream?
                if(packet.stream_index==videoStream)
                {
                    // Decode video frame
                    avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, 
                        &packet);
    
                    // Did we get a video frame?
                    if(frameFinished)
                    {
    
                        static struct SwsContext *img_convert_ctx;
    
    #if 0
                        // Older removed code
                        // Convert the image from its native format to RGB swscale
                        img_convert((AVPicture *)pFrameRGB, PIX_FMT_RGB24, 
                            (AVPicture*)pFrame, pCodecCtx->pix_fmt, pCodecCtx->width, 
                            pCodecCtx->height);
    
                        // function template, for reference
                        int sws_scale(struct SwsContext *context, uint8_t* src[], int srcStride[], int srcSliceY,
                            int srcSliceH, uint8_t* dst[], int dstStride[]);
    #endif
                        // Convert the image into YUV format that SDL uses
                        if(img_convert_ctx == NULL) {
                            int w = pCodecCtx->width;
                            int h = pCodecCtx->height;
    
                            img_convert_ctx = sws_getContext(w, h, 
                                pCodecCtx->pix_fmt, 
                                w, h, PIX_FMT_RGB24, SWS_BICUBIC,
                                NULL, NULL, NULL);
                            if(img_convert_ctx == NULL) {
                                fprintf(stderr, "Cannot initialize the conversion context!\n");
                                exit(1);
                            }
                        }
                        int ret = sws_scale(img_convert_ctx, pFrame->data, pFrame->linesize, 0, 
                            pCodecCtx->height, pFrameRGB->data, pFrameRGB->linesize);
    #if 0 
                        // this use to be true, as of 1/2009, but apparently it is no longer true in 3/2009
                        if(ret) {
                            fprintf(stderr, "SWS_Scale failed [%d]!\n", ret);
                            exit(-1);
                        }
    #endif
    
                        // Save the frame to disk
                        if(i++<=1000)
                            SaveFrame(pFrameRGB, pCodecCtx->width, pCodecCtx->height, i);
                    }
                }
    
                // Free the packet that was allocated by av_read_frame
                av_free_packet(&packet);
                //sleep(1);
            }
    
            // Free the RGB image
            //free(buffer);
            av_free(buffer);
            av_free(pFrameRGB);
    
            // Free the YUV frame
            av_free(pFrame);
    
            // Close the codec
            avcodec_close(pCodecCtx);
    
            // Close the video file
            av_close_input_file(pFormatCtx);
    
            return 0;
    }
    
    static void SaveFrame(AVFrame *pFrame, int width, int height, int iFrame)
    {
        FILE *pFile;
        char szFilename[32];
        int  y;
    
        // Open file
        sprintf(szFilename, "frame%d.ppm", iFrame);
        pFile=fopen(szFilename, "wb");
        if(pFile==NULL)
            return;
    
        // Write header
        fprintf(pFile, "P6\n%d %d\n255\n", width, height);
    
        // Write pixel data
        for(y=0; ydata[0]+y*pFrame->linesize[0], 1, width*3, pFile);
    
        // Close file
        fclose(pFile);
    }
    
  • Tainted canvas may not be exported. When trying to send it via ajax to php file

    6 décembre 2015, par romikette F

    I am trying to capture an image from a local video so I can make a thumb out of it. I am trying to avoid using video conversions (like ffmpeg) to save server resources and keep my videos in theyr original format. I'we been reading about tainted canvases and I understand I am trying to breach the browsers security protocols. But this article https://developer.mozilla.org/en-US/docs/Web/HTML/CORS_enabled_image#What_is_a_tainted_canvas said that I could pull an image if it were from a different source (something something). So far I have this:






    and:

    function capture(){
    var canvas = document.getElementById('mYcanvas');
    var video = document.getElementById('video');
    canvas.getContext('2d').drawImage(video, 0, 0, video.videoWidth, video.videoHeight);
    
    function getBase64() {
    var img = document.getElementById("mYcanvas");
    var canvas = document.createElement("canvas");
    canvas.width = img.width;
    canvas.height = img.width;
    var ctx = canvas.getContext("2d");
    ctx.drawImage(img, 0, 0);
    var dataURL = canvas.toDataURL("image/png");
    alert(dataURL.replace(/^data:image\/(png|jpg);base64,/, "")); 
    
    $.ajax({
        url: 'submit_image.php',
        type: 'POST',
        data: {
            imgdata: dataURL
        },
        success: function (res) {
                document.getElementById("response").innerHTML = res;
            }
    });
    }
    getBase64();
    
    }
    

    Never mind the php. It dose not receive anything.

  • arm-linux-androideabi-gcc is unable to create an executable - compile ffmpeg for android armeabi devices

    6 décembre 2015, par Chaitanya Chandurkar

    I am trying to compile ffmpeg for android armeabi devices.
    I am following tutorial by roman10.net
    His given build script builds ffmpeg for armv7-a devices. I want to build it for armeabi.

    My ultimate aim is to run ffmpeg commands on android armeabi and armv7-a devices.

    So I change the script's CPU part from

    #arm v7vfpv3
    CPU=armv7-a
    OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=$CPU "
    PREFIX=./android/$CPU
    ADDITIONAL_CONFIGURE_FLAG=
    build_one 
    

    to

    #arm v6
    CPU=armv6
    OPTIMIZE_CFLAGS="-marm -march=$CPU"
    PREFIX=./android/$CPU 
    ADDITIONAL_CONFIGURE_FLAG=
    build_one
    

    (I thought changing it to armv6 would build ffmpeg compatible with armeabi devices. m I wrong?)

    Doubt 1:
    Do I also have to change toolchain?
    i.e from arm-linux-androideabi-4.4.3 to arm-eabi-4.4.0 ?????

    Doubt 2:

    When I try running ./build.sh (without changing toolchain) it gives me following error:

    /home/chaitanya/android/android-ndk-r5b/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc is unable to create an executable file.
    C compiler test failed.
    
    If you think configure made a mistake,.. blaah blahh blaah
    

    after that I got lot of warnings saying:
    ‘sub_id’ is deprecated and many other function deprecated warnings.

    platform: ubuntu 11.10
    ffmpeg version: 0.11.2
    NDK : android-ndk-r5b

    Here is my build script:

    #!/bin/bash
    ######################################################
    # Usage:
    # put this script in top of FFmpeg source tree
    # ./build_android
    # It generates binary for following architectures:
    # ARMv6 
    # ARMv6+VFP 
    # ARMv7+VFM-ïd16 (Tegra2) 
    # ARMv7+Neon (Cortex-A8)
    # Customizing:
    # 1. Feel free to change ./configure parameters for more features
    # 2. To adapt other ARM variants
    # set $CPU and $OPTIMIZE_CFLAGS 
    # call build_one
    ######################################################
    NDK=~/android/android-ndk-r5b
    PLATFORM=$NDK/platforms/android-8/arch-arm/
    PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86
    function build_one
    {
    ./configure --target-os=linux \
        --prefix=$PREFIX \
        --enable-cross-compile \
        --extra-libs="-lgcc" \
        --arch=arm \
        #--cc=$PREBUILT/bin/arm-linux-androideabi-gcc \
        --cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \
        #--nm=$PREBUILT/bin/arm-linux-androideabi-nm \
        --sysroot=$PLATFORM \
        --extra-cflags=" -O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 -Dipv6mr_interface=ipv6mr_ifindex -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 $OPTIMIZE_CFLAGS " \
        --disable-shared \
        --enable-static \
        --extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog" \
        --disable-everything \
        --enable-demuxer=mov \
        --enable-demuxer=h264 \
        --disable-ffplay \
        --enable-protocol=file \
        --enable-avformat \
        --enable-avcodec \
        --enable-decoder=rawvideo \
        --enable-decoder=mjpeg \
        --enable-decoder=h263 \
        --enable-decoder=mpeg4 \
        --enable-decoder=h264 \
        --enable-parser=h264 \
        --disable-network \
        --enable-zlib \
        --disable-avfilter \
        --disable-avdevice \
        $ADDITIONAL_CONFIGURE_FLAG
    
    make clean
    make  -j4 install
    $PREBUILT/bin/arm-linux-androideabi-ar d libavcodec/libavcodec.a inverse.o
    $PREBUILT/bin/arm-linux-androideabi-ld -rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib  -soname libffmpeg.so -shared -nostdlib  -z,noexecstack -Bsymbolic --whole-archive --no-undefined -o $PREFIX/libffmpeg.so libavcodec/libavcodec.a libavformat/libavformat.a libavutil/libavutil.a libswscale/libswscale.a -lc -lm -lz -ldl -llog  --warn-once  --dynamic-linker=/system/bin/linker $PREBUILT/lib/gcc/arm-linux-androideabi/4.4.3/libgcc.a
    }
    
    #arm v6
    CPU=armv6
    OPTIMIZE_CFLAGS="-marm -march=$CPU"
    PREFIX=./android/$CPU 
    ADDITIONAL_CONFIGURE_FLAG=
    build_one
    
    #arm v7vfpv3
    #CPU=armv7-a
    #OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=$CPU "
    #PREFIX=./android/$CPU
    #ADDITIONAL_CONFIGURE_FLAG=
    #build_one
    
    #arm v7vfp
    #CPU=armv7-a
    #OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfp -marm -march=$CPU "
    #PREFIX=./android/$CPU-vfp
    #ADDITIONAL_CONFIGURE_FLAG=
    #build_one
    
    #arm v7n
    #CPU=armv7-a
    #OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=neon -marm -march=$CPU -mtune=cortex-a8"
    #PREFIX=./android/$CPU 
    #ADDITIONAL_CONFIGURE_FLAG=--enable-neon
    #build_one
    
    #arm v6+vfp
    #CPU=armv6
    #OPTIMIZE_CFLAGS="-DCMP_HAVE_VFP -mfloat-abi=softfp -mfpu=vfp -marm -march=$CPU"
    #PREFIX=./android/${CPU}_vfp 
    #ADDITIONAL_CONFIGURE_FLAG=
    #build_one
    

    How do I tackle this problem? what is going wrong?