Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • FFMPEG command to replace first 32 bytes of mp4 file

    14 octobre 2016, par Tekriwal D

    Is there any utility or command in ffmpeg to replace first 32 bytes of video mp4 file from another key file containing 32 bytes data?

  • Making half an image transparent from the command line

    14 octobre 2016, par Alex Konetchy

    I don't really know where to start with this one. I'm trying to do something that I thought would be relatively simple to accomplish with imagemagick, but I don't know the exact command to start with. I need to draw a line through an image, and then make everything above the line transparent in the image, and make everything below the line, the orignal image. What would be the best way to accomplish this using imagemagick?

    So what I've come up with for now is to crop the image, and then resize it to the original size, but with a transparent background. The command I use is this, but it always comes out black. I'm not understanding why.

    convert -background none -gravity south out.png -resize 400x200 -extent 400x400 result.png

    Thanks for all of the help!

  • Decoding mp4/mkv using FFMPEG fails

    14 octobre 2016, par StarShine

    I'm using a project based on the latest FFmpeg git source tree, and linking to the shared DLL's published by Zeranoe at https://ffmpeg.zeranoe.com/builds/

    The playback code works and loops. It plays back h265 files (raw), mpeg, avi, and mpg files. However as soon as an mp4 or mkv container is specified as input file, regardless of what's inside,a lot of errors are dumped from the codec. It doesn't matter if it's HEVC or h264.

    [h264 @ 00000000xyz] No start code is found
    [h264 @ 00000000xyz] Error splitting the input into NAL units.
    

    To make everything really strange, ffplay.exe plays these files just fine.

    I realize that I can probably fix this by converting files into a raw format first, but I would like to be able to read and parse mp4 files a they are. Since I am using the pre-build libs of Zeraneo, my guess would be that something was not enabled during the build, but then I would expect ffplay to fail too. Do I need to set a flag in the format_context or codec_context, or provide some sort of filter identifier?

    Movies that play fine came from http://bbb3d.renderfarming.net/download.html, http://www.w6rz.net/ and http://www.sample-videos.com/

    These work:

    big_buck_bunny_480p_surround-fix.avi
    bigbuckbunny_480x272.h265
    

    Being a total noob at ffmpeg, please help me understand what is wrong and how to fix it. If the pre-build libs are the culprit, then the second question is if someone has a convenient cmake setup to build this for windows X64 and x32 debug and release targets.

    Here's the source for initializing ffmpeg for reading

    avdevice_register_all();
    avfilter_register_all();
    av_register_all();
    avformat_network_init();
    

    The format is parsed as follows:

    m_FormatContext = avformat_alloc_context();
    if (avformat_open_input(&m_FormatContext, file.GetPath().ToString().c_str(), NULL, NULL) != 0) 
    {
        //std::cout << "failed to open input" << std::endl;
        success = false;
    }
    // find stream info
    if (success)
    {
        if (avformat_find_stream_info(m_FormatContext, NULL) < 0) 
        {
            //std::cout << "failed to get stream info" << std::endl;
            success = false;
        }
    }
    

    The stream is opened as follows:

    m_VideoStream = avstream;
    m_FormatContext = formatContext;
    if (m_VideoStream)
    {
        m_StreamIndex = m_VideoStream->stream_identifier;
        AVCodecParameters *codecpar = m_VideoStream->codecpar;      
        if (codecpar)
        {
            AVCodecID codec_id = codecpar->codec_id;
            m_Decoder = avcodec_find_decoder(codec_id);
            if (m_Decoder)
            {
                m_CodecContext = avcodec_alloc_context3(m_Decoder);
                if (m_CodecContext)
                {
                    m_CodecContext->width = codecpar->width;                    
                    m_CodecContext->height = codecpar->height;
                    m_VideoSize = i3(codecpar->width, codecpar->height,1);
                    success = 0 == avcodec_open2(m_CodecContext, m_Decoder, NULL);
                    if (success)
                    {
                        if(m_CodecContext)
                        {
                            int size = av_image_get_buffer_size(format, m_CodecContext->width, m_CodecContext->height, 1);      
                            if (size > 0)
                            {
                                av_frame = av_frame_alloc();
                                gl_frame = av_frame_alloc();        
                                uint8_t *internal_buffer = (uint8_t *)av_malloc(size * sizeof(uint8_t));
                                av_image_fill_arrays((uint8_t**)((AVPicture *)gl_frame->data), (int*) ((AVPicture *)gl_frame->linesize), internal_buffer, format, m_CodecContext->width, m_CodecContext->height,1);
                                m_Packet = (AVPacket *)av_malloc(sizeof(AVPacket));
                            }
                        }
                    }
                    if (!success) 
                    {
                        avcodec_close(m_CodecContext);
                        avcodec_free_context(&m_CodecContext);
                        m_CodecContext = NULL;
                        m_Decoder = NULL;
                        m_VideoStream = NULL;
                    }
                }
                else
                {
                    m_Decoder = NULL;
                    m_VideoStream = NULL;
                }
            }
        }
    }
    

    And dedoding on a single thread:

    do 
    {
        if (av_read_frame(m_FormatContext, m_Packet) < 0) 
        {
            av_packet_unref(m_Packet);
            m_AllPacketsSent = true;
        }
        else
        {
            if (m_Packet->stream_index == m_StreamIndex) 
            {                   
                avcodec_send_packet(m_CodecContext, m_Packet);
            }
        }
    
        int frame_finished = avcodec_receive_frame(m_CodecContext, av_frame);
        if (frame_finished == 0) 
        {
            if (!conv_ctx) 
            {
                conv_ctx = sws_getContext(m_CodecContext->width, 
                    m_CodecContext->height, m_CodecContext->pix_fmt, 
                    m_CodecContext->width, m_CodecContext->height, format, SWS_BICUBIC, NULL, NULL, NULL);
            }
    
            sws_scale(conv_ctx, av_frame->data, av_frame->linesize, 0, m_CodecContext->height, gl_frame->data, gl_frame->linesize);
    
            switch(format)
            {
                case AV_PIX_FMT_BGR32_1:
                case AV_PIX_FMT_RGB32_1:
                case AV_PIX_FMT_0BGR32:
                case AV_PIX_FMT_0RGB32:
                case AV_PIX_FMT_BGR32:  
                case AV_PIX_FMT_RGB32:              
                {
                    m_CodecContext->bits_per_raw_sample = 32; break;                    
                }
                default:
                {
                    FWASSERT(format == AV_PIX_FMT_RGB32, "The format changed, update the bits per raw sample!"); break;
                }
            }
    
    
            size_t bufferSize = m_CodecContext->width * m_CodecContext->height * m_CodecContext->bits_per_raw_sample / 8;
            m_Buffer.Realloc(bufferSize, false,  gl_frame->data[0]);
            m_VideoSize = i3(m_CodecContext->width, m_CodecContext->height,1);
            result = true;
            // sends the image buffer straight to the locked texture here..
            // glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, codec_ctx->width, codec_ctx->height, GL_RGB, GL_UNSIGNED_BYTE, gl_frame->data[0]);
        }
    
        av_packet_unref(m_Packet);
    } while (m_Packet->stream_index != m_StreamIndex);
    
    m_FrameDecoded = result;
    

    Any insight is appreciated!

  • Compile ffmpeg as stand alone binary

    13 octobre 2016, par Guig

    I'm trying to compile ffmpeg as a standalone binary (because I want to use it in AWS lambda)

    I can get things to work fine on the server where I'm compiling, but if I copy the binary and run it from another server, I get:

    ./ffmpeg: error while loading shared libraries: libvdpau.so.1: cannot open shared object file: No such file or directory

    So it sounds like something didn't made it into the binary. From what I've read, I've to compile ffmpeg with the flags --disable-shared and --enable-static, which I've done:

    PATH="$HOME/bin:$PATH" PKG_CONFIG_PATH="$HOME/ffmpeg_build/lib/pkgconfig" ./configure \
      --prefix="$HOME/ffmpeg_build" \
      --pkg-config-flags="--static" \
      --extra-cflags="-I$HOME/ffmpeg_build/include" \
      --extra-ldflags="-L$HOME/ffmpeg_build/lib" \
      --bindir="$HOME/bin" \
      --disable-shared \
      --enable-static \
      --enable-gpl \
      --enable-libass \
      --enable-libfreetype \
      --enable-libmp3lame \
      --enable-libvpx \
      --enable-libx264
    PATH="$HOME/bin:$PATH" make
    make install
    make distclean
    hash -r
    

    Is there something I'm missing?

  • ffmpeg pipe:0 : could not find codec parameters

    13 octobre 2016, par dre

    When i run this command i get an error. If i run without the pipe, it works.

    With Pipe

    cat mymovie.m4v | ffmpeg -i pipe:0 -an -analyzeduration 1000000 -f image2 -vf 
    "select='eq(pict_type,PICT_TYPE_I)'" -vsync vfr 'public/files/thumb%04d.png'
    

    Without Pipe (Works)

    ffmpeg -i mymovie.m4v -an -analyzeduration 2147483647 -probesize 2147483647 -f image2 -vf 
    "select='eq(pict_type,PICT_TYPE_I)'" -vsync vfr 'public/files/thumb%04d.png'
    

    Output

    ffmpeg version 2.2 Copyright (c) 2000-2014 the FFmpeg developers
    built on Apr 10 2014 17:50:46 with Apple LLVM version 5.1 (clang-503.0.38) 
    (based on LLVM 3.4svn)
    configuration: --prefix=/usr/local/Cellar/ffmpeg/2.2 
    --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-nonfree 
    --enable-hardcoded-tables --enable-avresample --enable-vda --cc=clang 
    --host-cflags= --host-ldflags= --enable-libx264 --enable-libfaac --enable-libmp3lame 
    --enable-libxvid --enable-libfreetype --enable-libtheora --enable-libvorbis 
    --enable-libvpx --enable-librtmp --enable-libopencore-amrnb --enable-libopencore-amrwb
    --enable-libvo-aacenc --enable-libass --enable-ffplay --enable-libspeex 
    --enable-libschroedinger --enable-libfdk-aac --enable-libopus --enable-frei0r 
    --enable-libopenjpeg --extra-cflags='-
    
    I/usr/local/Cellar/openjpeg/1.5.1_1/include/openjpeg-1.5 '
      libavutil      52. 66.100 / 52. 66.100
      libavcodec     55. 52.102 / 55. 52.102
      libavformat    55. 33.100 / 55. 33.100
      libavdevice    55. 10.100 / 55. 10.100
      libavfilter     4.  2.100 /  4.  2.100
      libavresample   1.  2.  0 /  1.  2.  0
      libswscale      2.  5.102 /  2.  5.102
      libswresample   0. 18.100 /  0. 18.100
      libpostproc    52.  3.100 / 52.  3.100
    
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fd87b80f000] stream 0, offset 0x2c: partial file
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fd87b80f000] Could not find codec parameters for stream 0
    (Video: h264 (avc1 / 0x31637661), 1280x720, 3310 kb/s): unspecified pixel format
    

    Consider increasing the value for the 'analyzeduration' and 'probesize' options pipe:0: could not find codec parameters

    I have tried setting (link)

    -analyzeduration100 -probesize 10000000 
    -analyzeduration 2147483647 -probesize 2147483647
    

    Still didn't work.