Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Android Media palyer by FFmpeg2.3.3 and SDL2-2.0.3 has a error when SDL_init().The error is about SDL_main.h

    18 novembre 2014, par Hanamaki

    I use FFmpeg2.3.3 and SDL2-2.0.3 to develop an Android video player.I built the .apk success,but when I ran it,it's an error at SDL_init().I got message by SDL_error().The message was:

    SDL_Init(14144): Application didn't initialize properly, did you include SDL_main.h in the file containing your main() function?

    but I have #include "SDL_main.h" in the source.Anyone has some suggestions? thanks very much.

  • How to extract frames using mencoder into specific directory ?

    18 novembre 2014, par slik

    I can't seem to extract images into a specific directory.

    The current command I am running is

    mplayer -vo jpeg CustomLogoAni.mp4
    

    Is it also possible to set the prefix for the file? Similar to ffmpeg %d.jpg

    When combining the images I run the command:

    mencoder "mf://*.jpg" -ovc x264 -o output.avi
    

    But it does not work.

    Can anyone help?

  • Create a demultiplexer for MPEG 2 TS in android

    17 novembre 2014, par anz

    I have a requirement where I need to extract ID3 tags from a MPEG2 TS(HLS STREAM). MPEG2 has a limited support in android in regards to playing the file. But my concern is to extract the ID3 tags(playing the file is not necessary). Hence I am not concerned with the codecs(encoding and decoding).

    I have explored the following options:

    libstagefright and OpenMax : A playback engine implemented by Google from Android 2.0. It has a MediaExtractor is responsible for retrieving track data and the corresponding meta data from the underlying file system or http stream. But according to this post Adding video codec to Android I need to build my own firmware or my own media player.I am hoping I don't have to go down that path. More info on stagefright and openMax can be found here:

    An overview of Stagefright player

    Android’s Stagefright Media Player Architecture

    Custom Wrapper Codec Integration into Android

    How to integrate a decoder to multimedia framework

    Compiling and using FFMPEG: A complete, cross-platform solution to record, convert and stream audio and video. We can demultiplex ts files with this library as mentioned here:

    FFmpeg - Extracting video and audio from transport stream file (.ts).

    But I am not sure if I will be able to extract the ID3 tags from the HLS Stream. libavformat might be able to do this but I still need to come up with a mechanism for signaling the read metadata to my application.

    Compiling vlc for android: I have compiled vlc for android and made some modifications inside the transport module in demux component for extracting the tags, but it is not able to play all the streams that I am supplying to it.

    After looking through these options , I am still at a fix in how to achieve this. I don't want to create a media player as I will not be playing the files nor do I want to build my own firmware. Using ffmpeg seems to be the most viable option, but I want to try this without using any third-party or open source library. My questions are:

    Is it even possible to create a demultiplexer from scratch that will work on android?

    If possible then ,how to go about it ?

    Any options that I have missed?

    I am new to this. Any help would be greatly appreciated..Thanks

  • FFMPEG Streaming Running 3x Faster

    17 novembre 2014, par Zohaib Khalid

    Hey i'm recording stream from an IP camera with FFMPEG . But when i play it in VLC or Media Player after recording the video player 3x faster. The Input to ffmpeg is :

    ffmpeg -i "http://10.11.16.127:90/livestream.cgi?user=admin&pwd=&streamid=0&audio=0&filename=" -c:v libx264 -t 300 -an output.mp4
    

    Any Help would be appreciated. I am using FFMPEG in windows machine.

  • ffmpeg and hardware acceleration

    17 novembre 2014, par user1541069

    This code is to decode an h264 stream from camera using ffmpeg, but I want to use hardware acceleration for decoding video. I can't find a good example.

        string url = @"rtsp://admin:123456@192.168.0.201:554";
    
            FFmpegInvoke.av_register_all();
            FFmpegInvoke.avcodec_register_all();
            FFmpegInvoke.avformat_network_init();
    
    
            AVFormatContext* pFormatContext = FFmpegInvoke.avformat_alloc_context();
            if (FFmpegInvoke.avformat_open_input(&pFormatContext, url, null, null) != 0)
                throw new Exception("Could not open file");
    
            if (FFmpegInvoke.avformat_find_stream_info(pFormatContext, null) != 0)
                throw new Exception("Could not find stream info");
    
            AVStream* pStream = null;
            for (int i = 0; i < pFormatContext->nb_streams; i++)
            {
                if (pFormatContext->streams[i]->codec->codec_type == AVMediaType.AVMEDIA_TYPE_VIDEO)
                {
                    pStream = pFormatContext->streams[i];
                    break;
                }
            }
            if (pStream == null)
                throw new Exception("Could not found video stream");
    
            AVCodecContext codecContext = *(pStream->codec);
            int width = codecContext.width;
            int height = codecContext.height;
            AVPixelFormat sourcePixFmt = codecContext.pix_fmt;
            AVCodecID codecId = codecContext.codec_id;
            var convertToPixFmt = AVPixelFormat.PIX_FMT_BGR24;
            SwsContext* pConvertContext = FFmpegInvoke.sws_getContext(width, height, sourcePixFmt,
                                                                       width, height, convertToPixFmt,
                                                                       FFmpegInvoke.SWS_FAST_BILINEAR, null, null, null);
            if (pConvertContext == null)
                throw new Exception("Could not initialize the conversion context");
    
            var pConvertedFrame = (AVPicture*)FFmpegInvoke.avcodec_alloc_frame();
            int convertedFrameBufferSize = FFmpegInvoke.avpicture_get_size(convertToPixFmt, width, height);
            var pConvertedFrameBuffer = (byte*)FFmpegInvoke.av_malloc((uint) convertedFrameBufferSize);
            FFmpegInvoke.avpicture_fill(pConvertedFrame, pConvertedFrameBuffer, convertToPixFmt, width, height);
    
            AVCodec* pCodec = FFmpegInvoke.avcodec_find_decoder(codecId);
            if (pCodec == null)
                throw new Exception("Unsupported codec");
    
            // Reusing codec context from stream info, 
            // as an alternative way it could look like this: (but it works not for all kind of codecs)
            // AVCodecContext* pCodecContext = FFmpegInvoke.avcodec_alloc_context3(pCodec);
            AVCodecContext* pCodecContext = &codecContext;
    
            if ((pCodec->capabilities & FFmpegInvoke.CODEC_CAP_TRUNCATED) == FFmpegInvoke.CODEC_CAP_TRUNCATED)
                pCodecContext->flags |= FFmpegInvoke.CODEC_FLAG_TRUNCATED;
    
            if (FFmpegInvoke.avcodec_open2(pCodecContext, pCodec, null) < 0)
                throw new Exception("Could not open codec");
    
            AVFrame* pDecodedFrame = FFmpegInvoke.avcodec_alloc_frame();
    
            var packet = new AVPacket();
            AVPacket* pPacket = &packet;
            FFmpegInvoke.av_init_packet(pPacket);
    
            int frameNumber = 0;
            while (true)
            {
                Console.WriteLine("frame: {0}", frameNumber);
    
                if (FFmpegInvoke.av_read_frame(pFormatContext, pPacket) < 0)
                    throw new Exception("Could not read frame");
    
                if (pPacket->stream_index != pStream->index)
                    continue;
    
                int gotPicture = 0;
                int size = FFmpegInvoke.avcodec_decode_video2(pCodecContext, pDecodedFrame, &gotPicture, pPacket);
                if (size < 0)
                    throw new Exception(string.Format("Error while decoding frame {0}", frameNumber));
    
                if (gotPicture == 1)
                {
                    byte** src = &pDecodedFrame->data_0;
                    byte** dst = &pConvertedFrame->data_0;
                    FFmpegInvoke.sws_scale(pConvertContext, src, pDecodedFrame->linesize, 0,
                                            height, dst, pConvertedFrame->linesize);
    
                    byte* convertedFrameAddress = pConvertedFrame->data_0;
    
                    var imageBufferPtr = new IntPtr(convertedFrameAddress);
    
                    using (var bitmap = new Bitmap(width, height, pConvertedFrame->linesize[0], PixelFormat.Format24bppRgb, imageBufferPtr))
                    {
                        bitmap.Save(@"frame.buffer.jpg", ImageFormat.Jpeg);
                    }
                }
                //frameNumber++;
            }
    
            FFmpegInvoke.av_free(pConvertedFrame);
            FFmpegInvoke.av_free(pConvertedFrameBuffer);
            FFmpegInvoke.sws_freeContext(pConvertContext);
    
            FFmpegInvoke.av_free(pDecodedFrame);
            FFmpegInvoke.avcodec_close(pCodecContext);
            FFmpegInvoke.avformat_close_input(&pFormatContext);
    

    How can I use hardware acceleration for decoding stream(hwaccel)?