Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • How to avoid a growing delay with ffmpeg between sound and raw video data ?

    13 novembre 2011, par roland

    Hello everybody and thank you for reading !

    Here is my problem : I have a program piping raw video frames to the standard output. This program is using OpenCV to capture and process the video and outputs directly the processed frames. The loop is synced to the framerate I chose. I'm using ffmpeg to read from the standard input and everything works fine for the video. But now that I added the sound I have a big problem : a growing delay is occuring and I really need to get rid of it. So here is my idea, but I really need your help :

    I have to find a way to include a timestamp information to the raw video. To be understandable by ffmpeg, it needs to be a known raw video compatible container. And then I will need to use the container API and pipe it to the standard output in my program. I really do not know what to use in the jungle of video formats and codecs, and I don't event know how to enable timestamp synchronizing in ffmpeg...

    If anyone has an idea, i am really interested here. For information, here is the command line i use to pipe the raw video :

    ./myprogram | ffmpeg -y -f alsa -i pulse -ac 2  -f rawvideo -vcodec rawvideo -r 24 -s 640x480 -pix_fmt bgr24 -i - -vcodec libx264 -pix_fmt yuv420p -r 24 -f flv -ar 44100 out.flv;
    

    Thand you very much,

    Roland

  • Get single buffer from AVFrame data and display it on Android Bitmap/Surface/SurfaceView

    13 novembre 2011, par Ivan

    I have decoded AVFrame from avcodec_decode_video2 function (FFmpeg) which is then passed to the SWS library and converted from YUV420P format to RGB565. How do I combine all colors and linesizes information i.e. frame->data[0..3], frame->linesize[0..3] into one buffer and how to display it then on the Android device say by using Android Bitmap or SurfaceView/View? I don't want to use SurfaceFlinger because it is not official part of NDK and it is subject to change with every minor release.

  • how to capture current screen with FFMPEG and broadcast on a webpage ? [closed]

    12 novembre 2011, par user893856

    because I have no idea about it. I know theres even a protocol for it (RTSP). I know about php and socket programing so that wont be obstacle

  • Decoding a Stream using FFMpeg

    12 novembre 2011, par Ali

    Is there any way to directly decode an audio stream using ffmpeg instead of storing into buffer and decoding. I heard that libavcodec/tcp.h is for that purpose? However I am not able to find that file in ffmpeg source.

  • Libmms, iOS, and ASF encode to MP3 using FFmpeg library

    11 novembre 2011, par zomerc

    I was wondering if someone can point me into the right direction on using FFMpeg library to convert a asf streaming to MP3 formatted so that can be play on iOS. I have need unsuccessfully in my attempt do this. any help would be appreciated.

    I am using libmms to open and read the stream on the iOS.

    thanks,