Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • How to extract images from webm video using ffmpeg

    23 octobre 2013, par Angelo N.

    I need to extract a set of images from a webm video using ffmpeg (version 0.9.2). I currently use this script:

    ./ffmpeg -i video.webm -r 1 -f image2 image-%04d.jpg
    

    but I have this error

    ffmpeg version 0.9.2, Copyright (c) 2000-2012 the FFmpeg developers
      built on Oct 23 2013 12:18:47 with gcc 4.6.3
      configuration: --extra-libs=-ldl --disable-shared --enable-static --disable-everything --disable-ffplay --disable-ffprobe --disable-ffserver --disable-avfilter --disable-avdevice --enable-libvpx --enable-protocol=file --enable-encoder=mjpeg --enable-encoder=libvpx --enable-decoder=libvpx --enable-decoder=mjpeg --enable-parser=mjpeg --enable-muxer=webm --enable-muxer=image2 --enable-muxer=mjpeg --enable-muxer=avi --enable-demuxer=image2 --enable-demuxer=mjpeg --enable-demuxer=avi
      libavutil    51. 32. 0 / 51. 32. 0
      libavcodec   53. 42. 4 / 53. 42. 4
      libavformat  53. 24. 2 / 53. 24. 2
      libswscale    2.  1. 0 /  2.  1. 0
    /home/username/Desktop/out.webm: Invalid data found when processing input
    

    I found that in ffmpeg there isn't a webm demuxer and you can check it typing

    ./configuration --list-demuxers
    

    So I'm asking why ffmpeg do not support a webm demuxer which is used by libvpx (used for example in youtube by Google) .

    It sounds strange... doesn't it?

  • live stream with ffmpeg and IP camera

    23 octobre 2013, par hamilton.lima

    I'm reading a live stream from a DKSEG IP camera, but the results are not good at all ...

    Here is the command line used for this :

    ./ffmpeg -i "http://192.168.0.29/videostream.asf?user=admin&pwd=&resolution=64&rate=0" -vcodec libx264 -r 30 -s 640x480 -threads 2 -vprofile baseline -vpre zoom -strict experimental -acodec aac -ab 96000 -ar 48000 -ac 1  -f rtsp rtsp://xxx.xxx.xxx.xxx:1935/live/_definst_/c3
    

    the zoom.ffpreset :

    coder=1
    level=31
    maxrate=1200000
    bufsize=200000
    wpredp=0
    g=60
    refs=1
    subq=3
    trellis=0
    bf=0
    rc_lookahead=0
    

    Even when I save to a mp4 file the results are not good, with some gaps in the video.

    Any toughts ? suggestions ?

  • Recording video and creation like Talking Tom on Android

    23 octobre 2013, par coder000001

    How can I ( video within the own app, not screen-cast) record video like "Talking Tom" does on android ?

    Stitching video from Images with the help of FFMPEG is too much CPU intensive and doesn't take help of hardware acceleration.

  • Android-How to pass back frames from FFmpeg back to Android

    23 octobre 2013, par yarin

    It is an architecture question-i am really interesting about the answer

    I building an app with following goals:

    1.record video with effect in real time(using FFmpeg)

    2.display the customized video in real time for the user while he recording

    So,after 1 month of working...i decide to remember that goal number 2 is worth to thinking about :) I have a ready skeleton app that record video with effect in real time. but i have to preview this customized frame back to the user.

    My options (and this is my question):

    1.Each frame that pass from onPreviewFrame(byte[] video_frame_data, Camera camera) to ffmpeg with JNI to encode-will sending back to android through the same JNI after i apply the effects(i mean: onPreviewFrame->JNI to FFMPEG->immediately apply effect->send the costumed frame back to android side for display->encode the costumed frame).

    Advantages: it is look like is the most easy to use.

    Disadvantages: use the JNI twice or the passing back the frame could consume time(i really don't now if it really big price to pay,cuz it is only byte array or int array per frame to send to android side)

    2.I heard about openGL on ndk,but i think that the surface it self created on the android side-so is it really going to be better? i prefer to use other surface that i using now in java

    3.Create an video player on FFmpeg to preview each customized frame in real time.

    Thank for your helping,i hope that the first solution is available and not consume to much expensive time in terms of real time processing

  • ffmpeg finding settings from encoded file

    23 octobre 2013, par Johan

    I am trying to encode 720p video files to 420p, I found an example of this that has very good compression without losing much quality, even audio is good. So I am interested in how they have done this, and to replicate it on my own video files.

    Here is output from ffmpeg -i on the encoded file

    Input #0, matroska,webm, from 'reencoded.mkv':
    Duration: 00:41:37.07, start: 0.000000, bitrate: 534 kb/s
    Stream #0:0: Audio: aac, 48000 Hz, stereo, fltp (default) (forced)
    Stream #0:1(eng): Video: h264 (High), yuv420p, 852x480, SAR 1:1 DAR 71:40, 23.98 fps, 23.98 tbr, 1k tbn, 47.95 tbc (default)
    

    Here is output from ffmpeg -i on the source file

    Input #0, matroska,webm, from 'source.mkv':
    Duration: 00:41:37.02, start: 0.000000, bitrate: 2920 kb/s
    Stream #0:0(eng): Video: h264 (High), yuv420p, 1280x720, SAR 1:1 DAR 16:9, 23.98 fps, 23.98 tbr, 1k tbn, 47.95 tbc
    Stream #0:1: Audio: ac3, 48000 Hz, 5.1(side), fltp, 384 kb/s (default)
    

    How do I encode a file with the same settings?


    This is what I have tried so far

    ffmpeg -i source.mkv -vcodec libx264 -acodec libfaac -ar 48k -ac 2 -s 852x480 output.mkv
    

    My output file becomes much bigger (201MB vs 159MB) and bitrate higher,

    Input #0, matroska,webm, from 'output.mkv'
    Duration: 00:41:37.11, start: 0.000000, bitrate: 675 kb/s
    Stream #0:0(eng): Video: h264 (High), yuv420p, 852x480 [SAR 640:639 DAR 16:9], SAR 1:1 DAR 71:40, 23.98 fps, 23.98 tbr, 1k tbn, 47.95 tbc
    Stream #0:1: Audio: aac, 48000 Hz, stereo, fltp (default)