Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • How to remove/reduce pixelation of mjpeg stream in FFMPEG

    10 janvier 2017, par Luis Ruiz

    I'm working on a server side re-streaming service for some IP Cameras. I'm using ffserver on Linux to serve the streams and ffmpeg on Windows to feed. I'm getting the cameras video (H264) using rtsp.

    My ffserver config example is as follows

    
    File ./test.ffm
    
    
    
    Feed test.ffm
    Format mpjpeg
    VideoFrameRate 3
    VideoSize 704x480
    NoAudio
    FileMaxSize 100k 
    VideoHighQuality
    
    

    and the way ffmpeg is feeding:

    ffmpeg -rtsp_transport tcp -i "rtsp://admin:admin@192.168.1.12:554/cam/realmonitor?channel=1&subtype=0" -vcodec copy -acodec copy "http://192.168.1.101:8090/test.ffm"
    

    Resulting video is Very pixelated and it differs from the real image.

    Cameras' configuration are as follows:

    Resolution: D1 (704*480)
    FrameRate: 3
    BitRate: 256k
    BitRateType: Variable
    

    Is there anything i'm missing or doing wrong?

    Thanks in advance for any help

  • Ffmpeg seeking to successive key frame

    10 janvier 2017, par user1447257

    I can cut a video using ffmpeg, which will result in a video that starts at the key frame before the time stamp ss:

    ffmpeg -ss 10 input.mp4 -t 2 output.mp4
    

    This video cutting is used in a pipeline, that removes dirty parts in an image processing pipeline. Hence, the above command also adds some dirty parts in the first chunk to the output.

    How can I tell ffmpeg to start the video on the successive key frame after ss? Bonus question: How can I tell ffmpeg to end the video on the key frame before my dirty parts in the input start again?

    Thanks in advance!

  • C++ Extracting a h264 Subsequence from Byte Stream

    10 janvier 2017, par Simon

    I have a raw h.264 byte stream coming from an RTSP network camera. In order to get the byte stream, I catch the piped output from ffmpeg using popen():

    ffmpeg -i rtsp://address -c:v copy -an -c:v copy -an -f h264 pipe:1
    

    At some point in time, I would like to start recording from the stream for a while (and save everything to an mp4 file). I want to achieve this without decoding the stream to an intermediate format (e.g., yuv420p) and encoding it back. As a first test, I just started writing the output buffer to disk after a couple of seconds. Then, I can encode the video again using

    ffmpeg -i cam.h264 -c:v h264 -an -f copy cam_out.mp4
    

    Here, ffmpeg complains that the first part of the data is corrupted (it still seems to be able to recover from this as it just throws away the corrupted parts). This of course makes sense as I simply start recording without looking for the start of frames etc.. Ideally, I would like to start and stop recording at the correct parts of the stream. I had a small glimpse on the h.264 format and the NAL units. Is there some simple way of detecting "good" positions in the stream to start recording?

  • Cannot load library jni ffmpeg in intellij idea

    10 janvier 2017, par user3292775

    After a lot of trial and error I managed to make all the program work except when coming to load a library. The library is located in the JNI folder. Can someone help me with this error?

    02-11 21:38:32.940 26771-26771/com.example.untitled E/AndroidRuntime﹕ FATAL EXCEPTION: main Process: com.example.untitled, PID: 26771 java.lang.UnsatisfiedLinkError: Couldn't load nativetester-jni from loader dalvik.system.PathClassLoader[DexPathList[[zip file "/data/app/com.example.untitled-2.apk"],nativeLibraryDirectories=[/data/app-lib/com.example.untitled-2, /vendor/lib, /system/lib]]]: findLibrary returned null at java.lang.Runtime.loadLibrary(Runtime.java:358) at java.lang.System.loadLibrary(System.java:526) at com.example.untitled.NativeTester.(NativeTester.java:23) at com.example.untitled.FFmpegPlayer.(FFmpegPlayer.java:184) at com.example.untitled.VideoActivity.onCreate(VideoActivity.java:134) at android.app.Activity.performCreate(Activity.java:5231) at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1087) at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2169) at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2265) at android.app.ActivityThread.access$800(ActivityThread.java:145) at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1206) at android.os.Handler.dispatchMessage(Handler.java:102) at android.os.Looper.loop(Looper.java:136) at android.app.ActivityThread.main(ActivityThread.java:5081) at java.lang.reflect.Method.invokeNative(Native Method) at java.lang.reflect.Method.invoke(Method.java:515) at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:781) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:597) at dalvik.system.NativeStart.main(Native Method)

  • ffmpeg concatenate 3 videos with fade in/out

    10 janvier 2017, par Crikey Mikey

    Im trying to join 3 videos together with a fade in/out effect.

    I can get this working for 2 videos (sourced from stackoverflow but cant find the link):

    ffmpeg -y -i part1.mp4 -i part2.mp4 -f lavfi -i color=black:s=1920x1080 -filter_complex \
    "[0:v]format=pix_fmts=yuva420p,fade=t=out:st=10:d=1:alpha=1,setpts=PTS-STARTPTS[va0]; \
     [1:v]format=pix_fmts=yuva420p,fade=t=in:st=0:d=1:alpha=1,setpts=PTS-STARTPTS+10/TB[va1]; \
     [2:v]trim=duration=20[over]; \
     [over][va0]overlay[over1]; \
     [over1][va1]overlay=format=yuv420[outv]" \
    -vcodec libx264 -map [outv] merged.mp4
    

    But cant work out how to make this work for 3 videos.

    I don't need any audio. Any ideas?

    Cheers,