Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • RTSP streaming on Android client using FFMpeg

    10 août 2013, par rurtle

    I am working on a hobby project the goal for which is to develop an Android application capable of streaming live feeds captured through web cams in a LAN setting using FFMpeg as the underlying engine. So far, I did the following -

    A. Compiling and generating FFMpeg related libraries for the following releases -

    FFMpeg version: 2.0
    NDK version: r8e & r9
    Android Platform version: android-16 & android-18thisthisthisthis
    Toolchain version: 4.6 & 4.8
    Platform built on: Fedora 18 (x86_64)

    B. Creating the files Android.mk & Application.mk in appropriate path.

    However, when it came to writing the native code for accessing appropriate functionality of FFMpeg from the application layer using Java, I'm stuck with following questions -

    a) Which all of FFMpeg's features I need to make available from native to app layer for streaming real-time feeds?
    b) In order to compile FFMpeg for Android, I followed this link. Whether the compilation options are sufficient for handling *.sdp streams or do I need to modify it?
    c) Do I need to make use of live555?

    I am totally new to FFMpeg and Android application development and this is going to be my first serious project for Android platform. I have been searching for relevant tutorials dealing with RTSP streaming using FFMpeg for a while now without much success. Moreover, I tried the latest development build of VLC player and found it to be great for streaming real-time feeds. However, it's a complex beast and the goal for my project is of quite limited nature, mostly learning - in a short time span.

    Could you suggest some pointers (e.g. links, documents or sample code) on how can I write the native code for utilizing FFMpeg library and subsequently use those functionality from the app layer for streaming real-time feeds? Moreover, will really appreciate if you could let me know the kind of background knowledge necessary for this project from a functional standpoint (in a language agnostic sense).

  • Getting troubles when I generate rtsp stream as an output with ffmpeg from static images as an input

    10 août 2013, par Ilya Yevlampiev

    I'm trying to start the rtsp stream via feeding ffmpeg with static images and feeding ffserver with ffmpeg output.

    The first problem appears from the ffserver.config:

    Port 12345
    RTSPPort 8544
    BindAddress 0.0.0.0
    MaxHTTPConnections 2000
    MaxClients 1000
    MaxBandwidth 1000
    CustomLog /var/log/ffserver-access.log
      
    File /tmp/videofeed.ffm
    FileMaxSize 3M
    #Launch ffmpeg -s 640x480 -f video4linux2 -i /dev/video0
    #Launch ffmpeg http://localhost:8090/videofeed.ffm
    Launch ffmpeg -loop 1 -f image2 -r 20 -b 9600 -i Janalif.jpg -t 30 http://127.0.0.1:8090/videofeed.ffm -report
    ACL allow 127.0.0.1
      
      
    Format rtsp
    #rtsp://localhost:5454/test1-rtsp.mpg
    Feed videofeed.ffm
    #webcam.ffm
    Format flv
    VideoCodec flv
    VideoFrameRate 30
    VideoBufferSize 80000
    VideoBitRate 200
    VideoQMin 1
    VideoQMax 5
    VideoSize 640x480
    PreRoll 1
    NoAudio
      
      
    Format status
      
    

    Please ignore codecs etc in stream part. The problem appears for RTSPPort, after starting the server nmap shows no binding to 8544, only 12345 port is used.

    8090/tcp  open  unknown
    12345/tcp open  netbus
    

    I can download mpeg stream through http from http://localhost:12345/test1-rtsp.mpg. How can I setup 8544 port working?

    and another question is about Launch part of the stream. Am I right, that ffserver executes the content of Launch line? If so, how can i configure ffserver to wait the stream in some particular port, but start streaming at the moment I desire?

    P.S. The solution looks like Säkkijärven polkka, hoowever the idea behind this construct is to provide the controlled rtsp stream to emulate the camera output. In future I plan to substitute the command line for ffmpeg with some java bindings for it to produce the program-controlled images to the camera input to test the computer vision, that's why I need a way to launch ffmpeg independently on ffserver.

  • ffmpeg live rtmp stream does not start to process for long time

    10 août 2013, par user1492502

    I have rtmp stream created by flash player in h264 but when i convert it to video or tumbnail using ffmpeg it some times works after very very long time and some time not work but if I create a stream with Flash Media live encoder on same FMS server the command below works fine. At the same time if I try the stream in player it works well and fine.

    I am using IP so DNS resolving issue is not possible either I think.

    ffmpeg -i rtmp://xxx.xxx.xx.xx/live/bdeef2c065509361e78fa8cac90aac741cc5ee29 -r 1 -an -updatefirst 1 -y thumbnail.jpg

    Following is when it worked aftre 15 - 20 minutes 
    
    ffmpeg -i "rtmp://xxx.xxx.xx.xx/live/bdeef2c065509361e78fa8cac90aac741cc5ee29 live=1" -r 1 -an -updatefirst 1 -y thumb.jpg
    [root@test ~]# ffmpeg -i rtmp://38.125.41.20/live/bdeef2c065509361e78fa8cac90aac741cc5ee29 -r 1 -an -updatefirst 1 -y thumbnail.jpg
    ffmpeg version N-49953-g7d0e3b1-syslint Copyright (c) 2000-2013 the FFmpeg developers
      built on Feb 14 2013 15:29:40 with gcc 4.4.6 (GCC) 20120305 (Red Hat 4.4.6-4)
      configuration: --prefix=/usr/local/cpffmpeg --enable-shared --enable-nonfree --enable-gpl --enable-pthreads --enable-libopencore-amrnb --enable-decoder=liba52 --enable-libopencore-amrwb --enable-libfaac --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-libxvid --extra-cflags=-I/usr/local/cpffmpeg/include/ --extra-ldflags=-L/usr/local/cpffmpeg/lib --enable-version3 --extra-version=syslint
      libavutil      52. 17.101 / 52. 17.101
      libavcodec     54. 91.103 / 54. 91.103
      libavformat    54. 63.100 / 54. 63.100
      libavdevice    54.  3.103 / 54.  3.103
      libavfilter     3. 37.101 /  3. 37.101
      libswscale      2.  2.100 /  2.  2.100
      libswresample   0. 17.102 /  0. 17.102
      libpostproc    52.  2.100 / 52.  2.100
    [flv @ 0x14c0100] Stream #1: not enough frames to estimate rate; consider increasing probesize
    [flv @ 0x14c0100] Could not find codec parameters for stream 1 (Audio: none, 0 channels): unspecified sample format
    Consider increasing the value for the 'analyzeduration' and 'probesize' options
    [flv @ 0x14c0100] Estimating duration from bitrate, this may be inaccurate
    Input #0, flv, from 'rtmp://xxx.xxx.xx.xx/bdeef2c065509361e78fa8cac90aac741cc5ee29':
      Metadata:
        keyFrameInterval: 15
        quality         : 90
        level           : 3.1
        bandwith        : 0
        codec           : H264Avc
        fps             : 15
        profile         : baseline
      Duration: N/A, start: 0.000000, bitrate: N/A
        Stream #0:0: Video: h264 (Baseline), yuv420p, 640x480 [SAR 1:1 DAR 4:3], 15 tbr, 1k tbn, 30 tbc
        Stream #0:1: Audio: none, 0 channels
    Output #0, image2, to 'thumbnail.jpg':
      Metadata:
        keyFrameInterval: 15
        quality         : 90
        level           : 3.1
        bandwith        : 0
        codec           : H264Avc
        fps             : 15
        profile         : baseline
        encoder         : Lavf54.63.100
        Stream #0:0: Video: mjpeg, yuvj420p, 640x480 [SAR 1:1 DAR 4:3], q=2-31, 200 kb/s, 90k tbn, 1 tbc
    Stream mapping:
      Stream #0:0 -> #0:0 (h264 -> mjpeg)
    Press [q] to stop, [?] for help
    frame= 2723 fps=1.3 q=1.6 size=N/A time=00:45:23.00 bitrate=N/A dup=8 drop=12044
    

    and on stopping the stream by closing the browser running the flash player which is publishing the video I get the following

    [flv @ 0x23684e0] Could not find codec parameters for stream 1 (Audio: none, 0 channels): unspecified sample format
    Consider increasing the value for the 'analyzeduration' and 'probesize' options
    [flv @ 0x23684e0] Estimating duration from bitrate, this may be inaccurate
    Input #0, flv, from 'rtmp://xxx.xxx.xx.xx/live/bdeef2c065509361e78fa8cac90aac741cc5ee29':
      Metadata:
        keyFrameInterval: 15
        quality         : 90
        bandwith        : 0
        level           : 3.1
        codec           : H264Avc
        fps             : 15
        profile         : baseline
      Duration: N/A, start: 0.000000, bitrate: N/A
        Stream #0:0: Video: h264 (Baseline), yuv420p, 640x480 [SAR 1:1 DAR 4:3], 15 tbr, 1k tbn, 30 tbc
        Stream #0:1: Audio: none, 0 channels
    

    when if i stop the stream it quickly creates a thumbnail file where as running stream is an issue.

  • convert yuv to mp4 by ffmpeg on android

    10 août 2013, par worldask

    i have to convert yuv to mp4 by ffmpeg on android. When I convert wav to mp4 it works well. but when i convert yuv or yuv + wav to mp4, i got errer message said

    Error decoding AAC frame header
    

    anybody knows what happened?

    following is the full debug log

    transferYUV2MP4() enter
    __transfer_yuv_to_mp4() enter
    __transfer_yuv_to_mp4() argv[00/17] = ffmpeg
    __transfer_yuv_to_mp4() argv[01/17] = -loglevel
    __transfer_yuv_to_mp4() argv[02/17] = debug
    __transfer_yuv_to_mp4() argv[03/17] = -y
    __transfer_yuv_to_mp4() argv[04/17] = -i
    __transfer_yuv_to_mp4() argv[05/17] = /sdcard/111.yuv
    __transfer_yuv_to_mp4() argv[06/17] = -i
    __transfer_yuv_to_mp4() argv[07/17] = /sdcard/3.wav
    __transfer_yuv_to_mp4() argv[08/17] = -c:a
    __transfer_yuv_to_mp4() argv[09/17] = aac
    __transfer_yuv_to_mp4() argv[10/17] = -strict
    __transfer_yuv_to_mp4() argv[11/17] = experimental
    __transfer_yuv_to_mp4() argv[12/17] = -b:a
    __transfer_yuv_to_mp4() argv[13/17] = 56k
    __transfer_yuv_to_mp4() argv[14/17] = -preset
    __transfer_yuv_to_mp4() argv[15/17] = ultrafast
    __transfer_yuv_to_mp4() argv[16/17] = /sdcard/111.mp4
    __run_ffmpeg_main() enter
    __run_ffmpeg_main() handle=0xb000f7f8
    __run_ffmpeg_main() dlfunc=0x4b5a2728
    ffmpeg version 1.2.2
     Copyright (c) 2000-2013 the FFmpeg developers
      built on Aug 10 2013 16:34:45 with gcc 4.6 (GCC) 20120106 (prerelease)
      configuration: --target-os=linux --prefix=./android/armv7-a --sysroot=/Users/pht/android/ndks/android-ndk-r9/platforms/android-8/arch-arm/ --enable-gpl --enable-version3 --disable-shared --enable-static --disable-ffprobe --disable-ffplay --disable-ffserver --disable-network --enable-avformat --enable-avcodec --enable-cross-compile --arch=arm --cc=/Users/pht/android-standalone-toolchain/bin/arm-linux-androideabi-gcc --nm=/Users/pht/android-standalone-toolchain/bin/arm-linux-androideabi-nm --cross-prefix=/Users/pht/android-standalone-toolchain/bin/arm-linux-androideabi- --extra-cflags=' -I../fdk-aac/include -I../x264 -O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 -Dipv6mr_interface=ipv6mr_ifindex -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 -mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=armv7-a ' --extra-ldflags=' -L../fdk-aac/lib -L../x264 -Wl,-rpath-link=/Users/pht/android/ndks/android-ndk-r9/platforms/android-8/arch-arm//usr/lib -L/Users/pht/android/ndks/android-ndk-r9/platforms/andr
      libavutil      52. 18.100 / 52. 18.100
      libavcodec     54. 92.100 / 54. 92.100
      libavformat    54. 63.104 / 54. 63.104
      libavdevice    54.  3.103 / 54.  3.103
      libavfilter     3. 42.103 /  3. 42.103
      libswscale      2.  2.100 /  2.  2.100
      libswresample   0. 17.102 /  0. 17.102
      libpostproc    52.  2.100 / 52.  2.100
    Splitting the commandline.
    Reading option '-loglevel' ...
     matched as option 'loglevel' (set libav* logging level) with argument 'debug'.
    Reading option '-y' ...
     matched as option 'y' (overwrite output files) with argument '1'.
    Reading option '-i' ...
     matched as input file with argument '/sdcard/111.yuv'.
    Reading option '-i' ...
     matched as input file with argument '/sdcard/3.wav'.
    Reading option '-c:a' ...
     matched as option 'c' (codec name) with argument 'aac'.
    Reading option '-strict' ...
     matched as AVOption 'strict' with argument 'experimental'.
    Reading option '-b:a' ...
     matched as option 'b' (video bitrate (please use -b:v)) with argument '56k'.
    Reading option '-preset' ...
     matched as AVOption 'preset' with argument 'ultrafast'.
    Reading option '/sdcard/111.mp4' ...
     matched as output file.
    Finished splitting the commandline.
    Parsing a group of options: global .
    Applying option loglevel (set libav* logging level) with argument debug.
    Applying option y (overwrite output files) with argument 1.
    Successfully parsed a group of options.
    Parsing a group of options: input file /sdcard/111.yuv.
    Successfully parsed a group of options.
    Opening an input file: /sdcard/111.yuv.
    Format aac detected only with low score of 1, misdetection possible!
    File position before avformat_find_stream_info() is 0
    get_buffer() failed
    Error decoding AAC frame header.
    channel element 2.12 is not allocated
    More than one AAC RDB per ADTS frame is not implemented. Update your FFmpeg version to the newest one from Git. If the problem still occurs, it means that your file has a feature which has not been implemented.
    channel element 3.4 is not allocated
    channel element 2.2 is not allocated
    Number of scalefactor bands in group (44) exceeds limit (40).
    channel element 2.10 is not allocated
    channel element 1.15 is not allocated
    channel element 3.6 is not allocated
    channel element 2.0 is not allocated
    channel element 3.3 is not allocated
    Sample rate index in program config element does not match the sample rate index configured by the container.
    channel element 2.8 is not allocated
    Sample rate index in program config element does not match the sample rate index configured by the container.
    channel element 3.2 is not allocated
    Reserved bit set.
    channel element 2.6 is not allocated
    channel element 2.1 is not allocated
    Dependent coupling is not supported together with LTP
    Dependent coupling is not supported together with LTP
    Dependent coupling is not supported together with LTP
    Dependent coupling is not supported together with LTP
    Dependent coupling is not supported together with LTP
    

    and the "Dependent coupling..." line loops thousands of times

  • FFMPEG to remote server : Live streaming with segments [migrated]

    10 août 2013, par Brianjs

    I have looked around and have found many good articles on how to use ffmpeg to segment live video for HLS streaming. However, I need to be able to use use an encoder from a remote location (that is receiving live video), and then somehow send these segmented files and the m3u8/ts files to a web server in a different location, in real time.

    So: REMOTE COMPUTER(camera->ffmpeg->segmenter) -> WEBSERVER(receives files -> users connect for "live" stream)

    My question is: Has anyone seen something similar to this? Or is there a setting on ffmpeg/ffserver that will let me do this?