Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • How to find the presentation time stamp of a given frame number for FFMPEG decoding ?

    21 août 2017, par Deepankar Arya

    I am using the C APIs of ffmpeg for some video processing. My aim is to extract the screen shot of a given frame number. I have understood that ffmpeg has an av_seek_frame function to seek to a given timestamp(expressed in appropriated base units of the video stream). I assume that I will have to goto to the most recent I frame for the given frame(using the AVSEEK_FLAG_BACKWARD flag) and read onwards untill I meet the required frame. For that I need to give a seek time stamp to the av_seek_frame function. My main issue is that given a frame number, how do I find an associated presentation time stamp to seek to ?

  • how to convert one image to a video by using ffmpeg ?

    21 août 2017, par Tutu

    I downloaded and installed ffmpeg on my mac, however, I have a problem using it. I tried to use

    ffmpeg -loop 1 -i rho1.png -c:v libx264 -t 15 -pix_fmt yuv420p -vf scale=320:240 out.mp4
    

    however, the output is

    ld: warning: ignoring file video, file was built for unsupported file format ( 0x66 0x66 0x6D 0x70 0x65 0x67 0x20 0x2D 0x6C 0x6F 0x6F 0x70 0x20 0x31 0x20 0x2D ) which is not the architecture being linked (x86_64): video
    Undefined symbols for architecture x86_64:
    "_main", referenced from:
     implicit entry/start for main executable
    ld: symbol(s) not found for architecture x86_64
    clang: error: linker command failed with exit code 1 (use -v to see invocation)
    

    Need help with creating a video by using one picture in png.

  • Android stream to rstp server and convert to rtmp using nginx

    21 août 2017, par TeKilla

    I trying to get my android device to stream its camera live to a webbrowser. I read a great tutorial at androidhive about using wowza to do so. However, im looking for a free solution. I decided to setup on my local machine a nginx server with rtmp module with access from outside the LAN. Im able to stream my screen using OBS without problem.


    MY SETTINGS:

    In nginx, I'm using the following conf :

    rtmp {
        server {
            listen 1935;
            allow play all;
            chunk_size 4000;
    
            application live {
                live on;
                allow publish all;
                allow play all;
    
                exec_pull c:/nginx/ffmpeg -i "rtsp://127.0.0.1:1935/live/test" -f flv -r -s -an "rtmp://127.0.0.1:1935/live/pc"
    
                #enable HLS
                hls on;
                hls_path "c:/nginx/www/hls";
                hls_fragment 3;
                hls_playlist_length 60;
            }
        }
    }
    

    Im really uncertain with the line about ffmpeg...

    In my android application, its simple, I took the class from androidhive tutorial here : http://www.androidhive.info/2014/06/android-streaming-live-camera-video-to-web-page/ Im using the libstreaming for android : https://github.com/fyhertz/libstreaming

    So, I simply create a RTSP client using the following line :

        Matcher m = uri.matcher("rtsp://127.0.0.1:1935/live/test");
    

    It should create a RTSP client to connect to 127.0.0.1:1935 with RTSP protocol and stream the camera to the "live" application in the "test" channel. So my nginx should receive something on port 1935 and convert the RTSP to RTMP using ffmpeg right ?

    What iam missing to make the whole thing work ? Im getting really stuck and out of idea to try

    Thank you !

  • FFmpeg - feeding output of encode operation to filter

    21 août 2017, par nsp

    I wanted to know if we can feed the output of an encode operation to a "filter_complex" with a command like:

    ffmpeg -i input.mp4 -c:v libx264 -s:v 1920x1080 -b:v 10000k "[encoder-output-1]" \
    -c:v libx264 -s:v 1280x720 -b:v 5000k "[encoder-output-2]" \
    -c:v libx264 -s:v 640x360 -b:v 2000k "[encoder-output-3]" \
    -filter_complex "[encoder-output-1][0:v]psnr" -f null - \
    -filter_complex "[encoder-output-2][0:v]psnr" -f null -\
    -filter_complex "[encoder-output-3][0:v]psnr" -f null - 
    

    If we can do something like this, how should one name the output pad of the encoder, so that one can reference/map it in the filter_complex If not, please let me know what is the easiest way to achieve something like this.

    Note:

    1. I would be using third party encoders that don't have the capability to calculate PSNR scores internally. Thus, I would like to compute the PSNR within an FFmpeg filter.
  • FFMPEG - Filter volume has an unconnected output

    21 août 2017, par Jason Small

    I have the following FFMPEG command:

     ffmpeg -i ./master_video.mp4 -i ./temp/temp1.mp4 -i ./temp/temp2.mp4 -y -filter_complex [0:v]setpts=PTS-STARTPTS[v0];[1:a]asetpts=PTS-STARTPTS,volume=0.1[aud1];[1:v]setpts=PTS-STARTPTS+5/TB,fade=t=in:st=5:d=1:alpha=1,fade=t=out:st=14:d=1:alpha=1[v1];[2:a]asetpts=PTS-STARTPTS,volume=0.1[aud2];[2:v]setpts=PTS-STARTPTS+10/TB,fade=t=in:st=10:d=1:alpha=1,fade=t=out:st=19:d=1:alpha=1[v2];[v0][v1]overlay=eof_action=pass[out1];[out1][v2]overlay=eof_action=pass[out2] -map [out2] -map [aud1][aud2] temp.mp4
    

    But when I run it, I received the following error:

    error: ffmpeg exited with code 1: Filter volume has an unconnected output

    Any ideas why that error is occurring?