Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • use ffmpeg to set start_time equal in audio and video elementary streams

    13 août 2018, par sajad

    I am using ffmpeg tool for offline transcoding of some input files to MPEG-TS format. I use ffprobe to analyze the output. I need the output video to have equal values for start_time for both video and audio elementary streams. This is necessary for streaming by Perception streamer server. My desired output is like this:

            
        ..../ codec_type="video" /.../ start_pts="YYY" start_time="XXX" /..../
            
            ..../ codec_type="audio" /.../ start_pts="YYY" start_time="XXX" /..../
            
        
    

    I use this profile for transcoding:

    -ss 0 -y -vcodec libx264 -vb 3404k -acodec libfdk_aac -profile:a aac_he -strict experimental -ar 48k -f adts -ab 96k -r 25 -g 50 -force_key_frames 'expr:gte(t,n_forced*2)' -x264-params keyint=50:min-keyint=50:scenecut=-1:force-cfr=1:nal-hrd=cbr -vsync 1 -async 1 -profile:v main -level 4.0 -s 1920x1080 -aspect 16:9 -avoid_negative_ts make_zero -strict experimental -muxdelay 0 -muxpreload 0 -output_ts_offset 0 -initial_offset 0 -start_at_zero -bufsize 3500K -minrate 3500K -maxrate 3500K -f mpegts
    

    How can I set start_time and start_pts like I explained?

  • FFMPEG : Fill/Change (part of) audio waveform color as per actual progress with respect to time progress

    13 août 2018, par Software Development Consultan

    I am trying to make command which is generating waveform from mp3 file and show on background image and play audio. Togethr with this, I want to change waveform color left to right (something like progressbar) as per overall video time elapses.

    I have created following command which shows progress bar using drawbox to fill box color as per current time position.

    ffmpeg -y -loop 1 -threads 0 -i sample_background.png -i input.mp3 -filter_complex "color=red@0.5:s=1280x100[Color];[0:v]drawbox=0:155:1280:100:gray@1:t=fill[baserect];[1:a]aformat=channel_layouts=mono,showwaves=s=1280x100:rate=7:mode=cline:scale=sqrt:colors=0xffffff[waveform]; [baserect][waveform] overlay=0:155 [v1];[v1][Color] overlay=x='if(gte(t,0), -W+(t)*64, NAN)':y=155:format=yuv444[v2]" -map "[v2]" -map 1:a -c:v libx264 -crf 35 -ss 0 -t 20 -c:a copy -shortest -pix_fmt yuv420p -threads 0 output_withwave_and_progresbar.mp4

    enter image description here

    But I want to show progress inside generated audio waveform instead of making / filling rectangle using drawbox.

    So I have tried to make 2 waveform of 2 different color and overlay on each other and I wanted to show such a way that top waveform should display only part from x position (left) respective to current time.

    ffmpeg -y -loop 1 -threads 0 -i sample_background.png -i input.mp3 -filter_complex "[0:v]drawbox=0:155:1280:100:gray@1:t=fill[baserect];[1:a]aformat=channel_layouts=mono,showwaves=s=1280x100:rate=7:mode=cline:scale=sqrt:colors=0xff0000[waveform];[1:a]aformat=channel_layouts=mono,showwaves=s=1280x100:rate=7:mode=cline:scale=sqrt:colors=0xffffff[waveform2]; [baserect][waveform] overlay=0:155 [v1];[v1][waveform2] overlay=x='if(gte(t,0), -W+(t)*64, NAN)':y=155:format=yuv444[v2]" -map "[v2]" -map 1:a -c:v libx264 -crf 35 -ss 0 -t 20 -c:a copy -shortest -pix_fmt yuv420p -threads 0 test.mp4

    But I am not able to find way to do Wipe effect from left to right, currently it is sliding (as I am changing x of overlay) It might be done using alpha merge and setting all other pixel to transparent and only show pixels which are less than x pos. but I am not able to find how to do this.

    Background image: enter image description here

    we can use any mp3 file file, currently I have set 20 sec duration.

    Can someone please guide how we can do this?

    Thanks.

  • Error in FFmpeg build with cmake

    13 août 2018, par hogan

    When I build ffmpeg with cmake, I found some errors:

    libavfilter/avf_showcqt.c:147: error: undefined reference to 'av_fft_end'
    libavfilter/avf_showcqt.c:718: error: undefined reference to 'avpriv_vga16_font'
    libavfilter/avf_showcqt.c:1383: error: undefined reference to 'av_fft_init'
    libavfilter/avf_showcqt.c:1151: error: undefined reference to 'av_fft_permute'
    libavfilter/avf_showcqt.c:1152: error: undefined reference to 'av_fft_calc'
    libavfilter/avf_showfreqs.c:183: error: undefined reference to 'av_audio_fifo_free'
    libavfilter/avf_showfreqs.c:184: error: undefined reference to 'av_fft_end'
    libavfilter/avf_showfreqs.c:185: error: undefined reference to 'av_fft_init'
    

    And this is my link configuration:

    lib_avformat
    lib_avcodec
    lib_swscale
    lib_avutil
    lib_avfilter
    lib_swresample
    lib_postproc
    lib_avdevice
    lib_mp3lame
    lib_fdk-aac
    lib_x264
    
  • FFMPEG. Combine rawAudio and rawVideo from named pipe

    13 août 2018, par user3882856

    I have 2 NamedPipeServerStreams. 1 for Audio (Stereo Mix). I am using WavIn from NAudio and 1 for Video (Screen Capture). Just converting BitBlt screen to Byte Array.

    I'm able to create an mp4 video for the rawvideo and changing code, a seperate wav for the rawaudio but am unable to combine/merge both into an mp4 video.

    These are the ffmpeg commands i'm using.

    To create the audio wav file:

    string args = @"-f s32le -channels 2 -sample_rate 44100 -i \\.\pipe\ffpipea -c copy output.wav";
    

    Audio plays really fast but it at least gets it.

    To create the video mp4 file:

    string inputArgs = @"-framerate 8 -f rawvideo -pix_fmt bgr24 -video_size 1920x1080 -i \\.\pipe\ffpipev";
    string outputArgs = "-vcodec libx264 -crf 23 -pix_fmt yuv420p -preset ultrafast -r 8 output.mp4";
    

    My attempt to combine/merge both:

    string args = @"-framerate 8 -f rawvideo -pix_fmt bgr24 -video_size 1920x1080 -i \\.\pipe\ffpipev " +
            @"-f s32le -channels 2 -sample_rate 44100 -i \\.\pipe\ffpipea " +
            "-map 0:0 -map 1:0 -vcodec libx264 -crf 23 -pix_fmt yuv420p -preset ultrafast -r 8 -c:a copy output.mp4";
    

    Depending on what i change in the args, either the WaitForConnection() never fires or the audiopipe breaks with pip is broken.

    Please let me know if i can provide further information. Any help greatly appreciated.

  • Programmatically convert flv to mp4 on iOS

    13 août 2018, par Amitg2k12

    In our iOS app, we will receive flv**(Container)** file having video and audio streaming something like this

    Input #0, flv, from 'test.flv':
    
      Metadata:
        streamName      : flvLiveStream
        encoder         : Lavf55.12.100
      Duration: 00:00:48.00, start: 66064.401000, bitrate: 632 kb/s
        Stream #0:0, 41, 1/1000: **Video: h264 (Baseline)**, 1 reference frame, yuv420p(progressive, left), 1280x720, 0/1, 15 fps, 1k tbr, 1k tbn
        Stream #0:1, 14, 1/1000: **Audio: pcm_alaw,** 8000 Hz, mono, s16, 64 kb/s
    

    and this needs to be converted to mp4 container and format as well, I am trying using ffmpeg, I believe thats only the way , using transcoding.c file but failed at this stage

    Impossible to convert between the formats supported by the filter 'in' and the filter 'auto_scaler_0'
    

    I am trying to learn from OSX command like ffmpeg -I test.flv test.mp4 ,

    is that feasible to port to iOS will it work in all different scenarios,

    In Summary

    --- What is the best possible way to convert flv to mp4 on the iOS device where video will be in h264 and audio will be in swf codec ?