Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Convert form 30 to 60fps by increasing speed, not duplicating frames, using FFmpeg

    12 août 2018, par Matt Pellegrini

    I have a video that is incorrectly labelled at 30fps, it is actually 60fps and so looks like it's being played at half speed. The audio is fine, that is, the soundtrack finishes half way through the video clip. I'd like to know how, if possible to fix this, that is double the video speed, making it 60fps and meaning that the audio and video are synced.

    The file is H.264 and the audio MPEG-4 AAC.

    File details as given by ffmpeg, as requested:

    ffmpeg version 0.8.9-6:0.8.9-0ubuntu0.13.10.1, Copyright (c) 2000-2013 the Libav developers
    built on Nov  9 2013 19:09:46 with gcc 4.8.1
    *** THIS PROGRAM IS DEPRECATED ***
    This program is only provided for compatibility and will be removed in a future release. Please use avconv instead.
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from './Tignes60fps.mp4':
      Metadata:
        major_brand     : mp42
        minor_version   : 0
        compatible_brands: isommp42
        creation_time   : 2014-01-13 02:23:09
        Duration: 00:08:33.21, start: 0.000000, bitrate: 5690 kb/s
        Stream #0.0(eng): Video: h264 (High), yuv420p, 1920x1080 [PAR 1:1 DAR 16:9], 5609 kb/s, 29.97 fps, 29.97 tbr, 30k tbn, 59.94 tbc
    Metadata:
        creation_time   : 2014-01-13 02:23:09
        Stream #0.1(eng): Audio: aac, 48000 Hz, stereo, s16, 156 kb/s
    Metadata:
          creation_time   : 2014-01-13 02:23:09
    At least one output file must be specified
    
  • How to stream all videos in a folder ?

    12 août 2018, par Krishnendu

    Hi i want to stream videos over web using ffserver. i got this link as reference.

    Now what i am not able to figure out is how to pass a folder(which content all videos i want to stream) as input to stream all videos. I also want add more videos dynamically to this folder in time to time and streaming should happen(like how it works in Darwin). now i can't use Darwin because it doesn't support for iOS.

    please give me a suggestion.

    is there any other open source tool by which i can do this?

  • How to use ffmpeg for segmentation of rtsp-stream into mov files

    12 août 2018, par user331244

    I have a webcam that gives a RTSP-stream and I want to save it into MOV-chunks of lets say 5 sec. The stream is:

    rtsp://user:pwd@192.168.1.90:10554/tcp/av0_0
    

    I can open the stream and play it in VLC. I run ffmpeg as below, which looks sane, but the output is black video only. Where did I go wrong?

    ffmpeg -rtsp_transport tcp -i "rtsp://user:pwd@192.168.1.90:10554/tcp/av0_0" -f segment -segment_time 5 -segment_format mov -c copy -map 0 video%d.mov
    ffmpeg version 3.1.1 Copyright (c) 2000-2016 the FFmpeg developers
      built with Apple LLVM version 7.3.0 (clang-703.0.31)
      configuration: --prefix=/usr/local/Cellar/ffmpeg/3.1.1 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-opencl --enable-libx264 --enable-libmp3lame --enable-libxvid --enable-ffplay --disable-lzma --enable-vda
      libavutil      55. 28.100 / 55. 28.100
      libavcodec     57. 48.101 / 57. 48.101
      libavformat    57. 41.100 / 57. 41.100
      libavdevice    57.  0.101 / 57.  0.101
      libavfilter     6. 47.100 /  6. 47.100
      libavresample   3.  0.  0 /  3.  0.  0
      libswscale      4.  1.100 /  4.  1.100
      libswresample   2.  1.100 /  2.  1.100
      libpostproc    54.  0.100 / 54.  0.100
    Guessed Channel Layout for Input Stream #0.1 : mono
    Input #0, rtsp, from 'rtsp://user:pwd@192.168.1.90:10554/tcp/av0_0':
      Metadata:
        title           : streamed by the RTSP server
      Duration: N/A, start: 0.230000, bitrate: N/A
        Stream #0:0: Video: h264 (High), yuv420p, 1280x720, 20 fps, 25 tbr, 90k tbn, 40 tbc
        Stream #0:1: Audio: pcm_alaw, 8000 Hz, 1 channels, s16, 64 kb/s
    [segment @ 0x7fda5e012400] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead.
        Last message repeated 1 times
    Output #0, segment, to 'video%d.mov':
      Metadata:
        title           : streamed by the RTSP server
        encoder         : Lavf57.41.100
        Stream #0:0: Video: h264, yuv420p, 1280x720, q=2-31, 20 fps, 25 tbr, 10240 tbn, 40 tbc
        Stream #0:1: Audio: pcm_alaw, 8000 Hz, mono, 64 kb/s
    Stream mapping:
      Stream #0:0 -> #0:0 (copy)
      Stream #0:1 -> #0:1 (copy)
    Press [q] to stop, [?] for help
    [segment @ 0x7fda5e012400] Non-monotonous DTS in output stream 0:0; previous: 0, current: -1946; changing to 1. This may result in incorrect timestamps in the output file.
    [segment @ 0x7fda5e012400] Non-monotonous DTS in output stream 0:0; previous: 1, current: -1536; changing to 2. This may result in incorrect timestamps in the output file.
    [segment @ 0x7fda5e012400] Non-monotonous DTS in output stream 0:0; previous: 2, current: -1126; changing to 3. This may result in incorrect timestamps in the output file.
    [segment @ 0x7fda5e012400] Non-monotonous DTS in output stream 0:0; previous: 3, current: -717; changing to 4. This may result in incorrect timestamps in the output file.
    [segment @ 0x7fda5e012400] Non-monotonous DTS in output stream 0:0; previous: 4, current: -307; changing to 5. This may result in incorrect timestamps in the output file.
    frame=  204 fps= 24 q=-1.0 Lsize=N/A time=00:00:07.89 bitrate=N/A speed=0.937x
    video:2057kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
    Exiting normally, received signal 2.
    
  • make video thumbnail firebase cloud function [duplicate]

    11 août 2018, par abhishek

    This question already has an answer here:

    i am trying to get thumbnail of image on upload of video in storage. that doesn't seem to be working giving me an error.

    const tempThumbnailFilePath=path.join(os.tmpdir(), 'neew.jpg');
    return bucket.file(thumbnailName).download({
         destination: tempThumbnail,
       }).then(()=>{
    return spawn('ffmpeg', ['-ss', '0', '-i', tempThumbnail, '-f', 'image2', '-vframes', '1', '-vf', 'scale=512:-1', tempThumbnailFilePath], { capture: [ 'stdout', 'stderr' ]}).then((writeResult) => {
    console.log('thumnail created');
      console.log('[spawn] stdout: ', writeResult.stdout.toString());
    }).catch(function (err) {
            console.log('[spawn] stdout: ', err);
        });
    

    i get error like this

    [spawn] stdout: { Error: spawn ffmpeg ENOENT at exports._errnoException (util.js:1020:11)

  • FFMPEG Overlay one video on top of another video at specific location

    11 août 2018, par Ewan Sou

    I have two videos (in mp4 format). I would like to put one video on top of the other video

    • Both videos have the same duration
    • The bottom video is of resolution 640px by 640px
    • The top video is also of resolution 640px by 640px

    I need to scale down the top video resolution to 580px by 580px. Then I need to position it in a specific location on top of the bottom video.

    I tried the below quote

    ffmpeg -i bottom.mp4 -i top.mp4 -filter_complex "[0:0][1:0]overlay=enable='between(t\,0,50)'[out]" -shortest -map [out] -map 0:1 -pix_fmt yuv420p -c:a copy -dn -c:v libx264 -crf 18 output.mp4
    

    It does allow me to put the top video on top of the bottom video I think. But the top video was not scaled down in size. Also, this top video is by default on the top left corner of the bottom video.

    In additional, somehow, the very first frame will not show the top video at all. Only at around the 0.5sec will the top video appear. Is there any way to make it such that the top video is showing at the very first frame too?

    Thank you all in advance!