Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Thumb image from video using streamio-ffmpeg gem without loosing aspect ratio and given size while uploading using carrierwave

    6 mars 2013, par Krishna

    I am working on a project which require user to upload video, While they upload the video I will be converting the video to flv and mp4 format, and I need to take 2 screenshots for thumb image.

    I am using streamio-ffmpeg gem and CarrierWave the code for video conversion is below.

    movie = FFMPEG::Movie.new(file_name)
    movie.transcode("output_mp4.mp4", "-sameq -s 640x480 -deinterlace -b 512000 -vol 320 -ar 44100 -r 25")           
    movie.transcode("output_flv.flv", "-deinterlace -qscale 1 -b 512000 -vol 320 -ar 44100 -r 25")  
    

    This will convert the video to mp4 and flv format.

    The below code is used for creating thumb image, but the images look like stretched since not maintaining the aspect-ratio.

      movie.screenshot("output_thumb1.jpg", :resolution => '40x40')         
      movie.screenshot("output_thumb2.jpg", :resolution => '80x80')       
    

    So after looking at streamio-ffmpeg gem documentation, I came up with this solution.

    movie.screenshot("output_thumb.jpg", { seek_time: 2, resolution: '80x80' }, preserve_aspect_ratio: :width, preserve_aspect_ratio: :height)
    

    This Will preserve the aspect ratio but the image size is changed, its not '80x80'.

    My requirement is I need a thumb image of size 80x80 but it should not look stretched

    I am actually looking for something similar to resize_and_pad which CarrierWave::RMagick have.

  • change wav, aiff or mov audio sample rate of MOV or WAV WITHOUT changing number of samples

    6 mars 2013, par John Pilgrim

    I need a very precise way to speed up audio. I am preparing films for OpenDCP, an open-source tool to make Digital Cinema Packages, for screening in theaters. My source files are usually quicktime MOV files at 23.976fps with 48.000kHz audio. Sometimes my audio is a separate 48.000kHz WAV. (FWIW, the video frame rate of the source is actually 24/100.1 frames per second, which is a repeating decimal.)

    The DCP standard is based around a 24.000fps and 48.000kHz program, so the audio and video of the source need to be sped up. The image processing workflow inherently involves converting the MOV to a TIF sequence, frame-per-frame, which is then assumed to be 24.000fps, so I don't have to get involved in the internals of the QT Video Media Handler.

    But speeding up the audio to match is proving to be difficult. Most audio programs cannot get the number of audio samples to line up with the retimed image frames. A 0.1% speed increase in Audacity results in the wrong number of samples. The only pathway that I have found that works is to use Apple Cinema Tools to conform the 23.976fps/48.000kHz MOV to 24.000fps/48.048kHz (which it does by changing the Quicktime headers) and then using Quicktime Player to export the audio from that file at 48.000kHz, resampling it. This is frame accurate.

    So my question is: are there settings in ffmpeg or sox that will precisely speed up the audio in a MOV or in a WAV or AIFF precisely? I would like a cross platform solution, so I am not dependent on Cinema Tools, which is only MacOS.

    I know this is a LOT of background. Feel free to ask clarifying questions!

  • Run ffmpeg on Terminal MacOS [closed]

    6 mars 2013, par user2139175

    I'm rather new to MacOS and I cannot find it easy to working with Terminal to get ffmpeg run properly as I have on Window.

    I have got ffmpeg binary from http://ffmpegmac.net and I try running the executable in Terminal, it tells that the command not found ? The only way I can run it by now is using command : "open ffmpeg", but this way the Terminal open in another window and this is not what I'm expecting.

    Is there any ways to call ffmpeg directly in Terminal (just like run in on Windows cmd) or I'm missing anything ?

  • Capture Android screen as a video file using FFmpeg

    6 mars 2013, par jacob

    I am trying to capture an Android device's screen as a video file using FFmpeg with this command:

    /data/local/ffmpeg -y -vcodec rawvideo -f rawvideo -pix_fmt rgb32 -s 320x480 -i /dev/graphics/fb0 /sdcard/output2.avi 2> /sdcard/out.txt
    

    This creates a file with a single (unclear) frame and stops.

    ffmpeg version 0.11.1 Copyright (c) 2000-2012 the FFmpeg developers
      built on Sep 20 2012 13:28:38 with gcc 4.6.x-google 20120106 (prerelease)
      configuration: --arch=arm --cpu=cortex-a8 --target-os=linux --enable-runtime-cpudetect --prefix=/data/data/org.witness.sscvideoproto --enable-pic --disable-shared --enable-static --cross-prefix=/opt/android-ndk/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86/bin/arm-linux-androideabi- --sysroot=/opt/android-ndk/platforms/android-3/arch-arm --extra-cflags='-I../x264 -mfloat-abi=softfp -mfpu=neon' --extra-ldflags=-L../x264 --enable-version3 --enable-gpl --disable-doc --enable-yasm --enable-decoders --enable-encoders --enable-muxers --enable-demuxers --enable-parsers --enable-protocols --enable-filters --enable-avresample --enable-libfreetype --disable-indevs --enable-indev=lavfi --disable-outdevs --enable-hwaccels --enable-ffmpeg --disable-ffplay --disable-ffprobe --disable-ffserver --disable-network --enable-libx264 --enable-zlib
      libavutil      51. 54.100 / 51. 54.100
      libavcodec     54. 23.100 / 54. 23.100
      libavformat    54.  6.100 / 54.  6.100
      libavdevice    54.  0.100 / 54.  0.100
      libavfilter     2. 77.100 /  2. 77.100
      libswscale      2.  1.100 /  2.  1.100
      libswresample   0. 15.100 /  0. 15.100
      libpostproc    52.  0.100 / 52.  0.100
    [rawvideo @ 0xee5540] Estimating duration from bitrate, this may be inaccurate
    Input #0, rawvideo, from '/dev/graphics/fb0':
      Duration: N/A, start: 0.000000, bitrate: N/A
        Stream #0:0: Video: rawvideo (BGRA / 0x41524742), bgra, 320x480, 25 tbr, 25 tbn, 25 tbc
    [buffer @ 0xef16e0] w:320 h:480 pixfmt:bgra tb:1/25 sar:0/1 sws_param:flags=2
    [buffersink @ 0xef1950] No opaque field provided
    [format @ 0xef1a70] auto-inserting filter 'auto-inserted scaler 0' between the filter 'src' and the filter 'format'
    [scale @ 0xef2c10] w:320 h:480 fmt:bgra sar:0/1 -> w:320 h:480 fmt:yuv420p sar:0/1 flags:0x4
    Output #0, avi, to '/sdcard/output2.avi':
      Metadata:
        ISFT            : Lavf54.6.100
        Stream #0:0: Video: mpeg4 (FMP4 / 0x34504D46), yuv420p, 320x480, q=2-31, 200 kb/s, 25 tbn, 25 tbc
    Stream mapping:
      Stream #0:0 -> #0:0 (rawvideo -> mpeg4)
    Press [q] to stop, [?] for help
    frame=    5 fps=0.0 q=5.4 Lsize=     199kB time=00:00:00.20 bitrate=8156.6kbits/s    
    video:193kB audio:0kB global headers:0kB muxing overhead 2.929166%
    

    Any idea what am I doing wrong?

  • rawvideo and rgb32 values passed to FFmpeg

    6 mars 2013, par Brian

    I'm converting a file to PNG format using this call:

    ffmpeg.exe -vframes 1 -vcodec rawvideo -f rawvideo -pix_fmt rgb32 -s x -i infile -f image2 -vcodec png out.png
    

    I want to use a converter that can be linked or compiled into a closed-source commercial product, unlike FFmpeg, so I need to understand the format of the input file I'm passing in.

    So, what does rawvideo mean to FFmpeg?

    Is FFmpeg determining what type of raw format the input file has, or does rawvideo denote something distinct?

    What does rgb32 mean here?

    The size of the input file is a little more than (width * height * 8) bytes.