Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Cutting a video to the last 10 seconds

    1er avril 2014, par ShiftedDesign

    So I am looking for a way to cut a video file to the last 10 seconds of the file. Normally I would use FFMPEG or mencoder but the situation is that the file to be cut is going to be a different size every time it's cut. So I found some stuff for a general duration using ffmpeg but the problem is I would need the duration to be in seconds instead of 00:00:00 so then I could just use the var-10 for the -ss.

    I don't know if I explained that well enough but in short:

    Different length video every time it is ran, only need the last 10 seconds in a new file. Need something light on resources but still fast.

    Thanks in advance for anyone who answers!

  • No libffmpeg.so for command line arguments after building FFMPEG for android

    1er avril 2014, par user3280561

    I have followed the tutorial roman10 and the build finishes just fine. But I have .so files for everything except ffmpeg.so(or I see it as libffmpeg.so in some examples). So I dont know how to get that. Do I need to change a configure option? Am am using the following configure options:

    #!/bin/bash
    NDK=$HOME/Desktop/adt/android-ndk-r9
    SYSROOT=$NDK/platforms/android-9/arch-arm/
    TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86_64
    function build_one
    {
    ./configure \
        --prefix=$PREFIX \
        --enable-shared \
        --disable-static \
        --disable-doc \
        --disable-ffmpeg \
        --disable-ffplay \
        --disable-ffprobe \
        --disable-ffserver \
        --disable-avdevice \
        --disable-doc \
        --disable-symver \
        --cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
        --target-os=linux \
        --arch=arm \
        --enable-cross-compile \
        --sysroot=$SYSROOT \
        --extra-cflags="-Os -fpic $ADDI_CFLAGS" \
        --extra-ldflags="$ADDI_LDFLAGS" \
        $ADDITIONAL_CONFIGURE_FLAG
    make clean
    make
    make install
    }
    CPU=arm
    PREFIX=$(pwd)/android/$CPU 
    ADDI_CFLAGS="-marm"
    build_one
    

    Do I even need the libffmpeg.so in order to call the command line arguments?

  • Why are Iphone videos upside down ?

    1er avril 2014, par hendry

    I have a video file captured using the standard Camera app copied off my Iphone5 using btsync:

    orig.mov: ISO Media, Apple QuickTime movie
    

    That plays upside down in mplayer or vlc.

    When I transcode it using AWS's elastictranscoder using a standard preset:

    elastictranscoder.mpg: ISO Media, MPEG v4 system, version 1
    

    That transcoded video file plays back the right way up.

    When transcoding myself, using ffmpeg I need to explicitly rotate the video otherwise the resulting {mp4,webm} files are upside down.

    So I am guessing AWS's elastictranscoder auto rotates the video somehow. However why is the video upside down to begin with?

  • Can you split a file using ffmpeg directly from a db ?

    31 mars 2014, par Jeff

    I am splitting a file using ffmpeg and python. I am able split a file if I input a file path, but if I send in the file itself from my mongodb, it errors out. Is there a way to get ffmpeg to run using the file instead of the path to the file?

    python code:

    args = [ffmpeg_binary,
                             "-v", "quiet",
                             "-y", "-i", video_file, "-vcodec", "copy", "-acodec", "copy",
                     "-ss", "00:00:00", "-t", "00:00:10", "-sn",
                     output_file_name ]
    
    pipe = sp.Popen(args)
    

    if video_file is a gridfs.grid_file.GridOut object or a (gridfs.grid_file.GridOut object).read(), I get:

    Traceback (most recent call last):
      File "C:/dev/youniversity/test.py", line 21, in 
        split_vid_from_file(vid_file)
      File "C:\dev\youniversity\src\lib\ffmpeg_lib.py", line 107, in split_vid_from_file
        pipe = sp.Popen(args)
      File "C:\Python27\lib\subprocess.py", line 709, in __init__
        errread, errwrite)
      File "C:\Python27\lib\subprocess.py", line 957, in _execute_child
        startupinfo)
    TypeError: must be string without null bytes or None, not str
    

    How do I split the video directly from the DB?

    EDIT: I found that you can do this:

    ... "-y", "-", ... 
    
    pipe = sp.Popen(args, stdin=sp.PIPE)
    

    but I don't want the user to type input, I want to pipe in the variable.

  • FFmpeg hangs when using -vsync 1 for libvpx

    31 mars 2014, par user3466413

    I'm trying to transcode a .mov, ultimately generated by GoToMeeting, to a .webm. I've found to get the audio and video in sync for other formats that I need -vsync 1 and encoding at a constant framerate using -r:v 30. This causes FFmpeg to take the input .mov file, which is variable framerate, and transform it to constant framerate at 30 frames per second. In doing so it duplicates frames, but appears to just make many reference frames so the file size doesn't get massive.

    This approach works perfectly fine when encoding using libx264 and aac to a .mp4 container. This makes a file playable in all browsers that support MPEG-4. But, when I encode using using libvpx and libtheora to a .webm container for other browsers, particularly Chrome, FFmpeg just hangs or runs ridiculously slowly. It doesn't respond to pressing Q to abort it, either; I have to kill the process.

    So, I get output that looks like this:

    user@virtualbox:~/Desktop$ ffmpeg -y -i original.mov -vcodec libvpx -acodec libvorbis -r:v 30 -vsync 1 omgwork.webm
    ffmpeg version N-61874-g57e939d Copyright (c) 2000-2014 the FFmpeg developers
      built on Mar 27 2014 11:00:31 with gcc 4.8 (Ubuntu/Linaro 4.8.1-10ubuntu9)
      configuration: --extra-libs=-ldl --enable-gpl --enable-libass --enable-libfdk-aac --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-nonfree --enable-x11grab
      libavutil      52. 69.100 / 52. 69.100
      libavcodec     55. 55.100 / 55. 55.100
      libavformat    55. 35.101 / 55. 35.101
      libavdevice    55. 11.100 / 55. 11.100
      libavfilter     4.  3.100 /  4.  3.100
      libswscale      2.  5.102 /  2.  5.102
      libswresample   0. 18.100 /  0. 18.100
      libpostproc    52.  3.100 / 52.  3.100
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x254b940] decoding for stream 0 failed
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'original.mov':
      Metadata:
        major_brand     : qt  
        minor_version   : 537199360
        compatible_brands: qt  
        creation_time   : 2014-03-20 02:09:22
      Duration: 03:08:26.72, start: 0.000000, bitrate: 75 kb/s
        Stream #0:0(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv), 1368x728 [SAR 1:1 DAR 171:91], 20 kb/s, 0.60 fps, 300 tbr, 600 tbn, 1200 tbc (default)
        Metadata:
          creation_time   : 2014-03-20 02:09:22
          handler_name    : Apple Alias Data Handler
        Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 16000 Hz, mono, fltp, 47 kb/s (default)
        Metadata:
          creation_time   : 2014-03-20 02:09:22
          handler_name    : Apple Alias Data Handler
        Stream #0:2(eng): Data: none (rtp  / 0x20707472), 6 kb/s
        Metadata:
          creation_time   : 2014-03-20 02:09:22
          handler_name    : Apple Alias Data Handler
        Stream #0:3(und): Data: none (rtp  / 0x20707472), 0 kb/s
        Metadata:
          creation_time   : 2014-03-20 02:09:22
          handler_name    : Apple Alias Data Handler
    [libvpx @ 0x255e460] v1.2.0
    Output #0, webm, to 'omgwork.webm':
      Metadata:
        major_brand     : qt  
        minor_version   : 537199360
        compatible_brands: qt  
        encoder         : Lavf55.35.101
        Stream #0:0(und): Video: vp8 (libvpx), yuv420p, 1368x728 [SAR 1:1 DAR 171:91], q=-1--1, 200 kb/s, 1k tbn, 30 tbc (default)
        Metadata:
          creation_time   : 2014-03-20 02:09:22
          handler_name    : Apple Alias Data Handler
        Stream #0:1(eng): Audio: vorbis (libvorbis), 16000 Hz, mono, fltp (default)
        Metadata:
          creation_time   : 2014-03-20 02:09:22
          handler_name    : Apple Alias Data Handler
    Stream mapping:
      Stream #0:0 -> #0:0 (h264 -> libvpx)
      Stream #0:1 -> #0:1 (aac -> libvorbis)
    Press [q] to stop, [?] for help
    frame=  117 fps= 10 q=0.0 size=       4kB time=00:00:03.90 bitrate=   8.0kbits/s dup=111 drop=0
    

    Note how it says it's three minutes into transcoding...and it stays there for minutes on end. This video takes only about 10 minutes to completely transcode to MPEG-4 and there is no delay whatsoever from the Press [q] to stop, [?] for help message and to the first progress message. For WebM, the progress message only appears after a few minutes, and then just crawls along so slowly that it would take literally days to complete.

    When this is happening, FFmpeg maxes out all the CPU cores I give it (for example -threads 4 uses all four CPU cores). Aside from making my CPU burst into flames, it doesn't speed things up at all.

    I've tried:

    • Using Windows and Linux. Both do the same thing.
    • Using the latest available Windows binary and compiling FFmpeg from the Git repository. No improvement or change.
    • Using -vsync 0 and -vsync 2. The file does transcode when I do this, but the audio and video remain out of sync just as it did with other formats I tried. -vsync 1 seems absolutely essential for this to work,
    • Searching, and the closest I could find was bug #3208. But, I'm not specifying -fflags at all; you can see the exact command I'm running.

    Ideas?