Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • ffmpeg audio convert not working

    11 janvier 2012, par Andrew

    I cannot get the ffmpeg audio convert working on my site. The idea is that file should be converted when it's uploaded on the site.

    I've got this on my upload form determining the audio file's format:

    if(isset($_POST['audio']) && $_POST['audio'] != ''){
                    $ext1 = substr($_POST['audio'], -4);
    

    This is the best I've come up with for converting m4a to mp3:

    if(isset($_POST['audio']) && $_POST['audio'] != ''){ $file = $_POST['audio'];     if($ext1==".m4a"){ $call="/usr/local/bin/ffmpeg -i ".$file." -ab 192k -f -acodec mp3";}
    $convert = (popen("start /b ".$call, "r")); pclose($convert);
    

    The problem is, it won't convert. The path to ffmpeg is correct.

    Now I may be way over my head with this one, but if there's a simple solution for this, I'd love to hear it.

    EDIT.

    With this:

    if(isset($_POST['audio']) && $_POST['audio'] != ''){
        $file = $_POST['audio'];
                        $ext1 = substr($_POST['audio'], -4); /*get the last 4 chars*/
                        $mp3 = echo 'mp3';
    if($ext1=".m4a"){ 
    "/usr/local/bin/ffmpeg -i \"". $file . "\" -ab 192k -y -f mp3 \"".ext1.$mp3."\"";
    }
    }
    

    I think I'm right on the money with conversion itself, but the form just loads infinitly when submitted. So I'm guessing the conversion is happening, but the form does not know when it's done. Any ideas on that?

  • Encoding Raw PCM data to AAC using ffmpeg in Android

    10 janvier 2012, par NISHAnT

    Now, i m using libfaac directly to convert raw PCM data to AAC in JNI. Frames Encoded successfully still bytesWritten is always 0 it means there will b some problem in code Here is my Code.

    JNIEXPORT
    jbyteArray JNICALL Java_com_encodePCMFrame(JNIEnv * env,
        jclass clazz,short* data,jint bitRate,jint sampleSize,jint channelConfig)
    {
    
    
      faacEncHandle hEncoder;
        unsigned long samplesInput, maxBytesOutput, totalBytesWritten;
        faacEncConfigurationPtr faacPtr;
        char *faac_id_string;
        char *faac_copyright_string;
        unsigned long inputsamples;
        unsigned long maxoutputbytes;
        unsigned char* bitbuf;
        int bytesWritten;
        jbyteArray AACframe;
        jsize size;
    
    
        if(faacEncGetVersion(&faac_id_string, &faac_copyright_string) == FAAC_CFG_VERSION)
        {
          __android_log_print(ANDROID_LOG_VERBOSE, APPNAME, "\nFAAC_ID_STRING %s\n\n ", faac_id_string);
        }
    
    
        hEncoder = faacEncOpen(sampleSize, channelConfig,&inputsamples, &maxoutputbytes);
    
        if(hEncoder)
        {
           __android_log_print(ANDROID_LOG_VERBOSE, APPNAME, "AAC Codec Open (samplesize = %d)\n (channelConfig = %d)\n (input Samples = %d)\n(Max OUTPUT BYTES = %d)\n (bitRate = %d)...",sampleSize,channelConfig,inputsamples,maxoutputbytes,bitRate);
        }
    
    
        faacPtr = faacEncGetCurrentConfiguration(hEncoder);
    
        faacPtr->aacObjectType = MAIN;
        faacPtr->mpegVersion = MPEG2;
        faacPtr->outputFormat = 1; //ADTS
        faacPtr->bitRate = bitRate;
        faacPtr->inputFormat = FAAC_INPUT_16BIT;
    
    
    
        if (faacEncSetConfiguration(hEncoder, faacPtr)==0)
        {
             __android_log_print(ANDROID_LOG_VERBOSE, APPNAME,"fail to set");
             faacEncClose ( hEncoder );
             hEncoder =0;
        }
    
    
    
            bitbuf = (unsigned char*)malloc(maxoutputbytes*sizeof(unsigned char));
    
           bytesWritten = faacEncEncode(hEncoder,(int32_t *)data,inputsamples,bitbuf,maxoutputbytes);
    
    
        if(bytesWritten<=0)
        {
                __android_log_print(ANDROID_LOG_VERBOSE, APPNAME, "Can Not Encode Frame... bytesWritten = %d ",bytesWritten);
                faacEncClose(hEncoder);
        }
        else
        {
               __android_log_print(ANDROID_LOG_VERBOSE, APPNAME, "Bytes Written = %d ",bytesWritten);
    
               __android_log_print(ANDROID_LOG_VERBOSE, APPNAME, "Encoding frame %d ",bitbuf);
        }
    
    
        AACframe = (*env)->NewByteArray(env,maxoutputbytes);
    
        (*env)->SetByteArrayRegion(env,AACframe, 0,maxoutputbytes, bitbuf);
    
        __android_log_print(ANDROID_LOG_VERBOSE, APPNAME, "Buffer AAC Frame == %d    Allocated...  Size == %d ",AACframe,maxoutputbytes);
    
    
        return AACframe;
    
        av_free(bitbuf);
        av_free(data);
        (*env)->ReleaseByteArrayElements(env, AACframe, 0, JNI_ABORT);
    
     }
    

    Thanks in Advance.

  • Combine multiple videos into one

    10 janvier 2012, par StackedCrooked

    I have three videos:

    • a lecture that was filmed with a video camera
    • a video of the desktop capture of the computer used in the lecture
    • and the video of the whiteboard

    I want to create a final video with those three components taking up a certain region of the screen.

    Is open-source software that would allow me to do this (mencoder, ffmpeg, virtualdub..)? Which do you recommend?

    Or is there a C/C++ API that would enable me to create something like that programmatically?

    Edit
    There will be multiple recorded lectures in the future. This means that I need a generic/automated solution.

    I'm currently checking out if I could write an application with GStreamer to do this job. Any comments on that?

    Solved!
    I succeeded in doing this with GStreamer's videomixer element. I use the gst-launch syntax to create a pipeline and then load it with gst_parse_launch. It's a really productive way to implement complex pipelines.

    Here's a pipeline that takes two incoming video streams and a logo image, blends them into one stream and the duplicates it so that it simultaneously displayed and saved to disk.

      desktop. ! queue
               ! ffmpegcolorspace
               ! videoscale
               ! video/x-raw-yuv,width=640,height=480
               ! videobox right=-320
               ! ffmpegcolorspace
               ! vmix.sink_0
      webcam. ! queue
              ! ffmpegcolorspace
              ! videoscale
              ! video/x-raw-yuv,width=320,height=240
              ! vmix.sink_1
      logo. ! queue
            ! jpegdec
            ! ffmpegcolorspace
            ! videoscale
            ! video/x-raw-yuv,width=320,height=240
            ! vmix.sink_2
      vmix. ! t.
      t. ! queue
         ! ffmpegcolorspace
         ! ffenc_mpeg2video
         ! filesink location="recording.mpg"
      t. ! queue
         ! ffmpegcolorspace
         ! dshowvideosink
      videotestsrc name="desktop"
      videotestsrc name="webcam"
      multifilesrc name="logo" location="logo.jpg"
      videomixer name=vmix
                 sink_0::xpos=0 sink_0::ypos=0 sink_0::zorder=0
                 sink_1::xpos=640 sink_1::ypos=0 sink_1::zorder=1
                 sink_2::xpos=640 sink_2::ypos=240 sink_2::zorder=2
      tee name="t"
    
  • How do I encode a single image into multiple frames using ffmpeg ?

    10 janvier 2012, par cubabit

    I want to encode a single image for a set number of frames into an mpeg2 movie. i want it in mpeg2 so I can concatenate it later. I have tried:

    ffmpeg -b 800000 -loop_input -i input.jpg -vcodec mpeg2video -vframes 30 output.mpg
    

    But it seems to end up just one frame long and I'm not sure if it even has the correct codec.

  • Using ffmpeg to convert sound files for use in an android app

    10 janvier 2012, par stefs

    short: i'm trying to simply play a sound file converted with ffmpeg in my android app, but happen to have problems getting it to work.

    long: we have an iphone app and an android app doing the same thing, and i have to port the feature playing a sound on an user interaction. i have the source file in the aiff format, and tried to convert it to mp3 for android. but the app keeps crashing when it tries to load the file

    AssetFileDescriptor fileDescriptor = context.getResources().openRawResourceFd(resid);
    final MediaPlayer mp = new MediaPlayer();
    mp.setDataSource(fileDescriptor.getFileDescriptor(), fileDescriptor.getStartOffset(), fileDescriptor.getLength());
    fileDescriptor.close();
    mp.prepare();
    

    more specifically, mp.setDataSource crashes. some digging around led me to believe that something's wrong with the encoding. the sound file itself resides in res/raw.

    11-29 17:11:48.012: ERROR/SoundManager(15580): java.io.IOException: setDataSourceFD failed.: status=0x80000000
    11-29 17:11:48.012: ERROR/SoundManager(15580):     at android.media.MediaPlayer.setDataSource(Native Method)
    ...
    

    what i tried:

    • using a different mp3 that's already used with the same code in a different place. this works.
    • converted it to wav file. this didn't cause the app to crash, but it neither played a sound. that might be a different problem.
    • converted it to ogg; crashed

    so, the the ffmpeg conversion parameters are as follows:

    $ ffmpeg -i click_24db.aif -f mp3 ~/foobar/wheel_click.mp3
    ffmpeg version 0.7.8, Copyright (c) 2000-2011 the FFmpeg developers
      built on Nov 24 2011 14:31:00 with gcc 4.2.1 (Apple Inc. build 5666) (dot 3)
      configuration: --prefix=/opt/local --enable-gpl --enable-postproc --enable-swscale --enable-avfilter --enable-libmp3lame --enable-libvorbis --enable-libtheora --enable-libdirac --enable-libschroedinger --enable-libopenjpeg --enable-libxvid --enable-libx264 --enable-libvpx --enable-libspeex --mandir=/opt/local/share/man --enable-shared --enable-pthreads --cc=/usr/bin/gcc-4.2 --arch=x86_64 --enable-yasm
      libavutil    50. 43. 0 / 50. 43. 0
      libavcodec   52.123. 0 / 52.123. 0
      libavformat  52.111. 0 / 52.111. 0
      libavdevice  52.  5. 0 / 52.  5. 0
      libavfilter   1. 80. 0 /  1. 80. 0
      libswscale    0. 14. 1 /  0. 14. 1
      libpostproc  51.  2. 0 / 51.  2. 0
    Input #0, aiff, from 'click_24db.aif':
      Duration: 00:00:00.01, start: 0.000000, bitrate: 1570 kb/s
        Stream #0.0: Audio: pcm_s16be, 44100 Hz, 2 channels, s16, 1411 kb/s
    Output #0, mp3, to '/Users/xyz/foobar/wheel_click.mp3':
      Metadata:
        TSSE            : Lavf52.111.0
        Stream #0.0: Audio: libmp3lame, 44100 Hz, 2 channels, s16, 64 kb/s
    Stream mapping:
      Stream #0.0 -> #0.0
    Press [q] to stop, [?] for help
    size=       1kB time=00:00:00.05 bitrate=  92.9kbits/s    
    video:0kB audio:0kB global headers:0kB muxing overhead 45.563549%
    

    the resulting file plays nice in itunes, does not play in vlc and crashes when loaded with the android.media.MediaPlayer (note: i first tried it with the SoundPool lib, with both mp3 and ogg, but that didn't work either).

    i also tried the following paramters, which didn't work:

    ffmpeg -i inputfile.aif -f mp3 -acodec libmp3lame -ab 192000 -ar 44100 outputfile.mp3
    

    i'm working on osx, built ffmpeg with macports today, android api level is 7 (google api, 2.1-update1). looking at the "supported formats" table on dev.android didn't indicate my file to be out of the spec, but i may be mistaken in that.

    i don't have the slightest clue regarding bitrates and so on, so could anybody please point me to the right combination of ffmpeg parameters to get a working mp3 for android? i don't care if the resulting file would be mp3, ogg or 3gp or whatever.