Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Adding a bitmap over video using ffmpeg or MediaMuxer for android

    12 septembre 2014, par Alin

    I am stuck in this area which I am not comfortable at all to work in.

    Here is what I did so far:

    • Made an Ubuntu VirtualBox machine
    • Downloaded latest ffmpeg version which is 2.3.3
    • Compiled ffmpeg to be compatible with armv7-a so in the end I get two folders: include and lib. In include I have the headers and in libs the *.so files (just as in http://www.roman10.net/how-to-build-ffmpeg-with-ndk-r9/)

    enter image description here

    I have created a new android project and made a jni folder and this is how far I went... Even this, with all the struggle being new to linux and compiling took me almost a week to reach.

    Adding a watermark in ffmpeg I believe it is done on libavfilter ? I have to dig on this matter, however the original ffmpeg I need to translate into my project is:

    ffmpeg -i input.avi -i logo.png -filter_complex 'overlay=10:main_h-overlay_h-10' output.avi
    

    As far as I am studying now I need to do inside jni:

    • create a add_watermark.c file in which I need to somehow call the function that does the filter overlay call
    • create Android.mk to load this and the ffmpeg needed libraries

      LOCAL_PATH := $(call my-dir)

      include $(CLEAR_VARS)

      LOCAL_MODULE := add-watermark

      LOCAL_SRC_FILES := add-watermark.c

      LOCAL_LDLIBS := -llog -ljnigraphics -lz

      LOCAL_SHARED_LIBRARIES := libavformat libavcodec libswscale libavutil

      include $(BUILD_SHARED_LIBRARY)

      $(call import-module,ffmpeg-2.3.3/android/armv7-a)

    • create Application.mk

      APP_ABI := armeabi-v7a

      APP_PLATFORM := android-8

    • run ndk-build and use the generated libraries in my android project.

    I really need help on continuing, so every answer is received with great attention and pleasure.

    Later Edit: Would it be possible to somehow build ffmpeg.exe as a library and call its main with the exact same parameters as the original exe ? I do not want to run ffmpeg as a standalone executable, but have it integrated within the project. Something like http://www.roman10.net/how-to-port-ffmpeg-the-program-to-androidideas-and-thoughts/ What downsides would this approach have ?

    Later edit 2: if this is possible by using MediaMuxer or other APIs added in android 4.3 I am open to it you sample codes are provided. I did look over the MediaCodec and MediaMuxer samples also Grafik and haven't found a proper way to do what I wanted. I prefer ffmpeg approach better if it works

  • Android + ffmpeg + AudioTrack produces bad audio output

    12 septembre 2014, par Goddchen

    here is what I am trying to do: use an AudioRecord and "pipe" the output of AudioRecord.read(byte[],...) to an ffmpeg process' stdin that will convert to a 3gp (AAC) file.

    The ffmpeg call is as follows:

            ProcessBuilder processBuilder = new ProcessBuilder(BINARY.getAbsolutePath(),
                    "-y",
                    "-ar", "44100", "-c:a", "pcm_s16le", "-ac", "1","-f","s16le",
                    "-i", "-",
                    "-strict", "-2", "-c:a", "aac",
                    outFile.getAbsolutePath());
    

    The AudioRecord is setup as follows:

     AudioRecord record = new AudioRecord(/*AudioSource.VOICE_RECOGNITION,*/ AudioSource.MIC,
                SAMPLING_RATE,
                AudioFormat.CHANNEL_IN_MONO,
                AudioFormat.ENCODING_PCM_16BIT,
                bufferSize);
    

    SAMPLING_RATE = 44100 and bufferSize is the one returned by AudioRecord.getMinBufferSize(...)

    I am writing the data to ffmpeg like this:

    try {
                            IOUtils.write(data, getFFmpegHelper().getCurrentProcessOutputStream());
                        } catch (Exception e) {
                            Log.e(Application.LOG_TAG, "Error writing data to ffmpeg process", e);
                            //TODO notify user, stop the recording, etc...
                        }
    

    So far so good, the ffmpeg runs and created a proper 3gp file. But the audio in the file is totally off. It seems "choppy" (not sure if this is the correct english word ;) ) and also the pace is wrong, is plays too fast.

    Check out this sample: http://goddchen.de/android/tmp/tmp.3gp

    This is the output of the ffmpeg process:

        [s16le @ 0x23634d0] Estimating duration from bitrate, this may be inaccurate
        Guessed Channel Layout for  Input Stream #0.0 : mono
        Input #0, s16le, from 'pipe:':
        Duration: N/A, start: 0.000000, bitrate: 705 kb/s
        Stream #0:0: Audio: pcm_s16le, 44100 Hz, mono, s16, 705 kb/s
        [aformat @ 0x2363100] auto-inserting filter 'auto-inserted resampler 0' between the filter 'src' and the filter 'aformat'
        [aresample @ 0x235b0a0] chl:mono fmt:s16 r:44100Hz -> chl:mono fmt:flt r:44100Hz
        Output #0, 3gp, to '/data/data/com.test.audio/files/tmp.3gp':
        Metadata:
        encoder         : Lavf54.6.100
        Stream #0:0: Audio: aac (mp4a / 0x6134706D), 44100 Hz, mono, flt, 128 kb/s
        Stream mapping:
        Stream #0:0 -> #0:0 (pcm_s16le -> aac)
        size=       3kB time=00:00:00.18 bitrate= 132.5kbits/s    
    size=       8kB time=00:00:00.55 bitrate= 120.9kbits/s    
    size=      12kB time=00:00:00.83 bitrate= 121.8kbits/s    
    size=      16kB time=00:00:01.04 bitrate= 122.8kbits/s    
    size=      20kB time=00:00:01.32 bitrate= 122.5kbits/s    
    size=      23kB time=00:00:01.53 bitrate= 121.6kbits/s    
    size=      27kB time=00:00:01.81 bitrate= 121.0kbits/s    
    size=      31kB time=00:00:02.11 bitrate= 120.7kbits/s    
    size=      35kB time=00:00:02.32 bitrate= 123.4kbits/s
        video:0kB audio:34kB global headers:0kB muxing overhead 3.031610%
    
  • To large apk file when using .so files

    12 septembre 2014, par Arslan

    I am using a 3rd party API for camera library that is using ffmpeg frame recorder and .so files along with javacv.

    I am also using a 3rd party library for extracting meta data out from a video which also has some .so files

    When i merged all these files into only one folder "armeabi" my application did not work. So I have to copy all these files to all other folders "armeabi-v7a, mips, x86". Which works perfectly fine but obviously the size of the apk is now too large.

    Total size of these file are 20MB per folder. That make 80MB for all folder. My .apk size is 41MB. Please suggest ma what I can do to remove duplicate files/folders or reduce .apk size.

    Why I need all these folders armeabi, armeabi-v7a, mips, x86 enter image description here

    .so files in folder enter image description here

  • How to stream live from iphone camera to server using rtsp ?

    12 septembre 2014, par Vishal Gupta

    I am using AVCaptureSession to record a video and audio of user. I am getting real time video and audio streams independently. I am able to encode them using h264 encoder and aac encoder respectively. Now I am not getting how to multiplexing them both and make a stream ? How to send them to specific server url which is protected by userName and Password ? If it can be done using RTMP also then also it's fine.

    I have taken a reference from here! But I am not getting much out from this.

    Is there any RTSP library project which can help me ?

    I have been struggling in it from a long.

    Is there any solution to my problem ?

    Thanks in advance.

  • Video rotated using Android ffmpeg Library

    12 septembre 2014, par Adroid Freak

    I'm using Android ffmpeg Library, The library is working fine, the only issue I'm having is that the video is rotated. The original video is rotated by 90 degree, I tried many options by adding them to the library function below, the commented code is a sample of what I tried, I tried many other options but none of them is working.

    Anyone of you guys got it to work?

        public void processVideo(Clip in, Clip out, boolean enableExperimental, ShellCallback sc) throws Exception {
    
            ArrayList cmd = new ArrayList();
    
            cmd.add(mFfmpegBin);
            cmd.add("-y");
    
            if (in.format != null)
            {
                cmd.add(Argument.FORMAT);
                cmd.add(in.format);
            }
    
            if (in.videoCodec != null)
            {
                cmd.add(Argument.VIDEOCODEC);
                cmd.add(in.videoCodec);
            }
    
            if (in.audioCodec != null)
            {
                cmd.add(Argument.AUDIOCODEC);
                cmd.add(in.audioCodec);
            }
    
            cmd.add("-i");
            cmd.add(new File(in.path).getCanonicalPath());
    
            if (out.videoBitrate > 0)
            {
                cmd.add(Argument.BITRATE_VIDEO);
                cmd.add(out.videoBitrate + "k");
            }
    
            if (out.width > 0)
            {
                cmd.add(Argument.SIZE);
                cmd.add(out.width + "x" + out.height);
    
            }
            if (out.videoFps != null)
            {
                cmd.add(Argument.FRAMERATE);
                cmd.add(out.videoFps);
            }
    
            if (out.videoCodec != null)
            {
                cmd.add(Argument.VIDEOCODEC);
                cmd.add(out.videoCodec);
            }
    
            if (out.videoBitStreamFilter != null)
            {
                cmd.add(Argument.VIDEOBITSTREAMFILTER);
                cmd.add(out.videoBitStreamFilter);
            }
    
    
            if (out.videoFilter != null)
            {
                cmd.add("-vf");
                cmd.add(out.videoFilter);
            }
    
            if (out.audioCodec != null)
            {
                cmd.add(Argument.AUDIOCODEC);
                cmd.add(out.audioCodec);
            }
    
            if (out.audioBitStreamFilter != null)
            {
                cmd.add(Argument.AUDIOBITSTREAMFILTER);
                cmd.add(out.audioBitStreamFilter);
            }
            if (out.audioChannels > 0)
            {
                cmd.add(Argument.CHANNELS_AUDIO);
                cmd.add(out.audioChannels+"");
            }
    
            if (out.audioBitrate > 0)
            {
                cmd.add(Argument.BITRATE_AUDIO);
                cmd.add(out.audioBitrate + "k");
            }
    
            if (out.format != null)
            {
                cmd.add("-f");
                cmd.add(out.format);
            }
    
            if (enableExperimental)
            {
                cmd.add("-strict");
                cmd.add("-2");//experimental
            }
    //      cmd.add("-metadata:s:v rotate=\"0\"");
    //      cmd.add("rotate=\"0\"");
    
            cmd.add(new File(out.path).getCanonicalPath());
    
            execFFMPEG(cmd, sc);
    
        }
    

    This is how I'm using it btw,

    FfmpegController fc = new FfmpegController(this, fileTmp);
    
    clip_out.videoFps = "30";
    clip_out.width = 480;
    clip_out.height = 320;
    clip_out.videoCodec = "mpeg4";
    clip_out.audioCodec = "copy";
    clip_out.videoBitrate = 450;
    
         fc.processVideo(clip_in, clip_out, false, new ShellUtils.ShellCallback()