Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • MediaPlayer miscalculates audio duration. getCurrentPosition() goes back when audio reaches the end

    26 mars 2017, par Kacy

    I'm streaming .ogg files from my server. The MediaPlayer plays the song to completion, but when it reaches the end of the song, getCurrentPosition() returns a time less than the actual duration of the song. For 1 song it's only off by a second, but for another song it's off by 13 seconds.

    I have a SeekBar that's updated every second by calling getCurrentPosition(), and once the bar reaches the end, it actually jumps back a few seconds. And I can no longer use seekTo() into the segment of the song that's past the miscalculated duration because I receive the error: Attempt to seek to past end of file.

    Chrome's html video element simply displays the songs with the incorrect durations.

    Other tools like ffmpeg or this app on Windows 10 called PowerMediaPlayer both calculate the songs' durations perfectly.

    I believe this problem is related to this answer, but the solution assumes ffmpeg also miscalculates the duration, which is not the case here. Using the -vn flag when converting audio files to ogg didn't change anything.

    What are my options? Is there a way to get the MediaPlayer to calculate the correct duration?

    Update:

    Converting to mp3 has no issues (although I used audioBitrate('96k')instead of audioQuality(2), but I require a solution to get ogg files working. Below is how I'm using ffmpeg to perform the conversion. Another thing worth noting is that when I used 'vorbis' instead of 'libvorbis' for the codec the durations were off by less than a second, if at all. However, I must use 'libvorbis' because 'vorbis' is experimental and results in significanlty inconsistent bitrates across different files given the same audio quality parameter. (And it ignores audioBitrate() entirely.)

    ffmpeg( filepath )
        .toFormat( 'ogg' )
        .audioCodec( 'libvorbis' )
        .audioQuality( 2 )
        .output( destination )
        .run();
    
  • How can I find video rotation and rotate the clip accordingly using moviepy ?

    26 mars 2017, par Gloin

    I'm using moviepy to import some videos, but the videos that should be in portrait mode are imported in landscape. I need to check whether the rotation has been changed, and if it has, rotate it back.

    Is this functionality built into moviepy? If not, how else can I check it?

  • avformat_write_header is not working properly in ffmepg

    26 mars 2017, par ganesh kumar

    I was working on mp4 file creation project using FFMPEG, i tried to convert the stream information of video packet based on FFMPEG muxing,remuxing code, but header get damaged after convert into so file is corrupted.

    /*this code using set the stream information */

    AVFormatContext *input_context,*output_context;
        AVDictionary *opt;
       AVStream *out_stream;
       AVCodecContext *newcontext=NULL;
       out_stream= avformat_new_stream(output_context,NULL);
                    newcontext = avcodec_alloc_context3(codec);
                    newcontext->codec_id=Output_fmt->video_codec;
                    newcontext->bit_rate =in_stream->codec->bit_rate;
                    newcontext->width    = in_stream->codec->width;
                    newcontext->height   = in_stream->codec->height;
                    newcontext->timecode_frame_start=in_stream->codec->timecode_frame_start;
                    newcontext->gop_size      = in_stream->codec->gop_size;
                    newcontext->profile       =  in_stream->codec->profile;
                    newcontext->level         =in_stream->codec->level;
                    newcontext->pix_fmt       = PIX_FMT_YUV420P;
                    newcontext->frame_size=in_stream->codec->frame_size;
                    newcontext->sample_fmt=in_stream->codec->sample_fmt;
                    newcontext->sample_rate=in_stream->codec->sample_rate;
                     time_base=  (double)in_stream->time_base.num / (double)in_stream->time_base.den;
                     duration= (double)in_stream->duration * time_base * 1000.0;
                    if (!out_stream) {
                        fprintf(stderr, "Failed allocating output stream\n");
                        ret = AVERROR_UNKNOWN;
                        return;
                    }
                   ret = avcodec_copy_context(out_stream->codec,newcontext);
                    if (ret < 0) {
                        fprintf(stderr, "Failed to copy context from input to output stream codec context\n");
                        goto end;
                    }
                    out_stream->codec->codec_tag = 0;
                    if (output_context->oformat->flags & AVFMT_GLOBALHEADER)
                        out_stream->codec->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;
    

    Changed the Header Information using : /*this code using set the metadata */

    av_dict_set(&opt, "major_brand", "mp42", 0);
        av_dict_set(&opt, "minor_version","512" , 0);
        av_dict_set(&opt, "compatible_brands","isomiso2avc1mp41",0);
        av_dict_set(&opt, "comment","Hash=855738390",0);
        output_context->metadata = opt;
        ret = avformat_write_header(output_context,NULL);
    

    after create the mp4 file check file using ffmpeg in terminal. getting Error like this:

    /this error message/

    [mpeg4 @ 0x7ff2b9811c00] header damaged Last message repeated 39 times [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7ff2ba800000] decoding for stream 0 failed [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7ff2ba800000] Could not find codec parameters for stream 0 (Video: mpeg4 (mp4v / 0x7634706D), none, 376 kb/s): unspecified size Consider increasing the value for the 'analyzeduration' and 'probesize' options.

  • FFmpeg gives an error while extract MP3 from video

    26 mars 2017, par Urvashi Patel

    I want to execute FFmpeg command for extract audio in MP3 format from video that is in asset folder of the app. For more, refer http://writingminds.github.io/ffmpeg-android-java/

    String sourceFilePath = "file:///android_asset/SampleVideo.mp4";
    destFilePath = mp3File.getAbsolutePath();
    
    FFmpeg ffmpeg = FFmpeg.getInstance(MainActivity.this);
        try {
    
            String cmd[] = new String[]{"/data/data/com.example.bansi_vasundhara.videotomp3_demo/files/ffmpeg", "-i", sourceFilePath, "-vn", "-acodec copy", destFilePath};
    
            ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {
    
                @Override
                public void onStart() {
                    Log.e("gc", "Command Started");
                }
    
                @Override
                public void onProgress(String message) {
                    Log.e("gc", "onProgress" + message);
                }
    
                @Override
                public void onFailure(String message) {
                    Log.e("gc", "onFailure command" + message);
                }
    
                @Override
                public void onSuccess(String message) {
                    Log.e("gc", "onSuccess command" + message);
                }
    
                @Override
                public void onFinish() {
                    Log.e("gc", "onFinish command");
                }
            });
    
        } catch (FFmpegCommandAlreadyRunningException e) {
            // Handle if FFmpeg is already running
            e.printStackTrace();
        }
    

    but I get an error. Please suggest any solution.

    java.io.IOException: Error running exec(). Command: [/data/data/com.example.bansi_vasundhara.videotomp3_demo/files/ffmpeg, /data/data/com.example.bansi_vasundhara.videotomp3_demo/files/ffmpeg, -i, file:///android_asset/SampleVideo.mp4, -vn, -acodec copy, /storage/sdcard0/VideoToMP3/SampleMP3.mp3] Working Directory: null Environment: null at java.lang.ProcessManager.exec(ProcessManager.java:211) at java.lang.Runtime.exec(Runtime.java:173) at java.lang.Runtime.exec(Runtime.java:128) at com.github.hiteshsondhi88.libffmpeg.ShellCommand.run(ShellCommand.java:10) at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:38) at com.github.hiteshsondhi88.libffmpeg.FFmpegExecuteAsyncTask.doInBackground(FFmpegExecuteAsyncTask.java:10) at android.os.AsyncTask$2.call(AsyncTask.java:288) at java.util.concurrent.FutureTask.run(FutureTask.java:237) at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587) at java.lang.Thread.run(Thread.java:818) Caused by: java.io.IOException: No such file or directory at java.lang.ProcessManager.exec(Native Method) at java.lang.ProcessManager.exec(ProcessManager.java:209)

  • Android JavaCV FFmpeg webstream to local static website

    26 mars 2017, par Thomas Devoogdt

    For my integrated test I'm working on an application that needs to provide a live stream to a locally hosted website. I've already built a working site that run's on nanohttpd. This application performs also special image processing. Therefore I use JavaCV. The library is working perfectly and all cpp bindings are working too.

    My question: How to set up a live stream that can directly be played in a static site hosted by nanohttpd? - I am on the right way?

    My code:

    init:

    private void initLiveStream() throws FrameRecorder.Exception {
        /* ~~~ https://github.com/bytedeco/javacv/issues/598 ~~~ */
        frameRecorder = new FFmpegFrameRecorder("http://localhost:9090", imageWidth, imageHeight, 0);
        frameRecorder.setVideoOption("preset", "ultrafast");
        frameRecorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
        frameRecorder.setAudioCodec(0);
        frameRecorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
        frameRecorder.setFormat("webm");
        frameRecorder.setGopSize(10);
        frameRecorder.setFrameRate(frameRate);
        frameRecorder.setVideoBitrate(5000);
        frameRecorder.setOption("content_type","video/webm");
        frameRecorder.setOption("listen", "1");
        frameRecorder.start();
    }
    

    In my CameraView:

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
        Camera.Size size = camera.getParameters().getPreviewSize();
        Frame frame = new AndroidFrameConverter().convert(data, size.width, size.height);
        try {
             if(frameRecorder!=null){
                 frameRecorder.record(frame);
             }
         } catch (FrameRecorder.Exception e) {
             e.printStackTrace();
         }
     }
    

    Here is one of the stack traces that ar shown frequently in my search to the solution:

    org.bytedeco.javacv.FrameRecorder$Exception: avio_open error() error -111: Could not open 'http://localhost:9090'
    

    I couldn't find any other thread addressing this specific issue.

    Thanks in advance

    EDIT

    Thanks to Chester Cobus, Here is my used code:

    Websocket:

    //Constructor
    AsyncHttpServer serverStream = new AsyncHttpServer();
    List sockets = new ArrayList<>();
    
    //http://stackoverflow.com/a/33021907/5500092
    //I'm planning to use more sockets. This is the only uniform expression I found.
    serverStream.websocket("/((?:[^/]*/)*)(.*)", new AsyncHttpServer.WebSocketRequestCallback() {
         @Override
         public void onConnected(final WebSocket webSocket, AsyncHttpServerRequest request) {
             String uri = request.getPath();
             if (uri.equals("/live")) {
                 sockets.add(webSocket);
    
                 //Use this to clean up any references to your websocket
                 webSocket.setClosedCallback(new CompletedCallback() {
                     @Override
                     public void onCompleted(Exception ex) {
                         try {
                             if (ex != null)
                                 Log.e("WebSocket", "Error");
                         } finally {
                             sockets.remove(webSocket);
                         }
                     }
                 });
             }
         }
    });
    
    //Updater (Observer pattern)
    @Override
    public void updated(byte[] data) {
        for (WebSocket socket : sockets) {
             socket.write(new ByteBufferList(data));
        }
    }
    

    Record Acitivy

    private long start_time = System.currentTimeMillis();
    
    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
        long now_time = System.currentTimeMillis();
        if ((now_time - start_time) > 250) {
            start_time = now_time;
            //https://forums.xamarin.com/discussion/40991/onpreviewframe-issue-converting-preview-byte-to-android-graphics-bitmap
            Camera.Size size = camera.getParameters().getPreviewSize();
            YuvImage image = new YuvImage(data, ImageFormat.NV21, size.width, size.height, null);
            ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
            image.compressToJpeg(new Rect(0, 0, size.width, size.height), 60, byteArrayOutputStream);
            MainActivity.getWebStreamer().updated(byteArrayOutputStream.toByteArray());
        }
    }
    

    JavaScript

    var socket;
    var imageElement;
    
    /**
     * path - String.Format("ws://{0}:8090/live", Window.Location.HostName)
     * image - HTMLImageElement
     */
    function imageStreamer(path, image) {
        imageElement = image;
        socket = new WebSocket(path);
    
        socket.onmessage = function(msg) {
            var arrayBuffer = msg.data;
            var reader = new FileReader();
            reader.onload = function(e) {
                imageElement.src = e.target.result;
            };
            reader.readAsDataURL(arrayBuffer);
        };
    }