Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Cropping video with FFmpeg increases the tbn value too much

    5 avril 2016, par TOP

    Here is the information of original video:

     Metadata:
        major_brand     : mp42
        minor_version   : 0
        compatible_brands: isommp42
        creation_time   : 2016-04-05 03:00:09
      Duration: 00:01:50.09, start: 0.000000, bitrate: 8131 kb/s
        Stream #0:0(eng): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p(tv, bt470bg/bt470bg/smpte170m), 1920x1080, 7995 kb/s, SAR 1:1 DAR 16:9, 44.49 fps, 90k tbr, 90k tbn, 180k tbc (default)
    

    Then I used this ffmpeg command to crop video:

    ffmpeg -i file.mp4 -vf "crop=480:480:0:0" -b:v 2048k -preset ultrafast cropped.mp4
    

    Here is the information of cropped video:

     Metadata:
        major_brand     : isom
        minor_version   : 512
        compatible_brands: isomiso2avc1mp41
        encoder         : Lavf57.28.101
      Duration: 00:01:50.16, start: 0.023220, bitrate: 1078 kb/s
        Stream #0:0(eng): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 1282x716 [SAR 1:1 DAR 641:358], 1002 kb/s, 44.49 fps, 44.49 tbr, 220455000.00 tbn, 88.98 tbc (default)
    

    The default video player of my phone cannot play this video. If I use MX Player I have to change the decoder to Software decoder (instead of Hardware) to open it.

    I noticed that the tbn value was increased after reencoding. The old value is 90k. The new value is 220455k. Maybe it is the reason why the default video player doesn't work.

    Question: why is the tbn value so big? How to avoid it?

  • PHP imagick equivalent of -define png:color-type=6

    5 avril 2016, par user1661677

    I'm needing to save my PNG files with a different color type so ffmpeg can process them correctly. I'm using the PHP library for imagemagick, and I'm trying to figure out how to implement the following (command line) in imagick PHP:

    -define png:color-type=6

  • Cross fading 2 videos segments using ffmpeg when number of frames are not known before hand

    5 avril 2016, par Hero Roma

    I am a newbie trying to cross fade 2 videos using ffmpeg and the stackoverflow answer. But in my case I do not know the length of the video or the number of the frames. I want to start the fade out in the first video in the last 5 frames and fade in the first 5 frames. I am able to do the fade in on the second video but I cannot figure out how to do the fade out in the last 5 frames when I don't know the duration or the number of frames.

    The "-i" option is supposed to be able to extract that information but I cant seem to pipe it to the next block there.

    ffmpeg -i 1.mp4 -i 2.mp4 -f lavfi -i color=black -filter_complex \
    "[0:v]format=pix_fmts=yuva420p,fade=t=out:st=4:d=1:alpha=1,setpts=PTS-STARTPTS[va0];\
    [1:v]format=pix_fmts=yuva420p,fade=t=in:st=0:d=1:alpha=1,setpts=PTS-STARTPTS+4/TB[va1];\
    [2:v]scale=960x720,trim=duration=9[over];\
    [over][va0]overlay[over1];\
    [over1][va1]overlay=format=yuv420[outv]" \
    -vcodec libx264 -map [outv] out.mp4
    
  • FFmpeg can not open video file after adding the GLsurfaceView to render frames

    4 avril 2016, par Kyle Lo

    The source code works perfectly without any modification.

    I successfully use the below function to play the specified video.

    playview.openVideoFile("/sdcard/Test/mv.mp4");
    

    And for the research purpose I need to display the frame by using OpenGL ES. So I remove the original method below.

    ANativeWindow* window = ANativeWindow_fromSurface(env, javaSurface);
    
    ANativeWindow_Buffer buffer;
    if (ANativeWindow_lock(window, &buffer, NULL) == 0) {
      memcpy(buffer.bits, pixels,  w * h * 2);
      ANativeWindow_unlockAndPost(window);
    }
    
    ANativeWindow_release(window);
    

    And I add FrameRenderer class into my project

    public class FrameRenderer implements GLSurfaceView.Renderer {
    
        public long time = 0;
        public short framerate = 0;
        public long fpsTime = 0;
        public long frameTime = 0;
        public float avgFPS = 0;
        private PlayNative mNative = null;
    
        @Override
        public void onSurfaceCreated(GL10 gl, EGLConfig config) {/*do nothing*/}
    
        @Override
        public void onSurfaceChanged(GL10 gl, int width, int height) {
    
        }
    
        @Override
        public void onDrawFrame(GL10 gl) {
            mNative.render();
        }
    

    In the native side I create a corresponding method in VideoPlay.cpp And I only use glClearColorto test if the OpenGL function works or not.

    void VideoPlay::render() {
        glClearColor(1.0f, 0.0f, 0.0f, 1.0f);
        glClear(GL_COLOR_BUFFER_BIT);
    }
    

    And the onCreate is as below.

    protected void onCreate(Bundle savedInstanceState) {
            // TODO Auto-generated method stub
            super.onCreate(savedInstanceState);
            setContentView(R.layout.main_layout);
    
            playview = new PlayView(this);
    
            playview.openVideoFile("/sdcard/test_tt_racing.mp4");
            //playview.openVideoFile("/sdcard/big_buck_bunny.mp4");
    
            GLSurfaceView surface = (GLSurfaceView)findViewById(R.id.surfaceviewclass);
            surface.setRenderer(new FrameRenderer());
            ...
    

    Then test it on the mobile, the screen becomes red which means the GLSurfaceView and OpenGL works fine.

    But after I press the play bottom, whole the app stucked. And Show in the Log

    My question is why I can open the video whose path is totally the same with the previous one, just after I added the GLsurface renderer and how can I fix it?

  • Adding two audio clips and an image to a video file using ffmpeg

    4 avril 2016, par Hadi F.

    I'm trying to add two audio clips and a png image to my video clip in ffmpeg. I also want to increase the volume of the two audio files in the same code. Here is the code I wrote:

        ffmpeg -i videoclip.mp4 -i logo.png -i audio1.mp3 -i audio2.m4a -filter_complex "[0:v][1:v]overlay=${x}:${y}:enable='between(t,${logoStartTime},${logoEndTime})'; [2:0]volume=${vol}[s3];[s3]adelay=${audioDelay}[s1];[3:0]volume=${vol2}[s4];[s4]adelay=${audioDelay2}[s2];[0:a][s1][s2] amix=inputs=3:duration=longest " -c:v libx264 -shortest out.mp4
    

    Everything works fine in the output video, except when I play the output video, the audio1.mp3 plays once right at the start and once at the time I specify in 'adelay'. I really don't have any idea how to fix this and it's driving me crazy! Can anyone help please?