Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Why doesn't this FFmpeg code create a video from a series of images ?

    20 octobre 2011, par user551117

    I have successfully compiled the FFmpeg library for use in an iOS application. I would like to use it for encoding a video from a series of images, but I can't seem to make it work.

    The following is the code that I am using to encode this video:

    AVCodec *codec;
    AVCodecContext *c= NULL;
    int i, out_size, size, outbuf_size;
    FILE *f;
    AVFrame *picture;
    uint8_t *outbuf;
    
    printf("Video encoding\n");
    
    /// find the mpeg video encoder
    codec=avcodec_find_encoder(CODEC_ID_MPEG4);
    //codec = avcodec_find_encoder(CODEC_ID_MPEG4);
    if (!codec) {
        fprintf(stderr, "codec not found\n");
        exit(1);
    }
    
    c= avcodec_alloc_context();
    picture= avcodec_alloc_frame();
    
    // put sample parameters 
    c->bit_rate = 400000;
    /// resolution must be a multiple of two 
    c->width = 320;
    c->height = 480;
    //frames per second 
    c->time_base= (AVRational){1,25};
    c->gop_size = 10; /// emit one intra frame every ten frames
    c->max_b_frames=1;
    c->pix_fmt = PIX_FMT_YUV420P;
    
    //open it
    if (avcodec_open(c, codec) < 0) {
        fprintf(stderr, "could not open codec\n");
        exit(1);
    }
    
    f = fopen([[NSTemporaryDirectory() stringByAppendingPathComponent:filename] UTF8String], "w");
    if (!f) {
        fprintf(stderr, "could not open %s\n",[filename UTF8String]);
        exit(1);
    }
    
    // alloc image and output buffer
    outbuf_size = 100000;
    outbuf = malloc(outbuf_size);
    size = c->width * c->height;
    
    #pragma mark -
    AVFrame* outpic = avcodec_alloc_frame();
    int nbytes = avpicture_get_size(PIX_FMT_YUV420P, c->width, c->height);
    
    //create buffer for the output image
    uint8_t* outbuffer = (uint8_t*)av_malloc(nbytes);
    
    #pragma mark -  
    for(i=1;i<48;i++) {
        fflush(stdout);
    
        int numBytes = avpicture_get_size(PIX_FMT_YUV420P, c->width, c->height);
        uint8_t *buffer = (uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
    
        UIImage *image = [UIImage imageNamed:[NSString stringWithFormat:@"%d.png", i]];
        CGImageRef newCgImage = [image CGImage];
    
        CGDataProviderRef dataProvider = CGImageGetDataProvider(newCgImage);
        CFDataRef bitmapData = CGDataProviderCopyData(dataProvider);
        buffer = (uint8_t *)CFDataGetBytePtr(bitmapData);   
    
        avpicture_fill((AVPicture*)picture, buffer, PIX_FMT_RGB24, c->width, c->height);
        avpicture_fill((AVPicture*)outpic, outbuffer, PIX_FMT_YUV420P, c->width, c->height);
    
        struct SwsContext* fooContext = sws_getContext(c->width, c->height, 
                                                       PIX_FMT_RGB24, 
                                                       c->width, c->height, 
                                                       PIX_FMT_YUV420P, 
                                                       SWS_FAST_BILINEAR, NULL, NULL, NULL);
    
        //perform the conversion
        sws_scale(fooContext, picture->data, picture->linesize, 0, c->height, outpic->data, outpic->linesize);
        // Here is where I try to convert to YUV
    
        // encode the image
    
        out_size = avcodec_encode_video(c, outbuf, outbuf_size, outpic);
        printf("encoding frame %3d (size=%5d)\n", i, out_size);
        fwrite(outbuf, 1, out_size, f);
    
        free(buffer);
        buffer = NULL;      
    
    }
    
    // get the delayed frames
    for(; out_size; i++) {
        fflush(stdout);
    
        out_size = avcodec_encode_video(c, outbuf, outbuf_size, NULL);
        printf("write frame %3d (size=%5d)\n", i, out_size);
        fwrite(outbuf, 1, outbuf_size, f);      
    }
    
    // add sequence end code to have a real mpeg file
    outbuf[0] = 0x00;
    outbuf[1] = 0x00;
    outbuf[2] = 0x01;
    outbuf[3] = 0xb7;
    fwrite(outbuf, 1, 4, f);
    fclose(f);
    free(outbuf);
    
    avcodec_close(c);
    av_free(c);
    av_free(picture);
    printf("\n");
    

    What could be wrong with this code?

  • Why does FFMPEG reports the wrong duration ?

    20 octobre 2011, par Adrian Lynch

    I have an oldish build of FFMPEG that I can't easily change.

    We use FFMPEG to find the duration of video and sound files. So far it has been working wonderfully.

    Recently on an uploaded file, FFMPEG has reported a 30 second file as being 5 minutes 30 seconds in length.

    Could it be something wrong with the file rather than FFMPEG?

    If I use FFMPEG to convert to another file, the duration is restored.

    In case it matters, ffmpeg -i 'path to the file' produces:

        FFmpeg version Sherpya-r15618, Copyright (c) 2000-2008 Fabrice Bellard, et al.
          libavutil     49.11. 0 / 49.11. 0
          libavcodec    52. 0. 0 / 52. 0. 0
          libavformat   52.22. 1 / 52.22. 1
          libavdevice   52. 1. 0 / 52. 1. 0
          libswscale     0. 6. 1 /  0. 6. 1
          libpostproc   51. 2. 0 / 51. 2. 0
          built on Oct 14 2008 23:43:47, gcc: 4.2.5 20080919 (prerelease) [Sherpya]
        Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'H:\path\to\file.mov':
          Duration: 00:05:35.00, start: 0.000000, bitrate: 1223 kb/s
            Stream #0.0(eng): Audio: aac, 44100 Hz, stereo, s16
            Stream #0.1(eng): Video: h264, yuv420p, 720x576, 25.00 tb(r)
        Must supply at least one output file
    

    It's that very command I use to then extract the duration with RegEx.

    Does anyone have a nice application that can do what I'm trying above but get it right 100% of the time?

  • Strange issue with FFMPEG

    19 octobre 2011, par Raphael Milani

    I´ve been facing a strange issue with ffmpeg. I´ve tried converting videos MP4 to FLV but the ffpmeg doesn´t convert the whole of video. For example if the video has 10min the ffmpeg only converts 09 min and 30 seconds. I´ve been using this command to convert:

     -y -i "<$InFilePath$>" -ab 56 -ar 44100 -b 900000 -r 30 -s 832X468 -aspect 16:9 -f flv -qscale 2 "<$OutFilePath$>"
    

    Has anyone faced this problem?

  • ffmpeg from erlang using open_port

    19 octobre 2011, par user1002473

    I've got a problem with ffmpeg, when I try to use it in pipe to pipe mode in erlang. This's my code list:

    fun(Data) ->
        Port = open_port(
            {spawn, "ffmpeg -i -  -acodec copy -vcodec copy -f flv - "},
            [binary,stream,use_stdio,exit_status]
        ),
        Port ! {self(),{command,<binary>>}},
        receive_data(Port).
    

    and I've got this error from std error:

    av_interleaved_write_frame(): Broken pipe

  • How to convert .flv file to .3gp using ffmpeg ? [migrated]

    19 octobre 2011, par Chetana

    I have converted any video format to 3gp file format using ffmpeg on one server. But on another server it not works.

    Following is my script:

    exec("ffmpeg -i test.flv -sameq -acodec libmp3lame -ar 22050 -ab 96000
        -deinterlace -nr 500 -s 320x240 -aspect 4:3 -r 20 -g 500 -me_range 20
        -b 270k -deinterlace -f flv -y test.3gp ");
    

    Can anyone tell me what is wrong in script?

    Following is my ffmpeg setting:

    root@ninja [~]# ffmpeg -formats ffmpeg version CVS, build 3277056, Copyright (c) 2000-2004 Fabrice Bellard configuration: --enable-mp3lame --enable-libogg --enable-gpl --disable-mmx --enable-shared built on Jun 17 2009 10:51:43, gcc: 4.1.2 20080704 (Red Hat 4.1.2-44)