Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Ffmpeg Split lost one second

    4 décembre 2012, par Ahmet Ka

    I got a problem with ffmpeg video cutting. Videos not cut properly. 1 second between the first and second video is lost. part1.avi split code:

    ffmpeg -i input.avi -ss 00:00:00 -t 00:01:00 -vcodec copy -acodec copy part1.avi
    

    part2.avi split code

    ffmpeg -i input.avi -ss 00:01:00 -t 00:01:00 -vcodec copy -acodec copy part2.avi
    

    ffmpeg -i part1.avi = 00:01:00.02 ffmpeg -i part2.avi = 00:01:00.02

  • Retrieve album art using FFmpeg

    4 décembre 2012, par William Seemann

    I'm developing an Android application that relies on FFmpeg to retrieve audio metadata. I know it's possible to retrieve album art programmatically using FFMpeg. However, once you have decoded the art (a video frame within an MP3) how do generate an image file (a PNG) for use within an application? I've search all over but can't seem to find a working example.

    Edit, here is the solution:

    #include avcodec.h>
    #include avformat.h>
    
    void retrieve_album_art(const char *path, const char *album_art_file) {
        int i, ret = 0;
    
        if (!path) {
            printf("Path is NULL\n");
            return;
        }
    
        AVFormatContext *pFormatCtx = avformat_alloc_context();
    
        printf("Opening %s\n", path);
    
        // open the specified path
        if (avformat_open_input(&pFormatCtx, path, NULL, NULL) != 0) {
            printf("avformat_open_input() failed");
            goto fail;
        }
    
        // read the format headers
        if (pFormatCtx->iformat->read_header(pFormatCtx) < 0) {
            printf("could not read the format header\n");
            goto fail;
        }
    
        // find the first attached picture, if available
        for (i = 0; i < pFormatCtx->nb_streams; i++)
            if (pFormatCtx->streams[i]->disposition & AV_DISPOSITION_ATTACHED_PIC) {
                AVPacket pkt = pFormatCtx->streams[i]->attached_pic;
                FILE* album_art = fopen(album_art_file, "wb");
                ret = fwrite(pkt.data, pkt.size, 1, album_art);
                fclose(album_art);
                av_free_packet(&pkt);
                break;
            }
    
        if (ret) {
            printf("Wrote album art to %s\n", album_art_file);
        }
    
        fail:
            av_free(pFormatCtx);
            // this line crashes for some reason...
            //avformat_free_context(pFormatCtx);
    }
    
    int main() {
        avformat_network_init();
        av_register_all();
    
        const char *path = "some url";
        const char *album_art_file = "some path";
    
        retrieve_album_art(path, album_art_file);
    
        return 0;
    }
    
  • How to change frame rate of live stream using FFmpeg

    4 décembre 2012, par chiv

    I have created a streaming application, which takes video data from live source. Is there any transcoding tool which will receive stream from my streaming application and changes frame rate by transcoding the stream and re stream on another location?

    currently I'm Struggling with FFmpeg code to use in Visual Studio 2010. And I wanted to integrate FFmpeg code in my application.

    Using FFmpeg.exe I'm able to transcode static files.I wont get any proper example how to transcode/change frame rate of live stream using FFmpeg. I tried,following command to re-stream on another IP.

          ffmpeg -re -i "rtp://my_ip:1234" -r 60 -f rtp "trp://my_ip:4321"
    
  • Trying to execute a SPLICE effect with libsox

    4 décembre 2012, par cube

    When two input files and one output file is required for an effect such as the splice effect, how should the sox_open_read() method be called?

    A typical libsox routine would look something like:

    in=sox_open_read(file_path_in, NULL, NULL, NULL);
        out=sox_open_write(file_path_out, &in->signal, NULL, NULL, NULL, NULL);
    chain = sox_create_effects_chain(&in->encoding, &out->encoding);
    
    e = sox_create_effect(sox_find_effect("input"));
    args[0] = (char *)in, assert(sox_effect_options(e, 1, args) == SOX_SUCCESS);
    
    assert(sox_add_effect(chain, e, &in->signal, &in->signal) == SOX_SUCCESS);
    
        e = sox_create_effect(sox_find_effect("speed"));
        args[0] = 5,
        assert(sox_effect_options(e, 1, args) == SOX_SUCCESS);
        assert(sox_add_effect(chain, e, &in->signal, &in->signal) == SOX_SUCCESS);
    

    However, I want to use the splice effect which requires 2 input files to combine into one output.

    Any help would be appreciated.

  • FFmpeg and Android encoding issue

    4 décembre 2012, par brux

    I compiled ffmpeg for Android. The executable works from the device terminal and I can do normal video operations. I am trying to join 2 mpeg files captured using the camera at the command line on the device.

    First I Capture 2 videos using the camera and save to sdcard, one.mpeg and two.mpeg. Then I do:

     ffmpeg -i one.mpeg onenew.mpeg
     ffmpeg -i twompg.mpeg twonew.mpeg 
    

    (if i dont do the above 2 commands then it doesnt work at all)

     cat onenew.mpeg twonew.mpeg > joined.mpeg
     ffmpeg -i joined.mpeg -acodec copy -vcodec copy final.mpeg
    

    The output (final.mpeg) doesnt play on the device but if i copy to my linux desktop it opens up and plays fine. I tested final.mpeg on a 2.3.3 device and a 2.3.6 device.

    Anyone know why the device would fail to play the video file?

    UPDATE My friend tested the video on a device running 3.0, the default player never played the video however 'moboplayer' did, I need it to play on the default player though.