Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • How to generate images from mp4 file ?

    23 mai 2013, par FlyingCat

    I am trying to generate the images from mp4 video file with ffmpeg.

    I only want the first 3 second of every video in the file.

    I have the following command..

     $videoFile = $mp4File; //I have many mp4 files
    
     $ffmpeg ="/usr/ffmpeg";
    
     $image_path = $videoFile;
    
     $ALL_PLACE_WIDTH = 300;
     $ALL_PLACE_HEIGHT = 220;
    
     $image_cmd = " -r 1 -r 0.03 -s ".$ALL_PLACE_WIDTH."x".$ALL_PLACE_HEIGHT." -f image2 ";
     $dest_path = 'project/image-%01d.png';
    
     $str_command= $ffmpeg  ." -i " . $image_path . $image_cmd .$dest_path;
     shell_exec($str_command);
    

    I am trying to generate firt 3 second image of every mp4 file but my script will genearte a lot of images and it create images across entire mp4 file.

    Can I restrict the script to get only the first 3 seconds images only? Thanks a lot!

  • transcode and segment with ffmpeg

    22 mai 2013, par alphablender

    It appears that ffmpeg now has a segmenter in it, or at least there is a command line option

    -f segment

    in the documentation.

    Does this mean I can use ffmpeg to realtime-transcode a video into h.264 and deliver segmented IOS compatible .m3u8 streams using ffmpeg alone? if so, what would a command to transcode an arbitrary video file to a segmented h.264 aac 640 x 480 stream ios compatible stream?

  • FFMPEG SCREENSHOT ERROR : No such filter : 'tile' [closed]

    22 mai 2013, par itseasy21

    i have been trying on making multiple screenshots from a video file using ffmpeg and i succeed too in command but the only problem is while executing that i am getting this error:

    No such filter: 'tile'
    Error opening filters!
    

    The command i execute is:

    ffmpeg -ss 00:00:10 -i './tmp/try.avi' -vcodec mjpeg -vframes 1 -vf 'select=not(mod(n\,1000)),scale=320:240,tile=2x3' './tmp/try.jpg'
    

    any solution for this ????

  • Is it possible to limit the input file size in FFmpeg ?

    22 mai 2013, par Ryan James

    I’m having trouble finding a way to limit the input file size in FFmpeg.

    I’ve got a site that welcomes uploaded videos of 60 seconds in length, but need a safeguard in case someone tries to upload a 50GB file and paralyses my server’s CPU and storage.

  • Can't write RGB frames to mp4 video with FFMPEG

    22 mai 2013, par Michael IV

    I am writing RGB frames to mp4 (h264) using FFMPEG. The resulting video is created with the correct length but the visual is completely screwed: enter image description here

    My whole setup is based on this tutorial.But there is one difference.In the tutorial YUV frame is written explicitly with some random pixels while in my case I need to convert RGB frame (input) to YUV frame (output) before processing the encoding.

    I am doing it like this:

    The method is called on each frame:

    void  write_video_frame(AVFormatContext *oc, AVStream *st ,uint8_t* data)
    {
        int ret;
    
        AVCodecContext *c = st->codec;
        if (frame_count >= STREAM_NB_FRAMES) {
            /* No more frames to compress. The codec has a latency of a few
            * frames if using B-frames, so we get the last frames by
            * passing the same picture again. */
        } else {
    
    
        sws_ctx =  sws_getContext(c->width, c->height, AV_PIX_FMT_RGB24,c->width, c->height, c->pix_fmt, SWS_BICUBIC, NULL, NULL, NULL);
                  avpicture_fill((AVPicture*)&src_picture,data,AV_PIX_FMT_RGB24 , c->width , c->height );
                avpicture_fill((AVPicture*)&dst_picture,outPicture,AV_PIX_FMT_YUV420P , c->width , c->height );
    
        sws_scale(sws_ctx, src_picture.data, src_picture.linesize, 0,  c->height, dst_picture.data, dst_picture.linesize);
    
    
    
    
        }
    
        if (oc->oformat->flags & AVFMT_RAWPICTURE) {
            //// Raw video case - directly store the picture in the packet///
            AVPacket pkt;
            av_init_packet(&pkt);
            pkt.flags        |= AV_PKT_FLAG_KEY;
            pkt.stream_index  = st->index;
            pkt.data          = dst_picture.data[0];
            pkt.size          = sizeof(AVPicture);
            ret = av_interleaved_write_frame(oc, &pkt);
        } else {
            AVPacket pkt = { 0 };
            int got_packet;
            av_init_packet(&pkt);
            // encode the image //
            ret = avcodec_encode_video2(c, &pkt, frame, &got_packet);
            if (ret < 0) {
                //   fprintf(stderr, "Error encoding video frame: %s\n", av_err2str(ret));
                exit(1);
            }
            // If size is zero, it means the image was buffered.//
            if (!ret && got_packet && pkt.size) {
                pkt.stream_index = st->index;
                // Write the compressed frame to the media file.//
                ret = av_interleaved_write_frame(oc, &pkt);
            } else {
                ret = 0;
            }
        }
        if (ret != 0) {
            //  fprintf(stderr, "Error while writing video frame: %s\n", av_err2str(ret));
            exit(1);
        }
    
        frame_count++;
    }
    

    So what I am doing is getting conversion context.Then filling one picture with RGB24 pixels and setting destination picture empty (dst_picture) to receive converted frame in AV_PIX_FMT_YUV420P format. Then I scale it. The FFMPEG throws no errors.

    There is a workaround which works but creates very large output (that is why I don't want to use it.That's if I convert RGB frame explicitly and fetch it directly into destination picture directly like this:

    Bitmap2Yuv420p_calc2Fast(dst_picture.data[0] , data , c->width ,c ->height);
    

    Where Bitmap2Yuv420p_calc2Fast() looks like this:

    void Bitmap2Yuv420p_calc2Fast(uint8_t *destination, uint8_t *rgb, size_t width, size_t height) { size_t image_size = width * height; size_t upos = image_size; size_t vpos = upos + upos / 4; size_t i = 0;

        for( size_t line = 0; line < height; ++line )
        {
            if( !(line % 2) )
            {
                for( size_t x = 0; x < width; x += 2 )
                {
                    uint8_t r = rgb[3 * i];
                    uint8_t g = rgb[3 * i + 1];
                    uint8_t b = rgb[3 * i + 2];
    
                    destination[i++] = ((66*r + 129*g + 25*b) >> 8) + 16;
    
                    destination[upos++] = ((-38*r + -74*g + 112*b) >> 8) + 128;
                    destination[vpos++] = ((112*r + -94*g + -18*b) >> 8) + 128;
    
                    r = rgb[3 * i];
                    g = rgb[3 * i + 1];
                    b = rgb[3 * i + 2];
    
                    destination[i++] = ((66*r + 129*g + 25*b) >> 8) + 16;
                }
            }
            else
            {
                for( size_t x = 0; x < width; x += 1 )
                {
                    uint8_t r = rgb[3 * i];
                    uint8_t g = rgb[3 * i + 1];
                    uint8_t b = rgb[3 * i + 2];
    
                    destination[i++] = ((66*r + 129*g + 25*b) >> 8) + 16;
                }
            }
        }
    }
    

    Where is my mistake?