Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Urgent : Merge several mp4 video using ffmpeg and php

    8 novembre 2013, par user2969041

    I would like to merge several videos using ffmpeg and php. I want to retrieve the names of the videos dynamically but I can't merge all the videos together I only get i-1 merged videos

    Can anyone help me??

    (I can't post the code, each time i would post it occures an error and then i can't post it)

  • Restreaming rtmpsuck saved file

    8 novembre 2013, par xester

    i want to use rtmpsuck to save one file and stream that file to my house network.

    due to the duration of that file its in the file using ffmpeg to do it will go down after the duration time even if with -re.

    Theres anyway someone knows that im able to do it?

  • about ffmpeg drawtext contains chinses

    8 novembre 2013, par user2968187

    When i use below command (ffmpeg drawtext filter), the result window display messy code □□.

    How to solve the problem ?

       ffplay -f lavfi -i color=c=white -vf drawtext=fontfile=arial.ttf:text=中文
    

    thanks very much

  • What the correct args for ffmpeg conversion

    8 novembre 2013, par Andrew Simpson

    I have a C# app. I am using ffmpge to convert jpgs to an OGG file. I use the process class to write to a stream buffer within my app using piping without having to write the ogg file to a disk. I then upload these bytes to my server where it is written to the hard drive.

    On the server side I want to convert it to say an mp4 format. Again, I want to pipe it out to a stream buffer and allow my User to download the file.

    This is my code:

    Process _serverBuild = null;

    public byte[] EncodeAndUploadImages(string _args1)
    {
        byte[] _data = null;
        try
        {
            _args1 = "ffmpeg.exe -i \"c:\\Cloud\\Zipped\\000EC902F17F\\f3e45e44-b61c-472b-8dc1-a8e9b9511497_00007.ogg\" avi -";
            _serverBuild = new Process();
    
            _serverBuild.StartInfo.WorkingDirectory = Environment.CurrentDirectory;
            _serverBuild.StartInfo.Arguments = _args1;
            _serverBuild.StartInfo.FileName =  @"C:\inetpub\wwwroot\bin\ffmpeg.exe";
            _serverBuild.StartInfo.UseShellExecute = false;
            _serverBuild.StartInfo.RedirectStandardOutput = true;
            _serverBuild.StartInfo.RedirectStandardError = true;
            _serverBuild.StartInfo.RedirectStandardInput = true;
            _serverBuild.StartInfo.CreateNoWindow = true;
            _serverBuild.StartInfo.LoadUserProfile = false;
            _serverBuild.ErrorDataReceived += _server_ErrorDataReceived;
            _serverBuild.OutputDataReceived += _server_OutputDataReceived;
            _serverBuild.Exited += new EventHandler(_server_Exited);
            _serverBuild.EnableRaisingEvents = true;
            _serverProcessHasFinished = false;
            _serverBuild.Start();
    
            mStandardOutput = _serverBuild.StandardOutput.BaseStream;
            mStandardOutput.BeginRead(mReadBuffer, 0, mReadBuffer.Length, StandardOutputReadCallback, null);            
            _serverBuild.WaitForExit();
    
    
    
    
        }
        catch (Exception _ex)
        {
    
        }
        finally
        {
            _serverBuild.ErrorDataReceived -= _server_ErrorDataReceived;
            _serverBuild.OutputDataReceived -= _server_OutputDataReceived;
            _serverBuild.Dispose();
        }
        return _data;
    }
    
    byte[] mReadBuffer = new byte[4096];
    MemoryStream mStandardOutputMs = new MemoryStream();
    Stream mStandardOutput;
    
    private void StandardOutputReadCallback(IAsyncResult ar)
    {
        try
        {
            int numberOfBytesRead = mStandardOutput.EndRead(ar);
            {
                mStandardOutputMs.Write(mReadBuffer, 0, numberOfBytesRead);
                mStandardOutput.BeginRead(mReadBuffer, 0, mReadBuffer.Length, StandardOutputReadCallback, null);
            }
        }
        catch (Exception _ex)
        {
    
        }
    }
    

    The error I get (when I test with a bat file and run within a DOS prompt) is:

    enter image description here

    Obviously my parameters are at fault. I want to keep it simple as I may want to use other formats apart from MP4.

    Thanks

  • FFMpeg encoded video will only play in FFPlay

    8 novembre 2013, par mohM

    I've been debugging my program for a couple of weeks now with the output video only showing a blank screen (was testing with VLC, WMP and WMPClassic). I happened to try using FFPlay and lo and behold the video works perfectly. I've read that this is usually caused by an incorrect pixel format, and that switching to PIX_FMT_YUV420P will make it work universally...but I'm already using that pixel format in the encoding process. Is there anything else that is causing this?

    AVCodec* codec;
    AVCodecContext* c = NULL;
    uint8_t* outbuf;
    int i, out_size, outbuf_size;
    
    avcodec_register_all();
    
    printf("Video encoding\n");
    
    // Find the mpeg1 video encoder
    codec = avcodec_find_encoder(CODEC_ID_H264);
    if (!codec) {
        fprintf(stderr, "Codec not found\n");
        exit(1);
    }
    else printf("H264 codec found\n");
    
    c = avcodec_alloc_context3(codec);
    
    c->bit_rate = 400000;
    c->width = 1920;                                        // resolution must be a multiple of two (1280x720),(1900x1080),(720x480)
    c->height = 1200;
    c->time_base.num = 1;                                   // framerate numerator
    c->time_base.den = 25;                                  // framerate denominator
    c->gop_size = 10;                                       // emit one intra frame every ten frames
    c->max_b_frames = 1;                                    // maximum number of b-frames between non b-frames
    //c->keyint_min = 1;                                        // minimum GOP size
    //c->i_quant_factor = (float)0.71;                      // qscale factor between P and I frames
    //c->b_frame_strategy = 20;
    //c->qcompress = (float)0.6;
    //c->qmin = 20;                                         // minimum quantizer
    //c->qmax = 51;                                         // maximum quantizer
    //c->max_qdiff = 4;                                     // maximum quantizer difference between frames
    //c->refs = 4;                                          // number of reference frames
    //c->trellis = 1;                                           // trellis RD Quantization
    c->pix_fmt = PIX_FMT_YUV420P;
    c->codec_id = CODEC_ID_H264;
    //c->codec_type = AVMEDIA_TYPE_VIDEO;
    
    // Open the encoder
    if (avcodec_open2(c, codec,NULL) < 0) {
        fprintf(stderr, "Could not open codec\n");
        exit(1);
    }
    else printf("H264 codec opened\n");
    
    outbuf_size = 100000 + c->width*c->height*(32>>3);//*(32>>3);           // alloc image and output buffer
    outbuf = static_cast(malloc(outbuf_size));
    printf("Setting buffer size to: %d\n",outbuf_size);
    
    FILE* f = fopen("example.mpg","wb");
    if(!f) printf("x  -  Cannot open video file for writing\n");
    else printf("Opened video file for writing\n");
    
    // encode 5 seconds of video
    for(i=0;iwidth, c->height);
        uint8_t* outbuffer = (uint8_t*)av_malloc(nbytes*sizeof(uint8_t));
    
        AVFrame* inpic = avcodec_alloc_frame();
        AVFrame* outpic = avcodec_alloc_frame();
    
        outpic->pts = (int64_t)((float)i * (1000.0/((float)(c->time_base.den))) * 90);
        avpicture_fill((AVPicture*)inpic, (uint8_t*)pPixels, PIX_FMT_RGB32, c->width, c->height);                   // Fill picture with image
        avpicture_fill((AVPicture*)outpic, outbuffer, PIX_FMT_YUV420P, c->width, c->height);
        av_image_alloc(outpic->data, outpic->linesize, c->width, c->height, c->pix_fmt, 1); 
    
        inpic->data[0] += inpic->linesize[0]*(screenHeight-1);                                                      // Flipping frame
        inpic->linesize[0] = -inpic->linesize[0];                                                                   // Flipping frame
    
        struct SwsContext* fooContext = sws_getContext(screenWidth, screenHeight, PIX_FMT_RGB32, c->width, c->height, PIX_FMT_YUV420P, SWS_FAST_BILINEAR, NULL, NULL, NULL);
        sws_scale(fooContext, inpic->data, inpic->linesize, 0, c->height, outpic->data, outpic->linesize);
    
        // encode the image
        out_size = avcodec_encode_video(c, outbuf, outbuf_size, outpic);
        printf("Encoding frame %3d (size=%5d)\n", i, out_size);
        fwrite(outbuf, 1, out_size, f);
        delete [] pPixels;
        av_free(outbuffer);     
        av_free(inpic);
        av_free(outpic);
    }
    
    // get the delayed frames
    for(; out_size; i++) {
        fflush(stdout);
    
        out_size = avcodec_encode_video(c, outbuf, outbuf_size, NULL);
        printf("Writing frame %3d (size=%5d)\n", i, out_size);
        fwrite(outbuf, 1, out_size, f);
    }
    
    // add sequence end code to have a real mpeg file
    outbuf[0] = 0x00;
    outbuf[1] = 0x00;
    outbuf[2] = 0x01;
    outbuf[3] = 0xb7;
    fwrite(outbuf, 1, 4, f);
    fclose(f);
    
    avcodec_close(c);
    free(outbuf);
    av_free(c);
    printf("Closed codec and Freed\n");