Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • how play ogg/ape/wv/wma using ffmpeg

    25 novembre 2013, par user2361120

    I'm trying decode ogg,ape,wma,wv file formats I have posted the code below, but I'm having too much noise at the output

     av_init_packet(&packet);
            fmt_ctx = avformat_alloc_context();
    
            if ((ret = avformat_open_input(&fmt_ctx, szfile, NULL, NULL)) < 0)
            {
                LOGE("Cannot open input file\n");
            }
            if ((ret = avformat_find_stream_info(fmt_ctx, NULL)) < 0)
            {
                LOGE("Cannot find stream information\n");
            }
    
            /* select the audio stream */
            ret = av_find_best_stream(fmt_ctx, AVMEDIA_TYPE_AUDIO, -1, -1, &dec, 0);
            if (ret < 0)
            {
                LOGE("Cannot find a audio stream in the input file\n");
            }
    
            audio_stream_index = ret;
            dec_ctx = fmt_ctx->streams[audio_stream_index]->codec;
    
            LOGE(" ogg code %d codec id%d\n",AV_CODEC_ID_VORBIS,dec_ctx->codec_id);
    
            LOGE("avcodec_find_decoder\n");
    
    
            dec = avcodec_find_decoder(dec_ctx->codec_id);
                            if (!dec) {
    
                  __android_log_print(ANDROID_LOG_INFO, "BroovPlayer", "avcodec_find_decoder failed %d Name:%s\n", dec_ctx->codec_id, dec_ctx->codec_name);
    
    
                            }
    
    
            if ((ret = avcodec_open2(dec_ctx, dec, NULL)) < 0)
            {
                LOGE("Cannot open audio decoder\n");
            }
    
    
    
    
            //dec_ctx->sample_fmt = AV_SAMPLE_FMT_S16P;
            LOGS("Stage 5 sample fmt %d",dec_ctx->sample_fmt);
            LOGE("Stage 5");
            LOGD("........%d", packet.size);
            while (1)
            {
    
                   if ((ret = av_read_frame(fmt_ctx, &packet)) < 0)
                   {
                       //LOGE("Stage........... %d",ret);
                       break;
                   }
                   if (packet.stream_index == audio_stream_index)
                   {
                           avcodec_get_frame_defaults(frame);
                           got_frame = 0;
                          // LOGE("file size=%d packet_index=%d",packet.size,packet.dts);
    
    
                           ret = avcodec_decode_audio4(dec_ctx, frame, &got_frame, &packet);
                         //  LOGE("len=%d",ret);
                           if (ret < 0)
                           {
                               LOGE("Error decoding audio\n");
                               continue;
                           }
    
                           if (!got_frame) {
                                        /* stop sending empty packets if the decoder is finished */
                                        if (!packet.data && dec->capabilities & CODEC_CAP_DELAY)
                                            //flush_complete = 1;
                                        continue;
                                    }
    
                           if (got_frame)
                           {
                               // LOGE("begin frame decode\n");
                                int data_size = av_samples_get_buffer_size(NULL, dec_ctx->channels,frame->nb_samples,dec_ctx->sample_fmt, 1);
                               // LOGE("after frame decode\n");
    
                                jbyte *bytes = (*env)->GetByteArrayElements(env, array, NULL);
                                memcpy(bytes, frame->data[0], data_size); //
                                (*env)->ReleaseByteArrayElements(env, array, bytes, 0);
                                (*env)->CallVoidMethod(env, obj,play, array, data_size);
    
                           }
                           packet.size -= ret;
                           packet.data += ret;
                           packet.pts = AV_NOPTS_VALUE;
    
    
                     }
              }
    
                   av_free_packet(&packet);
    
    
    
                when ,i am playing ogg/ape/wv audio file format .
    
    
    
              please help me to minimize the noise, as less as possible
    

    or if any other approach is there to decode these file formats then please let me know

    thanks

  • ffmpeg - slideshow from set of images ignoring first image

    24 novembre 2013, par Vishnu

    I can create slideshow with below command

    ffmpeg -y -f image2 -r 1/15 -i image%d.jpeg -y -r 45 video.mp4
    

    Video is created successfully ,, but its ignoring the 1st image..

    for example in current folder i have

    image1.jpeg
    image2.jpeg
    image3.jpeg
    

    but slide show is created with image2.jpeg and image3.jpeg only..(i.e 30 second video with 2 image is created..but i expected 45 second video with 3 images displaying 15 sec per image)

  • OpenCV 2.4.7 with FFMpeg support build with VS 2010 (x86)

    24 novembre 2013, par GregPhil

    I'm fighting with new version of OpenCV 2.4.7 for almost one week (prev. used 2.4.2). My motivation for the new setup is the fact that I run into problems using the Videowriter for uncompressed grayscale videos (http://answers.opencv.org/question/3603/videowriter-videocapture-and-uncompressed-avi/). In https://github.com/Itseez/opencv/pull/1516 "akarsakov" is recommending to build OpenCV without vfw support and open uncompressed video through ffmpeg. That's what I tried to do. I followed basically the instructions mentioned in this video: http://docs.opencv.org/trunk/doc/tutorials/introduction/windows_install/windows_install.html

    I had successfully build opencv with Qt 5.1.1 (http://answers.opencv.org/question/24184/how-to-compile-opencv-247-with-qt-and-ffmpeg/). But I'm not able to get the VideoWriter & VideoCapture work for me. So what I did:

    1. I downloaded the newest ffmpeg build from http://ffmpeg.zeranoe.com/builds/

    2. I generated the vs2010 project disabling VfW and enabling ffmpeg instead:

      Video I/O:
      
      Video for Windows:           NO
      DC1394 1.x:                  NO
      DC1394 2.x:                  NO
      FFMPEG:                      YES (prebuilt binaries)
        codec:                     YES (ver 55.18.102)
        format:                    YES (ver 55.12.100)
        util:                      YES (ver 52.38.100)
        swscale:                   YES (ver 2.3.100)
        gentoo-style:              YES
        OpenNI:                    NO
      OpenNI PrimeSensor Modules:  NO
      PvAPI:                       NO
      GigEVisionSDK:               NO
      DirectShow:                  YES
      Media Foundation:            NO
      XIMEA:                       NO
      
    3. I added the include directories and libs for ffmpeg to the highgui module.

    I was not able to compile OpenCV without any errors but the behaviour what I see is not what I expect:

    1. Build OpenCV 2.4.7 without VfW -> VideoWriter.open () returns always false!
    2. Build OpenCV 2.4.7 with VfW -> VideoWriter.open () returns true for the time, calling a release() and an open() on the same object returns always a false.
    3. With or without ffmpeg -> An VideoCapture.open () gives a error message asking for libiconv-2dll!

    Does some had the same problems and is able to point me to my mistake?

    Thank you in advance. cheers

  • How can I make a slideshow of images into a video with each image having a specific length of frames to be shown ?

    24 novembre 2013, par jett

    Right now I am using something like this to create a video from a selection of images:

    avconv -i input%05d.png -r 24 -vsync cfr -c:v libx264 output.mp4

    But I want each picture to have a separate amount of time spent on it, so if I have 5 images I might want:

    [10frames, 40frames, 5frames, 80frames, 10frames]
    

    To be specified. The only thing I could find that may be a solution is to create five separate videos- set the length then merge them together, is this my only choice here?

  • Stream as a standard input for ffmpeg in C#

    24 novembre 2013, par Andrew Simpson

    I have a C# Winform application.

    I want to stream images as standard input to my web server.

    ie:

    I would use the Process Class in C# to execute FFMPEG with these arguments..

    ffmpeg -i input -c:v libx264 -preset ultrafast -f rawvideo -
    

    Whilst I know how to output and capture the memory stream within my C# app I do not know how to input a memory stream from within my C# app as standard input.

    I guess the immediate question is can it be done?

    Thanks