Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • FFmpeg RTSP Recording : Video Timestamp Does not Match With Recorded MP4 File Timestamp [closed]

    26 avril, par lastpeony4

    I'm currently testing by streaming a 30 fps example flv video using a local Happy-Time RTSP server.

    This is the flv file i am streaming with RTSP:

    enter image description here

    I recorded the video with below ffmpeg command:

    ffmpeg -i rtsp://127.0.0.1:6555/test30fps.flv -c copy test30fps.mp4
    

    The resulting video appears visually satisfactory, yet there's a discrepancy between the displayed time on the video and the actual duration of the video file. Although the MP4 file duration is correct (endRecordingTimeMs-startRecordingTimeMs= mp4 file duration), the time displayed within the video does not synchronize precisely with the file's time. Notably, this disparity escalates as the video progresses.

    I anticipate the time text overlaid on the video and the file's time to align seamlessly. However, a few seconds of divergence are noticeable, gradually expanding over the video's duration.

    enter image description here

    Why does this occur and is there any way to fix this?

  • HLS audio stream choppy in iOS and Mac Safari - but ok in Chrome, VLC

    26 avril, par Elliot

    I have an HLS stream that seems to have the audio cut out and then return randomly. This only occurs on iOS devices, and Safari on Mac OS. It works fine on Desktop VLC, but not iOS VLC (as in, the audio never cuts out).

    More confusing still - if we have multiple developers listening to the same HLS stream on different devices at the same time, we all hear the audio cut out and the return at the same time. So clearly something is wrong with the stream itself.

    We create a dash stream from FFMPEG, and then create an HLS manifest in parallel.

       transcoder
                // Right now this is the max we'll get
                .audioBitrate('64k')
                // We only have 1 channel
                .audioChannels(1)
                // Enable experimental features ldash/lhls
                .outputOption('-strict', 'experimental')
                // Set url for syncing UTC timestamps with client
                .outputOption('-utc_timing_url', 'https://time.akamai.com/?iso')
                .outputOption('-write_prft', '1')
                // Streaming mode
                .outputOption('-streaming', '1')
                // Use segment file template so we dont list all segments in the manifest
                .outputOption('-use_template', '1')
                // Must be disabled for index_correction which is suggested for streaming use case
                .outputOption('-use_timeline', '0')
                // Fix up segment indexes if they drift due to bandwidth changes
                .outputOption('-index_correction', '1')
                //  Use 6 second segments, for live this shouldn't matter but this is Apple's suggested
                //  value for HLS
                .outputOption('-seg_duration', '6')
                // Push the frames out as fast as they come in
                .outputOption('-frag_type', 'every_frame')
                // Use mp4 segments
                .outputOption('-dash_segment_type', 'mp4')
                // Enabled low-latency mode
                .outputOption('-ldash', '1')
                // Output an HLS playlist too
                .outputOption('-hls_playlist', this.options.hlsEnabled ? '1' : '0')
                // HLS manifest name
                .outputOption('-hls_master_name', 'index.m3u8')
                // Use persistent HTTP connection
                .outputOption('-http_persistent', '1')
                // Do PUTs to the upload endpoint
                .outputOption('-method', 'PUT')
                // Dont let an output error hold up the stream
                .outputOption('-ignore_io_errors', '1')
                // Set a timeout for output
                .outputOption('-timeout', '1')
                // DASH format
                .format('dash');
    

    Does anyone have any suggestions for what we should dig into next?

  • Multithreading decoding A/V streams via FFmpeg + JNI + Android NDK causes artifacts

    26 avril, par tretdm

    It would seem that if I mute the callback for reading the audio buffer, then they disappear.

    FFmpeg decoding issues in picture

    I'll attach the code below to make it easier for you to understand what the problem is:

    ffwrap.cpp

    static void *audioDecoderThread(void *arg) {
        AudioDecoder *audioDec = (AudioDecoder*) arg;
        audioDec->prepare();
        audioDec->start();
    }
    
    static void *videoDecoderThread(void *arg) {
        VideoDecoder *videoDec = (VideoDecoder*) arg;
        videoDec->prepare();
        videoDec->start();
    }
    
    // ...
    
    void FFmpegWrapper::startDecoding(int pStreamIndex) {
        if(pStreamIndex == gAudioStreamIndex) {
            AudioDecoder *audioDec = new AudioDecoder(
                gFormatCtx,
                gAudioCodecCtx,
                getStream(gAudioStreamIndex),
                gAudioStreamIndex,
                gInterface
            );
    
            pthread_t audioDecThread;
            pthread_create(&audioDecThread, NULL, &audioDecoderThread, (void*)audioDec);
        } else if(pStreamIndex == gVideoStreamIndex) {
            VideoDecoder *videoDec = new VideoDecoder(
                gFormatCtx,
                gVideoCodecCtx,
                getStream(gVideoStreamIndex),
                gVideoStreamIndex,
                gInterface
            );
    
            pthread_t videoDecThread;
            pthread_create(&videoDecThread, NULL, &videoDecoderThread, (void*)videoDec);
        }
    }
    

    audiodec.cpp

    AudioDecoder::AudioDecoder(AVFormatContext *pFormatCtx,
                               AVCodecContext *pCodecCtx,
                               AVStream* pStream,
                               int pStreamIndex,
                               IFFmpegWrapper *pInterface) {
        gFormatCtx = pFormatCtx;
        gCodecCtx = pCodecCtx;
        gStream = pStream;
        gStreamIndex = pStreamIndex;
        gInterface = pInterface;
    }
    
    bool AudioDecoder::prepare() {
        gBufferSize = AV_MAX_AUDIO_FRAME_SIZE; 
        // ^ this constant is equal to 192000 and replaces the deprecated 
        //   AVCODEC_MAX_AUDIO_FRAME_SIZE
    
        gBuffer = (short*) av_malloc(gBufferSize);
        gSwrCtx = swr_alloc();
        gSwrCtx = swr_alloc_set_opts(
            gSwrCtx, (int64_t) gCodecCtx->channel_layout, AV_SAMPLE_FMT_S16,
            gCodecCtx->sample_rate, gCodecCtx->channel_layout,
            gCodecCtx->sample_fmt, gCodecCtx->sample_rate, 0, NULL
        );
        swr_init(gSwrCtx);
        return gBuffer != NULL;
    }
    
    void *AudioDecoder::decodeInThread() {
        int         status, dataSize, len;
        AVPacket    avPkt;
        AVFrame     *pFrame     = av_frame_alloc();
    
        while(av_read_frame(gFormatCtx, &avPkt)>=0) {
            // It is from the audio stream?
            if(avPkt.stream_index == gStreamIndex) {
                len = avcodec_decode_audio4(gStream->codec, pFrame, &status, &avPkt);
                if(len < 0) {
                    break;
                }
                if (status) {
                    dataSize = av_samples_get_buffer_size(
                        NULL, gCodecCtx->channels, pFrame->nb_samples,
                        gCodecCtx->sample_fmt, 1
                    );
                    uint8_t* buffer = (uint8_t*)av_malloc(sizeof(uint8_t) * dataSize);
                    swr_convert(
                        gSwrCtx, &buffer, dataSize,
                        (const uint8_t **) pFrame->data,
                        pFrame->nb_samples
                    );
                    memcpy(gBuffer, buffer, dataSize);
                    av_free(buffer);
    
                    // Calling callback for JNI bridging
                    gInterface->onStreamDecoding((uint8_t*)gBuffer, dataSize, gStreamIndex);
                }
            }
    
            // Free the packet that was allocated by av_read_frame
            av_free_packet(&avPkt);
            av_packet_unref(&avPkt);
        }
        av_free(pFrame);
        stop();
    }
    
    bool AudioDecoder::start() {
        //  ^ Problematic function, using pthread the problem remains 
        //    except for commenting out this line
        decodeInThread();
        return true;
    }
    
    bool AudioDecoder::stop() {
        free(gBuffer);
        swr_free(&gSwrCtx);
        avcodec_close(gCodecCtx);
        return true;
    }
    

    videodec.cpp

    VideoDecoder::VideoDecoder(AVFormatContext *pFormatCtx,
                               AVCodecContext *pCodecCtx,
                               AVStream* pStream,
                               int pStreamIndex,
                               IFFmpegWrapper *pInterface) {
        gFormatCtx = pFormatCtx;
        gCodecCtx = pCodecCtx;
        gStream = pStream;
        gStreamIndex = pStreamIndex;
        gInterface = pInterface;
    }
    
    bool VideoDecoder::prepare() {
        return true; // dummy function yet
    }
    
    void *VideoDecoder::decodeInThread() {
        AVPacket            avPkt;
        int                 vWidth = gCodecCtx->width,
                            vHeight = gCodecCtx->height,
                            status, len,
                            dataSize = avpicture_get_size(AV_PIX_FMT_RGB32, vWidth, vHeight),
                            packetSize, tVideoFrames;
        struct SwsContext   *img_convert_ctx = NULL;
    
        gBuffer = (short*) av_mallocz((size_t)dataSize);
    
        while(av_read_frame(gFormatCtx, &avPkt)>=0) {
            gFrame = avcodec_alloc_frame();
            // It is from the video stream?
            if(avPkt.stream_index == gStreamIndex) {
                packetSize = avPkt.size;
                struct SwsContext *img_convert_ctx = NULL;
                avpicture_fill((AVPicture*) gFrame,
                    (const uint8_t*) gBuffer,
                    gCodecCtx->pix_fmt,
                    gCodecCtx->width,
                    gCodecCtx->height
                );
    
                avcodec_decode_video2(gCodecCtx, gFrame, &status, &avPkt);
                if(!status || gFrame == NULL || packetSize == 0) {
                    tVideoFrames++;
                    continue;
                }
                AVPixelFormat pxf;
    
                pxf = AV_PIX_FMT_BGR32;
    
                convertYuv2Rgb(pxf, gFrame, dataSize);
                tVideoFrames++;
                gInterface->onStreamDecoding((uint8_t*)gBuffer, dataSize, gStreamIndex);
            }
            av_free(gFrame);
    
            // Free the packet that was allocated by av_read_frame
            av_free_packet(&avPkt);
            av_packet_unref(&avPkt);
        }
    
        stop();
    }
    
    short* VideoDecoder::convertYuv2Rgb(AVPixelFormat pxf, AVFrame* frame, int length) {
        // Convert video frame from YUV to RGB24
        AVFrame         *frameRGB   = av_frame_alloc();
        AVPixelFormat   output_pxf  = pxf;
    
        avpicture_fill((AVPicture *)frameRGB, (uint8_t*)gBuffer, output_pxf,
                       gCodecCtx->width, gCodecCtx->height);
        const int width = gCodecCtx->width, height = gCodecCtx->height;
        SwsContext* img_convert_ctx = sws_getContext(width, height,
                                         gCodecCtx->pix_fmt,
                                         width, height, output_pxf, SWS_BICUBIC,
                                         NULL, NULL, NULL);
    
    
        if(img_convert_ctx == NULL) {
            LOGE(10, "[ERROR] Cannot initialize the conversion context!");
            sws_freeContext(img_convert_ctx);
            return NULL;
        }
    
        int ret = sws_scale(img_convert_ctx, (const uint8_t* const*)frame->data, frame->linesize, 0,
                            gCodecCtx->height, frameRGB->data, frameRGB->linesize);
        if(frameRGB->data[0] == NULL) {
            LOGE(10, "[ERROR] SWS_Scale failed");
        }
        av_free(frameRGB);
        av_frame_unref(frameRGB);
        sws_freeContext(img_convert_ctx);
        return gBuffer;
    }
    
    bool VideoDecoder::start() {
        decodeInThread(); 
        //  ^ Problematic function, using pthread the problem remains 
        //    except for commenting out this line
        return true;
    }
    
    bool VideoDecoder::stop() {
        av_free(gFrame);
        avcodec_close(gCodecCtx);
        av_free(gBuffer);
        return true;
    }
    

    mplayer.cpp

    void IPlayerWrapper::onResult(int cmdId, int resultCode) {
        JNIEnv* env;
        int attachResult = attachEnv(&env);
        if(attachResult < 2) {
            jclass jmPlay = env->GetObjectClass(instance);
            if(cmdId == FFMPEG_COMMAND_FIND_STREAMS) {
                gWrapper->openCodecs();
            } else if(cmdId == FFMPEG_COMMAND_OPEN_CODECS) {
                jmethodID onResultMid = env->GetMethodID(jmPlay, "onResult", "(II)V");
                env->CallVoidMethod(instance, onResultMid, (jint)cmdId, (jint)resultCode);
            }
            if(attachResult == 1) {
                gVM->DetachCurrentThread();
            }
        }
    }
    
    void IPlayerWrapper::onStreamDecoding(uint8_t* buffer, int bufferLen, int streamIndex) {
        JNIEnv* env;
        int attachResult = attachEnv(&env);
        if(attachResult < 2) {
            jclass jmPlay = env->GetObjectClass(instance);
            jBuffer = env->NewByteArray((jsize) bufferLen);
            env->SetByteArrayRegion(jBuffer, 0, (jsize) bufferLen, (jbyte *) buffer);
            if(streamIndex == gWrapper->gAudioStreamIndex) {
                jmethodID renderAudioMid = env->GetMethodID(jmPlay, "renderAudio", "([BI)V");
                env->CallVoidMethod(instance, renderAudioMid, jBuffer, bufferLen);
            } else if(streamIndex == gWrapper->gVideoStreamIndex) {
                jmethodID renderVideoMid = env->GetMethodID(jmPlay, "renderVideo", "([BI)V");
                env->CallVoidMethod(instance, renderVideoMid, jBuffer, bufferLen);
            }
            env->ReleaseByteArrayElements(jBuffer, (jbyte *)env->GetByteArrayElements(jBuffer, NULL), JNI_ABORT);
            env->DeleteLocalRef(jBuffer);
            env->DeleteLocalRef(jmPlay);
            if(attachResult == 1) {
                gVM->DetachCurrentThread();
            }
        }
    }
    
    
    JNIEXPORT void JNICALL naInit(JNIEnv *env, jobject instance) {
        gInterface = new IPlayerWrapper();
        gInterface->instance = env->NewGlobalRef(instance);
        gWrapper = new FFmpegWrapper(gDebugMode, gInterface);
    }
    
    JNIEXPORT void JNICALL naPlay(JNIEnv *env, jobject instance, int streamType) {
        gVMArgs.version = JNI_VERSION_1_6;
        gVMArgs.name = NULL;
        gVMArgs.group = NULL;
        gWrapper->setPlaybackState(FFMPEG_PLAYBACK_PLAYING);
        //gWrapper->startDecoding();
    }
    
    JNIEXPORT void JNICALL naStartAudioDecoding(JNIEnv *env, jobject instance) {
        gWrapper->startDecoding(gWrapper->gAudioStreamIndex);
    }
    
    JNIEXPORT void JNICALL naStartVideoDecoding(JNIEnv *env, jobject instance) {
        gWrapper->startDecoding(gWrapper->gVideoStreamIndex);
    }
    

    FFmpegPlayer.java

        @Override
        public void start() throws IllegalStateException {
            if(tracks != null) {
                naPlay();
                Log.d(MPLAY_TAG, "Playing...");
                FFAudioTrack audio_track = null;
                FFVideoTrack video_track = null;
                for(int tracks_index = 0; tracks_index < tracks.size(); tracks_index++) {
                    if(tracks.get(tracks_index) instanceof FFAudioTrack) {
                        audio_track = (FFAudioTrack) tracks.get(tracks_index);
                    } else if(tracks.get(tracks_index) instanceof FFVideoTrack) {
                        video_track = (FFVideoTrack) tracks.get(tracks_index);
                    }
                }
    
                naStartAudioDecoding();
                naStartVideoDecoding();
            }
        }
    
        private void renderAudio(final byte[] buffer, final int length) {
            AudioTrack track = null;
            if (buffer == null) {
                Log.e(MPLAY_TAG, "Audio buffer is empty");
                return;
            }
    
            if (!prepared_audio_buffer) {
                for (int tracks_index = 0; tracks_index < tracks.size(); tracks_index++) {
                    if (tracks.get(tracks_index) instanceof FFAudioTrack) {
                        track = (FFAudioTrack) tracks.get(tracks_index);
                    }
                }
                if (track == null) {
                    Log.e(MPLAY_TAG, "Audio track not found");
                    return;
                }
                int ch_config = track.channels == 2 ?
                        AudioFormat.CHANNEL_CONFIGURATION_STEREO :
                        AudioFormat.CHANNEL_CONFIGURATION_MONO;
    
                audio_track = new AudioTrack(AudioManager.STREAM_MUSIC, (int) track.sample_rate,
                        ch_config,
                        AudioFormat.ENCODING_PCM_16BIT, length, AudioTrack.MODE_STREAM);
    
                minAudioBufferSize = AudioRecord.getMinBufferSize(
                        (int) (track.sample_rate),
                        ch_config,
                        AudioFormat.ENCODING_PCM_16BIT);
    
                audio_track.play();
                prepared_audio_buffer = true;
            }
    
            try {
                audio_track.write(buffer, 0, length);
            } catch (Exception ignored) {
            }
        }
    
        private void renderVideo(final byte[] buffer, final int length) {
            Canvas c;
            videoBuffer = buffer;
            OvkVideoTrack track = null;
            for (int tracks_index = 0; tracks_index < tracks.size(); tracks_index++) {
                if (tracks.get(tracks_index) instanceof FFVideoTrack) {
                    track = (FFVideoTrack) tracks.get(tracks_index);
                }
            }
            if (track != null) {
                int frame_width = track.frame_size[0];
                int frame_height = track.frame_size[1];
                if (frame_width > 0 && frame_height > 0) {
                    minVideoBufferSize = frame_width * frame_height * 4;
                    try {
                        // RGB_565  == 65K colours (16 bit)
                        // RGB_8888 == 16.7M colours (24 bit w/ alpha ch.)
                        int bpp = Build.VERSION.SDK_INT > 9 ? 16 : 24;
                        Bitmap.Config bmp_config =
                                bpp == 24 ? Bitmap.Config.RGB_565 : Bitmap.Config.ARGB_8888;
                        if(videoBuffer != null && holder != null) {
                            holder.setType(SurfaceHolder.SURFACE_TYPE_NORMAL);
                            if((c = holder.lockCanvas()) == null) {
                                Log.d(MPLAY_TAG, "Lock canvas failed");
                                return;
                            }
                            ByteBuffer bbuf =
                                    ByteBuffer.allocateDirect(minVideoBufferSize);
                            bbuf.rewind();
                            for(int i = 0; i < videoBuffer.length; i++) {
                                bbuf.put(i, videoBuffer[i]);
                            }
                            bbuf.rewind();
                            Bitmap bmp = Bitmap.createBitmap(frame_width, frame_height, bmp_config);
                            bmp.copyPixelsFromBuffer(bbuf);
                            float aspect_ratio = (float) frame_width / (float) frame_height;
                            int scaled_width = (int)(aspect_ratio * (c.getHeight()));
                            videoBuffer = null;
                            c.drawBitmap(bmp,
                                    null,
                                    new RectF(
                                            ((c.getWidth() - scaled_width) / 2), 0,
                                            ((c.getWidth() - scaled_width) / 2) + scaled_width,
                                            c.getHeight()),
                                    null);
                            holder.unlockCanvasAndPost(c);
                            bmp.recycle();
                            bbuf.clear();
                        } else {
                            Log.d(MPLAY_TAG, "Video frame buffer is null");
                        }
                    } catch (Exception ex) {
                        ex.printStackTrace();
                    } catch (OutOfMemoryError oom) {
                        oom.printStackTrace();
                        stop();
                    }
                }
            }
        }
    
    

    I used FFmpeg version 2.8.11 and wrote a wrapper specifically for an application that supports Android 2.1 and higher. After switching to OOP and multithreading, I noticed such decoder artifacts, and the buffer often times out, as indicated by the Android AudioFlinger in the logs. It doesn’t matter whether it’s an AVD or a real device.

    04-26 20:56:51.215 1333-1372/? I/AudioFlinger: BUFFER TIMEOUT: remove(4097) from active list on thread 0xaf183540
    

    Replaying with other coding schemes does not solve this problem, since the problem lies in the implementation of the wrapper itself, which I cannot understand.

  • How to Terminate a Process Normally Created using ProcessBuilder

    26 avril, par Bilal Ahmed Yaseen

    I am creating Processes using ProcessBuilder in my Java Application. The created process executes some FFMPEG commands which actually copy the RTSP streams in specified destination media file.

    ProcessBuilder builder = new ProcessBuilder("ffmpeg", "-i", RTSP_URL, "-f", fileFormat, destFilePath);
    Process processToExecute = builder.start();
    

    I want to close the process before it completes its execution. So, If I run this FFMPEG command directly in windows CMD and then press 'CTRL+C' after 5 seconds then process get terminates with status '2'. And I can play the media file created so far.

    So, If I do the same operation in my Java Application using:

     process.destroy(); //I call this method after 5 sec
    

    I get the status code '1' which means abnormal termination. I get the status by the following way:

     processToExecute.destroy();
     processToExecute.exitValue(); //This return me status '1'
    

    And I can't play the media file and I think this is due to the abnormal termination of the process.

    So how I can terminate the process created using ProcessBuilder in the same way we do in CMD with (CTRL+C) so that I may play the created media file ?

    I want to terminate process (created using ProcessBuilder) in Java Application with status code of '2' that I get when I terminate process using CMD.

    EDIT#01: --- Sharing Findings

    So, when I try to delete that file once app terminates, I get the following error:

    The Action Can't be Performed Because File is Opened in FFMPEG.exe
    

    Which means that process is not terminating the command it is executing. That command still has occupied this file that's why I am not getting able to play it. Process gets terminate when I call:

     processToExecute.destroy();
    

    But, the task it is performing (that is execution of a command) is still active. Strange!!!!

    EDIT#02: Sharing Ultimate Reason

    Actually If I directly press 'CTRL+C' or 'q' in cmd when process is running then it terminates the process successfully and this process is no more visible in the currently executing processes lists.

    And Programatically when I call method:

    cmd> processToExecute.destroy();
    

    It terminates the process but when I see the list of currently executing processes I can still see them over there.

    And same scenario exists If I try to terminate this process using 'taskkill' or 'kill' command in another CMD by specifying their's name or pid that still process terminates abnormally.

    P.S. I use the following command to see the running processes:

    tasklist
    

    So from this it proves that destroy() method from Application and 'taskkill or kill' command from another CMD is not terminating the process normally that pressing 'CTRL+C' and 'q' does.

  • How to play streaming media of rtmp/rtsp protocol in .NET MAUI Android/iOS ? [closed]

    26 avril, par han zhu

    As far as I know, there are many solutions on Windows (winform/wpf). The most typical ones are to call the libvlcsharp and ffmpeg libraries. However, for .NET MAUI Android/iOS, I have not found the corresponding support library at present. In fact, my purpose seems very simple. I want to use MAUI to implement a real-time streaming media player that can support the rtmp/rtsp protocol. Does anyone know of any good solutions? Thanks

    I tried the MediaElement from the community toolkit but it doesn't support rtmp so I'm hoping to find a solution to support rtmp live streaming in .NET Maui Android/ios