Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • ffmpeg - convert image sequence to video with reversed order [closed]

    12 août, par 0__

    Looking at the docs, it is not apparent to me whether ffmpeg would allow me to convert an image sequence to a video in reverse order, for example using this sequence:

    frame-1000.jpg
    frame-999.jpg
    frame-998.jpg
    ...
    frame-1.jpg
    

    Is it possible to give a "step direction" for the frame indices?

  • How to finalize LL-DASH stream MPD file using Shaka Packager [closed]

    11 août, par Arjit

    I'm using Shaka Packager with FFmpeg piping to produce LL-DASH live streams from an NGINX-RTMP input. Streaming works fine, but when the publisher stops streaming, Shaka Player continuously trying to fetch & throws 404 errors for segment files that don’t exist.

    For example, if the last segment is 29.m4s, the player still tries to fetch:

    GET http://localhost:8080/dash/stream/480p_30.m4s 404 (Not Found)
    GET http://localhost:8080/dash/stream/720p_31.m4s 404 (Not Found)
    

    Even though the stream ended at segment 29.

    Shaka-packager:

    "$SHAKA_PACKAGER_BIN" \
      "in=${VIDEO_720P_FIFO},stream=video,format=mp4,init_segment=${STREAM_OUTPUT_DIR}/720p_init.mp4,segment_template=${STREAM_OUTPUT_DIR}/720p_\$Number\$.m4s,bandwidth=3000000,hls_name=720p" \
      "in=${VIDEO_480P_FIFO},stream=video,format=mp4,init_segment=${STREAM_OUTPUT_DIR}/480p_init.mp4,segment_template=${STREAM_OUTPUT_DIR}/480p_\$Number\$.m4s,bandwidth=1500000,hls_name=480p" \
      "in=${AUDIO_FIFO},stream=audio,format=mp4,init_segment=${STREAM_OUTPUT_DIR}/audio_init.mp4,segment_template=${STREAM_OUTPUT_DIR}/audio_\$Number\$.m4s,bandwidth=128000,hls_name=audio" \
      --hls_master_playlist_output "${STREAM_OUTPUT_DIR}/master.m3u8" \
      --hls_playlist_type LIVE \
      --mpd_output "${STREAM_OUTPUT_DIR}/manifest.mpd" \
      --io_block_size 65536 \
      --segment_duration 2 \
      --low_latency_dash_mode=true \
      --utc_timings "urn:mpeg:dash:utc:http-xsdate:2014=http://time.akamai.com/?iso" \
      --min_buffer_time 1 \
      --time_shift_buffer_depth 60 \
      --hls_start_time_offset -2 \
      --preserved_segments_outside_live_window 10
    

    FFMPEG:

    ffmpeg -y -re -i "rtmp://localhost:1935/live/${STREAM_KEY}" \
    -filter_complex \
    "[0:v]split=2[v720][v480]; \
    [v720]scale=w=1280:h=720,setsar=1[v720out]; \
    [v480]scale=w=854:h=480,setsar=1[v480out]; \
    [0:a]aresample=44100[aout]" \
    \
    -map "[v720out]" -c:v libx264 -b:v 3000k -maxrate 3000k -bufsize 6000k -preset veryfast -tune zerolatency -g 60 -keyint_min 60 -sc_threshold 0 \
    -movflags empty_moov+default_base_moof -frag_duration 2000000 -f mp4 ${VIDEO_720P_FIFO} \
    -map "[v480out]" -c:v libx264 -b:v 1500k -maxrate 1500k -bufsize 3000k -preset veryfast -tune zerolatency -g 60 -keyint_min 60 -sc_threshold 0 \
    -movflags empty_moov+default_base_moof -frag_duration 2000000 -f mp4 ${VIDEO_480P_FIFO} \
    -map "[aout]" -c:a aac -b:a 128k \
    -movflags empty_moov+default_base_moof -frag_duration 2000000 -f mp4 ${AUDIO_FIFO} \
    -loglevel info -stats
    

    To finalize the stream, I have a script that:

    • Waits for segments to stop being created
    • Sends SIGINT to FFmpeg and Shaka Packager
    • Manually appends #EXT-X-ENDLIST to HLS playlists

    Here’s a simplified version of the finalization logic:

    #!/bin/bash
    exec >> /output/debug.log 2>&1
    
    STREAM_KEY="$1"
    
    FFMPEG_PID_FILE="/output/${STREAM_KEY}_transcoder.pid"
    SHAKA_PID_FILE="/output/${STREAM_KEY}_packager.pid"
    
    VIDEO_720P_FIFO="/output/video_720p.pipe"
    VIDEO_480P_FIFO="/output/video_480p.pipe"
    AUDIO_FIFO="/output/audio.pipe"
    
    HLS_PLAYLISTS_DIR="/output/${STREAM_KEY}"
    
    LAST_COUNT=-1
    STABLE_COUNT=0
    MAX_STABLE=5  # 5 seconds of no new segments
    
    while true; do
        CURRENT_COUNT=$(ls ${STREAM_OUTPUT_DIR}/*.m4s 2>/dev/null | wc -l)
        if [ "$CURRENT_COUNT" -eq "$LAST_COUNT" ]; then
            STABLE_COUNT=$((STABLE_COUNT + 1))
        else
            STABLE_COUNT=0
        fi
    
        if [ "$STABLE_COUNT" -ge "$MAX_STABLE" ]; then
            echo "No new segments for $MAX_STABLE seconds, safe to finalize."
            break
        fi
    
        LAST_COUNT=$CURRENT_COUNT
        sleep 1
    done
    
    # Stop Shaka Packager
    if [ -f "$SHAKA_PID_FILE" ]; then
        SHAKA_PID=$(cat "$SHAKA_PID_FILE")
        kill -INT $SHAKA_PID
        echo "Waiting for Shaka Packager process $SHAKA_PID to finish..."
        while kill -0 $SHAKA_PID > /dev/null 2>&1; do
            sleep 1
        done
        echo "$(date): Killed Shaka Packager PID $SHAKA_PID"
        rm -f "$SHAKA_PID_FILE"
    fi
    
    # Stop FFmpeg transcoder if still running
    if [ -f "$FFMPEG_PID_FILE" ]; then
        FFMPEG_PID=$(cat "$FFMPEG_PID_FILE")
        # kill -SIGINT $FFMPEG_PID
        kill -9 $FFMPEG_PID
        echo "Waiting for FFmpeg process $SHAKA_PID to finish..."
        while kill -0 $FFMPEG_PID > /dev/null 2>&1; do
            sleep 1
        done
    
        echo "$(date): Killed FFmpeg PID $FFMPEG_PID"
        rm -f "$FFMPEG_PID_FILE"
    fi
    
    
    
    
    for playlist in "$HLS_PLAYLISTS_DIR"/*.m3u8; do
        if [[ "$(basename "$playlist")" != "master.m3u8" ]]; then
            echo "#EXT-X-ENDLIST" >> "$playlist"
        fi
    done
    
    rm -f "$VIDEO_720P_FIFO" "$VIDEO_480P_FIFO" "$AUDIO_FIFO"
    
    echo "$(date): Finalization complete for stream $STREAM_KEY."
    
    

    shaka-packager wait infinitely when i am sending INT/TERM.

    Question:

    • Why doesn't Shaka Packager finalize the LL-DASH MPD manifest when receiving SIGINT?
    • Is there a way to tell Shaka Packager to write the final segments and set availabilityEndTime without switching to VOD mode?
    • Should I be doing something manually to flush the last segments or finalize the MPD, similar to #EXT-X-ENDLIST in HLS?

    I only want to gracefully end the stream so Shaka Player doesn't try to fetch non-existent segments, without switching the stream to VOD mode.

  • How to insert background video using ffmpeg [closed]

    10 août, par user3714098

    I have a video.mp4 (640x360) and image.jpg (1280x720). I want to make image.jpg as a background of video.mp4 using ffmpeg:

    enter image description here

  • Manually generate "empty" h264 p-frame

    9 août, par leavittx

    Let's call a p-frame frame empty if it doesn't change any pixels in the currently decoded video (i.e. no motion vectors, nothing).

    What I need is to be able to manually insert empty p-frame into the video stream on request (need to send frames to the streaming client with a constant framerate, however the frame source on the streaming server can supply frames with a different/lower one).

    So I need to be able to correctly form the byte sequence which represents the empty p-frame for the current resolution (I wonder which other parameters could be needed?)

    Ideally, I prefer to have an encoder-independent solution, since I use two different ones: nvenc via Nvidia Video SDK and x264 via FFmpeg.

    Where should I look to?

  • How to handle DHAV (.dav) video streams ?

    7 août, par Mateus Henrique

    I have a WPF app where I need to handle DHAV (.dav) video stream on runtime from a Digital Video Recorder (DVR). I'm using an SDK that can be found here Dahua SDK search

    SDK: General_NetSDK_Eng_Win64_IS_V3.052.0000002.0.R.201103
    

    I need to handle every single frame from the video stream, convert it to a BitmapImage and then displays it in a WPF Image control. Something like: MJPEG Decoder

    The problem is that I can't find any documentation on how to handle that data and the samples from the SDK doesn't show that either, instead they are built with WinForms and they only pass the Window Handle of the PictureBox's control to an exported DLL function and 'magically' shows the video stream:

    [DllImport(LIBRARYNETSDK)]
    public static extern IntPtr CLIENT_RealPlayEx(IntPtr lLoginID, int nChannelID, IntPtr hWnd, EM_RealPlayType rType);
    

    OBS: 'hWnd' param is the Window Handle to display the video in.

    The problem with this approach is that I don't have any control over the video stream.

    I have tried many FFMPEG wrappers for .NET but they only parse the data if I first write it to disk and only then I can convert it to some type I can handle.

    This is the callback function that is called contantly during the application's runtime with the data I need to handle:

        private void RealDataCallback(IntPtr lRealHandle, uint dwDataType, IntPtr pBuffer, uint dwBufSize, IntPtr param, IntPtr dwUser)
        {
            switch (dwDataType)
            {
                case 0: // original data
                    break;
                case 1: // frame data
                    HandleFrameData(lRealHandle, dwDataType, pBuffer, dwBufSize, param, dwUser);
                    break;
                case 2: // yuv data
                    break;
                case 3: // pcm audio data
                    break;
            }
        }
    
        private void HandleFrameData(IntPtr lRealHandle, uint dwDataType, IntPtr pBuffer, uint dwBufSize, IntPtr param, IntPtr dwUser)
        {
            // The pBuffer parameter format is DHAV (.dav) 
            byte[] buff = new byte[dwBufSize];
            Marshal.Copy(pBuffer, buff, 0, (int)dwBufSize);
    
            using (var ms = new MemoryStream(buff))
            {
            }
        }
    

    UPDATE

    I'm able to convert the YUV data provided in the callback funcion to RGB but that is not the ideal solution. It would be so much better (and faster) if I can convert the original (.dav) data.

    The RealDataCallback, in fact, only returns 1 frame per callback, but I don't know how to convert that frame to Bitmap. Any help would be appreciated.