Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Decoding HEVC with Alpha Channel via NVDEC : Monochrome (4:0:0) Workarounds ? [closed]

    3 mars, par Holy_diver

    I’m working on decoding HEVC streams that include an alpha (transparency) channel using NVIDIA’s NVDEC. The alpha channel is encoded in monochrome (YUV 4:0:0), but NVDEC’s HEVC decoder appears to lack support for monochrome formats. How can I work around this limitation?

    Problem Details:

    HEVC Profile: Stream uses HEVC_Rext (Range Extensions) 
    with a monochrome alpha layer (4:0:0 chroma subsampling).
    
    NVDEC Limitations: The SDK documentation states support 
    for 4:0:0 (8-bit) only for specific codecs like JPEG, not HEVC.
    Attempting to decode returns 
    cudaError_InvalidValue or NVCUDACB_STATUS_INVALID_PARAM.
    
    Alpha Storage: The alpha is either a separate stream 
    or a dual-layer HEVC bitstream (e.g., DUAL_LAYER_DEPTH_SEPARATE).
    
  • How can I create a video from images using FFmpeg in different scenarios ?

    3 mars, par Meet Developer

    I am trying to create a video from images using FFmpeg in different scenarios, and my commands work on Windows but not in Android Studio. I am using this library in my Android project:

    implementation("com.arthenica:ffmpeg-kit-full:6.0-2")
    

    When I run the commands, I get this error in Logcat:

    [vost#0:0 @ 0x73746c8800] Unknown encoder 'libx264'.
    

    Here are the commands I tried:

    • When I have a list of images and a total duration, so each image's duration is automatically calculated based on the total duration. (For example, 3 images and a total duration of 9 seconds, so each image is shown for 3 seconds).
    ffmpeg -framerate 1/3 -i image%d.jpg -c:v libx264 -r 30 -pix_fmt yuv420p output.mp4
    
    • When each image has its own specific duration. (For example, image1 = 3 sec, image2 = 5 sec, image3 = 10 sec, so the total video length is 18 seconds).
    ffmpeg -loop 1 -t 3 -i image1.jpg -loop 1 -t 5 -i image2.jpg -loop 1 -t 10 -i image3.jpg \
    -filter_complex "[0:v][1:v][2:v]concat=n=3:v=1:a=0[outv]" -map "[outv]" -c:v libx264 -r 30 -pix_fmt yuv420p output.mp4
    
    • When I have a list of images along with an audio file, and the video duration matches the audio length. (For example, 5 images and an audio file of 30 seconds, so each image is displayed for 6 seconds).
    ffmpeg -framerate 1/6 -i image%d.jpg -i audio.mp3 -c:v libx264 -r 30 -pix_fmt yuv420p -c:a aac -b:a 192k -shortest output.mp4
    

    Since libx264 is not available in my Android FFmpeg build, what alternative codec or approach should I use to generate the video correctly?

  • ffmpeg video recording hangs in Emacs

    3 mars, par Nutritioustim

    A. Using ffmpeg, both of these audio and video captures work correctly in a shell.

    # Audio
    ffmpeg -y -f avfoundation -t 5 -i :0 -c:a aac audio-recording.m4a
    
    # Video
    # i. video only
    # ii. audio & video
    ffmpeg -y -f avfoundation -framerate 30 -i 0 -t 10 -c:v libx264 -preset fast -pix_fmt yuv420p video-recording.mp4
    ffmpeg -y -f avfoundation -framerate 30 -pixel_format "uyvy422" -i 0:0 -t 10 -c:v libx264 -preset fast -pix_fmt yuv420p audio-video-recording.mp4
    

    B. In Emacs (29.4, "Emacs For Mac OS X"), this audio code block completes, and prints the successful callback message ffmpeg-audio , finished.

    # Audio, completes successfully
    (setq ffmpeg-record--process
          (apply 'start-process "ffmpeg-audio" "*ffmpeg-recording*"
                 (list "ffmpeg" "-y" "-f" "avfoundation" "-i" ":0" "-t" "10" "-c:a" "aac" "audio-recording.m4a")))
    
    (set-process-sentinel
     ffmpeg-record--process
     (lambda (process event)
       (message "%s , %s" process event)))
    

    But the video code block hangs, and does not complete with any output message.

    # Video, hangs after an initial message
    (setq ffmpeg-record--process
          (apply 'start-process "ffmpeg-video" "*ffmpeg-recording*"
                 (list "ffmpeg" "-y" "-f" "avfoundation" "-framerate" "30" "-pixel_format" "uyvy422" "-i" "0:none" "-t" "10" "-c:v"
                       "libx264" "-preset" "fast" "-pix_fmt" "yuv420p" "video-recording.mp4")))
    (set-process-sentinel
     ffmpeg-record--process
     (lambda (process event)
       (message "%s , %s" process event)))
    

    C. Running diagnostics on the ffmpeg-record--process variable shows the process is still running even after the 10 second duration (-t 10) is long over.

    (message "Process: %s, PID: %s, Status: %s"
             (process-name ffmpeg-record--process)
             (process-id ffmpeg-record--process)
             (process-status ffmpeg-record--process))
    

    D. Presumably, it is start-process that is not properly handling the invocation.

    But why then is the ffmpeg video invocation hanging, only in Emacs (not in the shell)... when the (Emacs) audio invocation completes fine?

  • Why my ffmpeg command works only first several minutes ? [closed]

    2 mars, par О. Войтенко

    I have my own ffmpeg exec command in the docker for writing rtsp stream by motion detection and make segments every 60 sec. But This command detect correctly only first 1-2 minutes, and without any errors ffmpeg does not detect motion. I'm using the latest version ffmpeg.

    docker run -d -v /etc/localtime:/etc/localtime:ro \
    -v /etc/timezone:/etc/timezone:ro \
    --tmpfs /tmpfs:rw,noexec,nosuid,size=100M,mode=1777 \
    -v /data/ARCHIVE:/ARCHIVE \
    -e ARCHIVE_PATH="/ARCHIVE/Cam1" \
    --restart=unless-stopped \
    -d --name rec-Cam1 \
    ffmpeg-recorder -loglevel debug -rtsp_transport udp \
    -i rtsp://user:pass@1.2.3.4:554 \
    -pix_fmt yuv420p -threads 4 -fflags +igndts+discardcorrupt \
    -use_wallclock_as_timestamps 1 -fps_mode cfr \
    -filter_complex "[0:v]split=2[full][motion];[motion]crop=1270:710:0:10,select='gt(scene, 0.01)'[detect];[full][detect]overlay=0:10[output];" \
    -map "[output]" \
    -r 10 \
    -c:v h264 \
    -b:v 500k \
    -maxrate 1000k \
    -crf 20 \
    -preset veryslow \
    -f segment \
    -segment_time 60 \
    -reset_timestamps 1 \
    -strftime 1 \
    -reconnect_at_eof 1 \
    -reconnect_streamed 1 \
    -reconnect_delay_max 120 \
    -movflags +faststart+write_colr+frag_keyframe \
     /tmpfs/Cam1_%Y%m%d%H%M%S.mp4
    

    This command uses tmpfs and then inotifywait copy ready file to the storage, image built - ffmpeg + inotify. This is par of my program. But main issue is that this configurftion does not work correctly.

  • FFMPEG HLS stream for Android and IOS

    2 mars, par Poda

    I'm trying to stream to mobile devices with ffmpeg and apache2.2 but I haven't been successful.

    I used this command to create the segments and the playlist:

    ffmpeg -i http://x.x.x.x:8080 -codec:v libx264 -r 25 -pix_fmt yuv420p -profile:v baseline -level 3 -b:v 500k -s 640x480 -codec:a aac -strict experimental -ac 2 -b:a 128k -movflags faststart -flags -global_header -map 0 -f hls  -hls_time 10 -hls_list_size 5 -hls_allow_cache 0 -sc_threshold 0 -hls_flags delete_segments -hls_segment_filename out%05d.ts list.m3u8
    

    The source is a http stream which is streamed by VLC media player.

    Example content of the list.m3u8 file:

    #EXTM3U
    #EXT-X-VERSION:3
    #EXT-X-ALLOW-CACHE:NO
    #EXT-X-TARGETDURATION:10
    #EXT-X-MEDIA-SEQUENCE:89
    #EXTINF:10.000000,
    out00089.ts
    #EXTINF:10.000000,
    out00090.ts
    #EXTINF:10.000000,
    out00091.ts
    #EXTINF:10.000000,
    out00092.ts
    #EXTINF:9.000000,
    out00093.ts
    #EXT-X-ENDLIST
    

    I created another playlist file - playlist.m3u8:

    #EXTM3U
    #EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=512000
    http://x.x.x.x/list.m3u8
    

    If I open this (playlist.m3u8) file in VLC media player then it plays. It also works in desktop chrome and desktop firefox browsers with Video-js plugin flash fallback.

    I set the correct MIME types to the .ts and .m3u8 files in .htaccess file:

    AddType application/x-mpegURL .m3u8
    AddType video/MP2T .ts
    

    FFprobe output for playlist.m3u8:

    Input #0, hls,applehttp, from 'playlist.m3u8':
        Duration: N/A, start: 1.400000, bitrate: N/A
        Program 0
        Metadata: variant_bitrate : 512000
    Stream #0:0: Video: h264 (Constrained Baseline) ([27][0][0][0] / 0x001B), yuv420p, 640x480 [SAR 1:1 DAR 4:3], 25 fps, 25 tbr, 90k tbn, 50 tbc
    Metadata: variant_bitrate : 512000
    Stream #0:1: Audio: aac (LC) ([15][0][0][0] / 0x000F), 44100 Hz, stereo, fltp, 128 kb/s
    Metadata: variant_bitrate : 512000
    

    What should I do to make it work?

    UPDATE

    It works if I provide a link to list.m3u8 file (created by ffmpeg).