Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
Force FFmpeg to use a non-standard codec in container/extension (opus in .m4b)
13 mars, par Craig StevensonFor my audiobooks I am using a non standard combination of using the opus audio codec in m4b files (this works without issue on most devices for me as they support opus audio streams in mp4 containers).
currently having to instruct ffmpeg to encode it into mp4 and i manually change the extension to m4b, as it gives this error if i try and do it directly despite it being the same container format:
[ipod @ 0x64872b2de9c0] Could not find tag for codec opus in stream #1, codec not currently supported in container
Is there any comand or options to let me overide this and force it to let me use and opus stream directly in .m4b?
-
Why does Apple support H.265 playback, but not when using HLS ? [closed]
13 mars, par clark wuApple devices (iPhone, iPad, macOS Safari) can natively play H.265 (HEVC) videos in .mp4 or .mov format. However, when I try to convert the same video to HLS, it fails to play. Here’s what I tested:
Test Cases
✅ Directly playing H.265 .mp4 → Works fine
❌ H.265 + Opus → HLS (fMP4) → Fails
❌ H.265 + AAC → HLS (TS) → Fails
✅ H.264 + Opus → HLS fMP4 → Works fine (video + audio)
✅ H.264 + Opus → HLS TS → Video plays, but no audio
❌ H.265 + AAC → HLS fMP4 → Doesn’t play at all
FFmpeg Commands Used
1️⃣ H.265 + Opus (fMP4 segments)
ffmpeg -i input.mp4 -c:v copy -c:a copy -hls_segment_type fmp4 -hls_time 10 -hls_list_size 0 -hls_flags independent_segments output.m3u8
2️⃣ H.265 + AAC (TS segments)
ffmpeg -i input.mp4 -c:v copy -c:a copy -hls_time 10 -hls_list_size 0 -hls_flags independent_segments output.m3u8
Playback Attempts
Vidstack.js player (Fails)
Several online M3U8 players (All fail)
Key Questions
Why does Apple support H.265 in MP4 but not in HLS?
Is this an HLS specification limitation?
Does the audio codec (Opus / AAC) affect playback?
Does HLS require a specific HEVC profile/level?
Is HLS only compatible with AVC, not HEVC?
Looking for Answers
Has anyone successfully played H.265 + HLS on Apple devices?
Are there any official Apple HLS specifications regarding HEVC support?
Possible workarounds or alternative solutions?
Would appreciate any insights or help from the community! Thanks!
-
flutter_ffmpeg is discontinued with no alternatives ?
13 mars, par Rageh AzzazyBased on this article by Taner Sener https://tanersener.medium.com/saying-goodbye-to-ffmpegkit-33ae939767e1
and after the discontinuation of flutter_ffmpeg and ffmpeg_kit_flutter packages.
most packages for executing video editing commands were built depending on them and so won't work After April 1, 2025 as mentioned in the package documentation and Taner's article.
example of packages depending on flutter_ffmpeg or ffmpeg_kit_flutter like
- video_trimmer
- zero_video_trimmer
- flutter_video_trimmer
- video_trim
- bemeli_compress
- video_trimmer_pro
- ... others
and editing video using video_editor or video_editor_2 or video_editor_pits has become a problem
and I believe downloading the binaries and doing the whole thing locally is not just tedious but illegal as well.
so I broke down exactly what I need to edit videos in my flutter app
and I found those package but not yet tested them
Trimming : flutter_native_video_trimmer
Compression : video_compress or video_compress_plus
Muting : video_compress handles this
Encoding : Not sure, I just need a simple MP4 video files
Cropping : I can't find a solution for video cropping
I don't need to rotate, reverse or do any fancy stuff to the videos, just those five functions.
so now only remaining solution for this approach is to find a simple video cropping function and an encoder that work on flutter IOS & Android
so my question is not looking for external library recommendation but do you have any ideas how to crop a video in flutter natively ?
-
No accelerated colorspace conversion found from yuv420p to argb
13 mars, par Zac ChanI am a novice at ffmpeg and have recently taken over a code base built by a previous engineer. The FFmpeg code is on an app engine that will edit the videos when they are uploaded.
This code generated a title animation that will later be used as an overlay.
exports.generateTitleAnimation = function(metadata, destPath, options = {}) { const peeqLogoPath = "/app/assets/peeq-logo.png"; const whiteBarMovPath = "/app/assets/whiteBar.mov"; const titleFontPath = "/app/assets/Sofia-Pro-Black.otf"; const dateStrFontPath = "/app/assets/Sofia-Pro-Medium.otf"; const outputDuration = 5.52; const src01 = "color=c=white:s=1920x1080:duration=" + outputDuration; const src02 = "color=c=white@0.0:s=1920x1080:r=120:duration=" + outputDuration; var dateStrXOffset = "(92"; var filterComplexStr = "[1]"; if (metadata.title) { const title = metadata.title.toUpperCase(); filterComplexStr += "drawtext=fontfile=" + titleFontPath + ":text='" + title + "':x='floor(92*(min((t-1.75)^29,0)+max((t-3.75)^29,0)+1))':y=622+30+2:fontsize=70:fontcolor=black:ft_load_flags=render,"; } if (metadata.subTitle) { const subTitle = metadata.subTitle.toUpperCase(); filterComplexStr += "drawtext=fontfile=" + titleFontPath + ":text='" + subTitle + "':x='floor(92*(min((t-2.0)^29,0.0)+max((t-3.8)^29,0.0)+1.0))':y=622+184-20-60+9:fontsize=46:fontcolor=black:ft_load_flags=render,"; dateStrXOffset += "+30*" + (subTitle.length + 1); } if (metadata.dateStr) { filterComplexStr += "drawtext=fontfile=" + dateStrFontPath + ":text='" + metadata.dateStr + "':x='floor(" + dateStrXOffset + ")*(min((t-2.0)^29,0.0)+max((t-3.8)^29,0.0)+1.0))':y=622+184-20-60+9:fontsize=46:fontcolor=black:ft_load_flags=render,"; } console.log("generateTitleAnimation generating") filterComplexStr += "split=10[t01][t02][t03][t04][t05][t06][t07][t08][t09][t10];[t02]setpts=PTS+0.0166/TB[d02];[t03]setpts=PTS+0.033/TB[d03];[t04]setpts=PTS+0.05/TB[d04];[t05]setpts=PTS+0.0666/TB[d05];[t06]setpts=PTS+0.083/TB[d06];[t07]setpts=PTS+0.1/TB[d07];[t08]setpts=PTS+0.1166/TB[d08];[t09]setpts=PTS+0.133/TB[d09];[t10]setpts=PTS+0.15/TB[d10];[d10][d09]blend=average,[d08]blend=darken,[d07]blend=average,[d06]blend=darken,[d05]blend=average,[d04]blend=darken,[d03]blend= average,[d02]blend=darken,[t01]blend=average,colorkey=white:0.2:0.0,perspective=y1=W*0.176327:y3=H+W*0.176327[text01];[2][3]overlay=x=(W-w)*0.5:y=(H-h)*0.5:enable='between(t,0,3.0)'[logo01];[logo01][text01]overlay[outv]"; var args = ["-y", "-f", "lavfi", "-i", src01, "-f", "lavfi", "-i", src02, "-i", whiteBarMovPath, "-i", peeqLogoPath, "-filter_complex", filterComplexStr, "-vcodec", "qtrle", "-crf:v", "28", "-codec:a", "aac", "-ac", "2", "-ar", "44100", "-ab", "128k", "-map", "[outv]", destPath]; //console.log("args", args); return childProcess.spawn('ffmpeg', args).then((ffResult) => { return destPath; }, (err) => { //console.error(new Error("generateTitleAnimation:" + err)); console.error(err); return Promise.reject(err); });};
destPath is a .mov file
Up till a few days ago, the backend started throwing up this error
stderr: 'ffmpeg version 3.4.2-1~16.04.york0.2 Copyright (c) 2000-2018 the FFmpeg developers\n built with gcc 5.4.0 (Ubuntu 5.4.0- 6ubuntu1~16.04.9) 20160609\n configuration: --prefix=/usr --extra- version=\'1~16.04.york0.2\' --toolchain=hardened -- libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu -- enable-gpl --disable-stripping --enable-avresample --enable-avisynth -- enable-gnutls --enable-ladspa --enable-libass --enable-libbluray -- enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite -- enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable- libgme --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable- libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse -- enable-librubberband --enable-librsvg --enable-libshine --enable- libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable- libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx -- enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 - -enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable- openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm -- enable-libiec61883 --enable-chromaprint --enable-frei0r --enable- libopencv --enable-libx264 --enable-shared\n libavutil 55. 78.100 / 55. 78.100\n libavcodec 57.107.100 / 57.107.100\n libavformat 57. 83.100 / 57. 83.100\n libavdevice 57. 10.100 / 57. 10.100\n libavfilter 6.107.100 / 6.107.100\n libavresample 3. 7. 0 / 3. 7. 0\n libswscale 4. 8.100 / 4. 8.100\n libswresample 2. 9.100 / 2. 9.100\n libpostproc 54. 7.100 / 54. 7.100\nInput #0, lavfi, from \'color=c=white:s=1920x1080:duration=5.52\':\n Duration: N/A, start: 0.000000, bitrate: N/A\n Stream #0:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], 25 tbr, 25 tbn, 25 tbc\nInput #1, lavfi, from \'color=c=white@0.0:s=1920x1080:r=120:duration=5.52\':\n Duration: N/A, start: 0.000000, bitrate: N/A\n Stream #1:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], 120 fps, 120 tbr, 120 tbn, 120 tbc\nInput #2, mov,mp4,m4a,3gp,3g2,mj2, from \'/app/assets/whiteBar.mov\':\n Metadata:\n major_brand : qt \n minor_version : 537199360\n compatible_brands: qt \n creation_time : 2018-04-27T15:55:18.000000Z\n Duration: 00:00:05.52, start: 0.000000, bitrate: 54847 kb/s\n Stream #2:0(eng): Video: qtrle (rle / 0x20656C72), bgra(progressive), 1920x1080, 53326 kb/s, SAR 1:1 DAR 16:9, 60 fps, 60 tbr, 60 tbn, 60 tbc (default)\n Metadata:\n creation_time : 2018-04-27T15:55:18.000000Z\n handler_name : Apple Alias Data Handler\n encoder : Animation\n timecode : 00:00:00:00\n Stream #2:1(eng): Data: none (tmcd / 0x64636D74), 0 kb/s (default)\n Metadata:\n creation_time : 2018-04-27T15:55:18.000000Z\n handler_name : Apple Alias Data Handler\n timecode : 00:00:00:00\nInput #3, png_pipe, from \'/app/assets/peeq-logo.png\':\n Duration: N/A, bitrate: N/A\n Stream #3:0: Video: png, rgba(pc), 452x207 [SAR 2834:2834 DAR 452:207], 25 tbr, 25 tbn, 25 tbc\nCodec AVOption crf (Select the quality for constant quality mode) specified for output file #0 (/tmp/972967.mov) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.\nCodec AVOption b (set bitrate (in bits/s)) specified for output file #0 (/tmp/972967.mov) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.\nStream mapping:\n Stream #1:0 (rawvideo) -> drawtext\n Stream #2:0 (qtrle) -> overlay:main\n Stream #3:0 (png) -> overlay:overlay\n overlay -> Stream #0:0 (qtrle)\nPress [q] to stop, [?] for help\n[swscaler @ 0x56080b828180] No accelerated colorspace conversion found from yuv420p to argb.\n[swscaler @ 0x56080b8b5f40] No accelerated colorspace conversion found from yuva420p to argb.\n',
However, this error only occurs on the app engine. Running nom test on my Mac generates the title perfectly.
-
Extracting audio from video using fluent-ffmpeg
12 mars, par Idi FavourIm trying to extract the audio from a video, an error is occuring Error converting file: Error: ffmpeg exited with code 234: Error opening output file ./src/videos/output-audio.mp3. Error opening output files: Invalid argument
I use this same directory for my video compression that runs before this one and it works.
ffmpeg() .input(url) .audioChannels(0) .format("mp3") .output("./src/videos/output-audio.mp3") .on("error", (err) => console.error(`Error converting file: ${err}`)) .on("end", async () => { console.log("audio transcripts"); const stream = fs.createReadStream("./src/videos/output-audio.mp3"); const transcription = await openai.audio.transcriptions.create({ file: stream, model: "whisper-1", response_format: "verbose_json", timestamp_granularities: ["word"], }); transcripts = transcription.text; console.log(transcription.text); }) .run();