13:11
I'm building an iOS app in Swift that needs to convert local video files to HLS format (.m3u8). Initially, I used the ffmpeg-kit-ios-full-gpl package from FFmpegKit, which works well. However, since this build includes GPL-licensed components (such as libx264), I'm concerned that using it would require my app to be released under the GPL, which is not compatible with App Store distribution.
That said, my needs are fairly basic: I only need to convert H.264 .mp4 video files into HLS format.
My Questions:
Is there a safe way to use FFmpegKit—such as the (...)
13:30
TL;DR: I have a large video that requires transcode to play in browser, rather than using -f hls generate every segments into disk, I want to listen to segment request and generate them on demand. However, cutting using -ss and -to is always not accurate among segments, causing video to glitch at segment borders.
Here is the details.
First know the video's keyframes timestamps. Like below
$ ffprobe -fflags +genpts -v error -skip_frame nokey -show_entries format=duration -show_entries stream=duration -show_entries packet=pts_time,flags -select_streams v -of csv (...)
09:15
I'm working on a media processing SDK with a substantial C++ codebase that currently uses FFmpeg for video decoding on native platforms (Windows/Linux). I need to port this to browsers while preserving both the existing C++ architecture and performance characteristics. The WASM approach is critical for us because it allows leveraging our existing optimized C++ media processing pipeline without a complete JavaScript rewrite, while maintaining the performance benefits of compiled native code.
The Challenge:
WebAssembly runs in a browser sandbox (...)