Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Re-encode a video keeping the GOP structure of the original video [closed]

    5 avril, par Baltar27

    Is there a way to re-encode a video with ffmpeg keeping the GOP structure of the original file? That is, if a frame is IDR, I, B or P in the input file, keep the same type in the output file, even if the GOP is variable and/or adaptive (it changes dynamically GOP type Open/Closed, the I period N or the P period M). Encoding in this way allows to maintain the maximum quality as the Open/Closed, I/B/P and GOP length decisions has been taken already by the original encoder.

  • I am building an editor app in flutter, I want to use ffmpeg_kit_flutter_full_gpl package or something similar to build video editor [closed]

    5 avril, par Almas Tai
    Users/xyz/.pub-cache/hosted/pub.dev/ffmpeg_kit_flutter_full_gpl-6.0.3/android/src/main/java/com/arthenica/ffmpegkit/flutter/FFmpegKitFlutterPlugin.java:192: error: method onAttachedToEngine(FlutterPluginBinding) is already defined in class FFmpegKitFlutterPlugin
        public void onAttachedToEngine(@NonNull final FlutterPluginBinding flutterPluginBinding) {
                    ^
    /Users/xyz/.pub-cache/hosted/pub.dev/ffmpeg_kit_flutter_full_gpl-6.0.3/android/src/main/java/com/arthenica/ffmpegkit/flutter/FFmpegKitFlutterPlugin.java:657: error: cannot find symbol
        protected void init(final BinaryMessenger messenger, final Context context, final Activity activity, final io.flutter.plugin.common.PluginRegistry.Registrar registrar, final ActivityPluginBinding activityBinding) {
                                                                                              ^
      symbol:   class Registrar
      location: interface PluginRegistry
    2 errors
    3 warnings
    
    FAILURE: Build failed with an exception.
    
    * What went wrong:
    Execution failed for task ':ffmpeg_kit_flutter_full_gpl:compileDebugJavaWithJavac'.
    > Compilation failed; see the compiler error output for details.
    
    * Try:
    > Run with --info option to get more log output.
    > Run with --scan to get full insights.
    
    BUILD FAILED in 18s
    
    ┌─ Flutter Fix ───────────────────────────────────────────────────────────────────────────────────┐
    │ [!] Consult the error logs above to identify any broken plugins, specifically those containing  │
    │ "error: cannot find symbol..."                                                                  │
    │ This issue is likely caused by v1 embedding removal and the plugin's continued usage of removed │
    │ references to the v1 embedding.                                                                 │
    │ To fix this error, please upgrade your current package's dependencies to latest versions by     │
    │ running `flutter pub upgrade`.                                                                  │
    │ If that does not work, please file an issue for the problematic plugin(s) here:                 │
    │ https://github.com/flutter/flutter/issues                                                       │
    └─────────────────────────────────────────────────────────────────────────────────────────────────┘
    Error: Gradle task assembleDebug failed with exit code 1, 
    

    Can anyone suggest me any other dependencies or help me fix this error?

  • How to stream from dynamically created Selenium Moon browser pods to RTMP using ffmpeg ?

    5 avril, par Ahmed Seddik Bouchiba

    I'm running Selenium Moon (from Aerokube) in my minikube cluster for automated browser testing, and I need to stream the browser sessions to an RTMP endpoint. Here's my current setup:

    $ kubectl get all -n my-namespace
    NAME                                    READY   STATUS    RESTARTS   AGE
    pod/minio-5b85cc6cd6-nznwl              1/1     Running   1 (48m ago)   2d8h
    pod/moon-7db8b9c76f-8jzzm               4/4     Running   0             34m
    pod/moon-7db8b9c76f-kp78j               4/4     Running   0             34m
    
    NAME                     TYPE        CLUSTER-IP       EXTERNAL-IP   PORT(S)                                 AGE
    service/minio            NodePort    10.99.231.25             9000:31147/TCP,9001:31125/TCP          2d8h
    service/moon             NodePort    10.100.244.255           4444:30683/TCP,9090:32243/TCP,8080:32571/TCP   34m
    
    NAME                               READY   UP-TO-DATE   AVAILABLE   AGE
    deployment.apps/minio              1/1     1            1           2d8h
    deployment.apps/moon               2/2     2            2           34m
    
    NAME                                          DESIRED   CURRENT   READY   AGE
    replicaset.apps/minio-5b85cc6cd6              1         1         1       2d8h
    replicaset.apps/moon-7db8b9c76f               2         2         2       34m
    

    The challenge:

    1. Moon dynamically creates browser pods when tests run
    2. I need to stream the browser activity to an RTMP endpoint using ffmpeg
    3. Moon was installed with Helm

    What I've tried to understand:

    • How to integrate ffmpeg with browser pods that are created on-demand
    • How to modify the Moon Helm chart to inject ffmpeg capabilities
    • How to capture display from the browser pods
    • How to stream to RTMP endpoints

    Has anyone implemented something similar? I'm looking for a solution that works with the dynamic nature of Moon's browser pod creation. Any examples, Helm chart modifications, or approaches would be greatly appreciated.

    Thanks in advance!

  • very low latency streaminig with ffmpeg using a webcam

    5 avril, par userDtrm

    I'm trying to configure ffmpeg to do a real-time video streaming using a webcam. The ffmpeg encoder command I use is as follows.

    ffmpeg -f v4l2 -input_format yuyv422 -s 640x480 -i /dev/video0 -c:v libx264 -profile:v baseline -trellis 0 -subq 1 -level 32 -preset superfast -tune zerolatency -me_method epzs -crf 30 -threads 0 -bufsize 1 -refs 4 -coder 0 -b_strategy 0 -bf 0 -sc_threshold 0 -x264-params vbv-maxrate=2000:slice-max-size=1500:keyint=30:min-keyint=10: -pix_fmt yuv420p -an -f mpegts udp://192.168.1.8:5001
    

    The ffplay command used to display the video feed is,

    ffplay -analyzeduration 1 -fflags -nobuffer -i udp://192.168.1.8:5001
    

    However, I'm experiencing a latency of 0.5 - 1.0s latency in the video stream. Is there a way to reduce this to a number less than 100ms. Also, when I replace the v4l2 camera capture with a screen capture using x11grab, the stream is almost real-time and I experience no noticeable delays. Moreover, changing the encoder from x264 to mpeg2 had no effect on the latency. In addition, the statistics from the ffmpeg shows that the encoder is performing at a 30fps rate, which I believe indicates that the encoding is real-time. This leaves me with only one reason for the experienced delay.

    • Is there a significant delay in buffers when using v4l2 during video capturing in a webcam?
    • I don't think the transmission delay is in effect in this case as I see no latencies when screen capture is used under the same conditions.
    • Can this latency be further reduced?. Can someone think of a different encoder configuration to be used instead of the one that I've been using?
  • FFmpeg Arthenica always encodes with x265 Main10 profile [closed]

    5 avril, par Jabed Dhali

    I'm using FFmpeg-Kit (Arthenica) for iOS to compress videos using the libx265 encoder. I'm building FFmpeg with a custom configuration script:

    export CUSTOM_CONFIG="--disable-indevs \
      --disable-outdevs \
      --disable-hwaccels \
      --enable-videotoolbox \
      --disable-protocols \
      --enable-protocol=file,fd,saf,async \
      --disable-decoders \
      --enable-decoder=${ENABLE_DECODER_LIST} \
      --disable-encoders \
      --enable-encoder=${ENABLE_ENCODER_LIST}"
    
    export CUSTOM_CONFIG_IOS=${CUSTOM_CONFIG}
      ./ios.sh --enable-ios-videotoolbox --enable-lame --enable-libaom  --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libxml2 --enable-opencore-amr --enable-openh264 --enable-opus --enable-twolame --enable-x264 --enable-x265 --enable-gpl
    

    Then I run FFmpeg with the following commands:

    -y -hide_banner -i input.MOV -vcodec libx265 -x265-params profile=main output.mov
    -y -hide_banner -i input.MOV -vcodec libx265 -x265-params profile=main10 output.mov
    

    However, regardless of the profile I set, the output always ends up being encoded with the Main10 profile.

    Here’s the log output:

    x265 [info]: HEVC encoder version 3.4
    x265 [info]: build info [Mac OS X][clang 16.0.0][32 bit][noasm] 10bit
    x265 [info]: using cpu capabilities: none!
    x265 [info]: Main 10 profile, Level-5 (Main tier)
    

    It seems that the FFmpeg build only supports 10-bit encoding. I want to be able to encode with different profiles (main, main10, etc.), as well as to build libx265 in a way that supports 8-bit + 10-bit + 12-bit encoding (unified 64-bit build), as like as the same command executes for MacOS.

    x265 [info]: HEVC encoder version 4.1+1-1d117be
    x265 [info]: build info [Mac OS X][clang 16.0.0][64 bit] 8bit+10bit+12bit
    x265 [info]: using cpu capabilities: NEON Neon_DotProd Neon_I8MM
    x265 [info]: Main profile, Level-2.1 (Main tier)
    

    How can I correctly configure and build FFmpeg and libx265 with Mobile FFmpeg to support multiple bit depths and allow setting different profiles via -x265-params?