Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • FFMPEG send RTP audio at 8k bytes/sec [closed]

    10 mai, par Muzza

    I'm trying to use FFMPEG to mimick a device that transmits G711U audio over UDP/RTP at 8k bytes per second. The device im mimicking sends rtp packets every 20ms with 160byte payload.

    I've had limited success using the following command

    ffmpeg -f dshow -i audio="Microphone (Realtek(R) Audio)" -ac 1 -ar 8000 -ab 8 -acodec pcm_mulaw -f rtp rtp://127.0.0.1:12345?pkt_size=160
    

    This sends G711U encoded audio, in 160byte chunks, but streams at 64kB/s, not the 8kB/s that my device is expected, so the device errors out?

    Any idea's would be massively appreciated!

    Thank you

    Log from FFMPEG

    >ffmpeg -f dshow -i audio="Microphone (Realtek(R) Audio)" -ac 1 -ar 8000 -ab 8 -acodec pcm_mulaw -f rtp rtp://127.0.0.1:12345?pkt_size=160
    ffmpeg version 2025-04-23-git-25b0a8e295-essentials_build-www.gyan.dev Copyright (c) 2000-2025 the FFmpeg developers
      built with gcc 14.2.0 (Rev3, Built by MSYS2 project)
      configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-zlib --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-sdl2 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-dxva2 --enable-d3d11va --enable-d3d12va --enable-ffnvcodec --enable-libvpl --enable-nvdec --enable-nvenc --enable-vaapi --enable-libgme --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libtheora --enable-libvo-amrwbenc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-librubberband
      libavutil      60.  2.100 / 60.  2.100
      libavcodec     62.  0.101 / 62.  0.101
      libavformat    62.  0.100 / 62.  0.100
      libavdevice    62.  0.100 / 62.  0.100
      libavfilter    11.  0.100 / 11.  0.100
      libswscale      9.  0.100 /  9.  0.100
      libswresample   6.  0.100 /  6.  0.100
      libpostproc    59.  1.100 / 59.  1.100
    [aist#0:0/pcm_s16le @ 00000198256b73c0] Guessed Channel Layout: stereo
    Input #0, dshow, from 'audio=Microphone (Realtek(R) Audio)':
      Duration: N/A, start: 135470.702000, bitrate: 1411 kb/s
      Stream #0:0: Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s, Start-Time 135470.702s
    Stream mapping:
      Stream #0:0 -> #0:0 (pcm_s16le (native) -> pcm_mulaw (native))
    Press [q] to stop, [?] for help
    [pcm_mulaw @ 00000198256cf240] Bitrate 8 is extremely low, maybe you mean 8k
    Output #0, rtp, to 'rtp://127.0.0.1:12345?pkt_size=160':
      Metadata:
        encoder         : Lavf62.0.100
      Stream #0:0: Audio: pcm_mulaw, 8000 Hz, mono, s16 (8 bit), 64 kb/s
        Metadata:
          encoder         : Lavc62.0.101 pcm_mulaw
    SDP:
    v=0
    o=- 0 0 IN IP4 127.0.0.1
    s=No Name
    c=IN IP4 127.0.0.1
    t=0 0
    a=tool:libavformat 62.0.100
    m=audio 12345 RTP/AVP 0
    b=AS:64
    
    [out#0/rtp @ 00000198256cdd00] video:0KiB audio:973KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: 8.467470%
    size=    1055KiB time=00:02:04.51 bitrate=  69.4kbits/s speed=   1x
    Exiting normally, received signal 2.
    

    Wireshark: Wireshark Log

    Shows packets being sent every ~0.20ms

  • Could someone please explain this filter graph ?

    9 mai, par Wynell

    https://www.ffmpeg.org/doxygen/trunk/filtering_video_8c-example.html

        filter_graph = avfilter_graph_alloc();
        // ...
        const AVFilter *buffersrc  = avfilter_get_by_name("buffer");
        const AVFilter *buffersink = avfilter_get_by_name("buffersink");
        // ...
        /* buffer video source: the decoded frames from the decoder will be inserted here. */
        snprintf(args, sizeof(args),
                "video_size=%dx%d:pix_fmt=%d:time_base=%d/%d:pixel_aspect=%d/%d",
                dec_ctx->width, dec_ctx->height, dec_ctx->pix_fmt,
                time_base.num, time_base.den,
                dec_ctx->sample_aspect_ratio.num, dec_ctx->sample_aspect_ratio.den);
     
        ret = avfilter_graph_create_filter(&buffersrc_ctx, buffersrc, "in",
                                           args, NULL, filter_graph);
        /* buffer video sink: to terminate the filter chain. */
        ret = avfilter_graph_create_filter(&buffersink_ctx, buffersink, "out",
                                           NULL, NULL, filter_graph);
        // ...
    /*
         * Set the endpoints for the filter graph. The filter_graph will
         * be linked to the graph described by filters_descr.
         */
     
        /*
         * The buffer source output must be connected to the input pad of
         * the first filter described by filters_descr; since the first
         * filter input label is not specified, it is set to "in" by
         * default.
         */
        outputs->name       = av_strdup("in");
        outputs->filter_ctx = buffersrc_ctx;
        outputs->pad_idx    = 0;
        outputs->next       = NULL;
     
        /*
         * The buffer sink input must be connected to the output pad of
         * the last filter described by filters_descr; since the last
         * filter output label is not specified, it is set to "out" by
         * default.
         */
        inputs->name       = av_strdup("out");
        inputs->filter_ctx = buffersink_ctx;
        inputs->pad_idx    = 0;
        inputs->next       = NULL;
     
        if ((ret = avfilter_graph_parse_ptr(filter_graph, filters_descr,
                                        &inputs, &outputs, NULL)) < 0)
    

    So I see these parts separately like this:

    But I can't combine them in my head.
    Are these in and out filters the same everywhere?
    What filter or graph are these inputs and outputs? If the parsed graph takes the input from in filter, why is it (in filter) in outputs variable then (and vice versa)!? What is even the role of these variables?
    Could you please explain the step by step algorithm how this code works

  • RN 0.74.7 can't find repo for ffmpeg-kit-react-native 6.0.2

    9 mai, par user938363

    My React Native 0.74.7 (on MacOS) app has hard time to find the ffmpeg-kit-react-native repo when run-android. It constantly complains no repo found on maven. Here is the error when react-native run-android:

    * What went wrong:
    Could not determine the dependencies of task ':app:processDebugResources'.
    > Could not resolve all task dependencies for configuration ':app:debugRuntimeClasspath'.
       > Could not find com.arthenica:ffmpeg-kit-full-gpl:6.0.
         Searched in the following locations:
           - https://oss.sonatype.org/content/repositories/snapshots/com/arthenica/ffmpeg-kit-full-gpl/6.0/ffmpeg-kit-full-gpl-6.0.pom
           - https://repo.maven.apache.org/maven2/com/arthenica/ffmpeg-kit-full-gpl/6.0/ffmpeg-kit-full-gpl-6.0.pom
           - file:/Users/macbook/Documents/code/js/VmonFront/node_modules/jsc-android/dist/com/arthenica/ffmpeg-kit-full-gpl/6.0/ffmpeg-kit-full-gpl-6.0.pom
           - https://dl.google.com/dl/android/maven2/com/arthenica/ffmpeg-kit-full-gpl/6.0/ffmpeg-kit-full-gpl-6.0.pom
           - https://www.jitpack.io/com/arthenica/ffmpeg-kit-full-gpl/6.0/ffmpeg-kit-full-gpl-6.0.pom
         Required by:
             project :app > project :ffmpeg-kit-react-native
    

    In android/build.gradle, the repo was specified:

    buildscript {
        ext {
            buildToolsVersion = "34.0.0"
            minSdkVersion = 23
            compileSdkVersion = 34
            targetSdkVersion = 34
            ndkVersion = "26.1.10909125"
            kotlinVersion = "1.9.22"
    
            ffmpegKitPackage = "full-gpl"
        }
    
        repositories {
            google()
            mavenCentral()
            maven { url "https://jitpack.io" }
            // You can comment out the below line if it causes issues:
            maven { url 'https://maven.arthenica.com/public/' } //<<<===repo here
        }
    
        dependencies {
            classpath("com.android.tools.build:gradle")
            classpath("com.facebook.react:react-native-gradle-plugin")
            classpath("org.jetbrains.kotlin:kotlin-gradle-plugin")
        }
    }
    
    
    apply plugin: "com.facebook.react.rootproject"
    

    Also the following code is added to android/app/build.gradle:

    // 🔥 Add this block to override the wrong version in ffmpeg's gradle.properties
    configurations.all {
        resolutionStrategy {
            force "com.arthenica:ffmpeg-kit-https:6.0". //<<<===6.0.2 or 6 didn't work
            force "com.arthenica:ffmpeg-kit-full-gpl:6.0" // Include if using other packages
        }
    }
    

    The problem seems to be with RN 0.74.7 only, as my RN 0.78.0 with ffmpeg-kit-react-native installed was able to download full-gpl version and installed successfully. What is missing here to install ffmpeg-kit on RN 0.74.7?

  • Android FFmpegKit-Full dependencies missing

    9 mai, par Adrián Primo

    I've tried to compile the app on Android and now it won't run as it can't find these files. I knew the *FFmpegKit *project was going to remove its binaries so I downloaded the package locally but I totally forgot about Android's specific files.

    Launching lib\main.dart on sdk gphone64 x86 64 in debug mode...
    Running Gradle task 'assembleDebug'...
    
    FAILURE: Build failed with an exception.
    
    * What went wrong:
    Execution failed for task ':app:checkDebugAarMetadata'.
    > Could not resolve all files for configuration ':app:debugRuntimeClasspath'.
       > Could not find com.arthenica:ffmpeg-kit-full:6.0-2.
         Searched in the following locations:
           - https://dl.google.com/dl/android/maven2/com/arthenica/ffmpeg-kit-full/6.0-2/ffmpeg-kit-full-6.0-2.pom
           - https://repo.maven.apache.org/maven2/com/arthenica/ffmpeg-kit-full/6.0-2/ffmpeg-kit-full-6.0-2.pom
           - https://jcenter.bintray.com/com/arthenica/ffmpeg-kit-full/6.0-2/ffmpeg-kit-full-6.0-2.pom
           - https://storage.googleapis.com/download.flutter.io/com/arthenica/ffmpeg-kit-full/6.0-2/ffmpeg-kit-full-6.0-2.pom
         Required by:
             project :app > project :ffmpeg_kit_flutter_full
    

    I've tried to download the dependencies manually from the repository, but the owner has removed the files.

    Does anybody have the required files for the ffmpegkit-full package? I found in a forked repository with the https package installed locally, but I need the full package.

    Dependency in Maven Repository

    Owner's files

  • Flutter_ffmpeg is discontinued. Is there a way still to use it in a flutter project ?

    9 mai, par Saad Mushtaq

    I have been using the flutter_ffmpeg package in a Flutter project. But now, when I run the project, I get a build failure exception. Is there a way to still use it in a Flutter project?

    FAILURE: Build failed with an exception.

    • What went wrong: Execution failed for task ':app:checkDebugAarMetadata'.

    Could not resolve all files for configuration ':app:debugRuntimeClasspath'. Could not find com.arthenica:mobile-ffmpeg-https:4.4. Searched in the following locations: - https://dl.google.com/dl/android/maven2/com/arthenica/mobile-ffmpeg-https/4.4/mobile-ffmpeg-https-4.4.pom - https://repo.maven.apache.org/maven2/com/arthenica/mobile-ffmpeg-https/4.4/mobile-ffmpeg-https-4.4.pom - https://storage.googleapis.com/download.flutter.io/com/arthenica/mobile-ffmpeg-https/4.4/mobile-ffmpeg-https-4.4.pom - https://jcenter.bintray.com/com/arthenica/mobile-ffmpeg-https/4.4/mobile-ffmpeg-https-4.4.pom Required by: project :app > project :flutter_ffmpeg