Recherche avancée

Médias (0)

Mot : - Tags -/navigation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (53)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Configuration spécifique d’Apache

    4 février 2011, par

    Modules spécifiques
    Pour la configuration d’Apache, il est conseillé d’activer certains modules non spécifiques à MediaSPIP, mais permettant d’améliorer les performances : mod_deflate et mod_headers pour compresser automatiquement via Apache les pages. Cf ce tutoriel ; mode_expires pour gérer correctement l’expiration des hits. Cf ce tutoriel ;
    Il est également conseillé d’ajouter la prise en charge par apache du mime-type pour les fichiers WebM comme indiqué dans ce tutoriel.
    Création d’un (...)

  • Contribute to documentation

    13 avril 2011

    Documentation is vital to the development of improved technical capabilities.
    MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
    To contribute, register to the project users’ mailing (...)

Sur d’autres sites (6467)

  • What's the most desireable way to capture system display and audio in the form of individual encoded audio and video packets in go (language) ? [closed]

    11 janvier 2023, par Tiger Yang

    Question (read the context below first) :

    


    For those of you familiar with the capabilities of go, Is there a better way to go about all this ? Since ffmpeg is so ubiquitous, I'm sure it's been optomized to perfection, but what's the best way to capture system display and audio in the form of individual encoded audio and video packets in go (language), so that they can be then sent via webtransport-go ? I wish for it to prioritize efficiency and low latency, and ideally capture and encode the framebuffer directly like ffmpeg does.

    


    Thanks ! I have many other questions about this, but I think it's best to ask as I go.

    


    Context and what I've done so far :

    


    I'm writing a remote desktop software for my personal use because of grievances with current solutions out there. At the moment, it consists of a web app that uses the webtransport API to send input datagrams and receive AV packets on two dedicated unidirectional streams, and the webcodecs API to decode these packets. On the serverside, I originally planned to use python with the aioquic library as a webtransport server. Upon connection and authentication, the server would start ffmpeg as a subprocess with this command :

    


    ffmpeg -init_hw_device d3d11va -filter_complex ddagrab=video_size=1920x1080:framerate=60 -vcodec hevc_nvenc -tune ll -preset p7 -spatial_aq 1 -temporal_aq 1 -forced-idr 1 -rc cbr -b:v 400K -no-scenecut 1 -g 216000 -f hevc -

    


    What I really appreciate about this is that it uses windows' desktop duplication API to copy the framebuffer of my GPU and hand that directly to the on-die hardware encoder with zero round trips to the CPU. I think it's about as efficient and elegant a solution as I can manage. It then outputs the encoded stream to the stdout, which python can read and send to the client.

    


    As for the audio, there is another ffmpeg instance :

    


    ffmpeg -f dshow -channels 2 -sample_rate 48000 -sample_size 16 -audio_buffer_size 15 -i audio="RD Audio (High Definition Audio Device)" -acodec libopus -vbr on -application audio -mapping_family 0 -apply_phase_inv true -b:a 25K -fec false -packet_loss 0 -map 0 -f data -

    


    which listens to a physical loopback interface, which is literally just a short wire bridging the front panel headphone and microphone jacks (I'm aware of the quality loss of converting to analog and back, but the audio is then crushed down to 25kbps so it's fine) ()

    


    Unfortunately, aioquic was not easy to work with IMO, and I found webtransport-go https://github.com/adriancable/webtransport-go, which was a hell of a lot better in both simplicity and documentation. However, now I'm dealing with a whole new language, and I wanna ask : (above)

    


    EDIT : Here's the code for my server so far :

    


    

    

    package main

import (
    "bytes"
    "context"
    "fmt"
    "log"
    "net/http"
    "os/exec"
    "time"

    "github.com/adriancable/webtransport-go"
)

func warn(str string) {
    fmt.Printf("\n===== WARNING ===================================================================================================\n   %s\n=================================================================================================================\n", str)
}

func main() {

    password := []byte("abc")

    videoString := []string{
        "ffmpeg",
        "-init_hw_device", "d3d11va",
        "-filter_complex", "ddagrab=video_size=1920x1080:framerate=60",
        "-vcodec", "hevc_nvenc",
        "-tune", "ll",
        "-preset", "p7",
        "-spatial_aq", "1",
        "-temporal_aq", "1",
        "-forced-idr", "1",
        "-rc", "cbr",
        "-b:v", "500K",
        "-no-scenecut", "1",
        "-g", "216000",
        "-f", "hevc", "-",
    }

    audioString := []string{
        "ffmpeg",
        "-f", "dshow",
        "-channels", "2",
        "-sample_rate", "48000",
        "-sample_size", "16",
        "-audio_buffer_size", "15",
        "-i", "audio=RD Audio (High Definition Audio Device)",
        "-acodec", "libopus",
        "-mapping_family", "0",
        "-b:a", "25K",
        "-map", "0",
        "-f", "data", "-",
    }

    connected := false

    http.HandleFunc("/", func(writer http.ResponseWriter, request *http.Request) {
        session := request.Body.(*webtransport.Session)

        session.AcceptSession()
        fmt.Println("\nAccepted incoming WebTransport connection.")
        fmt.Println("Awaiting authentication...")

        authData, err := session.ReceiveMessage(session.Context()) // Waits here till first datagram
        if err != nil {                                            // if client closes connection before sending anything
            fmt.Println("\nConnection closed:", err)
            return
        }

        if len(authData) >= 2 && bytes.Equal(authData[2:], password) {
            if connected {
                session.CloseSession()
                warn("Client has authenticated, but a session is already taking place! Connection closed.")
                return
            } else {
                connected = true
                fmt.Println("Client has authenticated!\n")
            }
        } else {
            session.CloseSession()
            warn("Client has failed authentication! Connection closed. (" + string(authData[2:]) + ")")
            return
        }

        videoStream, _ := session.OpenUniStreamSync(session.Context())

        videoCmd := exec.Command(videoString[0], videoString[1:]...)
        go func() {
            videoOut, _ := videoCmd.StdoutPipe()
            videoCmd.Start()

            buffer := make([]byte, 15000)
            for {
                len, err := videoOut.Read(buffer)
                if err != nil {
                    break
                }
                if len > 0 {
                    videoStream.Write(buffer[:len])
                }
            }
        }()

        time.Sleep(50 * time.Millisecond)

        audioStream, err := session.OpenUniStreamSync(session.Context())

        audioCmd := exec.Command(audioString[0], audioString[1:]...)
        go func() {
            audioOut, _ := audioCmd.StdoutPipe()
            audioCmd.Start()

            buffer := make([]byte, 15000)
            for {
                len, err := audioOut.Read(buffer)
                if err != nil {
                    break
                }
                if len > 0 {
                    audioStream.Write(buffer[:len])
                }
            }
        }()

        for {
            data, err := session.ReceiveMessage(session.Context())
            if err != nil {
                videoCmd.Process.Kill()
                audioCmd.Process.Kill()

                connected = false

                fmt.Println("\nConnection closed:", err)
                break
            }

            if len(data) == 0 {

            } else if data[0] == byte(0) {
                fmt.Printf("Received mouse datagram: %s\n", data)
            }
        }

    })

    server := &webtransport.Server{
        ListenAddr: ":1024",
        TLSCert:    webtransport.CertFile{Path: "SSL/fullchain.pem"},
        TLSKey:     webtransport.CertFile{Path: "SSL/privkey.pem"},
        QuicConfig: &webtransport.QuicConfig{
            KeepAlive:      false,
            MaxIdleTimeout: 3 * time.Second,
        },
    }

    fmt.Println("Launching WebTransport server at", server.ListenAddr)
    ctx, cancel := context.WithCancel(context.Background())
    if err := server.Run(ctx); err != nil {
        log.Fatal(err)
        cancel()
    }

}

    


    


    



  • How to get snapshot from video memorystream or byte[] using FFmpegCore instead of file path ?

    12 juillet 2024, par Akash Kadia

    I am trying to get snapshots from video data that can be in a MemoryStream OR Byte[] but not located on physical file path. FFMpegCore provide option to use Arguments with PipeSource but not sure how to use it. I have updated code for taking snapshot from Stream as below but it gives

    


    public static async Task<bitmap> SnapshotAsync(Stream input, IMediaAnalysis source, Size? size = null, TimeSpan? captureTime = null, int? streamIndex = null, int inputFileIndex = 0)&#xA;    {&#xA;        input.Seek(0, SeekOrigin.Begin);&#xA;        FFMpegCore.Pipes.StreamPipeSource streamPipeSource = new FFMpegCore.Pipes.StreamPipeSource(input);&#xA;        var (arguments, outputOptions) = BuildSnapshotArguments(streamPipeSource, source, size, captureTime, streamIndex, inputFileIndex);&#xA;        using var ms = new MemoryStream();&#xA;&#xA;        await arguments&#xA;            .OutputToPipe(new StreamPipeSink(ms), options => outputOptions(options&#xA;                .ForceFormat("rawvideo")))&#xA;            .ProcessAsynchronously().ConfigureAwait(false);&#xA;&#xA;        ms.Position = 0;&#xA;        return new Bitmap(ms);&#xA;    }&#xA;&#xA;    private static (FFMpegArguments, Action<ffmpegargumentoptions> outputOptions) BuildSnapshotArguments(&#xA;        IPipeSource input,&#xA;        IMediaAnalysis source,&#xA;        Size? size = null,&#xA;        TimeSpan? captureTime = null,&#xA;        int? streamIndex = null,&#xA;        int inputFileIndex = 0)&#xA;    {&#xA;        captureTime ??= TimeSpan.FromSeconds(source.Duration.TotalSeconds / 3);&#xA;        size = PrepareSnapshotSize(source, size);&#xA;        streamIndex ??= source.PrimaryVideoStream?.Index&#xA;                        ?? source.VideoStreams.FirstOrDefault()?.Index&#xA;                        ?? 0;&#xA;&#xA;        return (FFMpegArguments&#xA;            .FromPipeInput(input, options => options&#xA;                 .Seek(captureTime)),&#xA;            options => options&#xA;                .SelectStream((int)streamIndex, inputFileIndex)&#xA;                .WithVideoCodec(VideoCodec.Png)&#xA;                .WithFrameOutputCount(1)&#xA;                .Resize(size));&#xA;    }&#xA;   &#xA;    private static Size? PrepareSnapshotSize(IMediaAnalysis source, Size? wantedSize)&#xA;    {&#xA;        if (wantedSize == null || (wantedSize.Value.Height &lt;= 0 &amp;&amp; wantedSize.Value.Width &lt;= 0) || source.PrimaryVideoStream == null)&#xA;            return null;&#xA;&#xA;        var currentSize = new Size(source.PrimaryVideoStream.Width, source.PrimaryVideoStream.Height);&#xA;        if (source.PrimaryVideoStream.Rotation == 90 || source.PrimaryVideoStream.Rotation == 180)&#xA;            currentSize = new Size(source.PrimaryVideoStream.Height, source.PrimaryVideoStream.Width);&#xA;&#xA;        if (wantedSize.Value.Width != currentSize.Width || wantedSize.Value.Height != currentSize.Height)&#xA;        {&#xA;            if (wantedSize.Value.Width &lt;= 0 &amp;&amp; wantedSize.Value.Height > 0)&#xA;            {&#xA;                var ratio = (double)wantedSize.Value.Height / currentSize.Height;&#xA;                return new Size((int)(currentSize.Width * ratio), (int)(currentSize.Height * ratio));&#xA;            }&#xA;            if (wantedSize.Value.Height &lt;= 0 &amp;&amp; wantedSize.Value.Width > 0)&#xA;            {&#xA;                var ratio = (double)wantedSize.Value.Width / currentSize.Width;&#xA;                return new Size((int)(currentSize.Width * ratio), (int)(currentSize.Height * ratio));&#xA;            }&#xA;            return wantedSize;&#xA;        }&#xA;&#xA;        return null;&#xA;    }&#xA;</ffmpegargumentoptions></bitmap>

    &#xA;

    it is giving error under SnapShotAsync function at this line

    &#xA;

    await arguments&#xA;            .OutputToPipe(new StreamPipeSink(ms), options => outputOptions(options&#xA;                .ForceFormat("rawvideo")))&#xA;            .ProcessAsynchronously().ConfigureAwait(false);&#xA;

    &#xA;

    here is full error message

    &#xA;

    &#xA;

    ffmpeg exited with non-zero exit-code (1 - ffmpeg version 2021-04-04-git-b1b7cc698b-full_build-www.gyan.dev Copyright (c) 2000-2021 the FFmpeg developers&#xA;built with gcc 10.2.0 (Rev6, Built by MSYS2 project)&#xA;configuration : —enable-gpl —enable-version3 —enable-static —disable-w32threads —disable-autodetect —enable-fontconfig —enable-iconv —enable-gnutls —enable-libxml2 —enable-gmp —enable-lzma —enable-libsnappy —enable-zlib —enable-librist —enable-libsrt —enable-libssh —enable-libzmq —enable-avisynth —enable-libbluray —enable-libcaca —enable-sdl2 —enable-libdav1d —enable-libzvbi —enable-librav1e —enable-libsvtav1 —enable-libwebp —enable-libx264 —enable-libx265 —enable-libxvid —enable-libaom —enable-libopenjpeg —enable-libvpx —enable-libass —enable-frei0r —enable-libfreetype —enable-libfribidi —enable-libvidstab —enable-libvmaf —enable-libzimg —enable-amf —enable-cuda-llvm —enable-cuvid —enable-ffnvcodec —enable-nvdec —enable-nvenc —enable-d3d11va —enable-dxva2 —enable-libmfx —enable-libglslang —enable-vulkan —enable-opencl —enable-libcdio —enable-libgme —enable-libmodplug —enable-libopenmpt —enable-libopencore-amrwb —enable-libmp3lame —enable-libshine —enable-libtheora —enable-libtwolame —enable-libvo-amrwbenc —enable-libilbc —enable-libgsm —enable-libopencore-amrnb —enable-libopus —enable-libspeex —enable-libvorbis —enable-ladspa —enable-libbs2b —enable-libflite —enable-libmysofa —enable-librubberband —enable-libsoxr —enable-chromaprint&#xA;libavutil 56. 72.100 / 56. 72.100&#xA;libavcodec 58.135.100 / 58.135.100&#xA;libavformat 58. 77.100 / 58. 77.100&#xA;libavdevice 58. 14.100 / 58. 14.100&#xA;libavfilter 7.111.100 / 7.111.100&#xA;libswscale 5. 10.100 / 5. 10.100&#xA;libswresample 3. 10.100 / 3. 10.100&#xA;libpostproc 55. 10.100 / 55. 10.100&#xA;[mov,mp4,m4a,3gp,3g2,mj2 @ 000001c78845f040] Could not find codec parameters for stream 0 (Video : h264 (avc1 / 0x31637661), none, 1280x720, 4716 kb/s) : unspecified pixel format&#xA;Consider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options&#xA;Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '\.\pipe\FFMpegCore_4599336d-fbf8-430e-ab89-19082c7d3693' :&#xA;Metadata :&#xA;major_brand : mp42&#xA;minor_version : 0&#xA;compatible_brands : mp41isom&#xA;creation_time : 2021-11-17T11:53:33.000000Z&#xA;Duration : 00:00:03.62, start : 0.000000, bitrate : N/A&#xA;Stream #0:0(und) : Video : h264 (avc1 / 0x31637661), none, 1280x720, 4716 kb/s, 15.20 fps, 15.08 tbr, 30k tbn, 60k tbc (default)&#xA;Metadata :&#xA;creation_time : 2021-11-17T11:53:33.000000Z&#xA;handler_name : VideoHandler&#xA;vendor_id : [0][0][0][0]&#xA;encoder : AVC Coding&#xA;Stream #0:1(und) : Audio : aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 170 kb/s (default)&#xA;Metadata :&#xA;creation_time : 2021-11-17T11:53:33.000000Z&#xA;handler_name : SoundHandler&#xA;vendor_id : [0][0][0][0]&#xA;Stream mapping :&#xA;Stream #0:0 -> #0:0 (h264 (native) -> png (native))&#xA;Press [q] to stop, [?] for help&#xA;\.\pipe\FFMpegCore_4599336d-fbf8-430e-ab89-19082c7d3693 : Invalid argument&#xA;Cannot determine format of input stream 0:0 after EOF&#xA;Error marking filters as finished&#xA;Conversion failed !)

    &#xA;

    &#xA;

    I am just passing a video file

    &#xA;

        using (Stream fms = File.OpenRead(txtVideoPath.Text))&#xA;            {&#xA;                fms.Seek(0, SeekOrigin.Begin);&#xA;                using (MemoryStream vms = new MemoryStream())&#xA;                {&#xA;                    fms.CopyTo(vms);&#xA;                    vms.Seek(0, SeekOrigin.Begin);&#xA;--pass vms to snapshot function&#xA;                    ....&#xA;                }&#xA;            }&#xA;

    &#xA;

    is there any way to get snapshot from stream without storing file physically ?

    &#xA;

  • FFmpeg Video Overlay Function Not Working (Flutter)

    30 août 2022, par Dennis Ashford

    I am simply trying to get a watermarked video from the Flutter FFmpeg package and cannot seem to get it to work. For right now, I am downloading the video from Firebase, storing it in cache, overlaying a watermark image using the Flutter FFmpeg Kit package. The original video file, image file, and output file are being created properly. However, the overlay function is not working properly. Can someone help me with the FFmpegKit.execute function string ? I am not seeing any obvious errors in the function, but it isn't working properly. I tried writing as a new File and asBytes but neither worked.

    &#xA;

    Edit : I want to take the originalVideo and overlay it with the watermark image with the expected outcome being a new file that contains the new overlayed video. Not concerned with where the overlay is yet.

    &#xA;

    Pubspec.yaml

    &#xA;

    ffmpeg_kit_flutter: ^4.5.1&#xA;

    &#xA;

    The function is :

    &#xA;

    import &#x27;package:ffmpeg_kit_flutter/ffmpeg_kit.dart&#x27;;&#xA;&#xA;Future<file> waterMarkVideo(String videoPath, String watermarkPath) async {&#xA;    //these calls are to load the video into temporary directory&#xA;    final response = await http.get(Uri.parse(videoPath));&#xA;    final originalVideo = File (&#x27;${(await getTemporaryDirectory()).path}/video.mp4&#x27;);&#xA;    await originalVideo.create(recursive: true);&#xA;    await originalVideo.writeAsBytes(response.bodyBytes);&#xA;    print(&#x27;video path&#x27; &#x2B; originalVideo.path);&#xA;&#xA;    //this grabs the watermark image from assets and decodes it&#xA;    final byteData = await rootBundle.load(watermarkPath);&#xA;    final watermark = File(&#x27;${(await getTemporaryDirectory()).path}/image.png&#x27;);&#xA;    await watermark.create(recursive: true);&#xA;    await watermark.writeAsBytes(byteData.buffer.asUint8List(byteData.offsetInBytes, byteData.lengthInBytes));&#xA;    print(&#x27;watermark path&#x27; &#x2B; watermark.path);&#xA;&#xA;    //this creates temporary directory for new watermarked video&#xA;    var tempDir = await getTemporaryDirectory();&#xA;    final newVideoPath = &#x27;${tempDir.path}/${DateTime.now().microsecondsSinceEpoch}result.mp4&#x27;;&#xA;    final videoFile = await File(newVideoPath).create();&#xA;&#xA;//overlaying video using FFmpegkit where I need some help&#xA;    await FFmpegKit.executeAsync("-i $originalVideo -i $watermark -filter_complex &#x27;overlay=(W-w)/2:(H-h)/2&#x27; $videoFile")&#xA;    .then((session) async {&#xA;      final state = FFmpegKitConfig.sessionStateToString(await session.getState());&#xA;      final returnCode = await session.getReturnCode();&#xA;      final failStackTrace = await session.getFailStackTrace();&#xA;      final output = await session.getOutput();&#xA;      print(&#x27;FFmpeg process exited with state ${state} and rs ${returnCode}.If failed ${failStackTrace}&#x27;);&#xA;      print(&#x27;Last output $output&#x27;);&#xA;    } );&#xA;    print(&#x27;new video path&#x27; &#x2B; newVideoPath);&#xA;    Uint8List videoByteData = await videoFile.readAsBytes();&#xA;    return videoFile.writeAsBytes(videoByteData);&#xA;}&#xA;</file>

    &#xA;

    The output for the function above is

    &#xA;

    flutter: video path/Users/dennisashford/Library/Developer/CoreSimulator/Devices/FBFB4D51-EC31-47DF-8FE0-66B114806EA4/data/Containers/Data/Application/0EA3BB35-D364-4B98-A104-15CDB17AAD54/Library/Caches/video.mp4&#xA;flutter: watermark path/Users/dennisashford/Library/Developer/CoreSimulator/Devices/FBFB4D51-EC31-47DF-8FE0-66B114806EA4/data/Containers/Data/Application/0EA3BB35-D364-4B98-A104-15CDB17AAD54/Library/Caches/image.png&#xA;ffmpeg version v4.5-dev-3393-g30322ebe3c Copyright (c) 2000-2021 the FFmpeg developers&#xA;  built with Apple clang version 13.0.0 (clang-1300.0.29.30)&#xA;  configuration: --cross-prefix=x86_64-ios-darwin- --sysroot=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator15.2.sdk --prefix=/Users/taner/Projects/ffmpeg-kit/prebuilt/apple-ios-x86_64/ffmpeg --pkg-config=/opt/homebrew/bin/pkg-config --enable-version3 --arch=x86_64 --cpu=x86_64 --target-os=darwin --disable-neon --disable-asm --ar=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ar --cc=clang --cxx=clang&#x2B;&#x2B; --as=&#x27;clang -arch x86_64 -target x86_64-apple-ios12.1-simulator -march=x86-64 -msse4.2 -mpopcnt -m64 -DFFMPEG_KIT_X86_64 -Wno-unused-function -Wno-deprecated-declarations -fstrict-aliasing -DIOS -DFFMPEG_KIT_BUILD_DATE=20220114 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator15.2.sdk -O2 -mios-simulator-version-min=12.1 -I/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator15.2.sdk/usr/include&#x27; --ranlib=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib --strip=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/strip --nm=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm --extra-ldflags=&#x27;-mios-simulator-version-min=12.1&#x27; --disable-autodetect --enable-cross-compile --enable-pic --enable-inline-asm --enable-optimizations --enable-swscale --enable-shared --disable-static --install-name-dir=&#x27;@rpath&#x27; --enable-pthreads --disable-v4l2-m2m --disable-outdev=v4l2 --disable-outdev=fbdev --disable-indev=v4l2 --disable-indev=fbdev --enable-small --disable-xmm-clobber-test --disable-debug --disable-neon-clobber-test --disable-programs --disable-postproc --disable-doc --disable-htmlpages --disable-manpages --disable-podpages --disable-txtpages --disable-sndio --disable-schannel --disable-securetransport --disable-xlib --disable-cuda --disable-cuvid --disable-nvenc --disable-vaapi --disable-vdpau --disable-alsa --disable-cuda --disable-cuvid --disable-nvenc --disable-vaapi --disable-vdpau --enable-gmp --enable-gnutls --disable-sdl2 --disable-openssl --enable-zlib --enable-audiotoolbox --disable-outdev=audiotoolbox --enable-bzlib --enable-videotoolbox --enable-avfoundation --enable-iconv --disable-coreimage --disable-appkit --disable-opencl --disable-opengl&#xA;  libavutil      57. 13.100 / 57. 13.100&#xA;  libavcodec     59. 15.102 / 59. 15.102&#xA;  libavformat    59. 10.100 / 59. 10.100&#xA;  libavdevice    59.  1.100 / 59.  1.100&#xA;  libavfilter     8. 21.100 /  8. 21.100&#xA;  libswscale      6.  1.102 /  6.  1.102&#xA;  libswresample   4.  0.100 /  4.  0.100&#xA;File:: Protocol not found&#xA;Did you mean file:File:?&#xA;flutter: FFmpeg process exited with state RUNNING and rs 1.If failed null&#xA;flutter: Last output ffmpeg version v4.5-dev-3393-g30322ebe3c Copyright (c) 2000-2021 the FFmpeg developers&#xA;flutter:   built with Apple clang version 13.0.0 (clang-1300.0.29.30)&#xA;flutter:   configuration: --cross-prefix=x86_64-ios-darwin- --sysroot=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator15.2.sdk --prefix=/Users/taner/Projects/ffmpeg-kit/prebuilt/apple-ios-x86_64/ffmpeg --pkg-config=/opt/homebrew/bin/pkg-config --enable-version3 --arch=x86_64 --cpu=x86_64 --target-os=darwin --disable-neon --disable-asm --ar=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ar --cc=clang --cxx=clang&#x2B;&#x2B; --as=&#x27;clang -arch x86_64 -target x86_64-apple-ios12.1-simulator -march=x86-64 -msse4.2 -mpopcnt -m64 -DFFMPEG_KIT_X86_64 -Wno-unused-function -Wno-deprecated-declarations -fstrict-aliasing -DIOS -DFFMPEG_KIT_BUILD_DATE=20220114 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator15.2.sdk -O2 -mios-simulator-version-min=12.1 -I/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator15.2.sdk/usr/include&#x27; --ranlib=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib --strip=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/strip --nm=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm --extra-ldflags=&#x27;-mios-simulator-version-min=12.1&#x27; --disable-autodetect --enable-cross-compile --enable-pic --enable-inline-asm --enable-optimizations --enable-swscale --enable-shared --disable-static --install-name-dir=&#x27;@rpath&#x27; --enable-pthreads --disable-v4l2-m2m --disable-outdev=v4l2 --disable-outdev=fbdev --disable-indev=v4l2 --disable-indev=fbdev --enable-small --disable-xmm-clobber-test --disable-debug --disable-neon-clobber-test --disable-programs --disable-postproc --disable-doc --disable-htmlpages --disable-manpages --disable-podpages --disable-txtpages --disable-sndio --disable-schannel --disable-securetransport --disable-xlib --disable-cuda --disable-cuvid --disable-nvenc --disable-vaapi --disable-vdpau --disable-alsa --disable-cuda --disable-cuvid --disable-nvenc --disable-vaapi --disable-vdpau --enable-gmp --enable-gnutls --disable-sdl2 --disable-openssl --enable-zlib --enable-audiotoolbox --disable-outdev=audiotoolbox --enable-bzlib --enable-videotoolbox --enable-avfoundation --enable-iconv --disable-coreimage --disable-appkit --disable-opencl --disable-opengl&#xA;flutter:   libavutil      57. 13.100 / 57. 13.100&#xA;flutter:   libavcodec     59. 15.102 / 59. 15.102&#xA;flutter:   libavformat    59. 10.100 / 59. 10.100&#xA;flutter:   libavdevice    59.  1.100 / 59.  1.100&#xA;flutter:   libavfilter     8. 21.100 /  8. 21.100&#xA;flutter:   libswscale      6.  1.102 /  6.  1.102&#xA;flutter:   libswresample   4.  0.100 /  4.  0.100&#xA;flutter: File:: Protocol not found&#xA;flutter: Did you mean file:File:?&#xA;flutter: &#xA;flutter: new video path/Users/dennisashford/Library/Developer/CoreSimulator/Devices/FBFB4D51-EC31-47DF-8FE0-66B114806EA4/data/Containers/Data/Application/0EA3BB35-D364-4B98-A104-15CDB17AAD54/Library/Caches/1661892011213545result.mp4&#xA;

    &#xA;

    Update : So I built the code on a physical device and was able to get a little more information for what may be the problem. The error may be coming from

    &#xA;

    File:: Protocol not found&#xA;Did you mean file:File:?&#xA;

    &#xA;

    Complete logs from the physical device are given below

    &#xA;

    flutter: video path/var/mobile/Containers/Data/Application/70EA92CD-DF5D-440E-BBE1-C130BC7F0CC7/Library/Caches/video.mp4&#xA;flutter: watermark path/var/mobile/Containers/Data/Application/70EA92CD-DF5D-440E-BBE1-C130BC7F0CC7/Library/Caches/image.png&#xA;ffmpeg version v4.5-dev-3393-g30322ebe3c Copyright (c) 2000-2021 the FFmpeg developers&#xA;  built with Apple clang version 13.0.0 (clang-1300.0.29.30)&#xA;  configuration: --cross-prefix=arm64-ios-darwin- --sysroot=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS15.2.sdk --prefix=/Users/taner/Projects/ffmpeg-kit/prebuilt/apple-ios-arm64/ffmpeg --pkg-config=/opt/homebrew/bin/pkg-config --enable-version3 --arch=aarch64 --cpu=armv8 --target-os=darwin --enable-neon --enable-asm --ar=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ar --cc=clang --cxx=clang&#x2B;&#x2B; --as=&#x27;/Users/taner/Projects/ffmpeg-kit/.tmp/gas-preprocessor.pl -arch aarch64 -- clang -arch arm64 -target arm64-apple-ios12.1 -march=armv8-a&#x2B;crc&#x2B;crypto -mcpu=generic -DFFMPEG_KIT_ARM64 -Wno-unused-function -Wno-deprecated-declarations -fstrict-aliasing -fembed-bitcode -DIOS -DFFMPEG_KIT_BUILD_DATE=20220114 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS15.2.sdk -Oz -miphoneos-version-min=12.1 -I/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS15.2.sdk/usr/include&#x27; --ranlib=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib --strip=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/strip --nm=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm --extra-ldflags=&#x27;-miphoneos-version-min=12.1&#x27; --disable-autodetect --enable-cross-compile --enable-pic --enable-inline-asm --enable-optimizations --enable-swscale --enable-shared --disable-static --install-name-dir=&#x27;@rpath&#x27; --enable-pthreads --disable-v4l2-m2m --disable-outdev=v4l2 --disable-outdev=fbdev --disable-indev=v4l2 --disable-indev=fbdev --enable-small --disable-xmm-clobber-test --disable-debug --disable-neon-clobber-test --disable-programs --disable-postproc --disable-doc --disable-htmlpages --disable-manpages --disable-podpages --disable-txtpages --disable-sndio --disable-schannel --disable-securetransport --disable-xlib --disable-cuda --disable-cuvid --disable-nvenc --disable-vaapi --disable-vdpau --disable-alsa --disable-cuda --disable-cuvid --disable-nvenc --disable-vaapi --disable-vdpau --enable-gmp --enable-gnutls --disable-sdl2 --disable-openssl --enable-zlib --enable-audiotoolbox --disable-outdev=audiotoolbox --enable-bzlib --enable-videotoolbox --enable-avfoundation --enable-iconv --disable-coreimage --disable-appkit --disable-opencl --disable-opengl&#xA;  libavutil      57. 13.100 / 57. 13.100&#xA;  libavcodec     59. 15.102 / 59. 15.102&#xA;  libavformat    59. 10.100 / 59. 10.100&#xA;  libavdevice    59.  1.100 / 59.  1.100&#xA;  libavfilter     8. 21.100 /  8. 21.100&#xA;  libswscale      6.  1.102 /  6.  1.102&#xA;  libswresample   4.  0.100 /  4.  0.100&#xA;File:: Protocol not found&#xA;Did you mean file:File:?&#xA;flutter: FFmpeg process exited with state RUNNING and rs 1.If failed null&#xA;flutter: Last output ffmpeg version v4.5-dev-3393-g30322ebe3c Copyright (c) 2000-2021 the FFmpeg developers&#xA;  built with Apple clang version 13.0.0 (clang-1300.0.29.30)&#xA;  configuration: --cross-prefix=arm64-ios-darwin- --sysroot=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS15.2.sdk --prefix=/Users/taner/Projects/ffmpeg-kit/prebuilt/apple-ios-arm64/ffmpeg --pkg-config=/opt/homebrew/bin/pkg-config --enable-version3 --arch=aarch64 --cpu=armv8 --target-os=darwin --enable-neon --enable-asm --ar=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ar --cc=clang --cxx=clang&#x2B;&#x2B; --as=&#x27;/Users/taner/Projects/ffmpeg-kit/.tmp/gas-preprocessor.pl -arch aarch64 -- clang -arch arm64 -target arm64-apple-ios12.1 -march=armv8-a&#x2B;crc&#x2B;crypto -mcpu=generic -DFFMPEG_KIT_ARM64 -Wno-unused-function -Wno-deprecated-declarations -fstrict-aliasing -fembed-bitcode -DIOS -DFFMPEG_KIT_BUILD_DATE=20220114 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS15.2.sdk -Oz -miphoneos-version-min=12.1 -I/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS15.2.sdk/usr/include&#x27; --ranlib=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ranlib --strip=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/strip --nm=/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm --extra-ldflags=&#x27;-miphoneos-version-min=12.1&#x27; --disable-autodetect --enable-cross-compile --enable-pic --enable-inline-asm --enable-optimizations --enable-swscale --enable-shared --disable-static --install-name-dir=&#x27;@rpath&#x27; --enable-pthreads --disable-v4l2-m2m --disable-outdev=v4l2 --disable-outdev=fbdev --disable-indev=v4l2 --disable-indev=fbdev --enable-small --disable-xmm-clobber-test --disable-debug --disable-neon-clobber-test --disable-programs --disable-postproc --disable-doc --disable-htmlpages --disable-manpages --disable-podpages --disable-txtpages --disable-sndio --disable-schannel --disable-securetransport --disable-xlib --disable-cuda --disable-cuvid --disable-nvenc --disable-vaapi --disable-vdpau --disable-alsa --disable-cuda --disable-cuvid --disable-nvenc --disable-vaapi --disable-vdpau --enable-gmp --enable-gnutls --disable-sdl2 --disable-openssl --enable-zlib --enable-audiotoolbox --disable-outdev=audiotoolbox --enable-bzlib --enable-videotoolbox --enable-avfoundation --enable-iconv --disable-coreimage --disable-appkit --disable-opencl --disable-opengl&#xA;  libavutil      57. 13.100 / 57. 13.100&#xA;  libavcodec     59. 15.102 / 59. 15.102&#xA;  libavformat    59. 10.100 / 59. 10.100&#xA;  libavdevice    59.  1.100 / 59.  1.100&#xA;  libavfilter     8. 21.100 /  8. 21.100&#xA;  libswscale      6.  1.102 /  6.  1.102&#xA;  libswresample   4.  0.100 /  4.  0.100&#xA;File:: Protocol not found&#xA;Did you mean file:File:?&#xA;flutter: new video path/var/mobile/Containers/Data/Application/70EA92CD-DF5D-440E-BBE1-C130BC7F0CC7/Library/Caches/1661891272856433result.mp4&#xA;

    &#xA;

    Update #2 after a lot of help from @kesh below, this function was actually able to overlay the image on the video.

    &#xA;

    await FFmpegKit.executeAsync(&#x27;-i $videoPath -i ${watermark.path} -filter_complex "overlay=10:10" -y ${videoFile.path}&#x27;)&#xA;

    &#xA;