Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • FFMPEG File Output is Still in Use By a Process

    6 mai, par Tyler Bacon

    I am trying to complete this part of my program. In this section, I am trying to speed up or slow down a video based on a factor variable. Once it's done, I use moviepy to turn it into a VideoFileClip, then I delete the file.

        if factor <= 2:
            system("ffmpeg -i " + paths[dex] + " -vf setpts=" + str(vfactor) + "*PTS -an ./Media/Videos/temp.mp4")
            system("ffmpeg -i " + paths[dex] + " -filter:a atempo=" + str(factor) + " -vn ./Media/ShortSounds/temp.mp3")
        elif 2 < factor < 4:
            factor = round(sqrt(factor), 1)
            system("ffmpeg -i " + paths[dex] + " -vf setpts=" + str(vfactor) + "*PTS,setpts=" + str(vfactor) + "*PTS  -an ./Media/Videos/temp.mp4")
            system("ffmpeg -i " + paths[dex] + " -filter:a atempo=" + str(factor) + ",atempo=" + str(factor) + " -vn ./Media/ShortSounds/temp.mp3")
        elif factor > 4:
            raise Exception("File " + paths[dex] + " is too long.")
        t = VideoFileClip("./Media/Videos/temp.mp4")
        t.audio = AudioFileClip("./Media/Videos/temp.mp3")
        templist.append(t)
        remove("./Media/Videos/temp.mp4")
    

    However, when the code gets to the deletion command, it has the following error:

    PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: './Media/Videos/temp.mp4'
    

    What's strange is, I can see the temp.mp4 file, and it runs just fine. I never get this error while manually running the temp.mp4 file.

    I have tried the following:

    • Waiting 5, 10, and 20 seconds before deleting the file.
    • Running "taskkill -f -im ffmpeg.exe" before deleting the file
    • I went through the debugger, and right before the deletion, I checked in task manager to see if ffmpeg was still running, and it wasn't.

    Do you guys have any idea what could be holding this up? My code worked previously when I was trying to just do audio, but I am trying it with video and this is happening.

  • ffmpeg pipe process ends right after writing first buffer data to input stream and does not keep running

    6 mai, par Taketo Matsunaga

    I have been trying to convert 16bit PCM (s16le) audio data to webm using ffmpeg in C#. But the process ends right after the writing the first buffer data to standard input. I has exited with the status 0, meaning success. But do not know why.... Could anyone tell me why?

    I apprecite it if you could support me.

        public class SpeechService : ISpeechService
        {
            
            /// 
            /// Defines the _audioInputStream
            /// 
            private readonly MemoryStream _audioInputStream = new MemoryStream();
    
            public async Task SendPcmAsWebmViaWebSocketAsync(
                MemoryStream pcmAudioStream,
                int sampleRate,
                int channels) 
            {
                string inputFormat = "s16le";
    
                var ffmpegProcessInfo = new ProcessStartInfo
                {
                    FileName = _ffmpegPath,
                    Arguments =
                        $"-f {inputFormat} -ar {sampleRate} -ac {channels} -i pipe:0 " +
                        $"-f webm pipe:1",
                    RedirectStandardInput = true,
                    RedirectStandardOutput = true,
                    RedirectStandardError = true,
                    UseShellExecute = false,
                    CreateNoWindow = true,
                };
    
                _ffmpegProcess = new Process { StartInfo = ffmpegProcessInfo };
    
                Console.WriteLine("Starting FFmpeg process...");
                try
                {
    
                    if (!await Task.Run(() => _ffmpegProcess.Start()))
                    {
                        Console.Error.WriteLine("Failed to start FFmpeg process.");
                        return;
                    }
                    Console.WriteLine("FFmpeg process started.");
    
                }
                catch (Exception ex)
                {
                    Console.Error.WriteLine($"Error starting FFmpeg process: {ex.Message}");
                    throw;
                }
    
                var encodeAndSendTask = Task.Run(async () =>
                {
                    try
                    {
                        using var ffmpegOutputStream = _ffmpegProcess.StandardOutput.BaseStream;
                        byte[] buffer = new byte[8192]; // Temporary buffer to read data
                        byte[] sendBuffer = new byte[8192]; // Buffer to accumulate data for sending
                        int sendBufferIndex = 0; // Tracks the current size of sendBuffer
                        int bytesRead;
    
                        Console.WriteLine("Reading WebM output from FFmpeg and sending via WebSocket...");
                        while (true)
                        {
                            if ((bytesRead = await ffmpegOutputStream.ReadAsync(buffer, 0, buffer.Length)) > 0)
                            {
                                // Copy data to sendBuffer
                                Array.Copy(buffer, 0, sendBuffer, sendBufferIndex, bytesRead);
                                sendBufferIndex += bytesRead;
    
                                // If sendBuffer is full, send it via WebSocket
                                if (sendBufferIndex >= sendBuffer.Length)
                                {
                                    var segment = new ArraySegment(sendBuffer, 0, sendBuffer.Length);
                                    _ws.SendMessage(segment);
                                    sendBufferIndex = 0; // Reset the index after sending
                                }
                            }
                        }
                    }
                    catch (OperationCanceledException)
                    {
                        Console.WriteLine("Encode/Send operation cancelled.");
                    }
                    catch (IOException ex) when (ex.InnerException is ObjectDisposedException)
                    {
                        Console.WriteLine("Stream was closed, likely due to process exit or cancellation.");
                    }
                    catch (Exception ex)
                    {
                        Console.Error.WriteLine($"Error during encoding/sending: {ex}");
                    }
                });
    
                var errorReadTask = Task.Run(async () =>
                {
                    Console.WriteLine("Starting to read FFmpeg stderr...");
                    using var errorReader = _ffmpegProcess.StandardError;
                    try
                    {
                        string? line;
                        while ((line = await errorReader.ReadLineAsync()) != null) 
                        {
                            Console.WriteLine($"[FFmpeg stderr] {line}");
                        }
                    }
                    catch (OperationCanceledException) { Console.WriteLine("FFmpeg stderr reading cancelled."); }
                    catch (TimeoutException) { Console.WriteLine("FFmpeg stderr reading timed out (due to cancellation)."); }
                    catch (Exception ex) { Console.Error.WriteLine($"Error reading FFmpeg stderr: {ex.Message}"); }
                    Console.WriteLine("Finished reading FFmpeg stderr.");
                });
    
            }
    
            public async Task AppendAudioBuffer(AudioMediaBuffer audioBuffer)
            {
                try
                {
                    // audio for a 1:1 call
                    var bufferLength = audioBuffer.Length;
                    if (bufferLength > 0)
                    {
                        var buffer = new byte[bufferLength];
                        Marshal.Copy(audioBuffer.Data, buffer, 0, (int)bufferLength);
    
                        _logger.Info("_ffmpegProcess.HasExited:" + _ffmpegProcess.HasExited);
                        using var ffmpegInputStream = _ffmpegProcess.StandardInput.BaseStream;
                        await ffmpegInputStream.WriteAsync(buffer, 0, buffer.Length);
                        await ffmpegInputStream.FlushAsync(); // バッファをフラッシュ
                        _logger.Info("Wrote buffer data.");
    
                    }
                }
                catch (Exception e)
                {
                    _logger.Error(e, "Exception happend writing to input stream");
                }
            }
    
    
    Starting FFmpeg process...
    FFmpeg process started.
    Starting to read FFmpeg stderr...
    Reading WebM output from FFmpeg and sending via WebSocket...
    [FFmpeg stderr] ffmpeg version 7.1.1-essentials_build-www.gyan.dev Copyright (c) 2000-2025 the FFmpeg developers
    [FFmpeg stderr]   built with gcc 14.2.0 (Rev1, Built by MSYS2 project)
    [FFmpeg stderr]   configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-zlib --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-sdl2 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-dxva2 --enable-d3d11va --enable-d3d12va --enable-ffnvcodec --enable-libvpl --enable-nvdec --enable-nvenc --enable-vaapi --enable-libgme --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libtheora --enable-libvo-amrwbenc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-librubberband
    [FFmpeg stderr]   libavutil      59. 39.100 / 59. 39.100
    [FFmpeg stderr]   libavcodec     61. 19.101 / 61. 19.101
    [FFmpeg stderr]   libavformat    61.  7.100 / 61.  7.100
    [FFmpeg stderr]   libavdevice    61.  3.100 / 61.  3.100
    [FFmpeg stderr]   libavfilter    10.  4.100 / 10.  4.100
    [FFmpeg stderr]   libswscale      8.  3.100 /  8.  3.100
    [FFmpeg stderr]   libswresample   5.  3.100 /  5.  3.100
    [FFmpeg stderr]   libpostproc    58.  3.100 / 58.  3.100
    
    [2025-05-06 15:44:43,598][INFO][XbLogger.cs:85] _ffmpegProcess.HasExited:False
    [2025-05-06 15:44:43,613][INFO][XbLogger.cs:85] Wrote buffer data.
    [2025-05-06 15:44:43,613][INFO][XbLogger.cs:85] Wrote buffer data.
    [FFmpeg stderr] [aist#0:0/pcm_s16le @ 0000025ec8d36040] Guessed Channel Layout: mono
    [FFmpeg stderr] Input #0, s16le, from 'pipe:0':
    [FFmpeg stderr]   Duration: N/A, bitrate: 256 kb/s
    [FFmpeg stderr]   Stream #0:0: Audio: pcm_s16le, 16000 Hz, mono, s16, 256 kb/s
    [FFmpeg stderr] Stream mapping:
    [FFmpeg stderr]   Stream #0:0 -> #0:0 (pcm_s16le (native) -> opus (libopus))
    [FFmpeg stderr] [libopus @ 0000025ec8d317c0] No bit rate set. Defaulting to 64000 bps.
    [FFmpeg stderr] Output #0, webm, to 'pipe:1':
    [FFmpeg stderr]   Metadata:
    [FFmpeg stderr]     encoder         : Lavf61.7.100
    [FFmpeg stderr]   Stream #0:0: Audio: opus, 16000 Hz, mono, s16, 64 kb/s
    [FFmpeg stderr]       Metadata:
    [FFmpeg stderr]         encoder         : Lavc61.19.101 libopus
    [FFmpeg stderr] [out#0/webm @ 0000025ec8d36200] video:0KiB audio:1KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: 67.493113%
    [FFmpeg stderr] size=       1KiB time=00:00:00.04 bitrate= 243.2kbits/s speed=2.81x
    Finished reading FFmpeg stderr.
    [2025-05-06 15:44:44,101][INFO][XbLogger.cs:85] _ffmpegProcess.HasExited:True
    [2025-05-06 15:44:44,132][ERROR][XbLogger.cs:67] Exception happend writing to input stream
    System.ObjectDisposedException: Cannot access a closed file.
       at System.IO.FileStream.WriteAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
       at System.IO.Stream.WriteAsync(Byte[] buffer, Int32 offset, Int32 count)
       at EchoBot.Media.SpeechService.AppendAudioBuffer(AudioMediaBuffer audioBuffer) in C:\Users\tm068\Documents\workspace\myprj\xbridge-teams-bot\src\EchoBot\Media\SpeechService.cs:line 242
    

    I am expecting the ffmpeg process keep running.

  • MoviePY write_videofile using GPU for faster encoding [closed]

    5 mai, par kaushal

    I'm creating a video from scratch using moviePY. I am generating all the required frames, adding required audio (including a voiceover and background music), adding a logo and finally writing the video file in 4K.

    Everything works fine, except the write_videofile takes a lot of time.

    I have read many related posts which mentions using the right codec etc. I have NVidia card, so tried both h264_nvenc and hevc_nvenc. Quality of the output video dropped with the first one, so I'm sticking to hevc_nvenc. I'm using below line to write the file.

            video_clip.write_videofile(targetfile, codec="hevc_nvenc", threads=32, fps=24)
    

    What I have noticed is, it does seem to be using the gpu, but very little. Compared to this, when I run stable diffusion or vegas rendering, it uses gpu a lot more.

    That's why I think there is definitely some scope of improvement here. If you see below screenshot, when the video file write starts, the gpu utilisation increases a tiny bit, but it can take a lot more I think, isn't it?

    I can try various parameters that I've seen in other threads, like logger=None, progress_bar = False, ffmpeg_params=['-b:v','10000k'] etc., but they are not going to improve gpu utilisation in any shape or form. I've been wondering what am I missing.

    Any ideas or suggestions please?

    enter image description here

  • Live streaming webvtt subtitles with HLS protocol

    5 mai, par Victor Ruiz

    I need a tool to generate HLS subtitles in live mode. Specifically, I want to create an HTTP server that serves .m3u8 and .webvtt files which are continuously updated over time. Such HLS stream could be consumed via HTTP requests by HLS JS/ffplay multimedia players.

    The .webvtt files will be generated by an automatic transcriber, so the program must update the .m3u8 playlist accordingly whenever a new subtitle is produced.

    I only want to stream subtitle channels—no audio. The video channel can simply display a black chroma background.

    I attempted to use FFmpeg with a Linux pipe as input for the streaming .webvtt subtitles, along with a video file for the video stream. The output .webvtt and .m3u8 files were written to a folder and served via an NGINX server. However, FFmpeg fails after it reads the initial content of the .webvtt input from the pipe. If I inject more content afterward, it gets skipped.

    How can I achieve HLS subtitle streaming in live mode? Can FFmpeg be used for this purpose, or do I need a different tool?

  • Send image and audio data to FFmpeg via named pipes

    5 mai, par Nicke Manarin

    I'm able to send frames one by one to FFmpeg via a name pipe to create a video out of them, but if I try sending audio to a second named pipe, FFmpeg only accepts 1 frame in the frame pipe and starts reading from the audio pipe soon after it.

    ffmpeg.exe -loglevel debug -hwaccel auto 
    -f:v rawvideo -r 25 -pix_fmt bgra -video_size 782x601 -i \\.\pipe\video_to_ffmpeg 
    -f:a s16le -ac 2 -ar 48000 -i \\.\pipe\audio_to_ffmpeg 
    -c:v libx264 -preset fast -pix_fmt yuv420p 
    -vf "scale=trunc(iw/2)*2:trunc(ih/2)*2" -crf 23 -f:v mp4 -vsync vfr 
    -c:a aac -b:a 128k -ar 48000 -ac 2 
    -y "C:\Users\user\Desktop\video.mp4"
    

    I start both pipes like so:

    _imagePipeServer = new NamedPipeServerStream(ImagePipeName, PipeDirection.Out, 1, PipeTransmissionMode.Byte, PipeOptions.Asynchronous);
    _imagePipeStreamWriter = new StreamWriter(_imagePipeServer);
    _imagePipeServer.BeginWaitForConnection(null, null);
    
    _audioPipeServer = new NamedPipeServerStream(AudioPipeName, PipeDirection.Out, 1, PipeTransmissionMode.Byte, PipeOptions.Asynchronous);
    _audioPipeStreamWriter = new StreamWriter(_audioPipeServer);
    _audioPipeServer.BeginWaitForConnection(null, null);
    

    And send the data to the pipes using these methods:

    public void EncodeFrame(byte[] data)
    {
        if (_imagePipeServer?.IsConnected != true)
            throw new FFmpegException("Pipe not connected", Arguments, Output);
    
        _imagePipeStreamWriter?.BaseStream.Write(data, 0, data.Length);
    }
    
    public void EncodeAudio(ISampleProvider provider, long length)
    {
        if (_audioPipeServer?.IsConnected != true)
            throw new FFmpegException("Pipe not connected", Arguments, Output);
    
        var buffer = new byte[provider.WaveFormat.AverageBytesPerSecond * length / TimeSpan.TicksPerSecond];
        var bytesRead = provider.ToWaveProvider().Read(buffer, 0, buffer.Length);
    
        if (bytesRead < 1)
            return;
    
        _audioPipeStreamWriter?.BaseStream.Write(buffer, 0, bytesRead);
        _audioPipeStreamWriter?.BaseStream.Flush();
    }
    

    Not sending the audio (and thus not creating the audio pipe) works, with FFmpeg taking one frame at time and creating the video normally.

    But if I try sending the audio via a secondary pipe, I can only send one frame. This is the output when that happens (Btw, FFmpeg v7.1):

    Splitting the commandline.
    Reading option '-loglevel' ... matched as option 'loglevel' (set logging level) with argument 'debug'.
    Reading option '-hwaccel' ... matched as option 'hwaccel' (use HW accelerated decoding) with argument 'auto'.
    Reading option '-f:v' ... matched as option 'f' (force container format (auto-detected otherwise)) with argument 'rawvideo'.
    Reading option '-r' ... matched as option 'r' (override input framerate/convert to given output framerate (Hz value, fraction or abbreviation)) with argument '25'.
    Reading option '-pix_fmt' ... matched as option 'pix_fmt' (set pixel format) with argument 'bgra'.
    Reading option '-video_size' ... matched as AVOption 'video_size' with argument '782x601'.
    Reading option '-i' ... matched as input url with argument '\\.\pipe\video_to_ffmpeg'.
    Reading option '-f:a' ... matched as option 'f' (force container format (auto-detected otherwise)) with argument 's16le'.
    Reading option '-ac' ... matched as option 'ac' (set number of audio channels) with argument '2'.
    Reading option '-ar' ... matched as option 'ar' (set audio sampling rate (in Hz)) with argument '48000'.
    Reading option '-i' ... matched as input url with argument '\\.\pipe\audio_to_ffmpeg'.
    Reading option '-c:v' ... matched as option 'c' (select encoder/decoder ('copy' to copy stream without reencoding)) with argument 'libx264'.
    Reading option '-preset' ... matched as AVOption 'preset' with argument 'fast'.
    Reading option '-pix_fmt' ... matched as option 'pix_fmt' (set pixel format) with argument 'yuv420p'.
    Reading option '-vf' ... matched as option 'vf' (alias for -filter:v (apply filters to video streams)) with argument 'scale=trunc(iw/2)*2:trunc(ih/2)*2'.
    Reading option '-crf' ... matched as AVOption 'crf' with argument '23'.
    Reading option '-f:v' ... matched as option 'f' (force container format (auto-detected otherwise)) with argument 'mp4'.
    Reading option '-fps_mode' ... matched as option 'fps_mode' (set framerate mode for matching video streams; overrides vsync) with argument 'vfr'.
    Reading option '-c:a' ... matched as option 'c' (select encoder/decoder ('copy' to copy stream without reencoding)) with argument 'aac'.
    Reading option '-b:a' ... matched as option 'b' (video bitrate (please use -b:v)) with argument '128k'.
    Reading option '-ar' ... matched as option 'ar' (set audio sampling rate (in Hz)) with argument '48000'.
    Reading option '-ac' ... matched as option 'ac' (set number of audio channels) with argument '2'.
    Reading option '-y' ... matched as option 'y' (overwrite output files) with argument '1'.
    Reading option 'C:\Users\user\Desktop\video.mp4' ... matched as output url.
    Finished splitting the commandline.
    
    Parsing a group of options: global.
    Applying option loglevel (set logging level) with argument debug.
    Applying option y (overwrite output files) with argument 1.
    Successfully parsed a group of options.
    
    Parsing a group of options: input url \\.\pipe\video_to_ffmpeg.
    Applying option hwaccel (use HW accelerated decoding) with argument auto.
    Applying option f:v (force container format (auto-detected otherwise)) with argument rawvideo.
    Applying option r (override input framerate/convert to given output framerate (Hz value, fraction or abbreviation)) with argument 25.
    Applying option pix_fmt (set pixel format) with argument bgra.
    Successfully parsed a group of options.
    
    Opening an input file: \\.\pipe\video_to_ffmpeg.
    [rawvideo @ 000001c302ee08c0] Opening '\\.\pipe\video_to_ffmpeg' for reading
    [file @ 000001c302ee1000] Setting default whitelist 'file,crypto,data'
    [rawvideo @ 000001c302ee08c0] Before avformat_find_stream_info() pos: 0 bytes read:65536 seeks:0 nb_streams:1
    [rawvideo @ 000001c302ee08c0] All info found
    [rawvideo @ 000001c302ee08c0] After avformat_find_stream_info() pos: 1879928 bytes read:1879928 seeks:0 frames:1
    Input #0, rawvideo, from '\\.\pipe\video_to_ffmpeg':
      Duration: N/A, start: 0.000000, bitrate: 375985 kb/s
      Stream #0:0, 1, 1/25: Video: rawvideo, 1 reference frame (BGRA / 0x41524742), bgra, 782x601, 0/1, 375985 kb/s, 25 tbr, 25 tbn
    Successfully opened the file.
    
    Parsing a group of options: input url \\.\pipe\audio_to_ffmpeg.
    Applying option f:a (force container format (auto-detected otherwise)) with argument s16le.
    Applying option ac (set number of audio channels) with argument 2.
    Applying option ar (set audio sampling rate (in Hz)) with argument 48000.
    Successfully parsed a group of options.
    
    Opening an input file: \\.\pipe\audio_to_ffmpeg.
    [s16le @ 000001c302ef5380] Opening '\\.\pipe\audio_to_ffmpeg' for reading
    [file @ 000001c302ef58c0] Setting default whitelist 'file,crypto,data'
    

    The difference if I try sending 1 frame then some bytes (arbitrary length based on fps) of audio is that I get this extra comment at the end:

    [s16le @ 0000025948c96d00] Before avformat_find_stream_info() pos: 0 bytes read:15360 seeks:0 nb_streams:1
    

    Extra calls to EncodeFrame() hang forever at the BaseStream.Write(frameBytes, 0, frameBytes.Length) call, suggesting that FFmpeg is no longer reading the data.

    Something is causing FFmpeg to close or stop reading the first pipe and only accept data from the second one.

    Perhaps the command is missing something?


    🏆 Working solution

    I started using two BlockingCollection objects, with the consumers running in separate tasks.

    Start the process, setting up the pipes:

    private Process? _process;
    private NamedPipeServerStream? _imagePipeServer;
    private NamedPipeServerStream? _audioPipeServer;
    private StreamWriter? _imagePipeStreamWriter;
    private StreamWriter? _audioPipeStreamWriter;
    private readonly BlockingCollection _videoCollection = new();
    private readonly BlockingCollection _audioCollection = new();
    
    private const string ImagePipeName = "video_to_ffmpeg";
    private const string AudioPipeName = "audio_to_ffmpeg";
    private const string PipeStructure = @"\\.\pipe\"; //This part is only sent to FFmpeg, not to the .NET pipe creation.
    
    public void StartEncoding(string arguments)
    {
        _process = new Process
        {
            StartInfo = new ProcessStartInfo
            {
                FileName = "path to ffmpeg",
                Arguments = arguments.Replace("{image}", PipeStructure + ImagePipeName).Replace("{audio}", PipeStructure + AudioPipeName),
                RedirectStandardInput = false,
                RedirectStandardOutput = true,
                RedirectStandardError = true,
                UseShellExecute = false,
                CreateNoWindow = true
            }
        };
    
        StartFramePipeConnection();
        StartAudioPipeConnection();
    
        _process. Start();
        _process.BeginErrorReadLine();
        _process.BeginOutputReadLine();
    }
    
    private void StartFramePipeConnection()
    {
        if (_imagePipeServer != null)
        {
            if (_imagePipeServer.IsConnected)
                _imagePipeServer.Disconnect();
    
            _imagePipeServer.Dispose();
        }
    
        _imagePipeServer = new NamedPipeServerStream(ImagePipeName, PipeDirection.Out, 1, PipeTransmissionMode.Byte, PipeOptions.Asynchronous);
        _imagePipeStreamWriter = new StreamWriter(_imagePipeServer);
        _imagePipeServer.BeginWaitForConnection(VideoPipe_Connected, null);
    }
    
    private void StartAudioPipeConnection()
    {
        if (_audioPipeServer != null)
        {
            if (_audioPipeServer.IsConnected)
                _audioPipeServer.Disconnect();
    
            _audioPipeServer.Dispose();
        }
    
        _audioPipeServer = new NamedPipeServerStream(AudioPipeName, PipeDirection.Out, 1, PipeTransmissionMode.Byte, PipeOptions.Asynchronous);
        _audioPipeStreamWriter = new StreamWriter(_audioPipeServer);
        _audioPipeServer.BeginWaitForConnection(AudioPipe_Connected, null);
    }
    

    Start sending the data as soon as the pipe gets connected. Once the BlockingCollection gets its signal that no more data is going to be sent, it will leave the foreach block and it will wait for the pipe to drain its data.

    private void VideoPipe_Connected(IAsyncResult ar)
    {
        Task.Run(() =>
        {
            try
            {
                foreach (var frameBytes in _videoCollection.GetConsumingEnumerable())
                {                    
                    _imagePipeStreamWriter?.BaseStream.Write(frameBytes, 0, frameBytes.Length);
                }
    
                _imagePipeServer?.WaitForPipeDrain();
                _imagePipeStreamWriter?.Close();
            }
            catch (Exception e)
            {
                //Logging
                throw;
            }
        });
    }
    
    private void AudioPipe_Connected(IAsyncResult ar)
    {
        Task.Run(() =>
        {
            try
            {
                foreach (var audioChunk in _audioCollection.GetConsumingEnumerable())
                {
                    _audioPipeStreamWriter?.BaseStream.Write(audioChunk, 0, audioChunk.Length);
                }
    
                _audioPipeServer?.WaitForPipeDrain();
                _audioPipeStreamWriter?.Close();
            }
            catch (Exception e)
            {
                //Logging
                throw;
            }
        });
    }
    

    You can send the image and audio data as soon as the BlockingCollections are initiated, no need to wait for the pipes to connect.

    public void EncodeImage(byte[] data)
    {
        _videoCollection.Add(data);
    }
    
    public void EncodeAudio(ISampleProvider provider, long length)
    {
        var sampleCount = (int)(provider.WaveFormat.SampleRate * ((double)length / TimeSpan.TicksPerSecond) * provider.WaveFormat.Channels);
        var floatBuffer = new float[sampleCount];
    
        var samplesRead = provider.Read(floatBuffer, 0, sampleCount);
    
        if (samplesRead < 1)
            return 0;
    
        var byteBuffer = new byte[samplesRead * 4]; //4 bytes per float, f32le.
        Buffer.BlockCopy(floatBuffer, 0, byteBuffer, 0, byteBuffer.Length);
    
        
        _audioCollection.Add(byteBuffer);
    }
    

    Once you finished producing data, make sure to signal the BlockingCollections:

    public void FinishEncoding()
    {
        //Signal the end of video/audio producer.
        _videoCollection.CompleteAdding();
        _audioCollection.CompleteAdding();
    
        //Waits for 20 seconds for encoding to finish.
        _process?.WaitForExit(20_000);
    }
    

    The FFmpeg arguments were changed slightly:

    -loglevel trace -hwaccel auto 
    -f:v rawvideo -probesize 32 -r 25 -pix_fmt bgra -video_size 1109x627 -i {image} 
    -f:a f32le -ac 2 -ar 48000 -probesize 32 -i {audio} 
    -c:v libx264 -preset fast -pix_fmt yuv420p 
    -vf "scale=trunc(iw/2)*2:trunc(ih/2)*2" -crf 23 -f:v mp4 -fps_mode vfr 
    -c:a aac -b:a 128k -ar 48000 -ac 2 
    -y "C:\Users\user\Desktop\Video.mp4"