Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • FFmpeg streaming stuck few minutes after start [closed]

    19 février, par josue ramirez

    Sorry for my english! :D

    I'm trying to make a live video stream from my webcam with ffmpeg using Node.

    I get the images or frames from ffmpeg via pipe1 on the stdout event and send them via a socket-io client to my angular app to play the stream.

    So far the stream starts fine and my angular app can play it fine, but after a certain time (8 minutes or so) ffmpeg stops sending data and the stream in my angular app stops and if I want to start the webcam again it's not possible anymore because it's apparently occupied by another process. I've been trying to figure out what's going on and what I've noticed is that the ffmpeg process just stops and the process is no longer visible with the top command (ubuntu), but the child process in node still shows a ffmpeg process as if it were a zombie process.

    I've tried running ffmpeg from the terminal to save an mp4 file with the stream and in that case there is no problem, the recording doesn't stop even using pipe1. Only when I run ffmpeg from my Node app, the error occurs.

    Does anyone have any idea what could be going on?

    This command is the one that is executed to send the data through pipe1 both from node and from the terminal but it only stops in Node

    ffmpeg -f v4l2 -input_format mjpeg -video_size 800x600 -i /dev/video0 -q:v 4 -f image2pipe pipe1
    

    The same thing happens when saving an mp4 video only in Node it fails

    ffmpeg -f v4l2 -input_format mjpeg -video_size 800x600 -i /dev/video0 -q:v 4 video.mp4
    

    This is my function that starts the streaming

    export const startCamera = () => {
        logger.info('Starting camera ', cameraName);
    
        const inputFormat = process.platform === 'win32' ? 'dshow' : 'v4l2';
    
        const ffmpegArgs = [
            '-f', inputFormat, // Formato de entrada según la plataforma
        ];
    
        if (process.platform === 'linux') {
            ffmpegArgs.push(
                '-input_format', 'mjpeg',  // Especificar el formato antes de video_size
                '-video_size', '800x600',  // Tamaño de la imagen
            );
        }
        
        ffmpegArgs.push(
            '-i', cameraName,    // Dispositivo de entrada
            '-vf', 'fps=15',     // Fotogramas por segundo
            '-q:v', '10',        // Calidad de video
            '-f', 'image2pipe',  // Formato de salida
            'pipe:1'             // Salida por stdout
        );
    
        try {
            logger.info('FFmpeg command: ffmpeg', ffmpegArgs.join(' '));
            ffmpeg = spawn('ffmpeg', ffmpegArgs);
            let cameraOpen = false;
            // Enviar los datos del fotograma a través del WebSocket
            ffmpeg.stdout.on('data', (chunk) => {
                logger.debug(`Sending chunk of ${chunk.length} bytes`);
                if (!cameraOpen) {
                    cameraOpen = true;
                    serialConnection.connect();
                    logger.info('Camera started!!');
                }
    
                if (streamWS.isActive()) {
                    streamWS.send(chunk);
                }
            });
            // Manejo de errores
            ffmpeg.stderr.on('error', (data) => {
                logger.error('FFmpeg Error:', data.toString());
            });
    
            ffmpeg.on('close', async () => {
                logger.info('Camera closed....', cameraName);
                if (!cameraOpen) {
                    await commandWS.send(
                        {
                            command: COMMANDS.START_STREAM,
                            isError: true,
                            response: `Error al intentar iniciar la camara ${cameraName}, verificar que la camara se encuentre bien conectada.`,
                        },
                        COMMANDS_EVENTS.ROBOT_RESPONSE
                    );
                }
                cameraOpen = false;
                await serialConnection.close();
                stopCamera();
                await streamWS.disconnectWS();
            });
        } catch (e) {
            logger.error('Error tryng to start the camera...', e);
        }
    };
    
  • FFmpeg : high quality animated GIF ?

    19 février, par David Hope

    I'm generating animated a GIF from a video on my server.

    The generated GIF is not really high quality and it looks like the pixels are huge.

    Example:

    example GIF

    This is how I generate the GIF:

    shell_exec("/usr/bin/ffmpeg -i video.mkv -vf scale=500:-1 -t 10 -r 10 image.gif");
    

    I did a search on Google and came across this:

    shell_exec("/usr/bin/ffmpeg -i video.mkv -r 20 -f image2pipe -vcodec ppm - | convert -delay 5 - output.gif");
    

    But the command above doesn't do anything and no output.gif is being generated at all.

    There are some tutorials that I came across but none of them worked for me and some of them involve using ImageMagick which I dont have access to.

    Could someone please let me know if there is a clear way to generate a high-quality GIF using FFmpeg?

  • How to watermark with overlay fadeout with ffmpeg [closed]

    19 février, par Kevin Waterson

    I wish to put a watermark on my video, and a text overlay. The text overlay should fade out after the first three seconds. The watermark is permanent.

    This is what I have come up with, however, the watermark seems to push the overlay down.

    ffmpeg -y -i in.mp4 -i images/watermark.png -filter_complex "[0:v][1:v]overlay=10:10[bg];[bg]drawtext=fontfile=fonts/SAMAN___.ttf:text='Testing':box=1:boxborderw=800|800:boxcolor=black:fontsize=40:fontcolor=white:alpha='if(lt(t,2),0,if(lt(t,3),(t-2)/1,if(lt(t,6),1,if(lt(t,7),(1-(t-6))/1,0))))':x=(w-text_w)/2:y=(h-text_h)/2:y=h-th-10[out]" -map "[out]" -map 0:a out.mp4

  • Download HLS ( HTTP ) Stream video using python

    19 février, par LeDerp

    I need to download a streaming video(from a URL) using python the command line argument would be:

    ffmpeg -i URL stream.mp4
    

    I know I can use the subprocess command

    subprocess.call('ffmpeg -i '+ URL +' stream.mp4', shell=True)
    

    Is there any alternative like a API that I can use instead of using subprocess command

  • How to split an audio / video file based upon multiple timestamps by using FFMPEG (preferably)

    19 février, par badr2001

    I would like to be able to split my audio (mp3 or equiv.) or video file based upon multiple timestamps. The same way, for those who have used any editing software, you can crop the file based upon selecting a start time and end time.

    What I have done so far:

    [HttpPost]
    public async Task ProcessFullAudio([FromBody] ProcessFullAudioRequest processFullAudioRequest)
    {
        if (processFullAudioRequest == null || processFullAudioRequest.StartEndTimes == null || processFullAudioRequest.StartEndTimes.Length == 0)
        {
            throw new ArgumentException("Invalid request: StartEndTimes cannot be null or empty.");
        }
    
        var timestamp = DateTime.Now.ToString("yyyyMMdd_HHmmss");
        var fileName = $"{Guid.NewGuid()}_{timestamp}.mp3";
        var outputFilePath = Path.Combine(customFolder, fileName);
    
        int amntOfTimeStamps = processFullAudioRequest.StartEndTimes.Length;
        int step = 0;
        string lineOfCodeForFirstTime = " \"aselect='not(between(t,,)";
        string lineOfCodeForRestOfTimes = "+between(t,,)";
        string argument = "";
        string result = null;
        string filterComplex = GenerateAtrims(processFullAudioRequest.StartEndTimes);
    
        while (step != amntOfTimeStamps)
        {
            step++;
            if (step == 1)
            {
                result = ReplacePlaceholders(lineOfCodeForFirstTime, HhMmSsToSeconds(processFullAudioRequest.StartEndTimes[step - 1].Start).ToString() , HhMmSsToSeconds(processFullAudioRequest.StartEndTimes[step - 1].End).ToString());
                argument += result;
            }
            else
            {
                result = ReplacePlaceholders(lineOfCodeForRestOfTimes, HhMmSsToSeconds(processFullAudioRequest.StartEndTimes[step - 1].Start).ToString() , HhMmSsToSeconds(processFullAudioRequest.StartEndTimes[step - 1].End).ToString());
                argument += result;
            }
    
        }
        var FFMPEG_PATH = Path.Combine(Directory.GetCurrentDirectory(), "wwwroot", "ffmpeg.exe");
        if (!System.IO.File.Exists(FFMPEG_PATH))
        {
            return "ffmpeg.exe IS NULL";
        }
        var arguments = $"-i {processFullAudioRequest.InputFilePath} -af {argument})' ,asetpts=N/SR/TB\" -acodec libmp3lame {outputFilePath}";
    
        await ProcessAsyncHelper.ExecuteShellCommand(FFMPEG_PATH, arguments, Timeout.Infinite);
    
        return outputFilePath;
    }
    

    The code, I hope, is self explanatory. The argument variable should like something like this:

    -i C:\Users\User\Desktop\AudioEditorBackEnmd\AudioEditorAPI\wwwroot\mp3\TestAudio_123.mp3 -af  "aselect='not(between(t,120,240))' ,asetpts=N/SR/TB" -acodec libmp3lame C:\Users\User\Desktop\AudioEditorBackEnmd\AudioEditorAPI\wwwroot\mp3\aa887f21-0a90-4ec5-80ba-2b265cb445b4_20250219_123804.mp3
    

    After returning the output path for the newly processed edited audio, I pass it to my DownloadFile function:

            [HttpGet]
            public async Task DownloadProcessedFile([FromQuery] string fileName)
            {
                if (string.IsNullOrWhiteSpace(fileName))
                {
                    return BadRequest("File name is required.");
                }
    
                var filePath = Path.Combine(customFolder, fileName);
    
                if (!System.IO.File.Exists(filePath))
                {
                    return NotFound(new { fileName });
                }
    
                try
                {
                    var fileBytes = await System.IO.File.ReadAllBytesAsync(filePath);
    
                    return File(fileBytes, "audio/mpeg", fileName);
    
                }
                catch (Exception ex)
                {
                    throw;
                }
    
            }
    
    

    Everything works, in the sense off, I am able to get an processed file which is shorter than the original audio. However the issue is that the timestamps are off. For example, in the argument that I provided above, I am cropping the audio from 120 to 240, which is from the 2 min mark till the 4 min mark. The audio which I am passing is 01:06:53 however the processed audio gives me back 1:05:22 which is not excepted as I should be getting 01:04:53.

    The annoying thing is that I was getting the desired output at one point. Im not sure what changes caused the timestamps to become off. The only changes which I did was changing the file locations to my wwwroot folder.

    I have tried many different variations of commands to get a desired output but can't seem to get anything close - I always get 1:05:22 back. From the commands that I tired was:

                var arguments = $"-i {processFullAudioRequest.InputFilePath} -af {argument})' , asetpts=N/SR/TB\" -c:a libmp3lame -q:a 2  {outputFilePath}";
    

    I tried so many more but I simply can't remember. And now I feel like I have reached a hard wall in coming up with a solution for this.

    Any help I would much appreciate. I have tried to give as much detail as I can but if anything remains please let me know.