Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Rewrite video with ffmpeg with given non-equidistant times for each frame

    27 mars, par Aleph0

    In one of my projects it is necessary to synchronize a video with a different datasource. I already have the video as a mts file.

    I also exactly know which frame will be displayed at which time. This times are not necessarily equidistant.

    For simplicity assume that my video consists just out of 5 frames:

    Frame No Time
    1 1s
    2 3s
    3 5s
    4 8s
    5 12s

    What kind of ffmpeg command using a list of times give by a simple text file can be used to create a video stream with non-equidistant frames with the given times.

  • How can I display an image from Uint8List with the format AVFormat.RGBA8888 more quickly in Flutter ?

    27 mars, par Salawesh

    I'm looking to display images as quickly as possible. input data is in the form of Uint8List from dart:typed_data, data is encoded in AVFormat.RGBA8888 from ffmpeg.

    I'm looking for a solution to improve the performance of my graphics rendering code. And to see if it's possible to do it in a thread (isolate or compute).

    Here's my current working code.

        final buffer = await ui.ImmutableBuffer.fromUint8List(data.data);
        final descriptor = ui.ImageDescriptor.raw(
          buffer,
          width: data.width,
          height: data.height,
          pixelFormat: ui.PixelFormat.rgba8888,
        );
        final codec = await descriptor.instantiateCodec(); // native codec of ui.Image
        final frameInfo = await codec.getNextFrame();
    

    This is done in my main thread

  • How to manage hls in Nginx RTMP module

    27 mars, par syrkandonut

    I would like to manage the hls broadcast on request, like stop/start or some other way in Nginx RMTP module. My rtmp server needs to support many cameras, however, when it does ffmpeg exec for 200-300 rtmp streams, this is very difficult for the processor, so I would like to execute the ffmpeg command in parallel only on request, how could this be done?

    Rtmp Server

    rtmp {
        server {
            listen 1935;
            chunk_size 8192;
    
            application live {
                live on;
                record off;
                drop_idle_publisher 10s;
                allow publish all;
    
                on_publish rtmp-router:8082/on_publish;
    
                  exec ffmpeg -i rtmp://localhost:1935/live/$name
                  -f lavfi -i anullsrc -c:v copy -c:a aac -shortest -f flv rtmp://localhost:1935/hls/$name_main;
            }
    
    
            application hls {
                live on;
                hls on;
                hls_fragment_naming system;
                hls_fragment 2;
                hls_playlist_length 4;
                hls_path /opt/data/hls;
                hls_nested on;
    
                hls_variant _main BANDWIDTH=878000,RESOLUTION=640x360;
            }
        }
    }
    

    I would like to solve this through nginx or python itself, since the server running with threads is written in FastAPI.

  • VLC dead input for RTP stream

    27 mars, par CaptainCheese

    I'm working on creating an rtp stream that's meant to display live waveform data from Pioneer prolink players. The motivation for sending this video out is to be able to receive it in a flutter frontend. I initially was just sending a base-24 encoding of the raw ARGB packed ints per frame across a Kafka topic to it but processing this data in flutter proved to be untenable and was bogging down the main UI thread. Not sure if this is the most optimal way of going about this but just trying to get anything to work if it means some speedup on the frontend. So the issue the following implementation is experiencing is that when I run vlc --rtsp-timeout=120000 --network-caching=30000 -vvvv stream_1.sdp where

    % cat stream_1.sdp
    v=0
    o=- 0 1 IN IP4 127.0.0.1
    s=RTP Stream
    c=IN IP4 127.0.0.1
    t=0 0
    a=tool:libavformat
    m=video 5007 RTP/AVP 96
    a=rtpmap:96 H264/90000
    

    I see (among other questionable logs) the following:

    [0000000144c44d10] live555 demux error: no data received in 10s, aborting
    [00000001430ee2f0] main input debug: EOF reached
    [0000000144b160c0] main decoder debug: killing decoder fourcc `h264'
    [0000000144b160c0] main decoder debug: removing module "videotoolbox"
    [0000000144b164a0] main packetizer debug: removing module "h264"
    [0000000144c44d10] main demux debug: removing module "live555"
    [0000000144c45bb0] main stream debug: removing module "record"
    [0000000144a64960] main stream debug: removing module "cache_read"
    [0000000144c29c00] main stream debug: removing module "filesystem"
    [00000001430ee2f0] main input debug: Program doesn't contain anymore ES
    [0000000144806260] main playlist debug: dead input
    [0000000144806260] main playlist debug: changing item without a request (current 0/1)
    [0000000144806260] main playlist debug: nothing to play
    [0000000142e083c0] macosx interface debug: Playback has been ended
    [0000000142e083c0] macosx interface debug: Releasing IOKit system sleep blocker (37463)
    

    This is sort of confusing because when I run ffmpeg -protocol_whitelist file,crypto,data,rtp,udp -i stream_1.sdp -vcodec libx264 -f null - I see a number logs about

    [h264 @ 0x139304080] non-existing PPS 0 referenced
        Last message repeated 1 times
    [h264 @ 0x139304080] decode_slice_header error
    [h264 @ 0x139304080] no frame!
    

    After which I see the stream is received and I start getting telemetry on it:

    Input #0, sdp, from 'stream_1.sdp':
      Metadata:
        title           : RTP Stream
      Duration: N/A, start: 0.016667, bitrate: N/A
      Stream #0:0: Video: h264 (Constrained Baseline), yuv420p(progressive), 1200x200, 60 fps, 60 tbr, 90k tbn
    Stream mapping:
      Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
    Press [q] to stop, [?] for help
    [libx264 @ 0x107f04f40] using cpu capabilities: ARMv8 NEON
    [libx264 @ 0x107f04f40] profile High, level 3.1, 4:2:0, 8-bit
    Output #0, null, to 'pipe:':
      Metadata:
        title           : RTP Stream
        encoder         : Lavf61.7.100
      Stream #0:0: Video: h264, yuv420p(tv, progressive), 1200x200, q=2-31, 60 fps, 60 tbn
          Metadata:
            encoder         : Lavc61.19.101 libx264
          Side data:
            cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
    [out#0/null @ 0x60000069c000] video:144KiB audio:0KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: unknown
    frame= 1404 fps= 49 q=-1.0 Lsize=N/A time=00:00:23.88 bitrate=N/A speed=0.834x
    

    Not sure why VLC is turning me down like some kind of Berghain bouncer that lets nobody in the entire night.

    I initially tried just converting the ARGB ints to a YUV420p buffer and used this to create the Frame objects but I couldn't for the life of me figure out how to properly initialize it as the attempts I made kept spitting out garbled junk.

    Please go easy on me, I've made an unhealthy habit of resolving nearly all of my coding questions by simply lurking the internet for answers but that's not really helping me solve this issue.

    Here's the Java I'm working on (the meat of the rtp comms occurs within updateWaveformForPlayer()):

    package com.bugbytz.prolink;
    
    import org.apache.kafka.clients.producer.KafkaProducer;
    import org.apache.kafka.clients.producer.Producer;
    import org.apache.kafka.clients.producer.ProducerConfig;
    import org.apache.kafka.clients.producer.ProducerRecord;
    import org.bytedeco.ffmpeg.global.avcodec;
    import org.bytedeco.ffmpeg.global.avutil;
    import org.bytedeco.javacv.FFmpegFrameGrabber;
    import org.bytedeco.javacv.FFmpegFrameRecorder;
    import org.bytedeco.javacv.FFmpegLogCallback;
    import org.bytedeco.javacv.Frame;
    import org.bytedeco.javacv.FrameGrabber;
    import org.deepsymmetry.beatlink.CdjStatus;
    import org.deepsymmetry.beatlink.DeviceAnnouncement;
    import org.deepsymmetry.beatlink.DeviceAnnouncementAdapter;
    import org.deepsymmetry.beatlink.DeviceFinder;
    import org.deepsymmetry.beatlink.Util;
    import org.deepsymmetry.beatlink.VirtualCdj;
    import org.deepsymmetry.beatlink.data.BeatGridFinder;
    import org.deepsymmetry.beatlink.data.CrateDigger;
    import org.deepsymmetry.beatlink.data.MetadataFinder;
    import org.deepsymmetry.beatlink.data.TimeFinder;
    import org.deepsymmetry.beatlink.data.WaveformDetail;
    import org.deepsymmetry.beatlink.data.WaveformDetailComponent;
    import org.deepsymmetry.beatlink.data.WaveformFinder;
    
    import java.awt.*;
    import java.awt.image.BufferedImage;
    import java.io.File;
    import java.nio.ByteBuffer;
    import java.text.DecimalFormat;
    import java.util.ArrayList;
    import java.util.HashMap;
    import java.util.HashSet;
    import java.util.Map;
    import java.util.Properties;
    import java.util.Set;
    import java.util.concurrent.ExecutionException;
    import java.util.concurrent.Executors;
    import java.util.concurrent.ScheduledExecutorService;
    import java.util.concurrent.ScheduledFuture;
    import java.util.concurrent.TimeUnit;
    
    import static org.bytedeco.ffmpeg.global.avutil.AV_PIX_FMT_RGB24;
    
    public class App {
        public static ArrayList tracks = new ArrayList<>();
        public static boolean dbRead = false;
        public static Properties props = new Properties();
        private static Map recorders = new HashMap<>();
        private static Map frameCount = new HashMap<>();
    
        private static final ScheduledExecutorService scheduler = Executors.newScheduledThreadPool(1);
        private static final int FPS = 60;
        private static final int FRAME_INTERVAL_MS = 1000 / FPS;
    
        private static Map schedules = new HashMap<>();
    
        private static Set streamingPlayers = new HashSet<>();
    
        public static String byteArrayToMacString(byte[] macBytes) {
            StringBuilder sb = new StringBuilder();
            for (int i = 0; i < macBytes.length; i++) {
                sb.append(String.format("%02X%s", macBytes[i], (i < macBytes.length - 1) ? ":" : ""));
            }
            return sb.toString();
        }
    
        private static void updateWaveformForPlayer(int player) throws Exception {
            Integer frame_for_player = frameCount.get(player);
            if (frame_for_player == null) {
                frame_for_player = 0;
                frameCount.putIfAbsent(player, frame_for_player);
            }
    
            if (!WaveformFinder.getInstance().isRunning()) {
                WaveformFinder.getInstance().start();
            }
            WaveformDetail detail = WaveformFinder.getInstance().getLatestDetailFor(player);
    
            if (detail != null) {
                WaveformDetailComponent component = (WaveformDetailComponent) detail.createViewComponent(
                        MetadataFinder.getInstance().getLatestMetadataFor(player),
                        BeatGridFinder.getInstance().getLatestBeatGridFor(player)
                );
                component.setMonitoredPlayer(player);
                component.setPlaybackState(player, TimeFinder.getInstance().getTimeFor(player), true);
                component.setAutoScroll(true);
                int width = 1200;
                int height = 200;
                Dimension dimension = new Dimension(width, height);
                component.setPreferredSize(dimension);
                component.setSize(dimension);
                component.setScale(1);
                component.doLayout();
    
                // Create a fresh BufferedImage and clear it before rendering
                BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_INT_RGB);
                Graphics2D g = image.createGraphics();
                g.clearRect(0, 0, width, height);  // Clear any old content
    
                // Draw waveform into the BufferedImage
                component.paint(g);
                g.dispose();
    
                int port = 5004 + player;
                String inputFile = port + "_" + frame_for_player + ".mp4";
                // Initialize the FFmpegFrameRecorder for YUV420P
                FFmpegFrameRecorder recorder_file = new FFmpegFrameRecorder(inputFile, width, height);
                FFmpegLogCallback.set();  // Enable FFmpeg logging for debugging
                recorder_file.setFormat("mp4");
                recorder_file.setVideoCodec(avcodec.AV_CODEC_ID_H264);
                recorder_file.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);  // Use YUV420P format directly
                recorder_file.setFrameRate(FPS);
    
                // Set video options
                recorder_file.setVideoOption("preset", "ultrafast");
                recorder_file.setVideoOption("tune", "zerolatency");
                recorder_file.setVideoOption("x264-params", "repeat-headers=1");
                recorder_file.setGopSize(FPS);
                try {
                    recorder_file.start();  // Ensure this is called before recording any frames
                    System.out.println("Recorder started successfully for player: " + player);
                } catch (org.bytedeco.javacv.FFmpegFrameRecorder.Exception e) {
                    e.printStackTrace();
                }
    
                // Get all pixels in one call
                int[] pixels = new int[width * height];
                image.getRGB(0, 0, width, height, pixels, 0, width);
                recorder_file.recordImage(width,height,Frame.DEPTH_UBYTE,1,3 * width, AV_PIX_FMT_RGB24, ByteBuffer.wrap(argbToByteArray(pixels, width, height)));
                recorder_file.stop();
                recorder_file.release();
                final FFmpegFrameRecorder recorder = recorders.get(player);
                FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(inputFile);
    
    
                try {
                    grabber.start();
                } catch (Exception e) {
                    e.printStackTrace();
                }
                if (recorder == null) {
                    try {
                        String outputStream = "rtp://127.0.0.1:" + port;
                        FFmpegFrameRecorder initial_recorder = new FFmpegFrameRecorder(outputStream, grabber.getImageWidth(), grabber.getImageHeight());
                        initial_recorder.setFormat("rtp");
                        initial_recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
                        initial_recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
                        initial_recorder.setFrameRate(grabber.getFrameRate());
                        initial_recorder.setGopSize(FPS);
                        initial_recorder.setVideoOption("x264-params", "keyint=60");
                        initial_recorder.setVideoOption("rtsp_transport", "tcp");
                        initial_recorder.start();
                        recorders.putIfAbsent(player, initial_recorder);
                        frameCount.putIfAbsent(player, 0);
                        putToRTP(player, grabber, initial_recorder);
                    }
                    catch (Exception e) {
                        e.printStackTrace();
                    }
                }
                else {
                    putToRTP(player, grabber, recorder);
                }
                File file = new File(inputFile);
                if (file.exists() && file.delete()) {
                    System.out.println("Successfully deleted file: " + inputFile);
                } else {
                    System.out.println("Failed to delete file: " + inputFile);
                }
            }
        }
    
        public static void putToRTP(int player, FFmpegFrameGrabber grabber, FFmpegFrameRecorder recorder) throws FrameGrabber.Exception {
            final Frame frame = grabber.grabFrame();
            int frameCount_local = frameCount.get(player);
            frame.keyFrame = frameCount_local++ % FPS == 0;
            frameCount.put(player, frameCount_local);
            try {
                recorder.record(frame);
            } catch (FFmpegFrameRecorder.Exception e) {
                throw new RuntimeException(e);
            }
        }
        public static byte[] argbToByteArray(int[] argb, int width, int height) {
            int totalPixels = width * height;
            byte[] byteArray = new byte[totalPixels * 3];  // 4 bytes per pixel (ARGB)
    
            for (int i = 0; i < totalPixels; i++) {
                int argbPixel = argb[i];
    
                byteArray[i * 3] = (byte) ((argbPixel >> 16) & 0xFF);  // Red
                byteArray[i * 3 + 1] = (byte) ((argbPixel >> 8) & 0xFF);   // Green
                byteArray[i * 3 + 2] = (byte) (argbPixel & 0xFF);  // Blue
            }
    
            return byteArray;
        }
    
    
        public static void main(String[] args) throws Exception {
            VirtualCdj.getInstance().setDeviceNumber((byte) 4);
            CrateDigger.getInstance().addDatabaseListener(new DBService());
            props.put("bootstrap.servers", "localhost:9092");
            props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
            props.put("value.serializer", "com.bugbytz.prolink.CustomSerializer");
            props.put(ProducerConfig.MAX_REQUEST_SIZE_CONFIG, "20971520");
    
            VirtualCdj.getInstance().addUpdateListener(update -> {
                if (update instanceof CdjStatus) {
                    try (Producer producer = new KafkaProducer<>(props)) {
                        DecimalFormat df_obj = new DecimalFormat("#.##");
                        DeviceStatus deviceStatus = new DeviceStatus(
                                update.getDeviceNumber(),
                                ((CdjStatus) update).isPlaying() || !((CdjStatus) update).isPaused(),
                                ((CdjStatus) update).getBeatNumber(),
                                update.getBeatWithinBar(),
                                Double.parseDouble(df_obj.format(update.getEffectiveTempo())),
                                Double.parseDouble(df_obj.format(Util.pitchToPercentage(update.getPitch()))),
                                update.getAddress().getHostAddress(),
                                byteArrayToMacString(DeviceFinder.getInstance().getLatestAnnouncementFrom(update.getDeviceNumber()).getHardwareAddress()),
                                ((CdjStatus) update).getRekordboxId(),
                                update.getDeviceName()
                        );
                        ProducerRecord record = new ProducerRecord<>("device-status", "device-" + update.getDeviceNumber(), deviceStatus);
                        try {
                            producer.send(record).get();
                        } catch (InterruptedException ex) {
                            throw new RuntimeException(ex);
                        } catch (ExecutionException ex) {
                            throw new RuntimeException(ex);
                        }
                        producer.flush();
                        if (!WaveformFinder.getInstance().isRunning()) {
                            try {
                                WaveformFinder.getInstance().start();
                            } catch (Exception ex) {
                                throw new RuntimeException(ex);
                            }
                        }
                    }
                }
            });
            DeviceFinder.getInstance().addDeviceAnnouncementListener(new DeviceAnnouncementAdapter() {
                @Override
                public void deviceFound(DeviceAnnouncement announcement) {
                    if (!streamingPlayers.contains(announcement.getDeviceNumber())) {
                        streamingPlayers.add(announcement.getDeviceNumber());
                        schedules.putIfAbsent(announcement.getDeviceNumber(), scheduler.scheduleAtFixedRate(() -> {
                            try {
                                Runnable task = () -> {
                                    try {
                                        updateWaveformForPlayer(announcement.getDeviceNumber());
                                    } catch (InterruptedException e) {
                                        System.out.println("Thread interrupted");
                                    } catch (Exception e) {
                                        throw new RuntimeException(e);
                                    }
                                    System.out.println("Lambda thread work completed!");
                                };
                                task.run();
                            } catch (Exception e) {
                                e.printStackTrace();
                            }
                        }, 0, FRAME_INTERVAL_MS, TimeUnit.MILLISECONDS));
                    }
                }
    
                @Override
                public void deviceLost(DeviceAnnouncement announcement) {
                    if (streamingPlayers.contains(announcement.getDeviceNumber())) {
                        schedules.get(announcement.getDeviceNumber()).cancel(true);
                        streamingPlayers.remove(announcement.getDeviceNumber());
                    }
                }
            });
            BeatGridFinder.getInstance().start();
            MetadataFinder.getInstance().start();
            VirtualCdj.getInstance().start();
            TimeFinder.getInstance().start();
            DeviceFinder.getInstance().start();
            CrateDigger.getInstance().start();
    
            try {
                LoadCommandConsumer consumer = new LoadCommandConsumer("localhost:9092", "load-command-group");
                Thread consumerThread = new Thread(consumer::startConsuming);
                consumerThread.start();
    
                Runtime.getRuntime().addShutdownHook(new Thread(() -> {
                    consumer.shutdown();
                    try {
                        consumerThread.join();
                    } catch (InterruptedException e) {
                        Thread.currentThread().interrupt();
                    }
                }));
                Thread.sleep(60000);
            } catch (InterruptedException e) {
                System.out.println("Interrupted, exiting.");
            }
        }
    }
    
  • Can ffmpeg extract closed caption data [closed]

    26 mars, par spinon

    I am currently using ffmpeg to convert videos in various formats to flv files. One request has also come up and that is to get closed caption info out o the file as well. Does anyone have any experience with this or know it can even be done. I don't see any options for it but thought I would ask and see.