Advanced search

Medias (1)

Tag: - Tags -/artwork

Other articles (46)

  • Les formats acceptés

    28 January 2010, by

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • Ajouter notes et légendes aux images

    7 February 2011, by

    Pour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
    Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
    Modification lors de l’ajout d’un média
    Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)

  • Websites made ​​with MediaSPIP

    2 May 2011, by

    This page lists some websites based on MediaSPIP.

On other websites (9050)

  • How does FFmpeg determine the “attached pic” and “timed thumbnails” dispositions of an MP4 track?

    6 March, by obskyr

    The Issue

    


    FFmpeg has a concept of “dispositions” – a property that describes the purpose of a stream in a media file. For example, here are the streams in a file I have lying around, with the dispositions emphasized:

    


      Stream #0:0[0x1](und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo,&#xA;fltp, 251 kb/s <strong><em>(default)</em></strong>&#xA;      Metadata:&#xA;        creation_time   : 2021-11-10T20:14:06.000000Z&#xA;        handler_name    : Core Media Audio&#xA;        vendor_id       : [0][0][0][0]&#xA;&#xA;  Stream #0:1[0x2](und): Video: mjpeg (Baseline) (jpeg / 0x6765706A),&#xA;yuvj420p(pc, bt470bg/unknown/unknown), 1024x1024, 0 kb/s, 0.0006 fps, 3.08 tbr,&#xA;600 tbn <strong><em>(default) (attached pic) (timed thumbnails)</em></strong>&#xA;      Metadata:&#xA;        creation_time   : 2021-11-10T20:14:06.000000Z&#xA;        handler_name    : Core Media Video&#xA;        vendor_id       : [0][0][0][0]&#xA;&#xA;  Stream #0:2[0x3](und): Data: bin_data (text / 0x74786574)&#xA;      Metadata:&#xA;        creation_time   : 2021-11-10T20:14:06.000000Z&#xA;        handler_name    : Core Media Text&#xA;&#xA;  Stream #0:3[0x0]: Video: mjpeg (Baseline), yuvj420p(pc, bt470bg/unknown/&#xA;unknown), 1024x1024 [SAR 144:144 DAR 1:1], 90k tbr, 90k tbn <strong><em>(attached pic)</em></strong>

    &#xA;

    However, if I make any modification to this file’s chapter markers using the C++ library MP4v2 (even just re-saving the existing ones: auto f = MP4Modify("test.m4a"); MP4Chapter_t* chapterList; uint32_t chapterCount; MP4GetChapters(f, &amp;chapterList, &amp;chapterCount); MP4SetChapters(f, chapterList, chapterCount); MP4Close(f);), some of these dispositions are removed:

    &#xA;

      Stream #0:0[0x1](und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo,&#xA;fltp, 251 kb/s <strong><em>(default)</em></strong>&#xA;      Metadata:&#xA;        creation_time   : 2021-11-10T20:14:06.000000Z&#xA;        handler_name    : Core Media Audio&#xA;        vendor_id       : [0][0][0][0]&#xA;&#xA;  Stream #0:1[0x2](und): Video: mjpeg (Baseline) (jpeg / 0x6765706A),&#xA;yuvj420p(pc, bt470bg/unknown/unknown), 1024x1024, 0 kb/s, 0.0006 fps, 3.08 tbr,&#xA;600 tbn <strong><em>(default)</em></strong> <kbd>← “attached pic” and “timed thumbnails” removed!</kbd>&#xA;      Metadata:&#xA;        creation_time   : 2021-11-10T20:14:06.000000Z&#xA;        handler_name    : Core Media Video&#xA;        vendor_id       : [0][0][0][0]&#xA;&#xA;  Stream #0:2[0x0]: Video: mjpeg (Baseline), yuvj420p(pc, bt470bg/unknown/&#xA;unknown), 1024x1024 [SAR 144:144 DAR 1:1], 90k tbr, 90k tbn <strong><em>(attached pic)</em></strong>&#xA;&#xA;  Stream #0:3[0x4](und): Data: bin_data (text / 0x74786574)&#xA;  <kbd>This stream was moved to the end, but that’s intended behavior. It contains chapter titles, and we just edited the chapters.</kbd>&#xA;      Metadata:&#xA;        creation_time   : 2025-03-05T09:56:31.000000Z

    &#xA;

    It also renders the file unplayable in MPC-HC (but not in VLC!), which is apparently a bug in MP4v2. I’m currently investigating that bug to report and potentially fix it, but that’s a separate issue – in my journey there, I’m wracking my brain trying to understand what it is that MP4v2 changes to make FFmpeg stop reporting the “attached pic” and “timed thumbnails” dispositions. I’ve explored the before-and-afters in MP4 Box, and I can’t for the life of me find which atom it is that differs in a relevant way.

    &#xA;

    (I’d love to share the files, but unfortunately the contents are under copyright – if anyone knows of a way to remove the audio from an MP4 file without changing anything else, let me know and I’ll upload dummied-out versions. Without them, I can’t really ask about the issue directly. I can at least show you the files’ respective atom trees, but I’m not sure how relevant that is.)

    &#xA;

    The Question

    &#xA;

    I thought I’d read FFmpeg’s source code to find out how it determines dispositions for MP4 streams, but of course, FFmpeg is very complex. Could someone who’s more familiar with C and/or FFmpeg’s codebase help me sleuth out how FFmpeg determines dispositions for MP4 files (in particular, “attached pic” and “timed thumbnails”)?

    &#xA;

    Some Thoughts…

    &#xA;

      &#xA;
    • I figure searching for “attached_pic” might be a good start?
    • &#xA;

    • Could the MP4 muxer movenc.c be helpful?
    • &#xA;

    • I’d imagine what we’d really like to look at is the MP4 demuxing process, as it’s during demuxing that FFmpeg determines dispositions from the data in the file. After poring over the code for hours, however, I’ve been utterly unable to find where that happens.
    • &#xA;

    &#xA;

  • How to improve web camera streaming latency to v4l2loopback device with ffmpeg?

    11 March, by Made by Moses

    I'm trying to stream my iPhone camera to my PC on LAN.

    &#xA;

    What I've done:

    &#xA;

      &#xA;
    1. HTTP server with html page and streaming script:

      &#xA;

      I use WebSockets here and maybe WebRTC is better choice but it seems like network latency is good enough

      &#xA;

    2. &#xA;

    &#xA;

    async function beginCameraStream() {&#xA;  const mediaStream = await navigator.mediaDevices.getUserMedia({&#xA;    video: { facingMode: "user" },&#xA;  });&#xA;&#xA;  websocket = new WebSocket(SERVER_URL);&#xA;&#xA;  websocket.onopen = () => {&#xA;    console.log("WS connected");&#xA;&#xA;    const options = { mimeType: "video/mp4", videoBitsPerSecond: 1_000_000 };&#xA;    mediaRecorder = new MediaRecorder(mediaStream, options);&#xA;&#xA;    mediaRecorder.ondataavailable = async (event) => {&#xA;      // to measure latency I prepend timestamp to the actual video bytes chunk&#xA;      const timestamp = Date.now();&#xA;      const timestampBuffer = new ArrayBuffer(8);&#xA;      const dataView = new DataView(timestampBuffer);&#xA;      dataView.setBigUint64(0, BigInt(timestamp), true);&#xA;      const data = await event.data.bytes();&#xA;&#xA;      const result = new Uint8Array(data.byteLength &#x2B; 8);&#xA;      result.set(new Uint8Array(timestampBuffer), 0);&#xA;      result.set(data, 8);&#xA;&#xA;      websocket.send(result);&#xA;    };&#xA;&#xA;    mediaRecorder.start(100); // Collect 100ms chunks&#xA;  };&#xA;}&#xA;

    &#xA;

      &#xA;
    1. Server to process video chunks

      &#xA;

    2. &#xA;

    &#xA;

    import { serve } from "bun";&#xA;import { Readable } from "stream";&#xA;&#xA;const V4L2LOOPBACK_DEVICE = "/dev/video10";&#xA;&#xA;export const setupFFmpeg = (v4l2device) => {&#xA;  // prettier-ignore&#xA;  return spawn("ffmpeg", [&#xA;    &#x27;-i&#x27;, &#x27;pipe:0&#x27;,           // Read from stdin&#xA;    &#x27;-pix_fmt&#x27;, &#x27;yuv420p&#x27;,    // Pixel format&#xA;    &#x27;-r&#x27;, &#x27;30&#x27;,               // Target 30 fps&#xA;    &#x27;-f&#x27;, &#x27;v4l2&#x27;,             // Output format&#xA;    v4l2device, // Output to v4l2loopback device&#xA;  ]);&#xA;};&#xA;&#xA;export class FfmpegStream extends Readable {&#xA;  _read() {&#xA;    // This is called when the stream wants more data&#xA;    // We push data when we get chunks&#xA;  }&#xA;}&#xA;&#xA;function main() {&#xA;  const ffmpeg = setupFFmpeg(V4L2LOOPBACK_DEVICE);&#xA;  serve({&#xA;    port: 8000,&#xA;    fetch(req, server) {&#xA;      if (server.upgrade(req)) {&#xA;        return; // Upgraded to WebSocket&#xA;      }&#xA;    },&#xA;    websocket: {&#xA;      open(ws) {&#xA;        console.log("Client connected");&#xA;        const stream = new FfmpegStream();&#xA;        stream.pipe(ffmpeg?.stdin);&#xA;&#xA;        ws.data = {&#xA;          stream,&#xA;          received: 0,&#xA;        };&#xA;      },&#xA;      async message(ws, message) {&#xA;        const view = new DataView(message.buffer, 0, 8);&#xA;        const ts = Number(view.getBigUint64(0, true));&#xA;        ws.data.received &#x2B;= message.byteLength;&#xA;        const chunk = new Uint8Array(message.buffer, 8, message.byteLength - 8);&#xA;&#xA;        ws.data.stream.push(chunk);&#xA;&#xA;        console.log(&#xA;          [&#xA;            `latency: ${Date.now() - ts} ms`,&#xA;            `chunk: ${message.byteLength}`,&#xA;            `total: ${ws.data.received}`,&#xA;          ].join(" | "),&#xA;        );&#xA;      },&#xA;    },&#xA;  });&#xA;}&#xA;&#xA;main();&#xA;

    &#xA;

    After I try to open the v4l2loopback device

    &#xA;

    cvlc v4l2:///dev/video10&#xA;

    &#xA;

    picture is delayed for at least 1.5 sec which is unacceptable for my project.

    &#xA;

    Thoughts:

    &#xA;

      &#xA;
    • Problem doesn't seems to be with network latency
    • &#xA;

    &#xA;

    latency: 140 ms | chunk: 661 Bytes | total: 661 Bytes&#xA;latency: 206 ms | chunk: 16.76 KB | total: 17.41 KB&#xA;latency: 141 ms | chunk: 11.28 KB | total: 28.68 KB&#xA;latency: 141 ms | chunk: 13.05 KB | total: 41.74 KB&#xA;latency: 199 ms | chunk: 11.39 KB | total: 53.13 KB&#xA;latency: 141 ms | chunk: 16.94 KB | total: 70.07 KB&#xA;latency: 139 ms | chunk: 12.67 KB | total: 82.74 KB&#xA;latency: 142 ms | chunk: 13.14 KB | total: 95.88 KB&#xA;

    &#xA;

     150ms is actually too much for 15KB on LAN but there can some issue with my router

    &#xA;

      &#xA;
    • As far as I can tell it neither ties to ffmpeg throughput:
    • &#xA;

    &#xA;

    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from &#x27;pipe:0&#x27;:&#xA;  Metadata:&#xA;    major_brand     : iso5&#xA;    minor_version   : 1&#xA;    compatible_brands: isomiso5hlsf&#xA;    creation_time   : 2025-03-09T17:16:49.000000Z&#xA;  Duration: 00:00:01.38, start:&#xA;0.000000, bitrate: N/A&#xA;    Stream #0:0(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuvj420p(pc), 1280x720, 4012 kb/s, 57.14 fps, 29.83 tbr, 600 tbn, 1200 tbc (default)&#xA;    Metadata:&#xA;      rotate          : 90&#xA;      creation_time   : 2025-03-09T17:16:49.000000Z&#xA;      handler_name    : Core Media Video&#xA;    Side data:&#xA;      displaymatrix: rotation of -90.00 degrees&#xA;&#xA;Stream mapping:&#xA;  Stream #0:0 -> #0:0 (h264 (native) -> rawvideo (native))&#xA;&#xA;[swscaler @ 0x55d8d0b83100] deprecated pixel format used, make sure you did set range correctly&#xA;&#xA;Output #0, video4linux2,v4l2, to &#x27;/dev/video10&#x27;:&#xA;  Metadata:&#xA;    major_brand     : iso5&#xA;    minor_version   : 1&#xA;    compatible_brands: isomiso5hlsf&#xA;    encoder         : Lavf58.45.100&#xA;&#xA;Stream #0:0(und): Video: rawvideo (I420 / 0x30323449), yuv420p, 720x1280, q=2-31, 663552 kb/s, 60 fps, 60 tbn, 60 tbc (default)&#xA;    Metadata:&#xA;      encoder         : Lavc58.91.100 rawvideo&#xA;      creation_time   : 2025-03-09T17:16:49.000000Z&#xA;      handler_name    : Core Media Video&#xA;    Side data:&#xA;      displaymatrix: rotation of -0.00 degrees&#xA;&#xA;frame=   99 fps=0.0 q=-0.0 size=N/A time=00:00:01.65 bitrate=N/A dup=50 drop=0 speed=2.77x&#xA;frame=  137 fps=114 q=-0.0 size=N/A time=00:00:02.28 bitrate=N/A dup=69 drop=0 speed=1.89x&#xA;frame=  173 fps= 98 q=-0.0 size=N/A time=00:00:02.88 bitrate=N/A dup=87 drop=0 speed=1.63x&#xA;frame=  210 fps= 86 q=-0.0 size=N/A time=00:00:03.50 bitrate=N/A dup=105 drop=0 speed=1.44x&#xA;frame=  249 fps= 81 q=-0.0 size=N/A time=00:00:04.15 bitrate=N/A dup=125 drop=0 speed=1.36&#xA;frame=  279 fps= 78 q=-0.0 size=N/A time=00:00:04.65 bitrate=N/A dup=139 drop=0 speed=1.31x&#xA;

    &#xA;

      &#xA;
    • I also tried to write the video stream directly to video.mp4 file and immediately open it with vlc but it only can be successfully opened after 1.5 sec.

      &#xA;

    • &#xA;

    • I've tried to use OBS v4l2 input source instead of vlc but the latency is the same

      &#xA;

    • &#xA;

    &#xA;

    Update №1

    &#xA;

    When i try to stream actual .mp4 file to ffmpeg it works almost immediately with 0.2sec delay to spin up the ffmpeg itself:

    &#xA;

    cat video.mp4 | ffmpeg -re -i pipe:0 -pix_fmt yuv420p -f v4l2 /dev/video10 &amp; ; sleep 0.2 &amp;&amp; cvlc v4l2:///dev/video10&#xA;

    &#xA;

    So the problem is apparently with streaming process

    &#xA;

  • How to quote a file name with a single quote in ffmpeg movie= filter notation? [closed]

    26 May, by PieterV

    I am trying to run ffmpeg using a file that contains a single quote &#x27; in the filename.

    &#xA;

    I tried to follow the docs that say I should replace a &#x27; with &#x27;\&#x27;&#x27;.
    &#xA;And a ticket that says I should replace a &#x27; with \\\\\&#x27;.

    &#xA;

    I've tried both, and can't get get it working.

    &#xA;

    E.g. docs format:

    &#xA;

    ./ffprobe -loglevel error -read_intervals %00:30 -select_streams s:0 -f lavfi -i "movie=&#x27;D\:\\Test\\Interlaced - Dragons&#x27;\&#x27;&#x27; Den - S14E02 - Episode 2.mkv&#x27;[out0&#x2B;subcc]" -show_packets -print_format json&#xA;&#xA;{&#xA;[Parsed_movie_0 @ 00000222a2f82200] Failed to avformat_open_input &#x27;D:\Test\Interlaced - Dragons Den - S14E02 - Episode 2.mkv&#x27;&#xA;[AVFilterGraph @ 00000222a2f76ec0] Error processing filtergraph: No such file or directory&#xA;movie=&#x27;D\:\\Test\\Interlaced - Dragons&#x27;\&#x27;&#x27; Den - S14E02 - Episode 2.mkv&#x27;[out0&#x2B;subcc]: No such file or directory&#xA;

    &#xA;

    E.g. ticket format:

    &#xA;

    ./ffprobe -loglevel error -read_intervals %00:30 -select_streams s:0 -f lavfi -i "movie=&#x27;D\:\\Test\\Interlaced - Dragons\\\\\&#x27; Den - S14E02 - Episode 2.mkv&#x27;[out0&#x2B;subcc]" -show_packets -print_format json&#xA;&#xA;{&#xA;[Parsed_movie_0 @ 00000158613d2080] Failed to avformat_open_input &#x27;D:\Test\Interlaced - Dragons\\ Den - S14E02 - Episode 2.mkv[out0&#x2B;subcc]&#x27;&#xA;[AVFilterGraph @ 00000158613c6ec0] Error processing filtergraph: No such file or directory&#xA;movie=&#x27;D\:\\Test\\Interlaced - Dragons\\\\\&#x27; Den - S14E02 - Episode 2.mkv&#x27;[out0&#x2B;subcc]: No such file or directory&#xA;

    &#xA;

    > dir "D:\Test\Interlaced - Dragons&#x27; Den - S14E02 - Episode 2.mkv"&#xA;&#xA;    Directory: D:\Test&#xA;&#xA;Mode                 LastWriteTime         Length Name&#xA;----                 -------------         ------ ----&#xA;-a---           4/20/2025 11:38 AM       18059051 Interlaced - Dragons&#x27; Den - S14E02 - Episode 2.mkv&#xA;

    &#xA;

    This is on Win11 using FFmpeg7.
    &#xA;Any ideas?

    &#xA;

    [Update]
    &#xA;I found a doc on escape filtergraph strings, did not help, I tried 0 to 7 \.

    &#xA;

    I also found and tried the ffescape utility, the output it produces just uses a single \&#x27; and does not work.

    &#xA;

    > echo "D:\Test\Interlaced - Dragons&#x27; Den - S14E02 - Episode 2.mkv" | ./ffescape.exe&#xA;=> D:\\Test\\Interlaced - Dragons\&#x27; Den - S14E02 - Episode 2.mkv\&#xA;&#xA;> ./ffprobe -loglevel error -read_intervals %00:30 -select_streams s:0 -f lavfi -i "movie=&#x27;D:\\Test\\Interlaced - Dragons\&#x27; Den - S14E02 - Episode 2.mkv\&#x27;[out0&#x2B;subcc]" -show_packets -print_format json&#xA;{&#xA;[Parsed_movie_0 @ 0000021348f12200] Failed to avformat_open_input &#x27;D&#x27;&#xA;[AVFilterGraph @ 0000021348f06ec0] Error processing filtergraph: No such file or directory&#xA;movie=&#x27;D:\\Test\\Interlaced - Dragons\&#x27; Den - S14E02 - Episode 2.mkv\&#x27;[out0&#x2B;subcc]: No such file or directory&#xA;

    &#xA;

    [Update]
    &#xA;I found docs for ffmpeg filter script where I can place commands in a file.

    &#xA;

    I tried ./ffprobe -loglevel error -read_intervals %00:01 -select_streams s:0 -f lavfi -/i "d:\filtergraph.txt" -show_packets -print_format json, and it load the script.

    &#xA;

    Works: movie=test.mkv[out0&#x2B;subcc]\ if test.mkv is in ffprobe dir.&#xA;Works: movie=test\&#x27;.mkv[out0&#x2B;subcc]\ if test&#x27;.mkv is in ffprobe dir.
    &#xA;Not: movie=D:\test.mkv[out0&#x2B;subcc]
    &#xA;Not: movie=D\:\\test.mkv[out0&#x2B;subcc]
    &#xA;Not: movie=test space.mkv[out0&#x2B;subcc]
    &#xA;Not: movie=&#x27;test space.mkv[out0&#x2B;subcc]&#x27;
    &#xA;Not: movie="test space.mkv[out0&#x2B;subcc]"
    &#xA;Not: &#x27;movie=test space.mkv[out0&#x2B;subcc]&#x27;
    &#xA;Not: "movie=test space.mkv[out0&#x2B;subcc]"

    &#xA;

    :(

    &#xA;

    Update with working answer

    &#xA;