
Recherche avancée
Médias (2)
-
Exemple de boutons d’action pour une collection collaborative
27 février 2013, par
Mis à jour : Mars 2013
Langue : français
Type : Image
-
Exemple de boutons d’action pour une collection personnelle
27 février 2013, par
Mis à jour : Février 2013
Langue : English
Type : Image
Autres articles (41)
-
Contribute to a better visual interface
13 avril 2011MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community. -
Librairies et logiciels spécifiques aux médias
10 décembre 2010, parPour un fonctionnement correct et optimal, plusieurs choses sont à prendre en considération.
Il est important, après avoir installé apache2, mysql et php5, d’installer d’autres logiciels nécessaires dont les installations sont décrites dans les liens afférants. Un ensemble de librairies multimedias (x264, libtheora, libvpx) utilisées pour l’encodage et le décodage des vidéos et sons afin de supporter le plus grand nombre de fichiers possibles. Cf. : ce tutoriel ; FFMpeg avec le maximum de décodeurs et (...) -
Dépôt de média et thèmes par FTP
31 mai 2013, parL’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...)
Sur d’autres sites (5386)
-
How to improve web camera streaming latency to v4l2loopback device with ffmpeg ?
11 mars, par Made by MosesI'm trying to stream my iPhone camera to my PC on LAN.


What I've done :


- 

-
HTTP server with html page and streaming script :


I use WebSockets here and maybe WebRTC is better choice but it seems like network latency is good enough






async function beginCameraStream() {
 const mediaStream = await navigator.mediaDevices.getUserMedia({
 video: { facingMode: "user" },
 });

 websocket = new WebSocket(SERVER_URL);

 websocket.onopen = () => {
 console.log("WS connected");

 const options = { mimeType: "video/mp4", videoBitsPerSecond: 1_000_000 };
 mediaRecorder = new MediaRecorder(mediaStream, options);

 mediaRecorder.ondataavailable = async (event) => {
 // to measure latency I prepend timestamp to the actual video bytes chunk
 const timestamp = Date.now();
 const timestampBuffer = new ArrayBuffer(8);
 const dataView = new DataView(timestampBuffer);
 dataView.setBigUint64(0, BigInt(timestamp), true);
 const data = await event.data.bytes();

 const result = new Uint8Array(data.byteLength + 8);
 result.set(new Uint8Array(timestampBuffer), 0);
 result.set(data, 8);

 websocket.send(result);
 };

 mediaRecorder.start(100); // Collect 100ms chunks
 };
}



- 

-
Server to process video chunks






import { serve } from "bun";
import { Readable } from "stream";

const V4L2LOOPBACK_DEVICE = "/dev/video10";

export const setupFFmpeg = (v4l2device) => {
 // prettier-ignore
 return spawn("ffmpeg", [
 '-i', 'pipe:0', // Read from stdin
 '-pix_fmt', 'yuv420p', // Pixel format
 '-r', '30', // Target 30 fps
 '-f', 'v4l2', // Output format
 v4l2device, // Output to v4l2loopback device
 ]);
};

export class FfmpegStream extends Readable {
 _read() {
 // This is called when the stream wants more data
 // We push data when we get chunks
 }
}

function main() {
 const ffmpeg = setupFFmpeg(V4L2LOOPBACK_DEVICE);
 serve({
 port: 8000,
 fetch(req, server) {
 if (server.upgrade(req)) {
 return; // Upgraded to WebSocket
 }
 },
 websocket: {
 open(ws) {
 console.log("Client connected");
 const stream = new FfmpegStream();
 stream.pipe(ffmpeg?.stdin);

 ws.data = {
 stream,
 received: 0,
 };
 },
 async message(ws, message) {
 const view = new DataView(message.buffer, 0, 8);
 const ts = Number(view.getBigUint64(0, true));
 ws.data.received += message.byteLength;
 const chunk = new Uint8Array(message.buffer, 8, message.byteLength - 8);

 ws.data.stream.push(chunk);

 console.log(
 [
 `latency: ${Date.now() - ts} ms`,
 `chunk: ${message.byteLength}`,
 `total: ${ws.data.received}`,
 ].join(" | "),
 );
 },
 },
 });
}

main();



After I try to open the v4l2loopback device


cvlc v4l2:///dev/video10



picture is delayed for at least 1.5 sec which is unacceptable for my project.


Thoughts :


- 

- Problem doesn't seems to be with network latency




latency: 140 ms | chunk: 661 Bytes | total: 661 Bytes
latency: 206 ms | chunk: 16.76 KB | total: 17.41 KB
latency: 141 ms | chunk: 11.28 KB | total: 28.68 KB
latency: 141 ms | chunk: 13.05 KB | total: 41.74 KB
latency: 199 ms | chunk: 11.39 KB | total: 53.13 KB
latency: 141 ms | chunk: 16.94 KB | total: 70.07 KB
latency: 139 ms | chunk: 12.67 KB | total: 82.74 KB
latency: 142 ms | chunk: 13.14 KB | total: 95.88 KB



150ms is actually too much for 15KB on LAN but there can some issue with my router


- 

- As far as I can tell it neither ties to ffmpeg throughput :




Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'pipe:0':
 Metadata:
 major_brand : iso5
 minor_version : 1
 compatible_brands: isomiso5hlsf
 creation_time : 2025-03-09T17:16:49.000000Z
 Duration: 00:00:01.38, start:
0.000000, bitrate: N/A
 Stream #0:0(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuvj420p(pc), 1280x720, 4012 kb/s, 57.14 fps, 29.83 tbr, 600 tbn, 1200 tbc (default)
 Metadata:
 rotate : 90
 creation_time : 2025-03-09T17:16:49.000000Z
 handler_name : Core Media Video
 Side data:
 displaymatrix: rotation of -90.00 degrees

Stream mapping:
 Stream #0:0 -> #0:0 (h264 (native) -> rawvideo (native))

[swscaler @ 0x55d8d0b83100] deprecated pixel format used, make sure you did set range correctly

Output #0, video4linux2,v4l2, to '/dev/video10':
 Metadata:
 major_brand : iso5
 minor_version : 1
 compatible_brands: isomiso5hlsf
 encoder : Lavf58.45.100

Stream #0:0(und): Video: rawvideo (I420 / 0x30323449), yuv420p, 720x1280, q=2-31, 663552 kb/s, 60 fps, 60 tbn, 60 tbc (default)
 Metadata:
 encoder : Lavc58.91.100 rawvideo
 creation_time : 2025-03-09T17:16:49.000000Z
 handler_name : Core Media Video
 Side data:
 displaymatrix: rotation of -0.00 degrees

frame= 99 fps=0.0 q=-0.0 size=N/A time=00:00:01.65 bitrate=N/A dup=50 drop=0 speed=2.77x
frame= 137 fps=114 q=-0.0 size=N/A time=00:00:02.28 bitrate=N/A dup=69 drop=0 speed=1.89x
frame= 173 fps= 98 q=-0.0 size=N/A time=00:00:02.88 bitrate=N/A dup=87 drop=0 speed=1.63x
frame= 210 fps= 86 q=-0.0 size=N/A time=00:00:03.50 bitrate=N/A dup=105 drop=0 speed=1.44x
frame= 249 fps= 81 q=-0.0 size=N/A time=00:00:04.15 bitrate=N/A dup=125 drop=0 speed=1.36
frame= 279 fps= 78 q=-0.0 size=N/A time=00:00:04.65 bitrate=N/A dup=139 drop=0 speed=1.31x



- 

-
I also tried to write the video stream directly to
video.mp4
file and immediately open it withvlc
but it only can be successfully opened after 1.5 sec.

-
I've tried to use OBS v4l2 input source instead of vlc but the latency is the same








Update №1


When i try to stream actual
.mp4
file toffmpeg
it works almost immediately with 0.2sec delay to spin up the ffmpeg itself :

cat video.mp4 | ffmpeg -re -i pipe:0 -pix_fmt yuv420p -f v4l2 /dev/video10 & ; sleep 0.2 && cvlc v4l2:///dev/video10



So the problem is apparently with streaming process


-
-
How to quote a file name with a single quote in ffmpeg movie= filter notation ? [closed]
21 avril, par PieterVI am trying to run ffmpeg using a file that contains a single quote
'
in the filename.

I tried to follow the docs that say I should replace a
'
with'\''
.

And a ticket that says I should replace a'
with\\\\\'
.

I've tried both, and can't get get it working.


E.g. docs format :


./ffprobe -loglevel error -read_intervals %00:30 -select_streams s:0 -f lavfi -i "movie='D\:\\Test\\Interlaced - Dragons'\'' Den - S14E02 - Episode 2.mkv'[out0+subcc]" -show_packets -print_format json

{
[Parsed_movie_0 @ 00000222a2f82200] Failed to avformat_open_input 'D:\Test\Interlaced - Dragons Den - S14E02 - Episode 2.mkv'
[AVFilterGraph @ 00000222a2f76ec0] Error processing filtergraph: No such file or directory
movie='D\:\\Test\\Interlaced - Dragons'\'' Den - S14E02 - Episode 2.mkv'[out0+subcc]: No such file or directory



E.g. ticket format :


./ffprobe -loglevel error -read_intervals %00:30 -select_streams s:0 -f lavfi -i "movie='D\:\\Test\\Interlaced - Dragons\\\\\' Den - S14E02 - Episode 2.mkv'[out0+subcc]" -show_packets -print_format json

{
[Parsed_movie_0 @ 00000158613d2080] Failed to avformat_open_input 'D:\Test\Interlaced - Dragons\\ Den - S14E02 - Episode 2.mkv[out0+subcc]'
[AVFilterGraph @ 00000158613c6ec0] Error processing filtergraph: No such file or directory
movie='D\:\\Test\\Interlaced - Dragons\\\\\' Den - S14E02 - Episode 2.mkv'[out0+subcc]: No such file or directory



> dir "D:\Test\Interlaced - Dragons' Den - S14E02 - Episode 2.mkv"

 Directory: D:\Test

Mode LastWriteTime Length Name
---- ------------- ------ ----
-a--- 4/20/2025 11:38 AM 18059051 Interlaced - Dragons' Den - S14E02 - Episode 2.mkv



This is on Win11 using FFmpeg7.

Any ideas ?

[Update]

I found a doc on escape filtergraph strings, did not help, I tried 0 to 7\
.

I also found and tried the
ffescape
utility, the output it produces just uses a single\'
and does not work.

> echo "D:\Test\Interlaced - Dragons' Den - S14E02 - Episode 2.mkv" | ./ffescape.exe
=> D:\\Test\\Interlaced - Dragons\' Den - S14E02 - Episode 2.mkv\

> ./ffprobe -loglevel error -read_intervals %00:30 -select_streams s:0 -f lavfi -i "movie='D:\\Test\\Interlaced - Dragons\' Den - S14E02 - Episode 2.mkv\'[out0+subcc]" -show_packets -print_format json
{
[Parsed_movie_0 @ 0000021348f12200] Failed to avformat_open_input 'D'
[AVFilterGraph @ 0000021348f06ec0] Error processing filtergraph: No such file or directory
movie='D:\\Test\\Interlaced - Dragons\' Den - S14E02 - Episode 2.mkv\'[out0+subcc]: No such file or directory



[Update]

I found docs for ffmpeg filter script where I can place commands in a file.

I tried
./ffprobe -loglevel error -read_intervals %00:01 -select_streams s:0 -f lavfi -/i "d:\filtergraph.txt" -show_packets -print_format json
, and it load the script.

Works :
movie=test.mkv[out0+subcc]
\ iftest.mkv
is in ffprobe dir.
Works :movie=test\'.mkv[out0+subcc]
\ iftest'.mkv
is in ffprobe dir.

Not :movie=D:\test.mkv[out0+subcc]

Not :movie=D\:\\test.mkv[out0+subcc]

Not :movie=test space.mkv[out0+subcc]

Not :movie='test space.mkv[out0+subcc]'

Not :movie="test space.mkv[out0+subcc]"

Not :'movie=test space.mkv[out0+subcc]'

Not :"movie=test space.mkv[out0+subcc]"


:(


-
FFMPEG send RTP audio at 8k bytes/sec [closed]
10 mai, par MuzzaI'm trying to use FFMPEG to mimick a device that transmits G711U audio over UDP/RTP at 8k bytes per second.
The device im mimicking sends rtp packets every 20ms with 160byte payload.


I've had limited success using the following command


ffmpeg -f dshow -i audio="Microphone (Realtek(R) Audio)" -ac 1 -ar 8000 -ab 8 -acodec pcm_mulaw -f rtp rtp://127.0.0.1:12345?pkt_size=160



This sends G711U encoded audio, in 160byte chunks, but streams at 64kB/s, not the 8kB/s that my device is expected, so the device errors out ?


Any idea's would be massively appreciated !


Thank you


Log from FFMPEG


>ffmpeg -f dshow -i audio="Microphone (Realtek(R) Audio)" -ac 1 -ar 8000 -ab 8 -acodec pcm_mulaw -f rtp rtp://127.0.0.1:12345?pkt_size=160
ffmpeg version 2025-04-23-git-25b0a8e295-essentials_build-www.gyan.dev Copyright (c) 2000-2025 the FFmpeg developers
 built with gcc 14.2.0 (Rev3, Built by MSYS2 project)
 configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-zlib --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-sdl2 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-libaom --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-dxva2 --enable-d3d11va --enable-d3d12va --enable-ffnvcodec --enable-libvpl --enable-nvdec --enable-nvenc --enable-vaapi --enable-libgme --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libtheora --enable-libvo-amrwbenc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-librubberband
 libavutil 60. 2.100 / 60. 2.100
 libavcodec 62. 0.101 / 62. 0.101
 libavformat 62. 0.100 / 62. 0.100
 libavdevice 62. 0.100 / 62. 0.100
 libavfilter 11. 0.100 / 11. 0.100
 libswscale 9. 0.100 / 9. 0.100
 libswresample 6. 0.100 / 6. 0.100
 libpostproc 59. 1.100 / 59. 1.100
[aist#0:0/pcm_s16le @ 00000198256b73c0] Guessed Channel Layout: stereo
Input #0, dshow, from 'audio=Microphone (Realtek(R) Audio)':
 Duration: N/A, start: 135470.702000, bitrate: 1411 kb/s
 Stream #0:0: Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s, Start-Time 135470.702s
Stream mapping:
 Stream #0:0 -> #0:0 (pcm_s16le (native) -> pcm_mulaw (native))
Press [q] to stop, [?] for help
[pcm_mulaw @ 00000198256cf240] Bitrate 8 is extremely low, maybe you mean 8k
Output #0, rtp, to 'rtp://127.0.0.1:12345?pkt_size=160':
 Metadata:
 encoder : Lavf62.0.100
 Stream #0:0: Audio: pcm_mulaw, 8000 Hz, mono, s16 (8 bit), 64 kb/s
 Metadata:
 encoder : Lavc62.0.101 pcm_mulaw
SDP:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name
c=IN IP4 127.0.0.1
t=0 0
a=tool:libavformat 62.0.100
m=audio 12345 RTP/AVP 0
b=AS:64

[out#0/rtp @ 00000198256cdd00] video:0KiB audio:973KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: 8.467470%
size= 1055KiB time=00:02:04.51 bitrate= 69.4kbits/s speed= 1x
Exiting normally, received signal 2.



Wireshark :
Wireshark Log


Shows packets being sent every 0.20ms