
Recherche avancée
Médias (3)
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
Valkaama DVD Label
4 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (55)
-
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
-
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...)
Sur d’autres sites (9018)
-
How to improve web camera streaming latency to v4l2loopback device with ffmpeg ?
11 mars, par Made by MosesI'm trying to stream my iPhone camera to my PC on LAN.


What I've done :


- 

-
HTTP server with html page and streaming script :


I use WebSockets here and maybe WebRTC is better choice but it seems like network latency is good enough






async function beginCameraStream() {
 const mediaStream = await navigator.mediaDevices.getUserMedia({
 video: { facingMode: "user" },
 });

 websocket = new WebSocket(SERVER_URL);

 websocket.onopen = () => {
 console.log("WS connected");

 const options = { mimeType: "video/mp4", videoBitsPerSecond: 1_000_000 };
 mediaRecorder = new MediaRecorder(mediaStream, options);

 mediaRecorder.ondataavailable = async (event) => {
 // to measure latency I prepend timestamp to the actual video bytes chunk
 const timestamp = Date.now();
 const timestampBuffer = new ArrayBuffer(8);
 const dataView = new DataView(timestampBuffer);
 dataView.setBigUint64(0, BigInt(timestamp), true);
 const data = await event.data.bytes();

 const result = new Uint8Array(data.byteLength + 8);
 result.set(new Uint8Array(timestampBuffer), 0);
 result.set(data, 8);

 websocket.send(result);
 };

 mediaRecorder.start(100); // Collect 100ms chunks
 };
}



- 

-
Server to process video chunks






import { serve } from "bun";
import { Readable } from "stream";

const V4L2LOOPBACK_DEVICE = "/dev/video10";

export const setupFFmpeg = (v4l2device) => {
 // prettier-ignore
 return spawn("ffmpeg", [
 '-i', 'pipe:0', // Read from stdin
 '-pix_fmt', 'yuv420p', // Pixel format
 '-r', '30', // Target 30 fps
 '-f', 'v4l2', // Output format
 v4l2device, // Output to v4l2loopback device
 ]);
};

export class FfmpegStream extends Readable {
 _read() {
 // This is called when the stream wants more data
 // We push data when we get chunks
 }
}

function main() {
 const ffmpeg = setupFFmpeg(V4L2LOOPBACK_DEVICE);
 serve({
 port: 8000,
 fetch(req, server) {
 if (server.upgrade(req)) {
 return; // Upgraded to WebSocket
 }
 },
 websocket: {
 open(ws) {
 console.log("Client connected");
 const stream = new FfmpegStream();
 stream.pipe(ffmpeg?.stdin);

 ws.data = {
 stream,
 received: 0,
 };
 },
 async message(ws, message) {
 const view = new DataView(message.buffer, 0, 8);
 const ts = Number(view.getBigUint64(0, true));
 ws.data.received += message.byteLength;
 const chunk = new Uint8Array(message.buffer, 8, message.byteLength - 8);

 ws.data.stream.push(chunk);

 console.log(
 [
 `latency: ${Date.now() - ts} ms`,
 `chunk: ${message.byteLength}`,
 `total: ${ws.data.received}`,
 ].join(" | "),
 );
 },
 },
 });
}

main();



After I try to open the v4l2loopback device


cvlc v4l2:///dev/video10



picture is delayed for at least 1.5 sec which is unacceptable for my project.


Thoughts :


- 

- Problem doesn't seems to be with network latency




latency: 140 ms | chunk: 661 Bytes | total: 661 Bytes
latency: 206 ms | chunk: 16.76 KB | total: 17.41 KB
latency: 141 ms | chunk: 11.28 KB | total: 28.68 KB
latency: 141 ms | chunk: 13.05 KB | total: 41.74 KB
latency: 199 ms | chunk: 11.39 KB | total: 53.13 KB
latency: 141 ms | chunk: 16.94 KB | total: 70.07 KB
latency: 139 ms | chunk: 12.67 KB | total: 82.74 KB
latency: 142 ms | chunk: 13.14 KB | total: 95.88 KB



150ms is actually too much for 15KB on LAN but there can some issue with my router


- 

- As far as I can tell it neither ties to ffmpeg throughput :




Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'pipe:0':
 Metadata:
 major_brand : iso5
 minor_version : 1
 compatible_brands: isomiso5hlsf
 creation_time : 2025-03-09T17:16:49.000000Z
 Duration: 00:00:01.38, start:
0.000000, bitrate: N/A
 Stream #0:0(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuvj420p(pc), 1280x720, 4012 kb/s, 57.14 fps, 29.83 tbr, 600 tbn, 1200 tbc (default)
 Metadata:
 rotate : 90
 creation_time : 2025-03-09T17:16:49.000000Z
 handler_name : Core Media Video
 Side data:
 displaymatrix: rotation of -90.00 degrees

Stream mapping:
 Stream #0:0 -> #0:0 (h264 (native) -> rawvideo (native))

[swscaler @ 0x55d8d0b83100] deprecated pixel format used, make sure you did set range correctly

Output #0, video4linux2,v4l2, to '/dev/video10':
 Metadata:
 major_brand : iso5
 minor_version : 1
 compatible_brands: isomiso5hlsf
 encoder : Lavf58.45.100

Stream #0:0(und): Video: rawvideo (I420 / 0x30323449), yuv420p, 720x1280, q=2-31, 663552 kb/s, 60 fps, 60 tbn, 60 tbc (default)
 Metadata:
 encoder : Lavc58.91.100 rawvideo
 creation_time : 2025-03-09T17:16:49.000000Z
 handler_name : Core Media Video
 Side data:
 displaymatrix: rotation of -0.00 degrees

frame= 99 fps=0.0 q=-0.0 size=N/A time=00:00:01.65 bitrate=N/A dup=50 drop=0 speed=2.77x
frame= 137 fps=114 q=-0.0 size=N/A time=00:00:02.28 bitrate=N/A dup=69 drop=0 speed=1.89x
frame= 173 fps= 98 q=-0.0 size=N/A time=00:00:02.88 bitrate=N/A dup=87 drop=0 speed=1.63x
frame= 210 fps= 86 q=-0.0 size=N/A time=00:00:03.50 bitrate=N/A dup=105 drop=0 speed=1.44x
frame= 249 fps= 81 q=-0.0 size=N/A time=00:00:04.15 bitrate=N/A dup=125 drop=0 speed=1.36
frame= 279 fps= 78 q=-0.0 size=N/A time=00:00:04.65 bitrate=N/A dup=139 drop=0 speed=1.31x



- 

-
I also tried to write the video stream directly to
video.mp4
file and immediately open it withvlc
but it only can be successfully opened after 1.5 sec.

-
I've tried to use OBS v4l2 input source instead of vlc but the latency is the same








Update №1


When i try to stream actual
.mp4
file toffmpeg
it works almost immediately with 0.2sec delay to spin up the ffmpeg itself :

cat video.mp4 | ffmpeg -re -i pipe:0 -pix_fmt yuv420p -f v4l2 /dev/video10 & ; sleep 0.2 && cvlc v4l2:///dev/video10



So the problem is apparently with streaming process


-
-
flutter_ffmpeg is discontinued with no alternatives ?
13 mars, par Rageh AzzazyBased on this article by Taner Sener
https://tanersener.medium.com/saying-goodbye-to-ffmpegkit-33ae939767e1


and after the discontinuation of flutter_ffmpeg and ffmpeg_kit_flutter packages.


most packages for executing video editing commands were built depending on them and so won't work After April 1, 2025 as mentioned in the package documentation and Taner's article.


example of packages depending on flutter_ffmpeg or ffmpeg_kit_flutter like


- 

- video_trimmer
- zero_video_trimmer
- flutter_video_trimmer
- video_trim
- bemeli_compress
- video_trimmer_pro
- ... others
















and editing video using video_editor or video_editor_2 or video_editor_pits has become a problem


and I believe downloading the binaries and doing the whole thing locally is not just tedious but illegal as well.


so I broke down exactly what I need to edit videos in my flutter app


and I found those package but not yet tested them


Trimming : flutter_native_video_trimmer


Compression : video_compress or video_compress_plus


Muting : video_compress handles this


Encoding : Not sure, I just need a simple MP4 video files


Cropping : I can't find a solution for video cropping


I don't need to rotate, reverse or do any fancy stuff to the videos, just those five functions.


so now only remaining solution for this approach is to find a simple video cropping function and an encoder that work on flutter IOS & Android


so my question is not looking for external library recommendation but
do you have any ideas how to crop a video in flutter natively ?


-
ffmpeg throws error "Invalid file index 1 in filtergraph description" when used with concat and -/filter_complex [closed]
20 février, par Ernst VI want to create mp4 videos from many small .jpg files (time-lapse). Each image is shown for 2 seconds and then fades to the next. This works fine when I issue the command with a few images (see below). But when the number of images reaches hundreds or thousands, I can no longer put them into a single command, but have to put the filenames and the complex filter into .txt files.


Whatever I try, I always get the error
"Invalid file index 1 in filtergraph description"
.

This works fine :


ffmpeg -loop 1 -i "image1.jpg" -loop 1 -i "image2.jpg" -loop 1 -i "image3.jpg" -loop 1 -i "image4.jpg" -loop 1 -i "image5.jpg" -filter_complex "[0:v]trim=0:2,setpts=PTS-STARTPTS[v0]; [1:v]trim=0:2,setpts=PTS-STARTPTS[v1]; [2:v]trim=0:2,setpts=PTS-STARTPTS[v2]; [3:v]trim=0:2,setpts=PTS-STARTPTS[v3]; [4:v]trim=0:2,setpts=PTS-STARTPTS[v4]; [v0][v1] xfade=transition=fade:duration=1:offset=1 [xf1]; [xf1][v2] xfade=transition=fade:duration=1:offset=2 [xf2]; [xf2][v3] xfade=transition=fade:duration=1:offset=3 [xf3]; [xf3][v4] xfade=transition=fade:duration=1:offset=4 [xf4]" -map "[xf4]" -c:v libx264 -crf 18 -pix_fmt yuv420p output.mp4



But when I put the input files and the filter in txt files, it fails with the error above :


ffmpeg -stream_loop 1 -f concat -safe 0 -i concat_list.txt -/filter_complex filter_script.txt -map "[xf4]" -c:v libx264 -crf 18 -pix_fmt yuv420p output.mp4



This is the
concat_list.txt
:

file 'image1.jpg'
file 'image2.jpg'
file 'image3.jpg'
file 'image4.jpg'
file 'image5.jpg'



This is the
filter_script.txt
:

[0:v]trim=0:2,setpts=PTS-STARTPTS[v0];[1:v]trim=0:2,setpts=PTS-STARTPTS[v1];[2:v]trim=0:2,setpts=PTS-STARTPTS[v2];[3:v]trim=0:2,setpts=PTS-STARTPTS[v3];[4:v]trim=0:2,setpts=PTS-STARTPTS[v4];[v0][v1] xfade=transition=fade:duration=1:offset=1 [xf1];[xf1][v2] xfade=transition=fade:duration=1:offset=2 [xf2];[xf2][v3] xfade=transition=fade:duration=1:offset=3 [xf3];[xf3][v4] xfade=transition=fade:duration=1:offset=4 [xf4]



I tried adding/removing line breaks, semicolons, etc. Nothing helped.
Any idea what is going wrong here ? Thank you very much !


My ffmpeg version is the most recent one :


ffmpeg version 2025-02-17-git-b92577405b-full_build-www.gyan.dev