
Recherche avancée
Médias (1)
-
GetID3 - Bloc informations de fichiers
9 avril 2013, par
Mis à jour : Mai 2013
Langue : français
Type : Image
Autres articles (63)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (8134)
-
How to broadcast a video stream without reloading the page ?
16 novembre 2024, par promo 69I created a Node.js server to :


- 

- Receive images in udp and transform them into video and
- Display it on a website
- I tried but I don't understand how to broadcast the video live without having to reload the page








Node.js server code :


const express = require('express');
const dgram = require('dgram');
const fs = require('fs');
const ffmpeg = require('fluent-ffmpeg');
const path = require('path');
const WebSocket = require('ws');

const app = express();
const httpPort = 3000;

const imageDir = path.join(__dirname, 'images');
if (!fs.existsSync(imageDir)) {
 fs.mkdirSync(imageDir);
}

let imageCount = 0;

const udpPort = 15002;
const udpHost = '127.0.0.1';
const server = dgram.createSocket('udp4');

const wss = new WebSocket.Server({ noServer: true });


const createVideo = () => {
 const outputVideo = path.join(__dirname, 'output_video.mp4');

 ffmpeg()
 .input(path.join(imageDir, '%d.jpg'))
 .inputOptions('-framerate 30')
 .output(outputVideo)
 .outputOptions('-c:v libx264')
 .on('end', () => {
 console.log('Vidéo créée avec succès !');

 wss.clients.forEach(client => {
 if (client.readyState === WebSocket.OPEN) {
 client.send('new-video');
 }
 });
 })
 .on('error', (err) => {
 console.log('Erreur lors de la création de la vidéo:', err);
 })
 .run();
};


app.get('/feed-video', (req, res) => {
 const videoPath = path.join(__dirname, 'output_video.mp4');
 res.sendFile(videoPath);
});

server.on('message', (msg, rinfo) => {
 console.log(`Reçu message de ${rinfo.address}:${rinfo.port}`);

 const imageFilePath = path.join(imageDir, `${imageCount}.jpg`);
 fs.writeFileSync(imageFilePath, msg);

 console.log(`Image ${imageCount}.jpg reçue et sauvegardée`);


 imageCount++;


 if (imageCount > 100) {
 createVideo();
 imageCount = 0;
 }
});


server.on('listening', () => {
 const address = server.address();
 console.log(`Serveur UDP en écoute sur ${address.address}:${address.port}`);
});


app.server = app.listen(httpPort, () => {
 console.log(`Serveur HTTP et WebSocket démarré sur http://localhost:${httpPort}`);
});

app.server.on('upgrade', (request, socket, head) => {
 wss.handleUpgrade(request, socket, head, (ws) => {
 wss.emit('connection', ws, request);
 });
});


server.bind(udpPort, udpHost);




The html page :





 
 
 
 


<h1>Drone Video Feed</h1>
<video controls="controls" autoplay="autoplay"></video>

<code class="echappe-js"><script>&#xA; const video = document.getElementById(&#x27;video&#x27;);&#xA; const ws = new WebSocket(&#x27;ws://localhost:3000&#x27;);&#xA;&#xA; ws.onmessage = (event) => {&#xA; const blob = new Blob([event.data], { type: &#x27;video/mp4&#x27; });&#xA; video.src = URL.createObjectURL(blob);&#xA; video.play();&#xA; };&#xA;</script>






I tried with websocket but I didn't succeed.
The video is correctly created and when I reload the page the new video is played by the player.


However I would have been able to see the live stream without having to reload my page all the time.


-
The ffmpeg output binary stream front-end uses WebSocket to accept and cannot be played [closed]
17 novembre 2024, par KIMEOOKServer push nodejs


Use ws service to pass to the front end


ffmpegs = spawn('ffmpeg', [
 '-f', 'gdigrab', // 这是 Windows 下用于捕获屏幕的输入格式
 '-framerate', '60', // 捕获帧率
 '-i', 'desktop', // 捕获桌面(即屏幕)
 '-c:v', 'vp8', // 视频编码格式
 '-f', 'webm', // 设置输出为 mpegts 流
 '-pix_fmt', 'yuv420p',
 'pipe:1', // 输出到管道
 ]);







Front-end rendering



 let videoElement = document.getElementById('screenVideo');

 let mediaSource = new MediaSource();
 videoElement.src = URL.createObjectURL(mediaSource);

 mediaSource.addEventListener('sourceopen', () => {

 let sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vp8"'); 

 let ws = new WebSocket(`ws://${ip}:3000/?device=${encodeURIComponent(selectedDevice)}`);

 ws.onmessage = (event) => {
 const data = new Uint8Array(event.data);
 if (!sourceBuffer.updating) {
 try {
 sourceBuffer.appendBuffer(data);
 console.log('ok')
 } catch (error) {
 console.error('Error appending buffer:', error);
 }
 } else {
 console.log('SourceBuffer is busy');
 }
 };

 ws.onerror = (error) => {
 console.error('WebSocket error:', error);
 };

 ws.onclose = () => {
 console.log('WebSocket connection closed');
 };

 if (mediaSource.readyState === 'open') {
 videoElement.play().catch(err => {
 console.error('Error attempting to play the video:', err);
 });
 }
}



The video keeps spinning in circles and cannot be played normally.




Unable to play normally. What's the problem ?


-
Gstreamer video increases latency with decresed FPS
19 novembre 2024, par Ri DiI am using RPI 5 to stream the video :


rpicam-vid -t 0 --camera 0 --nopreview --mode 2304:1296:10:P --codec yuv420 --width 640 --height 360 --framerate 10 --rotation 0 --autofocus-mode manual --inline --listen -o - | ffmpeg -f rawvideo -pix_fmt yuv420p -s:v 640x360 -r 10 -i /dev/stdin -c:v libx264 -preset ultrafast -tune zerolatency -maxrate 300k -bufsize 50k -g 30000 -f mpegts tcp://192.168.0.147:1234



View it with :


gst-launch-1.0 -v tcpserversrc host=0.0.0.0 port=1234 ! queue ! tsdemux ! h264parse ! avdec_h264 ! videorate ! video/x-raw,framerate=10/1 ! videoconvert ! autovideosink sync=false



Problem is that with 10 FPS I get around 2s of latency ! While 56 or 120 FPS results in below 300ms latency.


Is the problem in sender or reader side ? Or both ?


I am not planning to use the 10 FPS, its only for demonstration of problem. But I would like to get lower latency at 56 FPS - just like at 120 FPS (around 80-100 ms difference) or maybe even better, as it seems to get lower with higher FPS.


Maybe there is some kind of buffering parameter which holds frames ?


(of course, when testing with higher FPS I change both numbers in sender and the one in reader command. The camera is v3 RPI official)


Also I'd like to mention that same thing happens with ffplay :


ffplay -i -probesize 3000 tcp://0.0.0.0:1234/?listen