Recherche avancée

Médias (91)

Autres articles (97)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

Sur d’autres sites (13335)

  • How to broadcast a video stream without reloading the page ?

    16 novembre 2024, par promo 69

    I created a Node.js server to :

    


      

    1. Receive images in udp and transform them into video and
    2. 


    3. Display it on a website
    4. 


    5. I tried but I don't understand how to broadcast the video live without having to reload the page
    6. 


    


    Node.js server code :

    


    const express = require('express');
const dgram = require('dgram');
const fs = require('fs');
const ffmpeg = require('fluent-ffmpeg');
const path = require('path');
const WebSocket = require('ws');

const app = express();
const httpPort = 3000;

const imageDir = path.join(__dirname, 'images');
if (!fs.existsSync(imageDir)) {
    fs.mkdirSync(imageDir);
}

let imageCount = 0;

const udpPort = 15002;
const udpHost = '127.0.0.1';
const server = dgram.createSocket('udp4');

const wss = new WebSocket.Server({ noServer: true });


const createVideo = () => {
    const outputVideo = path.join(__dirname, 'output_video.mp4');

    ffmpeg()
        .input(path.join(imageDir, '%d.jpg'))
        .inputOptions('-framerate 30')
        .output(outputVideo)
        .outputOptions('-c:v libx264')
        .on('end', () => {
            console.log('Vidéo créée avec succès !');

            wss.clients.forEach(client => {
                if (client.readyState === WebSocket.OPEN) {
                    client.send('new-video');
                }
            });
        })
        .on('error', (err) => {
            console.log('Erreur lors de la création de la vidéo:', err);
        })
        .run();
};


app.get('/feed-video', (req, res) => {
    const videoPath = path.join(__dirname, 'output_video.mp4');
    res.sendFile(videoPath);
});

server.on('message', (msg, rinfo) => {
    console.log(`Reçu message de ${rinfo.address}:${rinfo.port}`);

    const imageFilePath = path.join(imageDir, `${imageCount}.jpg`);
    fs.writeFileSync(imageFilePath, msg);

    console.log(`Image ${imageCount}.jpg reçue et sauvegardée`);


    imageCount++;


    if (imageCount > 100) {
        createVideo();
        imageCount = 0;
    }
});


server.on('listening', () => {
    const address = server.address();
    console.log(`Serveur UDP en écoute sur ${address.address}:${address.port}`);
});


app.server = app.listen(httpPort, () => {
    console.log(`Serveur HTTP et WebSocket démarré sur http://localhost:${httpPort}`);
});

app.server.on('upgrade', (request, socket, head) => {
    wss.handleUpgrade(request, socket, head, (ws) => {
        wss.emit('connection', ws, request);
    });
});


server.bind(udpPort, udpHost);



    


    The html page :

    


    &#xA;&#xA;&#xA;    &#xA;    &#xA;    &#xA;    &#xA;&#xA;&#xA;<h1>Drone Video Feed</h1>&#xA;<video controls="controls" autoplay="autoplay"></video>&#xA;&#xA;<code class="echappe-js">&lt;script&gt;&amp;#xA;    const video = document.getElementById(&amp;#x27;video&amp;#x27;);&amp;#xA;    const ws = new WebSocket(&amp;#x27;ws://localhost:3000&amp;#x27;);&amp;#xA;&amp;#xA;    ws.onmessage = (event) =&gt; {&amp;#xA;        const blob = new Blob([event.data], { type: &amp;#x27;video/mp4&amp;#x27; });&amp;#xA;        video.src = URL.createObjectURL(blob);&amp;#xA;        video.play();&amp;#xA;    };&amp;#xA;&lt;/script&gt;&#xA;&#xA;&#xA;&#xA;

    &#xA;

    I tried with websocket but I didn't succeed.&#xA;The video is correctly created and when I reload the page the new video is played by the player.

    &#xA;

    However I would have been able to see the live stream without having to reload my page all the time.

    &#xA;

  • The ffmpeg output binary stream front-end uses WebSocket to accept and cannot be played [closed]

    17 novembre 2024, par KIMEOOK

    Server push nodejs

    &#xA;

    Use ws service to pass to the front end

    &#xA;

          ffmpegs = spawn(&#x27;ffmpeg&#x27;, [&#xA;        &#x27;-f&#x27;, &#x27;gdigrab&#x27;,  // 这是 Windows 下用于捕获屏幕的输入格式&#xA;        &#x27;-framerate&#x27;, &#x27;60&#x27;,  // 捕获帧率&#xA;        &#x27;-i&#x27;, &#x27;desktop&#x27;,  // 捕获桌面(即屏幕)&#xA;        &#x27;-c:v&#x27;, &#x27;vp8&#x27;,  // 视频编码格式&#xA;        &#x27;-f&#x27;, &#x27;webm&#x27;,  // 设置输出为 mpegts 流&#xA;        &#x27;-pix_fmt&#x27;, &#x27;yuv420p&#x27;,&#xA;        &#x27;pipe:1&#x27;,  // 输出到管道&#xA;      ]);&#xA;

    &#xA;

    enter image description here&#xA;enter image description here

    &#xA;

    Front-end rendering

    &#xA;

    &#xA;      let videoElement = document.getElementById(&#x27;screenVideo&#x27;);&#xA;&#xA;      let mediaSource = new MediaSource();&#xA;      videoElement.src = URL.createObjectURL(mediaSource);&#xA;&#xA;      mediaSource.addEventListener(&#x27;sourceopen&#x27;, () => {&#xA;&#xA;        let sourceBuffer = mediaSource.addSourceBuffer(&#x27;video/webm; codecs="vp8"&#x27;); &#xA;&#xA;        let ws = new WebSocket(`ws://${ip}:3000/?device=${encodeURIComponent(selectedDevice)}`);&#xA;&#xA;        ws.onmessage = (event) => {&#xA;          const data = new Uint8Array(event.data);&#xA;          if (!sourceBuffer.updating) {&#xA;            try {&#xA;              sourceBuffer.appendBuffer(data);&#xA;              console.log(&#x27;ok&#x27;)&#xA;            } catch (error) {&#xA;              console.error(&#x27;Error appending buffer:&#x27;, error);&#xA;            }&#xA;          } else {&#xA;            console.log(&#x27;SourceBuffer is busy&#x27;);&#xA;          }&#xA;        };&#xA;&#xA;        ws.onerror = (error) => {&#xA;          console.error(&#x27;WebSocket error:&#x27;, error);&#xA;        };&#xA;&#xA;        ws.onclose = () => {&#xA;          console.log(&#x27;WebSocket connection closed&#x27;);&#xA;        };&#xA;&#xA;        if (mediaSource.readyState === &#x27;open&#x27;) {&#xA;          videoElement.play().catch(err => {&#xA;            console.error(&#x27;Error attempting to play the video:&#x27;, err);&#xA;          });&#xA;        }&#xA;}&#xA;

    &#xA;

    The video keeps spinning in circles and cannot be played normally.

    &#xA;

    enter image description here

    &#xA;

    Unable to play normally. What's the problem ?

    &#xA;

  • Gstreamer video increases latency with decresed FPS

    19 novembre 2024, par Ri Di

    I am using RPI 5 to stream the video :

    &#xA;

    rpicam-vid -t 0 --camera 0 --nopreview --mode 2304:1296:10:P --codec yuv420 --width 640 --height 360 --framerate 10 --rotation 0 --autofocus-mode manual --inline --listen -o - | ffmpeg -f rawvideo -pix_fmt yuv420p -s:v 640x360 -r 10 -i /dev/stdin -c:v libx264 -preset ultrafast -tune zerolatency -maxrate 300k -bufsize 50k -g 30000 -f mpegts tcp://192.168.0.147:1234&#xA;

    &#xA;

    View it with :

    &#xA;

    gst-launch-1.0 -v tcpserversrc host=0.0.0.0 port=1234 ! queue ! tsdemux ! h264parse ! avdec_h264 ! videorate ! video/x-raw,framerate=10/1 ! videoconvert ! autovideosink sync=false&#xA;

    &#xA;

    Problem is that with 10 FPS I get around 2s of latency ! While 56 or 120 FPS results in below 300ms latency.

    &#xA;

    Is the problem in sender or reader side ? Or both ?

    &#xA;

    I am not planning to use the 10 FPS, its only for demonstration of problem. But I would like to get lower latency at 56 FPS - just like at 120 FPS (around 80-100 ms difference) or maybe even better, as it seems to get lower with higher FPS.

    &#xA;

    Maybe there is some kind of buffering parameter which holds frames ?

    &#xA;

    (of course, when testing with higher FPS I change both numbers in sender and the one in reader command. The camera is v3 RPI official)

    &#xA;

    Also I'd like to mention that same thing happens with ffplay :

    &#xA;

    ffplay -i -probesize 3000 tcp://0.0.0.0:1234/?listen&#xA;

    &#xA;