Recherche avancée

Médias (0)

Mot : - Tags -/optimisation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (47)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Installation en mode ferme

    4 février 2011, par

    Le mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
    C’est la méthode que nous utilisons sur cette même plateforme.
    L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
    Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)

  • La sauvegarde automatique de canaux SPIP

    1er avril 2010, par

    Dans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
    Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)

Sur d’autres sites (8080)

  • Node.js, stream pipe output data to client with socket io-stream

    22 mai 2018, par Empha

    Sorry for a repeating topic, but i’ve searched and experimented for 2 days now and i haven’t been able to solve the problem.

    I am trying to live stream pictures every 1 second to a client via socket.io-stream using the following code :

    var args = [
       "-i",
       "/dev/video0",
       "-s",
       "1280x720",
       "-qscale",
       1,
       "-vf",
       "fps=1",
       config.imagePath,
       "-s",
       config.imageStream.resolution[0],
       "-f",
       "image2pipe",
       "-qscale",
       1,
       "-vf",
       "fps=1",
       "pipe:1"
    ];
    camera = spawn("avconv", args);    // avconv = ffmpeg

    The settings are good, and the process writes to stdout successfully. I capture all outgoing image data using this simplified code :

    var ss = require("socket.io-stream");
    camera.stdout.on("data", function(data) {
       var stream = ss.createStream();
       ss(socket).emit("img", stream, "newImg");
       // how do i write the data-object to the stream?
       // fs.createReadStream(imagePath).pipe(stream);
    });

    "socket" comes from the client using the socket.io-package, no problem there. So what i am doing is that i listen to the stdout-pipe for the "data" event. That data gets passed to the function above. That means that at this stage "data" is not a stream, its a "<buffer></buffer>code>"-object, and therefore i cannot stream it like i could previously using the commented createReadStream-statement where i read the image from disk. <strong>How do i stream the data (Buffer at this stage) to the client? Can i do this differently, perhaps not using socket.io-stream?</strong> "data" is just one part of the whole image, so perhaps two or three "data"-objects need to be put together to form the complete image.

    I tried using "stream.write(data, "binary") ;" which did transfer the Buffer-objects, problem is that there is not end of stream-event and therefore i do not know when an image is complete. I tried registering to stdout.on "close", "end", "finish", nothing triggers. Am i missing something ? Am i making it overly complex ? The reasoning behind my implementation is that i need a new stream for each complete image, is that right ?

    Thanks alot !

  • FFMPEG RTSP client sending unsolicited error response

    22 mai 2018, par Jim Rhodes

    I have a server that pulls live video streams from IP cameras and makes those streams available to clients using RTSP. If I view one of the streams from a PC using ffplay.exe, the stream displays properly and I can pause and resume the stream without any issues.

    I have an iOS app originally written by someone else that uses ffmpeg as the RTSP client and I am seeing odd behavior when I try to pause a stream. ffmpeg is version 3.4 and is included in the iOS project as static libraries. I do not know what options were used to build the ffmpeg libraries.

    The problem is that after the iOS client sends a PAUSE command via av_read_pause(AVFormatContext*) and the server responds with "RTSP/1.0 200 OK", the iOS app sends "RTSP/1.0 501 Not Implemented" back to the server. An RTSP client should never be sending an RTSP response. I have to assume that this is a bug in ffmpeg. Are there known issues with ffmpeg’s handling of PAUSE ? Should I be using av_read_pause() to pause the stream ?

  • Auto delete .ts and .m3u8 files once client receives all .ts files

    11 mars 2019, par Abhishek Mehandiratta

    So I created an express server that gets an mp3 file (which is stored locally right now, but will be taken from mongo db later) and uses ffmpeg to make .m3u8 and .ts files. The files are successfully sent to the client and there are no errors while playing it on the client. I used hls.js to play these files in Chrome. But the server still has those files stored locally. Is there any way the server can know when to delete these files that it stored locally ? There are a lot of files generated by ffmpeg so I can’t just let them stay there forever.

    I used the ffmpeg part of code from hls-server github repo.

    my server file

    index.js

    // just used to run ffmpeg for conversion
    var command = ffmpeg('inp.mp3')
     .on('start', function (commandLine) {
       console.log('command', commandLine);
     }).addOptions([
       '-c:a aac',
       '-b:a 64k',
       '-vn',
       '-hls_list_size 0',
       '-segment_time 10',
     ]).output('files\\output.m3u8');

    var express = require('express');
    var app = express();
    // express middleware to serve individual .ts and .m3u8 files when requested
    app.use(express.static('./files/'));

    app.get('/', function (req, res) {
     command.on('end', function () {
       console.log('done');
       res.write(`
       <code class="echappe-js">&lt;script src=&quot;https://cdn.jsdelivr.net/hls.js/latest/hls.min.js&quot;&gt;&lt;/script&gt;
    &lt;script&gt;<br />
         function onLevelLoaded (event, data) {<br />
           var level_duration = data.details.totalduration;<br />
           console.log(level_duration, data);<br />
         }<br />
         if(Hls.isSupported()) {<br />
           var audio = new Audio();<br />
           var hls = new Hls();<br />
           // requesting files from here<br />
           hls.loadSource('http://localhost:8000/output.m3u8');<br />
           hls.attachMedia(audio);<br />
           hls.on(Hls.Events.LEVEL_LOADED, onLevelLoaded);<br />
           hls.on(Hls.Events.FRAG_BUFFERED, (e, d) =&gt; {<br />
             console.log(e, d);<br />
           });<br />
         }<br />
       &lt;/script&gt;

    `) ;
    res.end() ;
    }).run() ;
    }) ;

    app.listen(8000) ;