Recherche avancée

Médias (1)

Mot : - Tags -/publicité

Autres articles (33)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Dépôt de média et thèmes par FTP

    31 mai 2013, par

    L’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
    Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (5052)

  • How to create .m3u8 and .ts files which are compatible with HTTP Live Streaming using Java and OpenCV ?

    30 septembre 2021, par O Connor

    Using FFMPEG as following, FFMPEG will create a stream.m3u8 file and .ts files sequentially, which are compatible for making video live streaming on a web page.

    


    ffmpeg -f dshow -i video="My Camera Name" stream.m3u8


    


    I am using Java and OpenCV as following for capturing video frames and creating .ts file.

    


    videoCapture = new VideoCapture(0, CV_CAP_DSHOW);

if (videoCapture.isOpened()) {
    Mat m = new Mat();
    videoCapture.read(m);
    int fourcc = VideoWriter.fourcc('a', 'v', 'c', '1');
    double fps = videoCapture.get(Videoio.CAP_PROP_FPS);
        
    if (fps <= 0) {
        fps = 25;
    }

    Size s =  new Size((int) videoCapture.get(Videoio.CAP_PROP_FRAME_WIDTH), (int) 
    videoCapture.get(Videoio.CAP_PROP_FRAME_HEIGHT));
    videoWriter = new VideoWriter("test0.ts", fourcc, fps, s, true);
      
    while (videoCapture.read(m)) {
         if (videoWriter.isOpened()) {
             videoWriter.write(m);
         }
    }
}
    
videoCapture.release();
videoWriter.release();


    


    I can play the test0.ts video using VLC player and ffplay. But this test0.ts video is not compatible for HTTP Live Streaming on a web page.

    


      

    1. How can I create .m3u8 and .ts files which are compatible with HTTP Live Streaming using Java and OpenCV ?
    2. 


    


    I am not persisting to use OpenCV and Java for create these files, but I need a client that captures the video frames and sends the raw video data to the server. And the server is responsible for catching the raw video data, creating .ts video file and for handling the live streaming requests coming from the web page. The communication between the client and the server uses a credential.

    


    That's why I think using FFMPEG alone for this approach is not possible.

    


      

    1. If there is another way for making this client, server and HTTP Live Streaming possible, I would like to learn that as well.
    2. 


    


  • Put vast or vpaid overlay on live stream rtmp url output

    8 février 2020, par yolov3

    how to merge vast/vpaid overlay on live stream rtmp url output using ffmpeg ?
    it means get rtmp or m3u8 output stream link with merged vast/paid generated on stream time & can’t be change it.

  • Node.js Live Streaming : Avoid buffering

    27 octobre 2012, par Shirish Kamath

    I've written a small nodeJS server that outputs system audio captured by ffmpeg on Windows (using DirectShow) to the browser as a streaming MP3 file. The audio needs to be as live as possible, with minimum/no buffering, and a "skipping" effect in the audio is perfectly acceptable.

    When I play the audio in Chrome using the HTML5 audio tag, there's a delay of about 8-10 secs over a low-latency LAN connection. I suspected this to be a client-side buffer, and used a Flash MP3 player on the client-side, which brought down the delay to 2-3 secs.

    Now, the buffering seems to taking place on the server-side. The documentation for NodeJS's response.write mentions that the data is written kernel buffers. How do I go about avoiding any buffering altogether or at least getting around it, so that the client always gets the latest audio data ? Strategies for handling 'drain' events to always push live data ?

    On the request object, I've used setNoDelay(true) to avoid the use of Nagle's algorithm. Following is a snippet of how data is written when the spawned ffmpeg process emits data.

    var clients = []; //List of client connections currently being served
    ffmpeg.stdout.on('data', function(data) {
       for(var i = 0; i < clients.length; i++){
           clients[i].res.write(data);
       }
    });