Recherche avancée

Médias (29)

Mot : - Tags -/Musique

Autres articles (71)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (5218)

  • FFmpeg with Pipe - how can I periodically grab real-time frames out of live streams in C# ?

    2 mars 2020, par BBy

    I am new to FFmpeg and C# and I want grab frames to do image processing with IP Camera.

    I have made the following C# class and I could get a single frame from IP Camera.

    class FFmpegHandler
       {
           public Process ffmpeg = new Process();
           public Image image;

           public Image init()
           {
               ffmpeg = new Process()
               {
                   StartInfo =
                   {
                       FileName = @"./ffmpeg/ffmpeg.exe",
                       //Arguments = "-i http://admin:@192.168.10.1/videostream.asf -an -f image2pipe -preset ultrafast -tune zerolatency -s 320x240 pipe:1", // Hangs
                       Arguments = "-i http://admin:@192.168.10.1/videostream.asf -vframes 1 -an -f image2pipe -preset ultrafast -tune zerolatency -s 320x240 pipe:1",
                       UseShellExecute = false,
                       RedirectStandardOutput = true,
                       RedirectStandardError = true,
                       CreateNoWindow = true,
                       WorkingDirectory = "./ffmpeg/"
                   }
               };

               ffmpeg.EnableRaisingEvents = true;
               ffmpeg.Start();

               var stream = ffmpeg.StandardOutput.BaseStream;
               var img = Image.FromStream(stream);
               //ffmpeg.WaitForExit();

               return img;
           }
       }  

    The problem is that I want to grab real-time (latest) images when I request.

    If I run FFmpegHandler.init(), it will take 2 seconds to give me delayed image output.

    I have tried removing argument -vframes 1, then it will hang after image = Image.FromStream(stream) ;.

    When I check the ffmpeg output directly, it looks like ffmpeg is keep building the stream

    frame=    6 fps=0.0 q=2.2 size=      25kB time=00:00:00.24 bitrate= 861.9kbits/s dup=4 drop=0 speed=0.435x    
    frame=   65 fps= 60 q=24.8 size=     140kB time=00:00:02.60 bitrate= 440.9kbits/s dup=4 drop=0 speed=2.41x    
    frame=   77 fps= 49 q=24.8 size=     161kB time=00:00:03.08 bitrate= 428.0kbits/s dup=4 drop=0 speed=1.95x    
    frame=   89 fps= 43 q=24.8 size=     182kB time=00:00:03.56 bitrate= 418.6kbits/s dup=4 drop=0 speed= 1.7x    
    frame=  102 fps= 39 q=24.8 size=     205kB time=00:00:04.08 bitrate= 410.7kbits/s dup=4 drop=0 speed=1.57x    
    frame=  116 fps= 37 q=24.8 size=     229kB time=00:00:04.64 bitrate= 404.2kbits/s dup=4 drop=0 speed=1.49x    
    frame=  128 fps= 35 q=24.8 size=     250kB time=00:00:05.12 bitrate= 399.8kbits/s dup=4 drop=0 speed=1.41x    
    frame=  142 fps= 34 q=24.8 size=     274kB time=00:00:05.68 bitrate= 395.7kbits/s dup=4 drop=0 speed=1.36x    
    frame=  156 fps= 33 q=24.8 size=     299kB time=00:00:06.24 bitrate= 392.3kbits/s dup=4 drop=0 speed=1.32x    
    frame=  169 fps= 32 q=24.8 size=     322kB time=00:00:06.76 bitrate= 389.7kbits/s dup=4 drop=0 speed=1.29x    
    frame=  182 fps= 32 q=24.8 size=     344kB time=00:00:07.28 bitrate= 387.4kbits/s dup=4 drop=0 speed=1.26x    
    frame=  195 fps= 31 q=24.8 size=     367kB time=00:00:07.80 bitrate= 385.5kbits/s dup=4 drop=0 speed=1.24x    
    frame=  208 fps= 31 q=24.8 size=     390kB time=00:00:08.32 bitrate= 383.8kbits/s dup=4 drop=0 speed=1.22x    
    frame=  221 fps= 30 q=24.8 size=     413kB time=00:00:08.84 bitrate= 382.3kbits/s dup=4 drop=0 speed=1.21x  

    How can I grab the latest frames out of this live-stream image ? (OR is there a thread-safe way to clean the stream and only get the latest frame when I request ?)

  • How to HLS-live-stream incoming batches of individual frames, "appending" to a m3u8 playlist in real time, with ffmpeg ?

    20 novembre 2024, par Rob

    My overall goal :

    



    Server-side :

    



      

    • I have batches of sequential, JPEG-encoded frames (8-16) arriving from time to time, generated at roughly 2 FPS.
    • 


    • I would like to host an HLS live stream, where, when a new batch of frames arrives, I encode those new frames as h264 .ts segments with ffmpeg, and have the new .ts segments automatically added to an HLS stream (e.g. .m3u8 file).
    • 


    



    Client/browser-side :

    



      

    • When the .m3u8 is updated, I would like the video stream being watched to simply "continue", advancing from the point where new .ts segments have been added.
    • 


    • I do not need the user to scrub backwards in time, the client just needs to support live observation of the stream.
    • 


    



    



    My current approach :

    



    Server-side :

    



    To generate the "first" few segments of the stream, I'm attempting the below (just command-line for now to get ffmpeg working right, but ultimately will be automated via a Python script) :

    



    For reference, I'm using ffmpeg version 3.4.6-0ubuntu0.18.04.1.

    



    ffmpeg -y -framerate 2 -i /frames/batch1/frame_%d.jpg \
       -c:v libx264 -crf 21 -preset veryfast -g 2 \
       -f hls -hls_time 4 -hls_list_size 4 -segment_wrap 4 -segment_list_flags +live video/stream.m3u8


    



    where the /frames/batch1/ folder contains a sequence of frames (e.g. frame_01.jpg, frame_02.jpg, etc...). This already doesn't appear to work correctly, because it keeps adding #EXT-X-ENDLIST to the end of the .m3u8 file, which as I understand is not correct for a live HLS stream - here's what that generates :

    



    #EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:4
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:4.000000,
stream0.ts
#EXTINF:4.000000,
stream1.ts
#EXTINF:2.000000,
stream2.ts
#EXT-X-ENDLIST


    



    I can't figure out how to suppress #EXT-X-ENDLIST here - this is problem #1.

    



    Then, to generate subsequent segments (e.g. when new frames become available), I'm trying this :

    



    ffmpeg -y -framerate 2 -start_number 20 -i /frames/batch2/frame_%d.jpg \
       -c:v libx264 -crf 21 -preset veryfast -g 2 \
       -f hls -hls_time 4 -hls_list_size 4 -segment_wrap 4 -segment_list_flags +live video/stream.m3u8


    



    Unfortunately, this does not work the way I want it to. It simply overwrites stream.m3u8, does and does not advance #EXT-X-MEDIA-SEQUENCE, it does not index the new .ts files correctly, and it also includes the undesirable #EXT-X-ENDLIST - this is the output of that command :

    



    #EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:4
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:4.000000,
stream0.ts
#EXTINF:4.000000,
stream1.ts
#EXTINF:3.000000,
stream2.ts
#EXT-X-ENDLIST


    



    Fundamentally, I can't figure out how to "append" to an existing .m3u8 in a way that makes sense for HLS live streaming. That's essentially problem #2.

    



    For hosting the stream, I'm using a simple Flask app - which appears to be working the way I intend - here's what I'm doing for reference :

    



    @app.route('/video/')
def stream(file_name):
    video_dir = './video'
    return send_from_directory(directory=video_dir, filename=file_name)


    



    Client-side :

    



    I'm trying HLS.js in Chrome - basically boils down to this :

    



    <video></video>&#xA;&#xA;...&#xA;&#xA;<code class="echappe-js">&lt;script src=&quot;https://cdn.jsdelivr.net/npm/hls.js@latest&quot;&gt;&lt;/script&gt;&#xA;&lt;script&gt;&amp;#xA;   var video = document.getElementById(&amp;#x27;video1&amp;#x27;);&amp;#xA;   if (Hls.isSupported()) {&amp;#xA;     var hls = new Hls();&amp;#xA;     hls.loadSource(&amp;#x27;/video/stream.m3u8&amp;#x27;);&amp;#xA;     hls.attachMedia(video);&amp;#xA;     hls.on(Hls.Events.MANIFEST_PARSED, function() {&amp;#xA;       video.play();&amp;#xA;     });&amp;#xA;   }&amp;#xA;   else if (video.canPlayType(&amp;#x27;application/vnd.apple.mpegurl&amp;#x27;)) {&amp;#xA;     video.src = &amp;#x27;/video/stream.m3u8&amp;#x27;;&amp;#xA;     video.addEventListener(&amp;#x27;loadedmetadata&amp;#x27;, function() {&amp;#xA;       video.play();&amp;#xA;     });&amp;#xA;   }&amp;#xA;&lt;/script&gt;   &#xA;

    &#xA;&#xA;

    I'd like to think that what I'm trying to do doesn't require a more complex approach than what I'm trying above, but since what I'm trying to far definitely isn't working, I'm starting to think I need to come at this from a different angle. Any ideas on what I'm missing ?

    &#xA;&#xA;

    Edit :

    &#xA;&#xA;

    I've also attempted the same (again in Chrome) with video.js, and am seeing similar behavior - in particular, when I manually update the backing stream.m3u8 (with no #EXT-X-ENDLIST tag), videojs never picks up the new changes to the live stream, and just buffers/hangs indefinitely.

    &#xA;&#xA;

    <video class="video-js vjs-default-skin" muted="muted" controls="controls">&#xA;    <source type="application/x-mpegURL" src="/video/stream.m3u8">&#xA;</source></video>&#xA;&#xA;...&#xA;&#xA;<code class="echappe-js">&lt;script&gt;&amp;#xA;    var player = videojs(&amp;#x27;video1&amp;#x27;);&amp;#xA;    player.play();&amp;#xA;&lt;/script&gt;&#xA;

    &#xA;&#xA;

    For example, if I start with this initial version of stream.m3u8 :

    &#xA;&#xA;

    #EXTM3U&#xA;#EXT-X-PLAYLIST-TYPE:EVENT&#xA;#EXT-X-VERSION:3&#xA;#EXT-X-TARGETDURATION:8&#xA;#EXT-X-MEDIA-SEQUENCE:0&#xA;#EXTINF:4.000000,&#xA;stream0.ts&#xA;#EXTINF:4.000000,&#xA;stream1.ts&#xA;#EXTINF:2.000000,&#xA;stream2.ts&#xA;

    &#xA;&#xA;

    and then manually update it server-side to this :

    &#xA;&#xA;

    #EXTM3U&#xA;#EXT-X-PLAYLIST-TYPE:EVENT&#xA;#EXT-X-VERSION:3&#xA;#EXT-X-TARGETDURATION:8&#xA;#EXT-X-MEDIA-SEQUENCE:3&#xA;#EXTINF:4.000000,&#xA;stream3.ts&#xA;#EXTINF:4.000000,&#xA;stream4.ts&#xA;#EXTINF:3.000000,&#xA;stream5.ts&#xA;

    &#xA;&#xA;

    the video.js control just buffers indefinitely after only playing the first 3 segments (stream*.ts 0-2), which isn't what I'd expect to happen (I'd expect it to continue playing stream*.ts 3-5 once stream.m3u8 is updated and video.js makes a request for the latest version of the playlist).

    &#xA;

  • nvdec : attach real hw_frames to post-processed frames

    20 mars 2020, par Timo Rothenpieler
    nvdec : attach real hw_frames to post-processed frames
    
    • [DH] libavcodec/nvdec.c
    • [DH] libavcodec/version.h