Recherche avancée

Médias (1)

Mot : - Tags -/berlin

Autres articles (65)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

Sur d’autres sites (12237)

  • How to HLS-live-stream incoming batches of individual frames, "appending" to a m3u8 playlist in real time, with ffmpeg ?

    20 novembre 2024, par Rob

    My overall goal :

    



    Server-side :

    



      

    • I have batches of sequential, JPEG-encoded frames (8-16) arriving from time to time, generated at roughly 2 FPS.
    • 


    • I would like to host an HLS live stream, where, when a new batch of frames arrives, I encode those new frames as h264 .ts segments with ffmpeg, and have the new .ts segments automatically added to an HLS stream (e.g. .m3u8 file).
    • 


    



    Client/browser-side :

    



      

    • When the .m3u8 is updated, I would like the video stream being watched to simply "continue", advancing from the point where new .ts segments have been added.
    • 


    • I do not need the user to scrub backwards in time, the client just needs to support live observation of the stream.
    • 


    



    



    My current approach :

    



    Server-side :

    



    To generate the "first" few segments of the stream, I'm attempting the below (just command-line for now to get ffmpeg working right, but ultimately will be automated via a Python script) :

    



    For reference, I'm using ffmpeg version 3.4.6-0ubuntu0.18.04.1.

    



    ffmpeg -y -framerate 2 -i /frames/batch1/frame_%d.jpg \
       -c:v libx264 -crf 21 -preset veryfast -g 2 \
       -f hls -hls_time 4 -hls_list_size 4 -segment_wrap 4 -segment_list_flags +live video/stream.m3u8


    



    where the /frames/batch1/ folder contains a sequence of frames (e.g. frame_01.jpg, frame_02.jpg, etc...). This already doesn't appear to work correctly, because it keeps adding #EXT-X-ENDLIST to the end of the .m3u8 file, which as I understand is not correct for a live HLS stream - here's what that generates :

    



    #EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:4
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:4.000000,
stream0.ts
#EXTINF:4.000000,
stream1.ts
#EXTINF:2.000000,
stream2.ts
#EXT-X-ENDLIST


    



    I can't figure out how to suppress #EXT-X-ENDLIST here - this is problem #1.

    



    Then, to generate subsequent segments (e.g. when new frames become available), I'm trying this :

    



    ffmpeg -y -framerate 2 -start_number 20 -i /frames/batch2/frame_%d.jpg \
       -c:v libx264 -crf 21 -preset veryfast -g 2 \
       -f hls -hls_time 4 -hls_list_size 4 -segment_wrap 4 -segment_list_flags +live video/stream.m3u8


    



    Unfortunately, this does not work the way I want it to. It simply overwrites stream.m3u8, does and does not advance #EXT-X-MEDIA-SEQUENCE, it does not index the new .ts files correctly, and it also includes the undesirable #EXT-X-ENDLIST - this is the output of that command :

    



    #EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:4
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:4.000000,
stream0.ts
#EXTINF:4.000000,
stream1.ts
#EXTINF:3.000000,
stream2.ts
#EXT-X-ENDLIST


    



    Fundamentally, I can't figure out how to "append" to an existing .m3u8 in a way that makes sense for HLS live streaming. That's essentially problem #2.

    



    For hosting the stream, I'm using a simple Flask app - which appears to be working the way I intend - here's what I'm doing for reference :

    



    @app.route('/video/')
def stream(file_name):
    video_dir = './video'
    return send_from_directory(directory=video_dir, filename=file_name)


    



    Client-side :

    



    I'm trying HLS.js in Chrome - basically boils down to this :

    



    <video></video>&#xA;&#xA;...&#xA;&#xA;<code class="echappe-js">&lt;script src=&quot;https://cdn.jsdelivr.net/npm/hls.js@latest&quot;&gt;&lt;/script&gt;&#xA;&lt;script&gt;&amp;#xA;   var video = document.getElementById(&amp;#x27;video1&amp;#x27;);&amp;#xA;   if (Hls.isSupported()) {&amp;#xA;     var hls = new Hls();&amp;#xA;     hls.loadSource(&amp;#x27;/video/stream.m3u8&amp;#x27;);&amp;#xA;     hls.attachMedia(video);&amp;#xA;     hls.on(Hls.Events.MANIFEST_PARSED, function() {&amp;#xA;       video.play();&amp;#xA;     });&amp;#xA;   }&amp;#xA;   else if (video.canPlayType(&amp;#x27;application/vnd.apple.mpegurl&amp;#x27;)) {&amp;#xA;     video.src = &amp;#x27;/video/stream.m3u8&amp;#x27;;&amp;#xA;     video.addEventListener(&amp;#x27;loadedmetadata&amp;#x27;, function() {&amp;#xA;       video.play();&amp;#xA;     });&amp;#xA;   }&amp;#xA;&lt;/script&gt;   &#xA;

    &#xA;&#xA;

    I'd like to think that what I'm trying to do doesn't require a more complex approach than what I'm trying above, but since what I'm trying to far definitely isn't working, I'm starting to think I need to come at this from a different angle. Any ideas on what I'm missing ?

    &#xA;&#xA;

    Edit :

    &#xA;&#xA;

    I've also attempted the same (again in Chrome) with video.js, and am seeing similar behavior - in particular, when I manually update the backing stream.m3u8 (with no #EXT-X-ENDLIST tag), videojs never picks up the new changes to the live stream, and just buffers/hangs indefinitely.

    &#xA;&#xA;

    <video class="video-js vjs-default-skin" muted="muted" controls="controls">&#xA;    <source type="application/x-mpegURL" src="/video/stream.m3u8">&#xA;</source></video>&#xA;&#xA;...&#xA;&#xA;<code class="echappe-js">&lt;script&gt;&amp;#xA;    var player = videojs(&amp;#x27;video1&amp;#x27;);&amp;#xA;    player.play();&amp;#xA;&lt;/script&gt;&#xA;

    &#xA;&#xA;

    For example, if I start with this initial version of stream.m3u8 :

    &#xA;&#xA;

    #EXTM3U&#xA;#EXT-X-PLAYLIST-TYPE:EVENT&#xA;#EXT-X-VERSION:3&#xA;#EXT-X-TARGETDURATION:8&#xA;#EXT-X-MEDIA-SEQUENCE:0&#xA;#EXTINF:4.000000,&#xA;stream0.ts&#xA;#EXTINF:4.000000,&#xA;stream1.ts&#xA;#EXTINF:2.000000,&#xA;stream2.ts&#xA;

    &#xA;&#xA;

    and then manually update it server-side to this :

    &#xA;&#xA;

    #EXTM3U&#xA;#EXT-X-PLAYLIST-TYPE:EVENT&#xA;#EXT-X-VERSION:3&#xA;#EXT-X-TARGETDURATION:8&#xA;#EXT-X-MEDIA-SEQUENCE:3&#xA;#EXTINF:4.000000,&#xA;stream3.ts&#xA;#EXTINF:4.000000,&#xA;stream4.ts&#xA;#EXTINF:3.000000,&#xA;stream5.ts&#xA;

    &#xA;&#xA;

    the video.js control just buffers indefinitely after only playing the first 3 segments (stream*.ts 0-2), which isn't what I'd expect to happen (I'd expect it to continue playing stream*.ts 3-5 once stream.m3u8 is updated and video.js makes a request for the latest version of the playlist).

    &#xA;

  • I want to make a box completely unrecognizable in a video without any way of reconstructing its content

    26 janvier 2023, par Jenson vivish

    saw many videos but couldn't find how to create

    &#xA;

    I want to make a box completely unrecognizable in a video without any way of reconstructing its content. Is

    &#xA;

    ffmpeg -y -i input_video -vf "drawbox=x=10:y=10:w=100:h=100:color=black@1.0:t=fill" ouput_video&#xA;sufficient for that ?

    &#xA;

    The command creates a black box at the desired location. I am just not sure if this cannot be undone.

    &#xA;

  • How to add background image for text with text size on a video in FFmpeg ?

    24 octobre 2019, par manwtein

    Is there way to add a background image under the text on the video in FFmpeg ?

    I can add black background using ’box’, but i need to add some certain image. I tried to use ’drawtext’ on a image and then overlayed it on the video, but i don’t know text’s width and height for scaling the image.

    "-i",
    "D:/AndroidProjects/Movies/VideoSamples/VideoShort1.mp4",
    "-i",
    "D:/AndroidProjects/Movies/ImageSamples/SampleImage.jpeg",
    "-filter_complex",
    "[0:v]drawtext=text='Example':fontcolor=#DB7093:fontsize=120:box=1:boxcolor=black@0.8:boxborderw=5:x='(w/2) - (tw/2)':y='h - ((h/2) - (th/2))",
    "-s",
    "1280x720",
    "-y",
    "D:/AndroidProjects/Movies/FFmpegSampleShort0.mp4"

    I expect to get video with some image background under my text instead black color.
    Final video