
Recherche avancée
Médias (9)
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#3 The Safest Place
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (85)
-
Les images
15 mai 2013 -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...) -
Les vidéos
21 avril 2011, parComme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)
Sur d’autres sites (8340)
-
How to HLS-live-stream incoming batches of individual frames, "appending" to a m3u8 playlist in real time, with ffmpeg ?
20 novembre 2024, par RobMy overall goal :



Server-side :



- 

- I have batches of sequential, JPEG-encoded frames (8-16) arriving from time to time, generated at roughly 2 FPS.
- I would like to host an HLS live stream, where, when a new batch of frames arrives, I encode those new frames as h264
.ts
segments withffmpeg
, and have the new.ts
segments automatically added to an HLS stream (e.g..m3u8
file).







Client/browser-side :



- 

- When the
.m3u8
is updated, I would like the video stream being watched to simply "continue", advancing from the point where new.ts
segments have been added. - I do not need the user to scrub backwards in time, the client just needs to support live observation of the stream.










My current approach :



Server-side :



To generate the "first" few segments of the stream, I'm attempting the below (just command-line for now to get ffmpeg working right, but ultimately will be automated via a Python script) :



For reference, I'm using ffmpeg version 3.4.6-0ubuntu0.18.04.1.



ffmpeg -y -framerate 2 -i /frames/batch1/frame_%d.jpg \
 -c:v libx264 -crf 21 -preset veryfast -g 2 \
 -f hls -hls_time 4 -hls_list_size 4 -segment_wrap 4 -segment_list_flags +live video/stream.m3u8




where the
/frames/batch1/
folder contains a sequence of frames (e.g. frame_01.jpg, frame_02.jpg, etc...). This already doesn't appear to work correctly, because it keeps adding#EXT-X-ENDLIST
to the end of the.m3u8
file, which as I understand is not correct for a live HLS stream - here's what that generates :


#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:4
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:4.000000,
stream0.ts
#EXTINF:4.000000,
stream1.ts
#EXTINF:2.000000,
stream2.ts
#EXT-X-ENDLIST




I can't figure out how to suppress
#EXT-X-ENDLIST
here - this is problem #1.


Then, to generate subsequent segments (e.g. when new frames become available), I'm trying this :



ffmpeg -y -framerate 2 -start_number 20 -i /frames/batch2/frame_%d.jpg \
 -c:v libx264 -crf 21 -preset veryfast -g 2 \
 -f hls -hls_time 4 -hls_list_size 4 -segment_wrap 4 -segment_list_flags +live video/stream.m3u8




Unfortunately, this does not work the way I want it to. It simply overwrites
stream.m3u8
, does and does not advance#EXT-X-MEDIA-SEQUENCE
, it does not index the new.ts
files correctly, and it also includes the undesirable#EXT-X-ENDLIST
- this is the output of that command :


#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:4
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:4.000000,
stream0.ts
#EXTINF:4.000000,
stream1.ts
#EXTINF:3.000000,
stream2.ts
#EXT-X-ENDLIST




Fundamentally, I can't figure out how to "append" to an existing
.m3u8
in a way that makes sense for HLS live streaming. That's essentially problem #2.


For hosting the stream, I'm using a simple Flask app - which appears to be working the way I intend - here's what I'm doing for reference :



@app.route('/video/')
def stream(file_name):
 video_dir = './video'
 return send_from_directory(directory=video_dir, filename=file_name)




Client-side :



I'm trying HLS.js in Chrome - basically boils down to this :



<video></video>

...

<code class="echappe-js"><script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script>

<script>&#xA; var video = document.getElementById(&#x27;video1&#x27;);&#xA; if (Hls.isSupported()) {&#xA; var hls = new Hls();&#xA; hls.loadSource(&#x27;/video/stream.m3u8&#x27;);&#xA; hls.attachMedia(video);&#xA; hls.on(Hls.Events.MANIFEST_PARSED, function() {&#xA; video.play();&#xA; });&#xA; }&#xA; else if (video.canPlayType(&#x27;application/vnd.apple.mpegurl&#x27;)) {&#xA; video.src = &#x27;/video/stream.m3u8&#x27;;&#xA; video.addEventListener(&#x27;loadedmetadata&#x27;, function() {&#xA; video.play();&#xA; });&#xA; }&#xA;</script>




I'd like to think that what I'm trying to do doesn't require a more complex approach than what I'm trying above, but since what I'm trying to far definitely isn't working, I'm starting to think I need to come at this from a different angle. Any ideas on what I'm missing ?



Edit :



I've also attempted the same (again in Chrome) with
video.js
, and am seeing similar behavior - in particular, when I manually update the backingstream.m3u8
(with no#EXT-X-ENDLIST
tag),videojs
never picks up the new changes to the live stream, and just buffers/hangs indefinitely.


<video class="video-js vjs-default-skin" muted="muted" controls="controls">
 <source type="application/x-mpegURL" src="/video/stream.m3u8">
</source></video>

...

<code class="echappe-js"><script>&#xA; var player = videojs(&#x27;video1&#x27;);&#xA; player.play();&#xA;</script>




For example, if I start with this initial version of
stream.m3u8
:


#EXTM3U
#EXT-X-PLAYLIST-TYPE:EVENT
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:8
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:4.000000,
stream0.ts
#EXTINF:4.000000,
stream1.ts
#EXTINF:2.000000,
stream2.ts




and then manually update it server-side to this :



#EXTM3U
#EXT-X-PLAYLIST-TYPE:EVENT
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:8
#EXT-X-MEDIA-SEQUENCE:3
#EXTINF:4.000000,
stream3.ts
#EXTINF:4.000000,
stream4.ts
#EXTINF:3.000000,
stream5.ts




the video.js control just buffers indefinitely after only playing the first 3 segments (stream*.ts 0-2), which isn't what I'd expect to happen (I'd expect it to continue playing stream*.ts 3-5 once
stream.m3u8
is updated andvideo.js
makes a request for the latest version of the playlist).

-
libvpxenc : add static-thresh private option
8 octobre 2014, par Anton Khirnovlibvpxenc : add static-thresh private option
Currently, this option is accessed through AVCodecContext.mb_threshold,
which originally controlled reusing MB data when transcoding mpeg to
mpeg. Since the libvpx meaning is completely different from the original
mpegvideo meaning, it is better to use a separate private option for
this. -
vaapi_h264 : Fix bit offset of slice data.
2 avril 2016, par Mark Thompsonvaapi_h264 : Fix bit offset of slice data.
Commit ca2f19b9cc37be509d85f05c8f902860475905f8 modified the meaning of
H264SliceContext.gb : it is now initialised at the start of the NAL unit
header, rather than at the start of the slice header. The VAAPI slice
decoder uses the offset after parsing to determine the offset of the
slice data in the bitstream, so with the changed meaning we no longer
need to add the extra byte to account for the NAL unit header because
it is now included directly.Signed-off-by : Derek Buitenhuis <derek.buitenhuis@gmail.com>