
Recherche avancée
Autres articles (53)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
Sur d’autres sites (10509)
-
Difference between DirectShowSource() and FFmpegSource2() in AviSynth
29 mars 2024, par MarianDFor non
.avi
A/V sources (as.mp3
,.mp4
, etc.) there are (at least) 2 possibilities for reading those media files in AviSynth (in Windows) :


- 

- The built-in media filter
DirectShowSource()
, using Microsoft's DirectShow media architecture. - The AviSynth Plugin
FFmpegSource2()
aliasFFMS2()
using FFmpeg and nothing else.







What are advantages and disadvantages of them ?

Which is more reliable, frame / sample accurate, etc.?

- The built-in media filter
-
Why is chrominance lost when i OpenSharedResource from a ffmpeg AVFrame resource ?
19 avril 2022, par LogoroD3D11_TEXTURE2D_DESC texture_desc = {0};
texture_desc.Width = 640;
texture_desc.Height = 480;
texture_desc.MipLevels = 1;
texture_desc.Format = DXGI_FORMAT_NV12;
texture_desc.SampleDesc.Count = 1;
texture_desc.ArraySize = 1;
texture_desc.Usage = D3D11_USAGE_DEFAULT;
texture_desc.MiscFlags = D3D11_RESOURCE_MISC_SHARED;

Microsoft::WRL::ComPtr<id3d11texture2d> temp_texture_for_my_device{nullptr};
my_device->CreateTexture2D(&texture_desc, NULL, &temp_texture_for_my_device);

Microsoft::WRL::ComPtr<idxgiresource> dxgi_resource{nullptr};
temp_texture_for_my_device.As(&dxgi_resource);
HANDLE shared_handle = NULL;
dxgi_resource->GetSharedHandle(&shared_handle);
dxgi_resource->Release();

Microsoft::WRL::ComPtr<id3d11texture2d> temp_texture_for_ffmpeg_device {nullptr};
ffmpeg_device->OpenSharedResource(shared_handle, __uuidof(ID3D11Texture2D), (void**)temp_texture_for_ffmpeg_device.GetAddressOf());
ffmpeg_device_context->CopySubresourceRegion(temp_texture_for_ffmpeg_device.Get(), 0, 0, 0, 0, (ID3D11Texture2D*)ffmpeg_avframe->data[0], (int)ffmpeg_avframe->data[1], NULL);
ffmpeg_device_context->Flush();
</id3d11texture2d></idxgiresource></id3d11texture2d>


I copy temp_texture_for_ffmpeg_device to a D3D11_USAGE_STAGING, it's normal, but when i copy temp_texture_for_my_device to a D3D11_USAGE_STAGING, i lost the chrominance data.


When i map the texture to cpu via D3D11_USAGE_STAGING :


temp_texture_for_ffmpeg_device : RowPitch is 768, DepthPitch is 768 * 720.
temp_texture_for_my_device : RowPitch is 1024, DepthPitch is 1024 * 480.


I think there are some different parameters between the two devices(or device context ?), but I don't know what parameters would cause such a difference in behavior.


my_device
andmy_device_context
are created byD3D11On12CreateDevice


-
How to HLS-live-stream incoming batches of individual frames, "appending" to a m3u8 playlist in real time, with ffmpeg ?
20 novembre 2024, par RobMy overall goal :



Server-side :



- 

- I have batches of sequential, JPEG-encoded frames (8-16) arriving from time to time, generated at roughly 2 FPS.
- I would like to host an HLS live stream, where, when a new batch of frames arrives, I encode those new frames as h264
.ts
segments withffmpeg
, and have the new.ts
segments automatically added to an HLS stream (e.g..m3u8
file).







Client/browser-side :



- 

- When the
.m3u8
is updated, I would like the video stream being watched to simply "continue", advancing from the point where new.ts
segments have been added. - I do not need the user to scrub backwards in time, the client just needs to support live observation of the stream.










My current approach :



Server-side :



To generate the "first" few segments of the stream, I'm attempting the below (just command-line for now to get ffmpeg working right, but ultimately will be automated via a Python script) :



For reference, I'm using ffmpeg version 3.4.6-0ubuntu0.18.04.1.



ffmpeg -y -framerate 2 -i /frames/batch1/frame_%d.jpg \
 -c:v libx264 -crf 21 -preset veryfast -g 2 \
 -f hls -hls_time 4 -hls_list_size 4 -segment_wrap 4 -segment_list_flags +live video/stream.m3u8




where the
/frames/batch1/
folder contains a sequence of frames (e.g. frame_01.jpg, frame_02.jpg, etc...). This already doesn't appear to work correctly, because it keeps adding#EXT-X-ENDLIST
to the end of the.m3u8
file, which as I understand is not correct for a live HLS stream - here's what that generates :


#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:4
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:4.000000,
stream0.ts
#EXTINF:4.000000,
stream1.ts
#EXTINF:2.000000,
stream2.ts
#EXT-X-ENDLIST




I can't figure out how to suppress
#EXT-X-ENDLIST
here - this is problem #1.


Then, to generate subsequent segments (e.g. when new frames become available), I'm trying this :



ffmpeg -y -framerate 2 -start_number 20 -i /frames/batch2/frame_%d.jpg \
 -c:v libx264 -crf 21 -preset veryfast -g 2 \
 -f hls -hls_time 4 -hls_list_size 4 -segment_wrap 4 -segment_list_flags +live video/stream.m3u8




Unfortunately, this does not work the way I want it to. It simply overwrites
stream.m3u8
, does and does not advance#EXT-X-MEDIA-SEQUENCE
, it does not index the new.ts
files correctly, and it also includes the undesirable#EXT-X-ENDLIST
- this is the output of that command :


#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:4
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:4.000000,
stream0.ts
#EXTINF:4.000000,
stream1.ts
#EXTINF:3.000000,
stream2.ts
#EXT-X-ENDLIST




Fundamentally, I can't figure out how to "append" to an existing
.m3u8
in a way that makes sense for HLS live streaming. That's essentially problem #2.


For hosting the stream, I'm using a simple Flask app - which appears to be working the way I intend - here's what I'm doing for reference :



@app.route('/video/')
def stream(file_name):
 video_dir = './video'
 return send_from_directory(directory=video_dir, filename=file_name)




Client-side :



I'm trying HLS.js in Chrome - basically boils down to this :



<video></video>

...

<code class="echappe-js"><script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script>

<script>&#xA; var video = document.getElementById(&#x27;video1&#x27;);&#xA; if (Hls.isSupported()) {&#xA; var hls = new Hls();&#xA; hls.loadSource(&#x27;/video/stream.m3u8&#x27;);&#xA; hls.attachMedia(video);&#xA; hls.on(Hls.Events.MANIFEST_PARSED, function() {&#xA; video.play();&#xA; });&#xA; }&#xA; else if (video.canPlayType(&#x27;application/vnd.apple.mpegurl&#x27;)) {&#xA; video.src = &#x27;/video/stream.m3u8&#x27;;&#xA; video.addEventListener(&#x27;loadedmetadata&#x27;, function() {&#xA; video.play();&#xA; });&#xA; }&#xA;</script>




I'd like to think that what I'm trying to do doesn't require a more complex approach than what I'm trying above, but since what I'm trying to far definitely isn't working, I'm starting to think I need to come at this from a different angle. Any ideas on what I'm missing ?



Edit :



I've also attempted the same (again in Chrome) with
video.js
, and am seeing similar behavior - in particular, when I manually update the backingstream.m3u8
(with no#EXT-X-ENDLIST
tag),videojs
never picks up the new changes to the live stream, and just buffers/hangs indefinitely.


<video class="video-js vjs-default-skin" muted="muted" controls="controls">
 <source type="application/x-mpegURL" src="/video/stream.m3u8">
</source></video>

...

<code class="echappe-js"><script>&#xA; var player = videojs(&#x27;video1&#x27;);&#xA; player.play();&#xA;</script>




For example, if I start with this initial version of
stream.m3u8
:


#EXTM3U
#EXT-X-PLAYLIST-TYPE:EVENT
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:8
#EXT-X-MEDIA-SEQUENCE:0
#EXTINF:4.000000,
stream0.ts
#EXTINF:4.000000,
stream1.ts
#EXTINF:2.000000,
stream2.ts




and then manually update it server-side to this :



#EXTM3U
#EXT-X-PLAYLIST-TYPE:EVENT
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:8
#EXT-X-MEDIA-SEQUENCE:3
#EXTINF:4.000000,
stream3.ts
#EXTINF:4.000000,
stream4.ts
#EXTINF:3.000000,
stream5.ts




the video.js control just buffers indefinitely after only playing the first 3 segments (stream*.ts 0-2), which isn't what I'd expect to happen (I'd expect it to continue playing stream*.ts 3-5 once
stream.m3u8
is updated andvideo.js
makes a request for the latest version of the playlist).