
Recherche avancée
Autres articles (68)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)
Sur d’autres sites (10363)
-
MPEG-DASH - Multiplexed Representations Issue
26 avril 2017, par MikeI’m trying to learn ffmpeg, MP4Box, and MPEG-DASH, but I’m running into an issue with the .mp4 I’m using. I’m using ffmpeg to demux the mp4 with this command :
ffmpeg -i test.mp4 -c:v copy -g 72 -an video.mp4 -c:a copy audio.mp4
Once the two files are created, I use MP4Box to segment the files for the dash player using this command :
MP4Box -dash 4000 -frag 1000 -rap -segment-name segment_ output.mp4
Which does create all the files I think I need. Then I point the player to the output_dash.mpd and nothing happens except a ton of messages in the console :
[8] EME detected on this user agent! (ProtectionModel_21Jan2015)
[11] Playback Initialized
[21] [dash.js 2.3.0] MediaPlayer has been initialized
[64] Parsing complete: ( xml2json: 3.42ms, objectiron: 2.61ms, total: 0.00603s)
[65] Manifest has been refreshed at Wed Apr 12 2017 12:16:52 GMT-0600 (MDT)[1492021012.196]
[72] MediaSource attached to element. Waiting on open...
[77] MediaSource is open!
[77] Duration successfully set to: 148.34
[78] Added 0 inline events
[78] No video data.
[79] No audio data.
[79] No text data.
[79] No fragmentedText data.
[79] No embeddedText data.
[80] Multiplexed representations are intentionally not supported, as they are not compliant with the DASH-AVC/264 guidelines
[81] No streams to play.Here is the MP4Box -info on the video I’m using :
* Movie Info *
Timescale 1000 - Duration 00:02:28.336
Fragmented File no - 2 track(s)
File suitable for progressive download (moov before mdat)
File Brand mp42 - version 512
Created: GMT Wed Feb 6 06:28:16 2036
File has root IOD (9 bytes)
Scene PL 0xff - Graphics PL 0xff - OD PL 0xff
Visual PL: Not part of MPEG-4 Visual profiles (0xfe)
Audio PL: Not part of MPEG-4 audio profiles (0xfe)
No streams included in root OD
iTunes Info:
Name: Rogue One - A Star Wars Story
Artist: Lucasfilm
Genre: Trailer
Created: 2016
Encoder Software: HandBrake 0.10.2 2015060900
Cover Art: JPEG File
Track # 1 Info - TrackID 1 - TimeScale 90000 - Duration 00:02:28.335
Media Info: Language "Undetermined" - Type "vide:avc1" - 3552 samples
Visual Track layout: x=0 y=0 width=1920 height=816
MPEG-4 Config: Visual Stream - ObjectTypeIndication 0x21
AVC/H264 Video - Visual Size 1920 x 816
AVC Info: 1 SPS - 1 PPS - Profile High @ Level 4.1
NAL Unit length bits: 32
Pixel Aspect Ratio 1:1 - Indicated track size 1920 x 816
Self-synchronized
Track # 2 Info - TrackID 2 - TimeScale 44100 - Duration 00:02:28.305
Media Info: Language "English" - Type "soun:mp4a" - 6387 samples
MPEG-4 Config: Audio Stream - ObjectTypeIndication 0x40
MPEG-4 Audio MPEG-4 Audio AAC LC - 2 Channel(s) - SampleRate 44100
Synchronized on stream 1
Alternate Group ID 1I know I need to separate the video and audio and I think that’s where my issue is. The command I’m using probably isn’t doing the right thing.
Is there a better command to demux my mp4 ?
Is the MP4Box command I’m using best for segmenting the files ?
If I use different files, will they always need to be demuxed ?One thing to mention, if I use the following commands everything works fine, but there is no audio because of the
-an
which means it’s only video :ffmpeg -i test.mp4 -c:v copy -g 72 -an output.mp4
MP4Box -dash 4000 -frag 1000 -rap -segment-name segment_ output.mp4UPDATE
I noticed that the video had no audio stream, but the audio had the video stream which is why I got the mux error. I thought that might be an issue so I ran this command to keep the unwanted streams out of the outputs :
ffmpeg -i test.mp4 -c:v copy -g 72 -an video.mp4 -c:a copy -vn audio.mp4
then I run :
MP4Box -dash 4000 -frag 1000 -rap -segment-name segment_ video.mp4 audio.mp4
now I no longer get the Multiplexed representations are intentionally not supported... message, but now I get :
[122] Video Element Error: MEDIA_ERR_SRC_NOT_SUPPORTED
[123] [object MediaError]
[125] Schedule controller stopping for audio
[126] Caught pending play exception - continuing (NotSupportedError: Failed to load because no supported source was found.)I tried playing the video and audio independently through Chrome and they both work, just not through the dash player. Ugh, this is painful to learn, but I feel like I’m making progress.
-
Dash.js not playing mpd files made with ffmpeg
31 décembre 2022, par MacsterI'm using ffmpeg to create chunks and manifest of a webm file which I want to live stream with Dash.js. Unfortunately Dash.js won't play the mpd file, no matter which way I create the chunks and manifest. However, the sample mpd URL from Dash.js is working.


Commands


ffmpeg -re -r 25 -i Dash/strm.webm
-map 0:v:0
-pix_fmt yuv420p
-c:v libvpx
-s 640x480 -keyint_min 60 -g 60 -speed 6 -tile-columns 4 -frame-parallel 1 -threads 8 -static-thresh 0 -max-intra-rate 300 -deadline realtime -lag-in-frames 0 -error-resilient 1
-b:v 3000k
-f webm_chunk
-header "Dash/glass_360.hdr"
-chunk_start_index 1 Dash/glass_360_%d.chk
-map 0:a:0
-c:a libvorbis
-b:a 128k -ar 44100
-f webm_chunk
-audio_chunk_duration 2000
-header Dash/glass_171.hdr
-chunk_start_index 1 Dash/glass_171_%d.chk


//Manifest
ffmpeg
-f webm_dash_manifest -live 1
-i Dash/glass_360.hdr
-f webm_dash_manifest -live 1
-i Dash/glass_171.hdr
-c copy
-map 0 -map 1
-f webm_dash_manifest -live 1
-adaptation_sets "id=0,streams=0 id=1,streams=1"
-chunk_start_index 1
-chunk_duration_ms 2000
-time_shift_buffer_depth 7200
-minimum_update_period 7200 Dash/glass_video_manifest.mpd



Manifest output


ffmpeg version git-2020-05-27-8b5ffae Copyright (c) 2000-2020 the FFmpeg developers
 built with gcc 9.3.1 (GCC) 20200523
 configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libsrt --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --disable-w32threads --enable-libmfx --enable-ffnvcodec --enable-cuda-llvm --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf
 libavutil 56. 49.100 / 56. 49.100
 libavcodec 58. 87.101 / 58. 87.101
 libavformat 58. 43.100 / 58. 43.100
 libavdevice 58. 9.103 / 58. 9.103
 libavfilter 7. 83.100 / 7. 83.100
 libswscale 5. 6.101 / 5. 6.101
 libswresample 3. 6.100 / 3. 6.100
 libpostproc 55. 6.100 / 55. 6.100
Input #0, webm_dash_manifest, from 'Dash/glass_360.hdr':
 Metadata:
 ENCODER : Lavf58.43.100
 Duration: N/A, bitrate: N/A
 Stream #0:0(eng): Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 1k tbr, 1k tbn, 1k tbc (default)
 Metadata:
 ALPHA_MODE : 1
 ENCODER : Lavc58.87.101 libvpx
 webm_dash_manifest_file_name: glass_360.hdr
 webm_dash_manifest_track_number: 1
Input #1, webm_dash_manifest, from 'Dash/glass_171.hdr':
 Metadata:
 ENCODER : Lavf58.43.100
 Duration: N/A, bitrate: N/A
 Stream #1:0(eng): Audio: vorbis, 44100 Hz, mono, fltp (default)
 Metadata:
 ENCODER : Lavc58.87.101 libvorbis
 webm_dash_manifest_file_name: glass_171.hdr
 webm_dash_manifest_track_number: 1
Output #0, webm_dash_manifest, to 'Dash/glass_video_manifest.mpd':
 Metadata:
 encoder : Lavf58.43.100
 Stream #0:0(eng): Video: vp8, yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=2-31, 1k tbr, 1k tbn, 1k tbc (default)
 Metadata:
 ALPHA_MODE : 1
 ENCODER : Lavc58.87.101 libvpx
 webm_dash_manifest_file_name: glass_360.hdr
 webm_dash_manifest_track_number: 1
 Stream #0:1(eng): Audio: vorbis, 44100 Hz, mono, fltp (default)
 Metadata:
 ENCODER : Lavc58.87.101 libvorbis
 webm_dash_manifest_file_name: glass_171.hdr
 webm_dash_manifest_track_number: 1
Stream mapping:
 Stream #0:0 -> #0:0 (copy)
 Stream #1:0 -> #0:1 (copy)
Press [q] to stop, [?] for help
frame= 0 fps=0.0 q=-1.0 Lsize= 1kB time=00:00:00.00 bitrate=N/A speed= 0x
video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:4kB muxing overhead: unknown



Manifest file
(glass_video_manifest.mpd)

I tried to delete theContetntComponent
like suggested in other questions, but it didn't work.

<?xml version="1.0" encoding="UTF-8"?>

<period start="PT0S">
<adaptationset mimetype="video/webm" codecs="vp8" lang="eng" bitstreamswitching="true" subsegmentalignment="true" subsegmentstartswithsap="1">
<contentcomponent type="video"></contentcomponent>
<segmenttemplate timescale="1000" duration="2000" media="glass_$RepresentationID$_$Number$.chk" startnumber="1" initialization="glass_$RepresentationID$.hdr"></segmenttemplate>
<representation bandwidth="1000000" width="640" height="480" codecs="vp8" mimetype="video/webm" startswithsap="1"></representation>
</adaptationset>
<adaptationset mimetype="audio/webm" codecs="vorbis" lang="eng" bitstreamswitching="true" subsegmentalignment="true" subsegmentstartswithsap="1">
<contentcomponent type="audio"></contentcomponent>
<segmenttemplate timescale="1000" duration="2000" media="glass_$RepresentationID$_$Number$.chk" startnumber="1" initialization="glass_$RepresentationID$.hdr"></segmenttemplate>
<representation bandwidth="128000" audiosamplingrate="44100" codecs="vorbis" mimetype="audio/webm" startswithsap="1"></representation>
</adaptationset>
</period>




Dash.js-Player


<code class="echappe-js"><script>&#xA;&#xA;(function(){&#xA; // var url = "https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd";&#xA; var url = "http://localhost:8081/videos/Dash/glass_live_manifest.mpd";&#xA; var player = dashjs.MediaPlayer().create();&#xA; &#xA; // config&#xA; targetLatency = 2.0; // Lowering this value will lower latency but may decrease the player&#x27;s ability to build a stable buffer.&#xA; minDrift = 0.05; // Minimum latency deviation allowed before activating catch-up mechanism.&#xA; catchupPlaybackRate = 0.5; // Maximum catch-up rate, as a percentage, for low latency live streams.&#xA; stableBuffer = 2; // The time that the internal buffer target will be set to post startup/seeks (NOT top quality).&#xA; bufferAtTopQuality = 2; // The time that the internal buffer target will be set to once playing the top quality.&#xA;&#xA; player.updateSettings({&#xA; &#x27;streaming&#x27;: {&#xA; &#x27;liveDelay&#x27;: 2,&#xA; &#x27;liveCatchUpMinDrift&#x27;: 0.05,&#xA; &#x27;liveCatchUpPlaybackRate&#x27;: 0.5,&#xA; &#x27;stableBufferTime&#x27;: 2,&#xA; &#x27;bufferTimeAtTopQuality&#x27;: 2,&#xA; &#x27;bufferTimeAtTopQualityLongForm&#x27;: 2,&#xA; &#x27;bufferToKeep&#x27;: 2,&#xA; &#x27;bufferAheadToKeep&#x27;: 2,&#xA; &#x27;lowLatencyEnabled&#x27;: true,&#xA; &#x27;fastSwitchEnabled&#x27;: true,&#xA; &#x27;abr&#x27;: {&#xA; &#x27;limitBitrateByPortal&#x27;: true&#xA; },&#xA; }&#xA; });&#xA;&#xA; console.log(player.getSettings());&#xA;&#xA; setInterval(() => {&#xA; console.log(&#x27;Live latency= &#x27;, player.getCurrentLiveLatency());&#xA; console.log(&#x27;Buffer length= &#x27;, player.getBufferLength(&#x27;video&#x27;));&#xA; }, 3000);&#xA;&#xA; player.initialize(document.querySelector("#videoPlayer"), url, true);&#xA;&#xA; })();&#xA;</script>



Chrome


{debug: {…}, streaming: {…}}
dash.all.min.js:2 XHR finished loading: GET "http://localhost:8081/videos/Dash/glass_live_manifest.mpd".
load @ dash.all.min.js:2
C @ dash.all.min.js:2
load @ dash.all.min.js:2
load @ dash.all.min.js:2
load @ dash.all.min.js:2
load @ dash.all.min.js:2
se @ dash.all.min.js:2
te @ dash.all.min.js:2
initialize @ dash.all.min.js:2
(anonymous) @ Dash:92
(anonymous) @ Dash:94
DevTools failed to load SourceMap: Could not parse content for http://localhost:8081/js/dash.all.min.js.map: Cannot read property 'length' of undefined
Dash:88 Live latency= NaN
Dash:89 Buffer length= NaN
Dash:88 Live latency= NaN
Dash:89 Buffer length= NaN
Dash:88 Live latency= NaN
Dash:89 Buffer length= NaN
Dash:88 Live latency= NaN
Dash:89 Buffer length= NaN
Dash:88 Live latency= NaN
Dash:89 Buffer length= NaN



UPDATE


Well, it seems like the problem in general was, that the mpd's wouldn't play from that /dash folder. So i took a look into the code and found a bad routing. Anyways, the mpd would't start with the given command I used, probably becasue it creates a
dynamic
manifest, like @Markus Schumann says. So I'm going with a new one which seems to be working for now, but not very well.

ffmpeg -y -re -i strm.webm 
-c:v libx264 -x264opts "keyint=24:min-keyint=24:no-scenecut" 
-r 24 -c:a aac -b:a 128k -bf 1 -b_strategy 0 -sc_threshold 0 -pix_fmt yuv420p 
-map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 -b:v:0 250k 
-filter:v:0 "scale=-2:240" -profile:v:0 baseline -b:v:1 750k 
-filter:v:1 "scale=-2:480" -profile:v:1 main -b:v:2 1500k 
-filter:v:2 "scale=-2:720" -profile:v:2 high 
-use_timeline 1 -use_template 1 -window_size 5 -adaptation_sets "id=0,streams=v id=1,streams=a" 
-f dash glass_video_manifest.mpd



-
Serving static video content directly vs. via adaptive streaming protocols (HLS, DASH)
7 mai 2020, par astralmasterIs there an advantage of serving static video content (not a live stream) via adaptive streaming protocols such as HLS or DASH over serving them directly as files using HTTP server in terms of speed ?



Example case is when you have a 500MB mp4 h264+AAC video that you have to serve on a website via HTML5 video element. Would you rather serve it directly, since most popular browsers implement functions such as seek without downloading the whole file first ; or would you rather use ffmpeg or similar solution to create HLS chunks from the mp4 file and instead provide .m3u8 playlist source to the HTML5 video element. Is there a real advantage in terms of speed of doing this ?



Which one would you implement if you had hundreds of video files all served as static content ?