
Recherche avancée
Médias (2)
-
Granite de l’Aber Ildut
9 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
-
Géodiversité
9 septembre 2011, par ,
Mis à jour : Août 2018
Langue : français
Type : Texte
Autres articles (42)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (7783)
-
is there a way to preload low quality in MPEG Dash or HLS so there is always something to play ?
5 mars 2018, par Daniel BenedyktI created HLS manifests and DASH manifests with different resolutions and bitrates.
The videos almost always starts low quality, then picks up quality/resolution.
But later, if Internet speed drops, the video stops because its playing the hi resolution video.
is there a way to preload low quality in MPEG DASH or HLS so there is always something to play ?
Edit : I have the video preloaded. its not live streaming but once it starts playing, I am looking for a consistent stream, even if that means sacrificing quality.
-
MPEG-DASH create initialization segment
3 novembre 2014, par staticI’m segmenting the video capture of the desktop using ffmpeg -segment and sending them over network in order to be served to clients and to be played using dash.js. The problem is that the player is searching for the initialization segment and i don’t seem to be able to figure out how to create it.
I create the segments using this ffmpeg command :ffmpeg -rtbufsize 1500M -f dshow -r 15 -i video="UScreenCapture"
-flags +global_header -vcodec libvpx -crf 10 -quality good -keyint_min 15 -g 15
-cpu-used 3 -b:v 1000k -qmin 10 -qmax 42 -threads 2 -vf scale=-1:480 -bufsize 1
500 -map 0 -f stream_segment -segment_time 2 -segment_format webm http://localho
st:3000/stream/22/%03dThe manifest that i create for the stream looks something like this :
<mpd xmlns="urn:mpeg:dash:schema:mpd:2011" type="dynamic" availabilitystarttime="2014-06-19T07:47:40.079Z" minbuffertime="PT0S" profiles="urn:mpeg:dash:profile:isoff-live:2011" suggestedpresentationdelay="PT40S" maxsegmentduration="PT2.000S" minimumupdateperiod="PT1000M">
<period bitstreamswitching="true" start="PT0S">
<adaptationset mimetype="video/webm" segmentalignment="true" startwithsap="1" maxwidth="1280" maxheight="720" maxframerate="15">
<contentcomponent contenttype="video"></contentcomponent>
<segmenttemplate presentationtimeoffset="0" timescale="90000" media="$Number$/" duration="180000" startnumber="0"></segmenttemplate>
<representation width="853" height="480" framerate="15" bandwidth="1000000" codecs="vp8"></representation>
</adaptationset>
</period>
</mpd>The player debugging mode prints the following things :
Getting the request for time: 0 dash.all.js:2073
Index for time 0 is 0 dash.all.js:2073
Waiting for more video buffer before starting playback. dash.all.js:2073
BufferController video seek: 0 dash.all.js:2073
Marking a special seek for initial video playback. dash.all.js:2073
Start searching for initialization. dash.all.js:2073
Perform init search: stream/22/ dash.all.js:2073
Getting the request for time: 0 dash.all.js:2073
Index for time 0 is 0 dash.all.js:2073
Data changed - loading the video fragment for time: 0 dash.all.js:2073
Getting the request for time: 0How can i create the initialization segment for the generated segments ? I can’t seem to be able to get it to work.
-
Live WebRTC streams (getUserMedia) to DASH using WebM
4 septembre 2015, par cypI’m trying to understand the feasibility of a live streaming solution.
I want to grab WebRTC streams (audio and video), send them to a server and transform them in chunks to send to a html5 video tag or a DASH player using WebM container (VP8 and Opus codecs).I also looked into ffmpeg, ffserver and gstreamer but...
My question is how to feed the WebRTC streams (live) and transform them in HTTP chunks (live DASH compatible) ?
Anyone achieved something like this ?