Recherche avancée

Médias (91)

Autres articles (54)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • L’utiliser, en parler, le critiquer

    10 avril 2011

    La première attitude à adopter est d’en parler, soit directement avec les personnes impliquées dans son développement, soit autour de vous pour convaincre de nouvelles personnes à l’utiliser.
    Plus la communauté sera nombreuse et plus les évolutions seront rapides ...
    Une liste de discussion est disponible pour tout échange entre utilisateurs.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (11314)

  • MPEG-DASH create initialization segment

    3 novembre 2014, par static

    I’m segmenting the video capture of the desktop using ffmpeg -segment and sending them over network in order to be served to clients and to be played using dash.js. The problem is that the player is searching for the initialization segment and i don’t seem to be able to figure out how to create it.
    I create the segments using this ffmpeg command :

    ffmpeg -rtbufsize 1500M -f dshow -r 15 -i video="UScreenCapture"
    -flags +global_header -vcodec libvpx -crf 10 -quality good -keyint_min 15 -g 15
    -cpu-used 3 -b:v 1000k -qmin 10 -qmax 42  -threads 2 -vf scale=-1:480 -bufsize 1
    500 -map 0 -f stream_segment -segment_time 2 -segment_format webm http://localho
    st:3000/stream/22/%03d

    The manifest that i create for the stream looks something like this :

    <mpd xmlns="urn:mpeg:dash:schema:mpd:2011" type="dynamic" availabilitystarttime="2014-06-19T07:47:40.079Z" minbuffertime="PT0S" profiles="urn:mpeg:dash:profile:isoff-live:2011" suggestedpresentationdelay="PT40S" maxsegmentduration="PT2.000S" minimumupdateperiod="PT1000M">
    <period bitstreamswitching="true" start="PT0S">
    <adaptationset mimetype="video/webm" segmentalignment="true" startwithsap="1" maxwidth="1280" maxheight="720" maxframerate="15">
    <contentcomponent contenttype="video"></contentcomponent>
    <segmenttemplate presentationtimeoffset="0" timescale="90000" media="$Number$/" duration="180000" startnumber="0"></segmenttemplate>
    <representation width="853" height="480" framerate="15" bandwidth="1000000" codecs="vp8"></representation>
    </adaptationset>
    </period>
    </mpd>

    The player debugging mode prints the following things :

    Getting the request for time: 0 dash.all.js:2073
    Index for time 0 is 0 dash.all.js:2073
    Waiting for more video buffer before starting playback. dash.all.js:2073
    BufferController video seek: 0 dash.all.js:2073
    Marking a special seek for initial video playback. dash.all.js:2073
    Start searching for initialization. dash.all.js:2073
    Perform init search: stream/22/ dash.all.js:2073
    Getting the request for time: 0 dash.all.js:2073
    Index for time 0 is 0 dash.all.js:2073
    Data changed - loading the video fragment for time: 0 dash.all.js:2073
    Getting the request for time: 0

    How can i create the initialization segment for the generated segments ? I can’t seem to be able to get it to work.

  • Live WebRTC streams (getUserMedia) to DASH using WebM

    4 septembre 2015, par cyp

    I’m trying to understand the feasibility of a live streaming solution.
    I want to grab WebRTC streams (audio and video), send them to a server and transform them in chunks to send to a html5 video tag or a DASH player using WebM container (VP8 and Opus codecs).

    I also looked into ffmpeg, ffserver and gstreamer but...

    My question is how to feed the WebRTC streams (live) and transform them in HTTP chunks (live DASH compatible) ?

    Anyone achieved something like this ?

  • Live WebRTC streams (getUserMedia) to DASH using WebM

    3 avril 2023, par cyp

    I'm trying to understand the feasibility of a live streaming solution. &#xA;I want to grab WebRTC streams (audio and video), send them to a server and transform them in chunks to send to a html5 video tag or a DASH player using WebM container (VP8 and Opus codecs).

    &#xA;&#xA;

    I also looked into ffmpeg, ffserver and gstreamer but...

    &#xA;&#xA;

    My question is how to feed the WebRTC streams (live) and transform them in HTTP chunks (live DASH compatible) ?

    &#xA;&#xA;

    Anyone achieved something like this ?

    &#xA;