Recherche avancée

Médias (3)

Mot : - Tags -/spip

Autres articles (72)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

Sur d’autres sites (12132)

  • Serving static video content directly vs. via adaptive streaming protocols (HLS, DASH)

    7 mai 2020, par astralmaster

    Is there an advantage of serving static video content (not a live stream) via adaptive streaming protocols such as HLS or DASH over serving them directly as files using HTTP server in terms of speed ?

    



    Example case is when you have a 500MB mp4 h264+AAC video that you have to serve on a website via HTML5 video element. Would you rather serve it directly, since most popular browsers implement functions such as seek without downloading the whole file first ; or would you rather use ffmpeg or similar solution to create HLS chunks from the mp4 file and instead provide .m3u8 playlist source to the HTML5 video element. Is there a real advantage in terms of speed of doing this ?

    



    Which one would you implement if you had hundreds of video files all served as static content ?

    


  • Dash.js not playing mpd files made with ffmpeg

    31 décembre 2022, par Macster

    I'm using ffmpeg to create chunks and manifest of a webm file which I want to live stream with Dash.js. Unfortunately Dash.js won't play the mpd file, no matter which way I create the chunks and manifest. However, the sample mpd URL from Dash.js is working.

    


    Commands

    


    ffmpeg -re -r 25 -i Dash/strm.webm
-map 0:v:0
-pix_fmt yuv420p
-c:v libvpx
-s 640x480 -keyint_min 60 -g 60 -speed 6 -tile-columns 4 -frame-parallel 1 -threads 8 -static-thresh 0 -max-intra-rate 300 -deadline realtime -lag-in-frames 0 -error-resilient 1
-b:v 3000k
-f webm_chunk
-header "Dash/glass_360.hdr"
-chunk_start_index 1 Dash/glass_360_%d.chk
-map 0:a:0
-c:a libvorbis
-b:a 128k -ar 44100
-f webm_chunk
-audio_chunk_duration 2000
-header Dash/glass_171.hdr
-chunk_start_index 1 Dash/glass_171_%d.chk


//Manifest
ffmpeg
-f webm_dash_manifest -live 1
-i Dash/glass_360.hdr
-f webm_dash_manifest -live 1
-i Dash/glass_171.hdr
-c copy
-map 0 -map 1
-f webm_dash_manifest -live 1
-adaptation_sets "id=0,streams=0 id=1,streams=1"
-chunk_start_index 1
-chunk_duration_ms 2000
-time_shift_buffer_depth 7200
-minimum_update_period 7200 Dash/glass_video_manifest.mpd


    


    Manifest output

    


    ffmpeg version git-2020-05-27-8b5ffae Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 9.3.1 (GCC) 20200523
  configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libsrt --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --disable-w32threads --enable-libmfx --enable-ffnvcodec --enable-cuda-llvm --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf
  libavutil      56. 49.100 / 56. 49.100
  libavcodec     58. 87.101 / 58. 87.101
  libavformat    58. 43.100 / 58. 43.100
  libavdevice    58.  9.103 / 58.  9.103
  libavfilter     7. 83.100 /  7. 83.100
  libswscale      5.  6.101 /  5.  6.101
  libswresample   3.  6.100 /  3.  6.100
  libpostproc    55.  6.100 / 55.  6.100
Input #0, webm_dash_manifest, from 'Dash/glass_360.hdr':
  Metadata:
    ENCODER         : Lavf58.43.100
  Duration: N/A, bitrate: N/A
    Stream #0:0(eng): Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 1k tbr, 1k tbn, 1k tbc (default)
    Metadata:
      ALPHA_MODE      : 1
      ENCODER         : Lavc58.87.101 libvpx
      webm_dash_manifest_file_name: glass_360.hdr
      webm_dash_manifest_track_number: 1
Input #1, webm_dash_manifest, from 'Dash/glass_171.hdr':
  Metadata:
    ENCODER         : Lavf58.43.100
  Duration: N/A, bitrate: N/A
    Stream #1:0(eng): Audio: vorbis, 44100 Hz, mono, fltp (default)
    Metadata:
      ENCODER         : Lavc58.87.101 libvorbis
      webm_dash_manifest_file_name: glass_171.hdr
      webm_dash_manifest_track_number: 1
Output #0, webm_dash_manifest, to 'Dash/glass_video_manifest.mpd':
  Metadata:
    encoder         : Lavf58.43.100
    Stream #0:0(eng): Video: vp8, yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=2-31, 1k tbr, 1k tbn, 1k tbc (default)
    Metadata:
      ALPHA_MODE      : 1
      ENCODER         : Lavc58.87.101 libvpx
      webm_dash_manifest_file_name: glass_360.hdr
      webm_dash_manifest_track_number: 1
    Stream #0:1(eng): Audio: vorbis, 44100 Hz, mono, fltp (default)
    Metadata:
      ENCODER         : Lavc58.87.101 libvorbis
      webm_dash_manifest_file_name: glass_171.hdr
      webm_dash_manifest_track_number: 1
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
  Stream #1:0 -> #0:1 (copy)
Press [q] to stop, [?] for help
frame=    0 fps=0.0 q=-1.0 Lsize=       1kB time=00:00:00.00 bitrate=N/A speed=   0x
video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:4kB muxing overhead: unknown


    


    Manifest file
    (glass_video_manifest.mpd)
    
I tried to delete the ContetntComponent like suggested in other questions, but it didn't work.

    


    &lt;?xml version="1.0" encoding="UTF-8"?>&#xA;&#xA;<period start="PT0S">&#xA;<adaptationset mimetype="video/webm" codecs="vp8" lang="eng" bitstreamswitching="true" subsegmentalignment="true" subsegmentstartswithsap="1">&#xA;<contentcomponent type="video"></contentcomponent>&#xA;<segmenttemplate timescale="1000" duration="2000" media="glass_$RepresentationID$_$Number$.chk" startnumber="1" initialization="glass_$RepresentationID$.hdr"></segmenttemplate>&#xA;<representation bandwidth="1000000" width="640" height="480" codecs="vp8" mimetype="video/webm" startswithsap="1"></representation>&#xA;</adaptationset>&#xA;<adaptationset mimetype="audio/webm" codecs="vorbis" lang="eng" bitstreamswitching="true" subsegmentalignment="true" subsegmentstartswithsap="1">&#xA;<contentcomponent type="audio"></contentcomponent>&#xA;<segmenttemplate timescale="1000" duration="2000" media="glass_$RepresentationID$_$Number$.chk" startnumber="1" initialization="glass_$RepresentationID$.hdr"></segmenttemplate>&#xA;<representation bandwidth="128000" audiosamplingrate="44100" codecs="vorbis" mimetype="audio/webm" startswithsap="1"></representation>&#xA;</adaptationset>&#xA;</period>&#xA;&#xA;

    &#xA;

    Dash.js-Player

    &#xA;

    <code class="echappe-js">&lt;script&gt;&amp;#xA;&amp;#xA;(function(){&amp;#xA;  // var url = &quot;https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd&quot;;&amp;#xA;  var url = &quot;http://localhost:8081/videos/Dash/glass_live_manifest.mpd&quot;;&amp;#xA;  var player = dashjs.MediaPlayer().create();&amp;#xA;                &amp;#xA;  // config&amp;#xA;  targetLatency = 2.0;        // Lowering this value will lower latency but may decrease the player&amp;#x27;s ability to build a stable buffer.&amp;#xA;  minDrift = 0.05;            // Minimum latency deviation allowed before activating catch-up mechanism.&amp;#xA;  catchupPlaybackRate = 0.5;  // Maximum catch-up rate, as a percentage, for low latency live streams.&amp;#xA;  stableBuffer = 2;           // The time that the internal buffer target will be set to post startup/seeks (NOT top quality).&amp;#xA;  bufferAtTopQuality = 2;     // The time that the internal buffer target will be set to once playing the top quality.&amp;#xA;&amp;#xA;   player.updateSettings({&amp;#xA;      &amp;#x27;streaming&amp;#x27;: {&amp;#xA;          &amp;#x27;liveDelay&amp;#x27;: 2,&amp;#xA;          &amp;#x27;liveCatchUpMinDrift&amp;#x27;: 0.05,&amp;#xA;          &amp;#x27;liveCatchUpPlaybackRate&amp;#x27;: 0.5,&amp;#xA;          &amp;#x27;stableBufferTime&amp;#x27;: 2,&amp;#xA;          &amp;#x27;bufferTimeAtTopQuality&amp;#x27;: 2,&amp;#xA;          &amp;#x27;bufferTimeAtTopQualityLongForm&amp;#x27;: 2,&amp;#xA;          &amp;#x27;bufferToKeep&amp;#x27;: 2,&amp;#xA;          &amp;#x27;bufferAheadToKeep&amp;#x27;: 2,&amp;#xA;          &amp;#x27;lowLatencyEnabled&amp;#x27;: true,&amp;#xA;          &amp;#x27;fastSwitchEnabled&amp;#x27;: true,&amp;#xA;          &amp;#x27;abr&amp;#x27;: {&amp;#xA;               &amp;#x27;limitBitrateByPortal&amp;#x27;: true&amp;#xA;          },&amp;#xA;       }&amp;#xA;   });&amp;#xA;&amp;#xA;   console.log(player.getSettings());&amp;#xA;&amp;#xA;   setInterval(() =&gt; {&amp;#xA;      console.log(&amp;#x27;Live latency= &amp;#x27;, player.getCurrentLiveLatency());&amp;#xA;      console.log(&amp;#x27;Buffer length= &amp;#x27;, player.getBufferLength(&amp;#x27;video&amp;#x27;));&amp;#xA;     }, 3000);&amp;#xA;&amp;#xA;     player.initialize(document.querySelector(&quot;#videoPlayer&quot;), url, true);&amp;#xA;&amp;#xA;   })();&amp;#xA;&lt;/script&gt;&#xA;

    &#xA;

    Chrome

    &#xA;

    {debug: {…}, streaming: {…}}&#xA;dash.all.min.js:2 XHR finished loading: GET "http://localhost:8081/videos/Dash/glass_live_manifest.mpd".&#xA;load @ dash.all.min.js:2&#xA;C @ dash.all.min.js:2&#xA;load @ dash.all.min.js:2&#xA;load @ dash.all.min.js:2&#xA;load @ dash.all.min.js:2&#xA;load @ dash.all.min.js:2&#xA;se @ dash.all.min.js:2&#xA;te @ dash.all.min.js:2&#xA;initialize @ dash.all.min.js:2&#xA;(anonymous) @ Dash:92&#xA;(anonymous) @ Dash:94&#xA;DevTools failed to load SourceMap: Could not parse content for http://localhost:8081/js/dash.all.min.js.map: Cannot read property &#x27;length&#x27; of undefined&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;

    &#xA;

    UPDATE

    &#xA;

    Well, it seems like the problem in general was, that the mpd's wouldn't play from that /dash folder. So i took a look into the code and found a bad routing. Anyways, the mpd would't start with the given command I used, probably becasue it creates a dynamic manifest, like @Markus Schumann says. So I'm going with a new one which seems to be working for now, but not very well.

    &#xA;

    ffmpeg -y -re -i strm.webm &#xA;-c:v libx264 -x264opts "keyint=24:min-keyint=24:no-scenecut" &#xA;-r 24 -c:a aac -b:a 128k -bf 1 -b_strategy 0 -sc_threshold 0 -pix_fmt yuv420p &#xA;-map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 -b:v:0 250k  &#xA;-filter:v:0 "scale=-2:240" -profile:v:0 baseline -b:v:1 750k  &#xA;-filter:v:1 "scale=-2:480" -profile:v:1 main -b:v:2 1500k &#xA;-filter:v:2 "scale=-2:720" -profile:v:2 high &#xA;-use_timeline 1 -use_template 1 -window_size 5 -adaptation_sets "id=0,streams=v id=1,streams=a" &#xA;-f dash glass_video_manifest.mpd&#xA;

    &#xA;

  • MPEG-DASH - Multiplexed Representations Issue

    26 avril 2017, par Mike

    I’m trying to learn ffmpeg, MP4Box, and MPEG-DASH, but I’m running into an issue with the .mp4 I’m using. I’m using ffmpeg to demux the mp4 with this command :

    ffmpeg -i test.mp4 -c:v copy -g 72 -an video.mp4 -c:a copy audio.mp4

    Once the two files are created, I use MP4Box to segment the files for the dash player using this command :

    MP4Box -dash 4000 -frag 1000 -rap -segment-name segment_ output.mp4

    Which does create all the files I think I need. Then I point the player to the output_dash.mpd and nothing happens except a ton of messages in the console :

    [8] EME detected on this user agent! (ProtectionModel_21Jan2015)
    [11] Playback Initialized
    [21] [dash.js 2.3.0] MediaPlayer has been initialized
    [64] Parsing complete: ( xml2json: 3.42ms, objectiron: 2.61ms, total: 0.00603s)
    [65] Manifest has been refreshed at Wed Apr 12 2017 12:16:52 GMT-0600 (MDT)[1492021012.196]  
    [72] MediaSource attached to element.  Waiting on open...
    [77] MediaSource is open!
    [77] Duration successfully set to: 148.34
    [78] Added 0 inline events
    [78] No video data.
    [79] No audio data.
    [79] No text data.
    [79] No fragmentedText data.
    [79] No embeddedText data.
    [80] Multiplexed representations are intentionally not supported, as they are not compliant with the DASH-AVC/264 guidelines
    [81] No streams to play.

    Here is the MP4Box -info on the video I’m using :

    * Movie Info *
       Timescale 1000 - Duration 00:02:28.336
       Fragmented File no - 2 track(s)
       File suitable for progressive download (moov before mdat)
       File Brand mp42 - version 512
       Created: GMT Wed Feb  6 06:28:16 2036

    File has root IOD (9 bytes)
    Scene PL 0xff - Graphics PL 0xff - OD PL 0xff
    Visual PL: Not part of MPEG-4 Visual profiles (0xfe)
    Audio PL: Not part of MPEG-4 audio profiles (0xfe)
    No streams included in root OD

    iTunes Info:
       Name: Rogue One - A Star Wars Story
       Artist: Lucasfilm
       Genre: Trailer
       Created: 2016
       Encoder Software: HandBrake 0.10.2 2015060900
       Cover Art: JPEG File

    Track # 1 Info - TrackID 1 - TimeScale 90000 - Duration 00:02:28.335
    Media Info: Language "Undetermined" - Type "vide:avc1" - 3552 samples
    Visual Track layout: x=0 y=0 width=1920 height=816
    MPEG-4 Config: Visual Stream - ObjectTypeIndication 0x21
    AVC/H264 Video - Visual Size 1920 x 816
       AVC Info: 1 SPS - 1 PPS - Profile High @ Level 4.1
       NAL Unit length bits: 32
       Pixel Aspect Ratio 1:1 - Indicated track size 1920 x 816
    Self-synchronized

    Track # 2 Info - TrackID 2 - TimeScale 44100 - Duration 00:02:28.305
    Media Info: Language "English" - Type "soun:mp4a" - 6387 samples
    MPEG-4 Config: Audio Stream - ObjectTypeIndication 0x40
    MPEG-4 Audio MPEG-4 Audio AAC LC - 2 Channel(s) - SampleRate 44100
    Synchronized on stream 1
    Alternate Group ID 1

    I know I need to separate the video and audio and I think that’s where my issue is. The command I’m using probably isn’t doing the right thing.

    Is there a better command to demux my mp4 ?
    Is the MP4Box command I’m using best for segmenting the files ?
    If I use different files, will they always need to be demuxed ?

    One thing to mention, if I use the following commands everything works fine, but there is no audio because of the -an which means it’s only video :

    ffmpeg -i test.mp4 -c:v copy -g 72 -an output.mp4

    MP4Box -dash 4000 -frag 1000 -rap -segment-name segment_ output.mp4

    UPDATE

    I noticed that the video had no audio stream, but the audio had the video stream which is why I got the mux error. I thought that might be an issue so I ran this command to keep the unwanted streams out of the outputs :

    ffmpeg -i test.mp4 -c:v copy -g 72 -an video.mp4 -c:a copy -vn audio.mp4

    then I run :

    MP4Box -dash 4000 -frag 1000 -rap -segment-name segment_ video.mp4 audio.mp4

    now I no longer get the Multiplexed representations are intentionally not supported... message, but now I get :

    [122] Video Element Error: MEDIA_ERR_SRC_NOT_SUPPORTED
    [123] [object MediaError]
    [125] Schedule controller stopping for audio
    [126] Caught pending play exception - continuing (NotSupportedError: Failed to load because no supported source was found.)

    I tried playing the video and audio independently through Chrome and they both work, just not through the dash player. Ugh, this is painful to learn, but I feel like I’m making progress.