Recherche avancée

Médias (2)

Mot : - Tags -/kml

Autres articles (38)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (4667)

  • How to specify MPEG-DASH adaptation sets in FFmpeg, when using optional output audio and subtitle streams ?

    15 juillet 2021, par Sebsii

    I am using the script below to transform videos to MPEG-DASH compliant ones. I want also to be able to process video that have no audio and/or no subtitles. Because of that I marked the streams for audio and subtitles with a ? at the end of the stream specifier, f.e. -map a:0?.

    


    Now when I want to specify the adaptation sets I can set it for the video and audio streams with id=0,streams=v and id=1,streams=a, but I can not select the subtitle streams, like I would want to with id=2,streams=s, as it is stated in the docs ("To map all video (or audio) streams to an AdaptationSet, "v" (or "a") can be used as stream identifier instead of IDs."). The only thing I could do is to specify an ID, like id=2,streams=8, but for that to work, an audio and subtitle stream would be mandatory, because otherwise there is no stream with the id 8 and FFmpeg would fail. I thought of putting the -map s:0? at the first position, then I could be sure, that it is always the stream with id 0, but if the input did not have a subtitle stream, then there would be an additional adaptation set, with the first version of video. I also thought about using FFprobe to get info about the available streams, but did not look too much into depth.

    


    Does anybody know how I could tell FFmpeg, that an adaptation set should be created, when there's a stream with subtitles present independent of there being audio streams or not ?

    


    Any help is much appreciated.

    


    #!/bin/bash

INPUT_PATH=$1
OUTPUT_PATH=$2

FPS=25
GOP_SIZE=100
PRESET_P=veryfast
V_SIZE_1=256x144
V_SIZE_2=426x240
V_SIZE_3=640x360
V_SIZE_4=854x480
V_SIZE_5=1280x720
V_SIZE_6=1920x1080

ffmpeg -i "$INPUT_PATH" \
    -preset $PRESET_P -keyint_min $GOP_SIZE -g $GOP_SIZE -sc_threshold 0 \
    -r $FPS -c:v libx264 -pix_fmt yuv420p -c:a aac -c:s copy \
    -map v:0 -s:0 $V_SIZE_1 -b:v:0 160k -maxrate:0 160k -bufsize:0 320k \
    -map v:0 -s:1 $V_SIZE_2 -b:v:1 450k -maxrate:1 450k -bufsize:1 900k \
    -map v:0 -s:2 $V_SIZE_3 -b:v:2 1M -maxrate:2 1M -bufsize:2 2M \
    -map v:0 -s:3 $V_SIZE_4 -b:v:3 2.5M -maxrate:3 2.5M -bufsize:3 5M \
    -map v:0 -s:4 $V_SIZE_5 -b:v:4 5M -maxrate:4 5M -bufsize:4 10M \
    -map v:0 -s:5 $V_SIZE_6 -b:v:5 8M -maxrate:5 8M -bufsize:5 16M \
    -map a:0? -b:a:0 128k -ac:a:0 1 \
    -map a:0? -b:a:1 384k -ac:a:1 2 \
    -map s:0? \
    -use_template 1 -use_timeline 1  \
    -seg_duration 4 -adaptation_sets "id=0,streams=v id=1,streams=a id=2,streams=s" \
    -f dash "$OUTPUT_PATH"


    


  • Serving static video content directly vs. via adaptive streaming protocols (HLS, DASH)

    7 mai 2020, par astralmaster

    Is there an advantage of serving static video content (not a live stream) via adaptive streaming protocols such as HLS or DASH over serving them directly as files using HTTP server in terms of speed ?

    



    Example case is when you have a 500MB mp4 h264+AAC video that you have to serve on a website via HTML5 video element. Would you rather serve it directly, since most popular browsers implement functions such as seek without downloading the whole file first ; or would you rather use ffmpeg or similar solution to create HLS chunks from the mp4 file and instead provide .m3u8 playlist source to the HTML5 video element. Is there a real advantage in terms of speed of doing this ?

    



    Which one would you implement if you had hundreds of video files all served as static content ?

    


  • Dash.js not playing mpd files made with ffmpeg

    31 décembre 2022, par Macster

    I'm using ffmpeg to create chunks and manifest of a webm file which I want to live stream with Dash.js. Unfortunately Dash.js won't play the mpd file, no matter which way I create the chunks and manifest. However, the sample mpd URL from Dash.js is working.

    


    Commands

    


    ffmpeg -re -r 25 -i Dash/strm.webm
-map 0:v:0
-pix_fmt yuv420p
-c:v libvpx
-s 640x480 -keyint_min 60 -g 60 -speed 6 -tile-columns 4 -frame-parallel 1 -threads 8 -static-thresh 0 -max-intra-rate 300 -deadline realtime -lag-in-frames 0 -error-resilient 1
-b:v 3000k
-f webm_chunk
-header "Dash/glass_360.hdr"
-chunk_start_index 1 Dash/glass_360_%d.chk
-map 0:a:0
-c:a libvorbis
-b:a 128k -ar 44100
-f webm_chunk
-audio_chunk_duration 2000
-header Dash/glass_171.hdr
-chunk_start_index 1 Dash/glass_171_%d.chk


//Manifest
ffmpeg
-f webm_dash_manifest -live 1
-i Dash/glass_360.hdr
-f webm_dash_manifest -live 1
-i Dash/glass_171.hdr
-c copy
-map 0 -map 1
-f webm_dash_manifest -live 1
-adaptation_sets "id=0,streams=0 id=1,streams=1"
-chunk_start_index 1
-chunk_duration_ms 2000
-time_shift_buffer_depth 7200
-minimum_update_period 7200 Dash/glass_video_manifest.mpd


    


    Manifest output

    


    ffmpeg version git-2020-05-27-8b5ffae Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 9.3.1 (GCC) 20200523
  configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libsrt --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --disable-w32threads --enable-libmfx --enable-ffnvcodec --enable-cuda-llvm --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf
  libavutil      56. 49.100 / 56. 49.100
  libavcodec     58. 87.101 / 58. 87.101
  libavformat    58. 43.100 / 58. 43.100
  libavdevice    58.  9.103 / 58.  9.103
  libavfilter     7. 83.100 /  7. 83.100
  libswscale      5.  6.101 /  5.  6.101
  libswresample   3.  6.100 /  3.  6.100
  libpostproc    55.  6.100 / 55.  6.100
Input #0, webm_dash_manifest, from 'Dash/glass_360.hdr':
  Metadata:
    ENCODER         : Lavf58.43.100
  Duration: N/A, bitrate: N/A
    Stream #0:0(eng): Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 1k tbr, 1k tbn, 1k tbc (default)
    Metadata:
      ALPHA_MODE      : 1
      ENCODER         : Lavc58.87.101 libvpx
      webm_dash_manifest_file_name: glass_360.hdr
      webm_dash_manifest_track_number: 1
Input #1, webm_dash_manifest, from 'Dash/glass_171.hdr':
  Metadata:
    ENCODER         : Lavf58.43.100
  Duration: N/A, bitrate: N/A
    Stream #1:0(eng): Audio: vorbis, 44100 Hz, mono, fltp (default)
    Metadata:
      ENCODER         : Lavc58.87.101 libvorbis
      webm_dash_manifest_file_name: glass_171.hdr
      webm_dash_manifest_track_number: 1
Output #0, webm_dash_manifest, to 'Dash/glass_video_manifest.mpd':
  Metadata:
    encoder         : Lavf58.43.100
    Stream #0:0(eng): Video: vp8, yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=2-31, 1k tbr, 1k tbn, 1k tbc (default)
    Metadata:
      ALPHA_MODE      : 1
      ENCODER         : Lavc58.87.101 libvpx
      webm_dash_manifest_file_name: glass_360.hdr
      webm_dash_manifest_track_number: 1
    Stream #0:1(eng): Audio: vorbis, 44100 Hz, mono, fltp (default)
    Metadata:
      ENCODER         : Lavc58.87.101 libvorbis
      webm_dash_manifest_file_name: glass_171.hdr
      webm_dash_manifest_track_number: 1
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
  Stream #1:0 -> #0:1 (copy)
Press [q] to stop, [?] for help
frame=    0 fps=0.0 q=-1.0 Lsize=       1kB time=00:00:00.00 bitrate=N/A speed=   0x
video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:4kB muxing overhead: unknown


    


    Manifest file
    (glass_video_manifest.mpd)
    
I tried to delete the ContetntComponent like suggested in other questions, but it didn't work.

    


    &lt;?xml version="1.0" encoding="UTF-8"?>&#xA;&#xA;<period start="PT0S">&#xA;<adaptationset mimetype="video/webm" codecs="vp8" lang="eng" bitstreamswitching="true" subsegmentalignment="true" subsegmentstartswithsap="1">&#xA;<contentcomponent type="video"></contentcomponent>&#xA;<segmenttemplate timescale="1000" duration="2000" media="glass_$RepresentationID$_$Number$.chk" startnumber="1" initialization="glass_$RepresentationID$.hdr"></segmenttemplate>&#xA;<representation bandwidth="1000000" width="640" height="480" codecs="vp8" mimetype="video/webm" startswithsap="1"></representation>&#xA;</adaptationset>&#xA;<adaptationset mimetype="audio/webm" codecs="vorbis" lang="eng" bitstreamswitching="true" subsegmentalignment="true" subsegmentstartswithsap="1">&#xA;<contentcomponent type="audio"></contentcomponent>&#xA;<segmenttemplate timescale="1000" duration="2000" media="glass_$RepresentationID$_$Number$.chk" startnumber="1" initialization="glass_$RepresentationID$.hdr"></segmenttemplate>&#xA;<representation bandwidth="128000" audiosamplingrate="44100" codecs="vorbis" mimetype="audio/webm" startswithsap="1"></representation>&#xA;</adaptationset>&#xA;</period>&#xA;&#xA;

    &#xA;

    Dash.js-Player

    &#xA;

    <code class="echappe-js">&lt;script&gt;&amp;#xA;&amp;#xA;(function(){&amp;#xA;  // var url = &quot;https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd&quot;;&amp;#xA;  var url = &quot;http://localhost:8081/videos/Dash/glass_live_manifest.mpd&quot;;&amp;#xA;  var player = dashjs.MediaPlayer().create();&amp;#xA;                &amp;#xA;  // config&amp;#xA;  targetLatency = 2.0;        // Lowering this value will lower latency but may decrease the player&amp;#x27;s ability to build a stable buffer.&amp;#xA;  minDrift = 0.05;            // Minimum latency deviation allowed before activating catch-up mechanism.&amp;#xA;  catchupPlaybackRate = 0.5;  // Maximum catch-up rate, as a percentage, for low latency live streams.&amp;#xA;  stableBuffer = 2;           // The time that the internal buffer target will be set to post startup/seeks (NOT top quality).&amp;#xA;  bufferAtTopQuality = 2;     // The time that the internal buffer target will be set to once playing the top quality.&amp;#xA;&amp;#xA;   player.updateSettings({&amp;#xA;      &amp;#x27;streaming&amp;#x27;: {&amp;#xA;          &amp;#x27;liveDelay&amp;#x27;: 2,&amp;#xA;          &amp;#x27;liveCatchUpMinDrift&amp;#x27;: 0.05,&amp;#xA;          &amp;#x27;liveCatchUpPlaybackRate&amp;#x27;: 0.5,&amp;#xA;          &amp;#x27;stableBufferTime&amp;#x27;: 2,&amp;#xA;          &amp;#x27;bufferTimeAtTopQuality&amp;#x27;: 2,&amp;#xA;          &amp;#x27;bufferTimeAtTopQualityLongForm&amp;#x27;: 2,&amp;#xA;          &amp;#x27;bufferToKeep&amp;#x27;: 2,&amp;#xA;          &amp;#x27;bufferAheadToKeep&amp;#x27;: 2,&amp;#xA;          &amp;#x27;lowLatencyEnabled&amp;#x27;: true,&amp;#xA;          &amp;#x27;fastSwitchEnabled&amp;#x27;: true,&amp;#xA;          &amp;#x27;abr&amp;#x27;: {&amp;#xA;               &amp;#x27;limitBitrateByPortal&amp;#x27;: true&amp;#xA;          },&amp;#xA;       }&amp;#xA;   });&amp;#xA;&amp;#xA;   console.log(player.getSettings());&amp;#xA;&amp;#xA;   setInterval(() =&gt; {&amp;#xA;      console.log(&amp;#x27;Live latency= &amp;#x27;, player.getCurrentLiveLatency());&amp;#xA;      console.log(&amp;#x27;Buffer length= &amp;#x27;, player.getBufferLength(&amp;#x27;video&amp;#x27;));&amp;#xA;     }, 3000);&amp;#xA;&amp;#xA;     player.initialize(document.querySelector(&quot;#videoPlayer&quot;), url, true);&amp;#xA;&amp;#xA;   })();&amp;#xA;&lt;/script&gt;&#xA;

    &#xA;

    Chrome

    &#xA;

    {debug: {…}, streaming: {…}}&#xA;dash.all.min.js:2 XHR finished loading: GET "http://localhost:8081/videos/Dash/glass_live_manifest.mpd".&#xA;load @ dash.all.min.js:2&#xA;C @ dash.all.min.js:2&#xA;load @ dash.all.min.js:2&#xA;load @ dash.all.min.js:2&#xA;load @ dash.all.min.js:2&#xA;load @ dash.all.min.js:2&#xA;se @ dash.all.min.js:2&#xA;te @ dash.all.min.js:2&#xA;initialize @ dash.all.min.js:2&#xA;(anonymous) @ Dash:92&#xA;(anonymous) @ Dash:94&#xA;DevTools failed to load SourceMap: Could not parse content for http://localhost:8081/js/dash.all.min.js.map: Cannot read property &#x27;length&#x27; of undefined&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;

    &#xA;

    UPDATE

    &#xA;

    Well, it seems like the problem in general was, that the mpd's wouldn't play from that /dash folder. So i took a look into the code and found a bad routing. Anyways, the mpd would't start with the given command I used, probably becasue it creates a dynamic manifest, like @Markus Schumann says. So I'm going with a new one which seems to be working for now, but not very well.

    &#xA;

    ffmpeg -y -re -i strm.webm &#xA;-c:v libx264 -x264opts "keyint=24:min-keyint=24:no-scenecut" &#xA;-r 24 -c:a aac -b:a 128k -bf 1 -b_strategy 0 -sc_threshold 0 -pix_fmt yuv420p &#xA;-map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 -b:v:0 250k  &#xA;-filter:v:0 "scale=-2:240" -profile:v:0 baseline -b:v:1 750k  &#xA;-filter:v:1 "scale=-2:480" -profile:v:1 main -b:v:2 1500k &#xA;-filter:v:2 "scale=-2:720" -profile:v:2 high &#xA;-use_timeline 1 -use_template 1 -window_size 5 -adaptation_sets "id=0,streams=v id=1,streams=a" &#xA;-f dash glass_video_manifest.mpd&#xA;

    &#xA;