
Recherche avancée
Médias (2)
-
Core Media Video
4 avril 2013, par
Mis à jour : Juin 2013
Langue : français
Type : Video
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (55)
-
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (11437)
-
How to prepare media stream to play using dash.js web player ?
7 avril 2016, par Paweł TobiszewskiI want to stream media from nginx server to Android device and play it using web player embedded into web page. Player I want to use is dash.js.
I play the same media also using different methods (MediaPlayer and ExoPlayer) and they are working great. But when I try to use dash.js, I faced problem with codecs - they are not supported.
I prepare my streams using ffmpeg and MP4Box, I also tried different codecs, like libx264, x264, x265 - always with the same effect.
My based media are video in Y4M format and audio in WAV.
How to encode it to use it in dash.js player ?EDIT :
I get error "Video Element Error : MEDIA_ERR_DECODE" while trying to decode video stream.Here is full log :
[16] EME detected on this user agent! (ProtectionModel_21Jan2015)
[19] Playback Initialized
[28] [dash.js 2.0.0] MediaPlayer has been initialized
[102] Parsing complete: ( xml2json: 3ms, objectiron: 3ms, total: 0.006s)
[103] Manifest has been refreshed at Thu Apr 07 2016 22:02:52 GMT+0200 (CEST)[1460059372.696]
[107] SegmentTimeline detected using calculated Live Edge Time
[118] MediaSource is open!
[118] [object Event]
[119] Duration successfully set to: 18.58
[119] Added 0 inline events
[120] video codec: video/mp4;codecs="avc1.640032"
[132] Schedule controller stopping for video
[137] No audio data.
[137] No text data.
[137] No fragmentedText data.
[137] No embeddedText data.
[138] No muxed data.
[139] Start Event Controller
[141] Schedule controller starting for video
[143] Native video element event: play
[144] Schedule controller starting for video
[148] loaded video:InitializationSegment:NaN (200, 0ms, 7ms)
[149] Initialization finished loading
[154] Getting the request for video time : 0
[155] SegmentList: 0 / 18.58
[164] loaded video:MediaSegment:0 (200, 7ms, 1ms)
[169] Native video element event: loadedmetadata
[171] Starting playback at offset: 0
[175] Got enough buffer to start.
[175] Buffered Range: 0 - 0.999999
[179] Requesting seek to time: 0
[181] Prior to making a request for time, NextFragmentRequestRule is aligning index handler's currentTime with bufferedRange.end. 0 was changed to 0.999999
[182] Getting the request for video time : 0.999999
[183] SegmentList: 0 / 18.58
[183] Getting the next request at index: 1
[184] SegmentList: 1 / 18.58
[190] loaded video:MediaSegment:1 (200, 5ms, 0ms)
[192] Buffered Range: 0 - 0.999999
[195] Getting the request for video time : 2
[196] Index for video time 2 is 1
[197] SegmentList: 1 / 18.58
[197] Getting the next request at index: 2
[198] SegmentList: 2 / 18.58
[205] loaded video:MediaSegment:2 (200, 4ms, 1ms)
[207] Buffered Range: 0 - 0.999999
[207] Getting the request for video time : 3
[208] Index for video time 3 is 2
[208] SegmentList: 2 / 18.58
[209] Getting the next request at index: 3
[209] SegmentList: 3 / 18.58
[212] Video Element Error: MEDIA_ERR_DECODE
[212] [object MediaError]
[215] Schedule controller stopping for video
[219] Native video element event: pause -
Live streaming through video tag
1er février 2018, par ElmiSI am currently trying to make a video streaming service without using plugins.
My server is using ffmpeg to transmux a rtsp stream to a fragmented mp4 so that the HTML 5 video tag can play it.
The ffmpeg command is
> ffmpeg -rtsp_transport tcp -i rtsp_address -movflags frag_keyframe+empty_moov -c:v copy -f mp4 -
(My rtsp stream has no audio)
Then I feed it to the client page though websocket which then will use Media Source Extensions API to give data to the video tag.
This all works on firefox but does not work for chrome or edge/i.e and I have searched all over but could not find an answer as to why it won’t play.
The initial data such as the width and the height of the video seems to go through as the size of the video frame changes. However the video won’t start and when I use video.play() so that it would autoplay the video once loaded, I get an error saying
Uncaught (in promise) DOMException : The play() request was interrupted by a call to pause().
It seems like the video automatically pauses itself since I did not use pause() anywhere in my code.
My client side code is
var webSocket;
var video = document.getElementById("rtspPlayer");
var mediaSource;
var sourceArray;
var strCodec = 'video/mp4; codecs="avc1.420029"';
var buffer = [];
window.onload = function(){
var selector = document.getElementById("select");
var url = selector.options[selector.selectedIndex].value;
setUpWebSocket(url);
setUpMediaSource();
}
function setUpWebSocket(url){
webSocket = new WebSocket(url);
webSocket.onopen = onWebSocketOpen;
webSocket.onmessage = onWebSocketData;
}
function setUpMediaSource(){
mediaSource = new window.MediaSource();
var uri = window.URL.createObjectURL(mediaSource);
video.setAttribute('src', uri);
video.setAttribute('type', 'video/mp4');
mediaSource.onsourceopen = function(){
mediaSource.duration = Infinity;
sourceBuffer = mediaSource.addSourceBuffer(strCodec);
sourceBuffer.onupdateend = readFromBuffer();
}
}
function onWebSocketOpen(){
video.play();
}
function onWebSocketData(data){
var blob = data.data;
var fileReader = new FileReader();
fileReader.onload = function() {
buffer.push(this.result);
readFromBuffer();
};
fileReader.readAsArrayBuffer(blob);
}
function readFromBuffer(){
if(buffer.length === 0){
//console.log("Buffer length 0");
return;
}
else if(!sourceBuffer){
//console.log("No Source Buffer");
return;
}
else if(sourceBuffer.updating){
//console.log("SourceBuffer Updating");
return;
}
try{
var data = buffer.shift();
sourceBuffer.appendBuffer(data);
}
catch(e){
return;
}
}I’m certain that this is the right Codec for the video I want to play as I have saved 30 second of the video using ffmpeg and used Bento4’s mp4info to check the codec. Also the 30 second video clip played well though the media source. However playing live seems impossible for me to solve.
Help me please ! I’ve been trying to solve this problem for days.
-
mpegvideo : drop support for real (non-emulated) edges
20 décembre 2013, par Anton Khirnovmpegvideo : drop support for real (non-emulated) edges
Several decoders disable those anyway and they are not measurably faster
on x86. They might be somewhat faster on other platforms due to missing
emu edge SIMD, but the gain is not large enough (and those decoders
relevant enough) to justify the added complexity.