
Recherche avancée
Autres articles (109)
-
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
Participer à sa traduction
10 avril 2011Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
Actuellement MediaSPIP n’est disponible qu’en français et (...) -
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
Sur d’autres sites (18973)
-
Vitamio Media Player License Requirements
27 mai 2015, par Amanda FernandezI have successfully integrated vitamio project in my app and it works perfectly. But I’m not clear about its licence. I found the project in github. It uses ffmpeg library and some of the files have apache license. So can I use, vitamio for enterprise app without the license ? Or license is mandatory. ?
-
How to generate an MP4 for use with Media Source Extensions using ffmpeg ?
21 novembre 2014, par Pablo MontillaI’m using ffmpeg to generate a fragmented mp4 stream that will be consumed by an HTML5 player using MSE. I think I have a correctly formatted mp4 stream of data with an empty moov at the start and a series of moof+mdat for the video fragments. What I’m stuck is with the codec that’s being received by Chrome.
chrome ://media-internals has this to say :
00:00:00 00 pipeline_state kCreated
00:00:00 00 EVENT PIPELINE_CREATED
00:00:00 00 EVENT WEBMEDIAPLAYER_CREATED
00:00:00 00 url blob:http%3A//localhost/3388b7f1-5567-4d76-a585-0b247affd16a
00:00:00 00 pipeline_state kInitDemuxer
00:00:07 68 error Video codec: avc1.4d401eFrom which I gather the problem is with the encoded data, which should be in
avc1.4d0020
as its specified in the javascript.I’ve seen the answer to html5 video tag codecs attribute, which tells me I need to reset
profile_compatibility
andAVCLevelIndication
, but I couldn’t find an option to do that.Here’s the command line I’m using (the pipe is read and sent to the web page using web sockets) :
ffmpeg -re -i Input.mp4 -map 0:0 -y -c:v libx264 -preset veryslow -x264opts intra-refresh=1 -tune psnr -refs 1 -b:v 1500k -bf 0 -g 8 -b-pyramid none -f mp4 -movflags empty_moov+omit_tfhd_offset+frag_keyframe -frag_duration 1000000 -profile:v main -iods_video_profile 0 -flags +loop \\.\pipe\FeedPipe
-
Interesting behavior in Media Source Extensions
28 mai 2020, par newtonian_figI'm trying to build a fairly standard video player using Media Source Extensions ; however, I want the user to be able to control when the player moves on to a new video segment. For example, we might see the following behavior :



- 

- Video player plays 1st segment
- Source Buffer runs out of data causing the video to appear paused
- When the user is ready, they click a button that adds the 2nd segment to the Source Buffer
- The video continues by playing the 2nd segment











This works well, except that when the video appears paused during step 2 it doesn't stop at the last frame of the 1st segment. Instead, it stops two frames before the end of the 1st segment. Those last two frames aren't being dropped, they just get played after the user clicks the button to advance the video. This is an issue for my application, and I'm trying to figure out a way to make sure all of the frames from the 1st segment get played before the end of step 2.



I suspect that these last two frames are getting held up in the video decoder buffer. Especially since calling endOfStream() on my Media Source after adding the 1st segment to the Source Buffer causes the 1st segment to play all the way through with no frames left behind.



Additional Info



- 

- I created each video segment file from a series of PNGs using the following ffmpeg command





ffmpeg -i %04d.png -movflags frag_keyframe+empty_moov+default_base_moof video_segment.mp4



- 

- Maybe this is a clue ? End of stream situations not handled correctly (last frames are dropped)
- Another interesting thing to note is that if the video only has 2 frames or less, MSE doesn't play it at all.
- The browser I'm using is Chrome. The code for my MSE player is just taken from the Google Developers example, but I'll post it here for completeness. This code only covers up to step 2 since that's where the issue is.









<code class="echappe-js"><script>&#xA;const mediaSource = new MediaSource();&#xA;video.src = URL.createObjectURL(mediaSource);&#xA;mediaSource.addEventListener(&#x27;sourceopen&#x27;, sourceOpen, { once: true });&#xA;&#xA;function sourceOpen() {&#xA; URL.revokeObjectURL(video.src);&#xA; const sourceBuffer = mediaSource.addSourceBuffer(&#x27;video/mp4; codecs="avc1.64001f"&#x27;);&#xA; sourceBuffer.mode = &#x27;sequence&#x27;;&#xA;&#xA; // Fetch the video and add it to the Source Buffer&#xA; fetch(&#x27;https://s3.amazonaws.com/bucket_name/video_file.mp4&#x27;)&#xA; .then(response => response.arrayBuffer())&#xA; .then(data => sourceBuffer.appendBuffer(data));&#xA;}&#xA;&#xA;</code></pre>&#xA;