
Recherche avancée
Médias (1)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
Autres articles (50)
-
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...)
Sur d’autres sites (9285)
-
Best way to stream live video under webRTC development [on hold]
5 avril 2016, par Ankit DhannaI have implemented webrtc live video chat with multiple presentors (using licode ).
However, I want to ask how can I broadcast this video stream so that multiple users who are not on webrtc can be served ?Where/Which server should I broadcast this stream to to get least lag time ?
-
Is it possible to create a video from a live stream of bitmaps ?
30 mars 2015, par user3088260I have an application that gets single frame from the webcam roughly every 200 ms as a bitmap. I have used
AForge.Video.FFMPEG.VideoFileWriter
to create a video from a few of them, but the problem is that the stream must be closed for the bitmaps to be written to file and I want to add more bitmaps to the same file for days - like a CCTV camera. Is it possible to transcode live video using AForge from bitmaps ? -
Independent MP4 fragments for live stream
20 avril 2020, par J DoeMy goal is to stream a H264 live-generated infinite feed (webcam) to the browser.



Specifically, I want to stream a Raspberry PI camera to the browser.



(Disclaimer : I'm streaming a H264 file right now ; but it's to be replaced with the PI webcam)



So, I have a H264 stream which I pipe to ffmpeg to mux it to MP4 fragments (movflags is set to frag_keyframe+empty_moov+default_base_moof), and then I send the fragments using websocket (in the server I extract the fragments and send each fragment as a message that has moof+mdata. Didn't figure out how to do it with HTTP Progressive, because the stream is live-generated...), and then they're played in the browser using the MediaSource API.



It works, but there's one problem : the video doesn't play unless it has ALL the fragments.



I tried sending the initializing fragment, and then sending only newly generated fragments, but it errors out and doesn't play.



How can I make it so you can join in the middle of the livestream, without being there from the very start of when it started recording (or in my testing, in the middle of the file stream).



In other words, how do I make the fragments independent of each other ?



P.S. any extra info will help, I'm new to this.