
Recherche avancée
Autres articles (112)
-
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (10873)
-
Need to record and stream live video in ffmpeg
16 août 2020, par ankurgoelNeed to record live webcamera and live stream it also record at the same time on the server.


I was trying to implement this using FFMpeg in vue but not sure if this is the best way to implement this.


Please guide if this can be done more efficiently than this.


-
Does Facebook multicast or unicast the live videos to its viewers ?
14 avril 2018, par Kamran ZahoorThis is how a live stream goes from one broadcaster to millions of viewers :
- A broadcaster starts a live video on their phone.
- The phone sends a RTMP stream to a Live Stream server.
- The Live Stream server decodes the video and transcodes to multiple bit rates.
- For each bit rate a set of one-second MPEG-DASH segments is continuously produced.
- Segments are stored in a datacenter cache.
- From the datacenter cache segments are sent to caches located in the points of presence (a PoP cache).
- On the view side the viewer receives a Live Story.
- The player on their device starts fetching segments from a PoP cache at a rate of one per second.
My question is simple. After the step 6, does PoP cache/Edge cache/eNB (edge layer node) multicast a live video content to multiple viewers or unicast each user separately (opening up seperate streams for each user) ?
-
Can we detect whether a live udp stream packet contains a key frame or not ?
12 janvier 2023, par Miuru Shalinda RajapakshaIs there a way to detect whether there is a key frame within a video packet or not ?


What I want do is I want to cut the live udp stream into seperate 30 second .ts files, and then concatenate the files including several other .ts file using ffmpeg and restream to rtmp (youtube). I can achieve this but the only problem I have is after concatenating the files there is drop frame after each 30 seconds, the reason for this is I have cut the live stream without considering the key frames. Therefore what I want is to cut the live stream based on its key frames. Im doing this using python and ffmpeg. Is there a way to detect whether a video packet contains a key frame or not ?