
Recherche avancée
Médias (1)
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (62)
-
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (9777)
-
WebRTC Multi-Stream recording
11 janvier 2021, par Tim SpechtI'm currently trying to build a WebRTC streaming architecture that contains multiple users streaming content from their camera in the same "room" and a SFU / MCU on server-side "recording" the incoming video packets, merging them into one image and re-distributing them to the viewers as either RTMP or HLS for added scalability.


Upon doing some initial research on this, Janus Gateway seems like a good fit for this given it's wide adoption across the space + their (seemingly) extensible plugin architecture. Thus, I'm currently trying to figure out what a recommended architecture for my use-case would look like.
I looked at the following plugins :


- 

- Janus Streaming
- Janus Recordings






While Janus and the Streaming plugin seem like a good start to get the broadcasting aspect within the group of casters in the room, I'm trying to piece together how I could combine the different video sources into a combined one (split horizontally for example if there are 2 casters active) and retransmit the final result as something optimized for broadcast-consumption like HLS. Some of the ways I could imagine doing that :


- 

- Implement a custom Janus plugin that transcodes the incoming buffers on the gateway itself
- Forwarding the incoming packets via RTP to a Transcoding server

- 

- In this specific case I am not sure what would be best to implement that ? Are the video frames different tracks ? Could I stream all of them to the same port and have
ffmpeg
or something similar take care of the merging for me ?




- In this specific case I am not sure what would be best to implement that ? Are the video frames different tracks ? Could I stream all of them to the same port and have






-
How to use yadif filter in ffmpeg ?
2 juin 2015, par PavelI write application on C++ with ffmpeg. I read interlaced frames with opencv VideoCapture and convert it to AVFrame for use yadif filter.
Can you give me simple example with yadif filter using ?filtering_video.c
from ffmpeg docs is too broad -
Ubuntu ffmpeg command -i not found
12 avril 2016, par GerwinI’m a beginner with Ubuntu and I need to setup a streaming server for my company, I’ve installed NGINX and FFMPEG, but when I call
ffmpeg -i
in the~/ffmpeg/ffmpeg
directory it returns an errorcommand not found
but when I call./ffmpeg -i
it returns a list with configurations.. why doesffmpeg -i
not work and how can I fix it ?I think that the
Path Enviroment Variable is incorrect
but how would I go abouts changing said variable ? I’m not that experienced with Ubuntu / CMD programming / FFMPEG configuration.Sources - https://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu
https://github.com/arut/nginx-rtmp-module/wiki/Control-module
and http://docs.unified-streaming.com/tutorials/live/broadcast-247.html#continuous-timestamps