
Recherche avancée
Médias (1)
-
Spitfire Parade - Crisis
15 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (85)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras. -
Possibilité de déploiement en ferme
12 avril 2011, parMediaSPIP peut être installé comme une ferme, avec un seul "noyau" hébergé sur un serveur dédié et utilisé par une multitude de sites différents.
Cela permet, par exemple : de pouvoir partager les frais de mise en œuvre entre plusieurs projets / individus ; de pouvoir déployer rapidement une multitude de sites uniques ; d’éviter d’avoir à mettre l’ensemble des créations dans un fourre-tout numérique comme c’est le cas pour les grandes plate-formes tout public disséminées sur le (...)
Sur d’autres sites (18436)
-
Use ffmpeg as external tool to stream 2 or more different sources via pipeline
25 février 2021, par StackDOXI have an application running on an embedded system. This application has 2 video sources (and, theorically, 1 audio source). Concentrating to the video sources, I have 2 subprocess that computes different frames sets (unrelated each others). I want to send these frames to 2 differents streams.


I would to avoid to write a lot of ffmpeg/libav code. I have ffmpeg compiled for the embeeded system and I can use it as tool. For example, I can write to the stdout the first frames set and pass it to the ffmpeg like this :


./my_app|ffmpeg -an -i - -vcodec copy -f rtp rtp://"remote_ip


This basically works. But, now i would to send the other one frame set. How to do that ? Theorically I need anoter ffmpeg instance that read from another source that can't be the stdout of "my_app", because is already busy.



I'm thinking to use 2 video files as support. I can record the 2 frames sets into 2 video files and then run 2 ffmpeg instances from these sources. In this case I think I need a way to limit the video files dimensions (like a circular buffer), because 2 streams can become really huge in time. It is a possibility ?
This can sound "weird" to me : I need to record a video source in realtime and stream it via ffmpeg (always in realtime). I don't know if it is a good idea, there are realtime problems for sure :


loop:
my_app --write_into--> video_stream1.mp4 
ffmpeg <--read_from-- video_stream1.mp4

my_app --write_into--> video_stream2.mp4 
ffmpeg <--read_from-- video_stream2.mp4



Have you some suggestion to address this kind of situation ?


many thanks, bye.


-
Overlay of multiple sources with layer order like CSS z-index
6 mars 2021, par BasjNote : I've already read a few questions/answers about
[ffmpeg]
overlay
, but it did not fully answer this.

Let's say we have five inputs
[0]
...[4]
that can be images, videos, etc.

I'd like to overlay them in a precise order, a bit like "Photoshop layers" or CSS
z-index
.

Currently I use
ffmpeg
with a "recursive" use ofoverlay
like this :

-filter_complex "
 [0]setpts=PTS-STARTPTS[s0];
 [1]setpts=PTS-STARTPTS+3/TB[s1];
 [2]setpts=PTS-STARTPTS+7/TB[s2];
 [3]setpts=PTS-STARTPTS+8/TB[s3];
 [4]setpts=PTS-STARTPTS+9/TB[s4];
 [s0][s1]overlay=enable='between(t,3,13)'[o1];
 [o1][s2]overlay=enable='between(t,7,12)'[o2];
 [o2][s3]overlay=enable='between(t,8,15)'[o3];
 [o3][s4]overlay=enable='between(t,9,12)'[o4];
 "



Question : with
ffmpeg
, is there a way to do a mixing of different inputs / "layers" :

- 

- in a shorter way that this recursive use of
overlay
- and such that we can specify the "stack order" of each layer, like with CSS
z-index






?


To give a concrete example, how would you modify this command such that :


- 

- from 0'00" to 0'10", the layer order (from background to foreground) is 0, 1, 2, 3, 4
- from 0'10" to 0'15", the layer order (from background to foreground) is 2, 3, 1, 4, 0 (i.e.
[0]
goes to foreground)






?


- in a shorter way that this recursive use of
-
avfilter/scale_cuda : combine separate CUDA sources
22 juin 2021, par Timo Rothenpieler