
Recherche avancée
Autres articles (97)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (8738)
-
How to stream live h.264 (IP camera) video to browser ? (bonus : low bandwidth and latency)
4 octobre 2018, par Ryan GriggsI need to stream live h.264-encoded video from an IP camera to the browser, while supporting all common browsers and mobile devices (i.e. Android, Firefox, Chrome, IE, Safari (Mac OS and iOS)), and while keeping bandwidth requirements and latency to a minimum.
MPEG-DASH requires browser support for Media Source Extensions, which are NOT supported by iOS. So that’s out.
HLS is only supported by Safari and Edge.
Also DASH seems to impose a latency of several seconds, which is not preferable.
I would like to be able to chunk the incoming h.264 data (i.e. fragmented MP4), pass the chunked data to the browser via Websockets, then dump the chunks into some sort of player as they arrive.
Broadway and its forks are a javascript h.264 decoder, and there is a Broadway-stream project that supports streams instead of files, but the docs are poor and I can only find examples of streaming when the source is not live.
The most pressing question is : how do I hand the "chunked data" to a player or Video HTML element as it arrives at the browser ?
I think the ideal setup would be to
- Use ffmpeg to transcode the original video to a chunked format (fMP4)
- Pipe the chunked output to a Node JS app which emits each chunk out through a Websocket to all connected viewers
- Viewers’ browsers dump each incoming chunk into some sort of decoder which renders the video.
I’m clear up to the point of handing the received chunks to a video decoder. How can that be done without depending on Media Source Extensions, and allowing viewers to join the stream at random times ?
-
In FFmpeg, using -filter_complex, how can I overlay and blend at the same time ?
20 avril 2023, par Kristian BoruffMy code works to combine three videos, resize the videos to 1920X1080, resize the watermark to the correct size, and then overlays the watermark to the lower left hand side. Then converts to the intended codecs for Youtube.
My question is how do I Blend at the same time using the -filter_complex workflow ?
Currently, I have the following workflow that does everything but set the opacity



ffmpeg -y -i fancy movie.mov -i logo.png -i in.mov -i out.mov -c:v libx264 -crf 18 -b:v 50000k -preset veryfast -tune film -profile:v high -x264opts cabac=1:keyint=16:bframes=2:keyint_min=15 -c:a libvo_aacenc -ab 128K -ar 48000 -filter_complex "[0:0] scale=1920X1080 [main]; [2:0] scale=1920X1080 [start]; [3:0] scale=1920X1080 [end]; [start] [2:1] [main] [0:1] [end] [3:1] concat=n=3:v=1:a=1 [prog]; [1:0] scale=480:90 [wm]; [prog] [wm] overlay=36:main_h-overlay_h-45" fart.mp4




I'm trying to add "blend=all_opacity=0.7" in the last step so the watermark will screen against the background. If I just add



[prog] [wm] overlay=36:main_h-overlay_h-45, blend=all_mode='overlay':all_opacity=0.7" fart.mp4




I get the error, Cannot find a matching stream for unlabeled input pad 1 on filter Parsed_blend_6



If I try a semicolon instead,



[prog] [wm] overlay=36:main_h-overlay_h-45; blend=all_mode='overlay':all_opacity=0.7




I get the error, Cannot find a matching stream for unlabeled input pad 0 on filter Parsed_blend_6 which makes me think that it's expecting something like [input 1] [input 2] blend command [output]. The problem is that I need it to overlay and blend.



I tried simplifying the code to just test if the blend operation was working properly.



ffmpeg -i test.mp4 -i logo.png -filter_complex "[0:0] scale=1920x1080 [wm]; [1:0] scale=1920x1080 [prog], [wm][prog] blend=all_mode='overlay':all_opacity=0.7" fart.mp4




I got the error First input link top parameters (size 1920x1080, SAR 1:1) do not match the corresponding second input link bottom parameters (1920x1080, SAR 243:80)
Failed to configure output pad on Parsed_blend_2



So, in addition to the trouble with combining of filters, I'm also having an issue with resizing the watermark as FFMpeg sees it as a different aspect ratio than the other videos.



This is my second day with FFmpeg, so any help would be appreciated.



I'm currently working with FFMpeg version N-61061-gf34cceb


-
In FFmpeg, using -filter_complex, how can I overlay and blend at the same time ?
5 mars 2014, par Kristian BoruffMy code works to combine three videos, resize the videos to 1920X1080, resize the watermark to the correct size, and then overlays the watermark to the lower left hand side. Then converts to the intended codecs for Youtube.
My question is how do I Blend at the same time using the -filter_complex workflow ?
Currently, I have the following workflow that does everything but set the opacityffmpeg -y -i fancy movie.mov -i logo.png -i in.mov -i out.mov -c:v libx264 -crf 18 -b:v 50000k -preset veryfast -tune film -profile:v high -x264opts cabac=1:keyint=16:bframes=2:keyint_min=15 -c:a libvo_aacenc -ab 128K -ar 48000 -filter_complex "[0:0] scale=1920X1080 [main]; [2:0] scale=1920X1080 [start]; [3:0] scale=1920X1080 [end]; [start] [2:1] [main] [0:1] [end] [3:1] concat=n=3:v=1:a=1 [prog]; [1:0] scale=480:90 [wm]; [prog] [wm] overlay=36:main_h-overlay_h-45" fart.mp4
I'm trying to add "blend=all_opacity=0.7" in the last step so the watermark will screen against the background. If I just add
[prog] [wm] overlay=36:main_h-overlay_h-45, blend=all_mode='overlay':all_opacity=0.7" fart.mp4
I get the error, Cannot find a matching stream for unlabeled input pad 1 on filter Parsed_blend_6
If I try a semicolon instead,
[prog] [wm] overlay=36:main_h-overlay_h-45; blend=all_mode='overlay':all_opacity=0.7
I get the error, Cannot find a matching stream for unlabeled input pad 0 on filter Parsed_blend_6 which makes me think that it's expecting something like [input 1] [input 2] blend command [output]. The problem is that I need it to overlay and blend.
I tried simplifying the code to just test if the blend operation was working properly.
ffmpeg -i test.mp4 -i logo.png -filter_complex "[0:0] scale=1920x1080 [wm]; [1:0] scale=1920x1080 [prog], [wm][prog] blend=all_mode='overlay':all_opacity=0.7" fart.mp4
I got the error First input link top parameters (size 1920x1080, SAR 1:1) do not match the corresponding second input link bottom parameters (1920x1080, SAR 243:80)
Failed to configure output pad on Parsed_blend_2So, in addition to the trouble with combining of filters, I'm also having an issue with resizing the watermark as FFMpeg sees it as a different aspect ratio than the other videos.
This is my second day with FFmpeg, so any help would be appreciated.
I'm currently working with FFMpeg version N-61061-gf34cceb