
Recherche avancée
Autres articles (12)
-
Installation en mode standalone
4 février 2011, parL’installation de la distribution MediaSPIP se fait en plusieurs étapes : la récupération des fichiers nécessaires. À ce moment là deux méthodes sont possibles : en installant l’archive ZIP contenant l’ensemble de la distribution ; via SVN en récupérant les sources de chaque modules séparément ; la préconfiguration ; l’installation définitive ;
[mediaspip_zip]Installation de l’archive ZIP de MediaSPIP
Ce mode d’installation est la méthode la plus simple afin d’installer l’ensemble de la distribution (...) -
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Déploiements possibles
31 janvier 2010, parDeux types de déploiements sont envisageable dépendant de deux aspects : La méthode d’installation envisagée (en standalone ou en ferme) ; Le nombre d’encodages journaliers et la fréquentation envisagés ;
L’encodage de vidéos est un processus lourd consommant énormément de ressources système (CPU et RAM), il est nécessaire de prendre tout cela en considération. Ce système n’est donc possible que sur un ou plusieurs serveurs dédiés.
Version mono serveur
La version mono serveur consiste à n’utiliser qu’une (...)
Sur d’autres sites (3329)
-
Filter useless white frames at the beginning and duplicated frames at the end of captured video
21 septembre 2020, par gumkinsI'm capturing HTML animation with use of Puppeteer and MediaRecorder API.


Before starting the capturing I'm waiting for
networkidle
event (I tried networkidle0-2 but result is identical)

await page.goto(url, { waitUntil: 'networkidle0' })



For some reason, the animation starts to play 2-3 seconds after the capturing starts, and thus, white frames are captures.
Similar, at the end of the video there are identical frames because capturing duration is a bit longer than animation plays.


Thus I want to detect and cut off those repeating white frames at the beginning and repeating non-white frames at the end of the video (mp4/webm).


I tried some solutions, like described here, for instance


ffmpeg -i input.mp4 -vf mpdecimate,setpts=N/FRAME_RATE/TB out.mp4



It does remove duplicates, but the problem is that it cuts off dups in the middle as well.


Thus, if I have a 15 sec animation,it removes all dups at the beginning, all dups at the end and all dups in the middle, and what is left after it is just a several identical frames which are pack into less than 1 sec video.


-
ffmpeg convert white background to transparent
7 janvier 2021, par tonyHello i took a picture of a framed sheet with my camera and i'm trying to convert the white background to transparent without using the colorkey filter. It is meant to be for a video using a number of frames overlayed to a static background but i started with only one png to see if it works.


So i started making 1 palette with only 2 colors with -f lavfi (black and white) and 1 palette to add the transparency


ffmpeg -i blackwhite.png -filter_complex "[0:v]split[a][b];[a]palettegen=max_colors=4[out1];\
 [b]palettegen=max_colors=4:reserve_transparent=on:\
 transparency_color=#FFFFFF[out2]" \
 -c:v png \
 -map [out1] paletteNormal.png \
 -map [out2] paletteTransparent.png 



then i mapped the png that i want to convert to the first palette to make uniform colors and then to the second to add the transparency and i overlayed the result to a background image that should be red


ffmpeg -i image.png -i paletteNormal.png -i paletteTransparent.png -i background.png \
 -filter_complex "[0:v][1:v]paletteuse=dither=bayer[a],\
 [a]split[a1][a2]; \
 [a1][2:v]paletteuse=alpha_threshold=128[c];[3:v][c]overlay[d]" \
 -map [a2] -c:v png out.png \
 -map [d] -c:v png out1.png



the png mapped to the first palette comes as it should be, pure black and white
the second comes with no transparency at all and covers the background
i tried different combinations like to make the second palette of 255colors..... nothing


what am i doing wrong ? i know it can be done
i went to this site and i made an alpha channel in 5 seconds with a lot of grief and rage.linkOfPain


p.s for a video from png frames what codec should i use ?


-
ffmpeg convert white color to alpha
11 janvier 2021, par tonyHello i took a picture with my camera :a drawing of a square frame on a white paper ; i'm trying to convert the white to transparent and keep the black color frame.


So i started making 1 palette with only 2 colors to make the colors uniform and 1 palette to add the transparency


ffmpeg -f lavfi -i "color=color=white:100x100" -f lavfi -i "color=color=black:100x100" -filter_complex "[0][1]hstack" -frames:v 1 blackwhite.png`
ffmpeg -i blackwhite.png -filter_complex "[0]split[a][b];[a]palettegen[pal1];[b]palettegen=reserve_transparent=on:transparency_color=white[pal2]" -map [pal1] palette1.png -map [pal2] palette2.png



then i mapped the image png of the frame to convert the white to transparent and overlayed the result to a red background


ffmpeg -i image.png -i palette1.png -i palette2.png -i background.png -filter_complex "[0:v][1:v]paletteuse=dither=bayer[a],[a]split[a1][a2];[a1][2:v]paletteuse=alpha_threshold=128[c];[3:v][c]overlay[d]" -map [a2] -c:v png out.png -map [d] -c:v png out1.png



the png mapped to the first palette (as a test) comes as it should be, pure black and white,the second comes with no transparency at all and covers the background
what am i doing wrong ?