
Recherche avancée
Autres articles (64)
-
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Participer à sa traduction
10 avril 2011Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
Actuellement MediaSPIP n’est disponible qu’en français et (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
Sur d’autres sites (11541)
-
How to create an encoding ladder for any aspect ratio ?
18 avril 2024, par volume oneFor a given video uploaded by a user, I need to create three versions of it to cover standard definition (SD), high definition (HD), full high definition (FHD), and ultra high definition (UHD e.g. 4K). "resolution/encoding ladders" for standard aspect ratios like 16:9 and 4:3.


For 4:3 we might have :


640 x 480
960 x 720
1440 x 1080
2880 x 2160



For 16:9 we might have :


854 x 480
1280 x 720
1920 x 1080
3840 x 2160



If a user uploads a file in either of those aspect ratios, we can create the four different versions because the resolution standards are known.


However if a user uploads a video with an unforseen aspect ratio, say 23:19, then how would you go about formatting that video into SD, HD, FHD, and UHD versions ?


If a 23:19 video is indeed uploaded then I am not looking to resize it to fit a different 'standard' aspect ratio. It must remain the same aspect ratio, but have four quality versions. The problem is what height and width sizes to create for non-standard resolutions ?


I have already come accross many aspect ratios like
16:10, 21:9, 1.85:1, 2.39:1
. How could I take care of making quality variations of those ?

I am using
Node.js
andFFMpeg
for video processing.

-
ffmpeg filter_complex multiple overlays - why are they not syncing properly
28 décembre 2023, par howdoodI'm setting up live streaming using three overlaid inputs in ffmpeg. O:v is a webcam directly connected to the streaming box (a headless linux box) while 2:v and 3:v are ultra-low latency UDP streams from two remote R-Pis.


The filter_complex argument I'm using (that works, with all three video inputs perfectly in sync) is
[0:v] fifo [v1] ;[2:v] fifo [v2] ; [3:v] fifo [v3] ;[v1]setpts=PTS-STARTPTS[sync1] ;[v2]setpts=PTS-STARTPTS[sync2] ; [v3]setpts=PTS-STARTPTS+5/TB [sync3] ;[sync1][sync2] overlay=x=W-w:y=H-h [out1] ;[out1][sync3] overlay=x=0:y=H-h [vfinal]


Two questions about this :


- 

-
Why am I having to add +5/TB to [sync3] via setpts (in bold above) ? If I omit that, [3:v] is synced in to the stream noticeably ahead of the other two inputs. My best guess is that this may be to compensate for the processing time of the first overlay [sync1][sync2] which is introducing extra latency into that part of the stream. Is that right ?


-
Importantly - the value of the extra pts compensation has to be dialed in by hand and will be specific to the underlying hardware used. Is there a way of getting ffmpeg to calculate this automatically so I don't have to worry about keeping in dialed in, in production ?








Thanks for reading ! I'm at the limit of the ffmpeg docs here...


-
-
How to use the hardware acceleration for ffmepg on m1-max ?
10 février 2023, par ThoughtfulHackingSince there aren't m1 builds available from ffmpeg.org, I had to compile my own. Obviously, I'd like to get the best possible performance.


- 

- Does ffmpeg use the "Hardware-accelerated H.264" on the m1 max ?
- Is there anything I need to do, like compiler flags, to get it ?
- Any switch at run time ?
- How can I verify that it's being used ?










To compile ffmpeg, I just did the basics :


./configure --prefix=/tmp/ff --enable-gpl --enable-nonfree --enable-libx264
make
make install



For x264, I just did
./configure —prefix=/tmp/ff
make
make install


to run :


ffmpeg -i random.wmv -c:v libx264 -preset ultrafast -c:a aac output-ultra.mp4 



Anything else I should be doing ?