
Recherche avancée
Médias (1)
-
The Great Big Beautiful Tomorrow
28 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Texte
Autres articles (34)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
List of compatible distributions
26 avril 2011, parThe table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...) -
Submit enhancements and plugins
13 avril 2011If you have developed a new extension to add one or more useful features to MediaSPIP, let us know and its integration into the core MedisSPIP functionality will be considered.
You can use the development discussion list to request for help with creating a plugin. As MediaSPIP is based on SPIP - or you can use the SPIP discussion list SPIP-Zone.
Sur d’autres sites (5780)
-
How to create an encoding ladder for any aspect ratio ?
18 avril 2024, par volume oneFor a given video uploaded by a user, I need to create three versions of it to cover standard definition (SD), high definition (HD), full high definition (FHD), and ultra high definition (UHD e.g. 4K). "resolution/encoding ladders" for standard aspect ratios like 16:9 and 4:3.


For 4:3 we might have :


640 x 480
960 x 720
1440 x 1080
2880 x 2160



For 16:9 we might have :


854 x 480
1280 x 720
1920 x 1080
3840 x 2160



If a user uploads a file in either of those aspect ratios, we can create the four different versions because the resolution standards are known.


However if a user uploads a video with an unforseen aspect ratio, say 23:19, then how would you go about formatting that video into SD, HD, FHD, and UHD versions ?


If a 23:19 video is indeed uploaded then I am not looking to resize it to fit a different 'standard' aspect ratio. It must remain the same aspect ratio, but have four quality versions. The problem is what height and width sizes to create for non-standard resolutions ?


I have already come accross many aspect ratios like
16:10, 21:9, 1.85:1, 2.39:1
. How could I take care of making quality variations of those ?

I am using
Node.js
andFFMpeg
for video processing.

-
Chromecast HLS : Unable to derive timescale
31 juillet 2020, par ScottI'm trying to get fmp4 HLS playing back on a new Chromecast (3rd gen I believe, not Ultra).


I've tried encoding the content with ffmpeg using both x264 and h264 libraries.
The main profile initially gives me a codec not supported error, remove the codec list from the hls manifest fixes this issue.


Switching to baseline (which is not ideal) doesn't give the codec error.


Both then (after removing the codec definitions or using baseline) give the following error :


Uncaught Error: Unable to derive timescale
 at Xl (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:344)
 at Y.$e (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:337)
 at Y.k.processSegment (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:340)
 at Am.k.processSegment (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:384)
 at Mj.$e (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:238)
 at Wj (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:236)
 at Oj (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:240)
 at Mj.fd (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:239)
 at Nc (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:39)
 at wi.Mc.dispatchEvent (www.gstatic.com/cast/sdk/libs/mediaplayer/1.0.0/media_player.js:38)



-
ffmpeg filter_complex multiple overlays - why are they not syncing properly
28 décembre 2023, par howdoodI'm setting up live streaming using three overlaid inputs in ffmpeg. O:v is a webcam directly connected to the streaming box (a headless linux box) while 2:v and 3:v are ultra-low latency UDP streams from two remote R-Pis.


The filter_complex argument I'm using (that works, with all three video inputs perfectly in sync) is
[0:v] fifo [v1] ;[2:v] fifo [v2] ; [3:v] fifo [v3] ;[v1]setpts=PTS-STARTPTS[sync1] ;[v2]setpts=PTS-STARTPTS[sync2] ; [v3]setpts=PTS-STARTPTS+5/TB [sync3] ;[sync1][sync2] overlay=x=W-w:y=H-h [out1] ;[out1][sync3] overlay=x=0:y=H-h [vfinal]


Two questions about this :


- 

-
Why am I having to add +5/TB to [sync3] via setpts (in bold above) ? If I omit that, [3:v] is synced in to the stream noticeably ahead of the other two inputs. My best guess is that this may be to compensate for the processing time of the first overlay [sync1][sync2] which is introducing extra latency into that part of the stream. Is that right ?


-
Importantly - the value of the extra pts compensation has to be dialed in by hand and will be specific to the underlying hardware used. Is there a way of getting ffmpeg to calculate this automatically so I don't have to worry about keeping in dialed in, in production ?








Thanks for reading ! I'm at the limit of the ffmpeg docs here...


-