
Recherche avancée
Autres articles (34)
-
Qu’est ce qu’un éditorial
21 juin 2013, parEcrivez votre de point de vue dans un article. Celui-ci sera rangé dans une rubrique prévue à cet effet.
Un éditorial est un article de type texte uniquement. Il a pour objectif de ranger les points de vue dans une rubrique dédiée. Un seul éditorial est placé à la une en page d’accueil. Pour consulter les précédents, consultez la rubrique dédiée.
Vous pouvez personnaliser le formulaire de création d’un éditorial.
Formulaire de création d’un éditorial Dans le cas d’un document de type éditorial, les (...) -
Contribute to translation
13 avril 2011You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
MediaSPIP is currently available in French and English (...) -
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...)
Sur d’autres sites (3848)
-
FFMPEG zoompan for continuous zoom in/zoom out from start duration
10 juillet 2020, par Nikhil SolankiI am creating continues zoomin/zoomout effect for input image using this command :


ffmpeg -i combine.mp4 -i image1.jpg -filter_complex "[0]split=2[color][alpha]; 
[color]crop=iw/2:ih:0:0[color]; [alpha]crop=iw/2:ih:iw/2:ih[alpha]; [colo][alpha]alphamerge[v1];
[1]scale=540*2:960*2, setsar=1, zoompan=z='if(lte(zoom,1.0),1.02,max(1.001,zoom-0.0015))':d=25*0.25:x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':s=540x960[v2]; 
[v2]zoompan=z='if(gte(zoom,1.1),1.0,min(zoom+0.0015,1.1))':d=25*0.25:x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':s=540x960[v2];
[v2]curves=vintage, format=yuv444p[v2];
[v2][v1] overlay=1" output_video.mp4 -y



This command will continues
zoomin
andzoomout
inputimage1.jpg
for 1 second and stop after 1 second its OK. But problem is I want performzoomin
zoomout
effect after 5 second of video. Video duration is 20s. So, how can I performzoompan
after some duration.

-
FFMPEG performance on remote server
9 avril 2022, par Haider AliI am experiencing a completely different performance for FFMPEG on server as compare to development machine. Parameters used are


ffmpeg -i input.mp4 -ss 00:00 -to 02:20 -codec:v libx264 -preset ultrafast -force_key_frames 'expr:gte(t,n_forced*4)' -hls_time 4 -hls_playlist_type vod -hls_segment_type mpegts output.mp4



Development Machine Specs


MacBook Pro (15-inch, 2019)

Processor 2.6 GHz 6-Core Intel Core i7

Memory 16 GB 2400 MHz DDR4

Graphics Intel UHD Graphics 630 1536 MB

Remote Server Specs

8 VCPU

32 GB

Does ffmpeg need to have special specs to run on server ?


Same above command can take 15 to 20 minute on server while on development machine it only takes 2 minutes.


-
How do I buffer and capture an RTSP stream to disk based on a trigger ?
2 septembre 2022, par SJoshiI think what I'm asking about is similar to this ffmpeg post about how to capture a lightning strike (https://trac.ffmpeg.org/wiki/Capture/Lightning).


I have a Raspberry Pi with an IP cam over RTSP, and what I'm wondering is how to maintain a continual 5 second live video buffer, until I trigger a "save" command which will pipe that 5 second buffer to disk, and continue streaming the live video to disk until I turn it off.


Essentially, Pi boots up, this magic black box process starts and is saving live video into a fixed-size, 5-second buffer, and then let's say an hour later - I click a button, and it flushes that 5-second buffer to a file on disk and continues to pipe the video to disk, until I click cancel.


In my environment, I'm able to use ffmpeg, gstreamer, or openRTSP. For each of these, I can connect to my RTSP stream and save it to disk, but I'm not sure how to create this ever-present 5 second cache.


I feel like the gstreamer docs are alluding to it here (https://gstreamer.freedesktop.org/documentation/application-development/advanced/buffering.html?gi-language=c), but I guess I'm just not grokking how the buffering fits in with a triggered save. From that article, I get the impression that the end-time of the video is known in advance (I could artificially limit mine, I guess).


I'm not in a great position to post-process the file, so using something like openRTSP, saving a whole bunch of video segments, and then merging them isn't really an option.


Note : After a successful save, I wouldn't need to save another video for a minute or so, so that 5 second cache has plenty of time to fill back up before the next


This is the closest similar question that I've found : https://video.stackexchange.com/questions/18514/ffmpeg-buffered-recording