
Recherche avancée
Médias (2)
-
Core Media Video
4 avril 2013, par
Mis à jour : Juin 2013
Langue : français
Type : Video
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (107)
-
Diogene : création de masques spécifiques de formulaires d’édition de contenus
26 octobre 2010, parDiogene est un des plugins ? SPIP activé par défaut (extension) lors de l’initialisation de MediaSPIP.
A quoi sert ce plugin
Création de masques de formulaires
Le plugin Diogène permet de créer des masques de formulaires spécifiques par secteur sur les trois objets spécifiques SPIP que sont : les articles ; les rubriques ; les sites
Il permet ainsi de définir en fonction d’un secteur particulier, un masque de formulaire par objet, ajoutant ou enlevant ainsi des champs afin de rendre le formulaire (...) -
Gestion des droits de création et d’édition des objets
8 février 2011, parPar défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)
Sur d’autres sites (8563)
-
ffmpeg - Understand images to video output, players show different images and lengths
8 juillet 2023, par Matt CI'm using multiple images to create a video with each image on a 1-second long frame. This is the command I'm using :


ffmpeg -framerate 1 -i 'image%d.jpg' -c:v libx264 -r 1 -pix_fmt yuv420p out.mp4



This seems straightforward and exactly what many others have done with success. However, the output I get is an mp4 which windows file explorer says is 4 seconds long, and is different in VLC and Windows media player and neither is the desired output.


In Windows : the video plays for 4 seconds with a black screen and at this point the time line at the bottom is filled up at 4 seconds, indicating the video is over. But it keeps playing, for another 4 seconds. And the last 4 seconds (from 0:04 to 0:07) is actually the desired output.







 Frame 

Image 







 1 

black screen 




 2 

black screen 




 3 

black screen 




 4 

black screen 




 5 

image1.jpg 




 6 

image2.jpg 




 7 

image3.jpg 




 8 

image4.jpg 









In VLC : the video shows last image supplied as input for 3 seconds, followed by the second to last image for 1 second.







 Frame 

Image 







 1 

image4.jpg 




 2 

image4.jpg 




 3 

image4.jpg 




 4 

image3.jpg 









Questions :


- 

- How/Why are these different in different players ?
- Why, in VLC, are only two images showing up, and why would one of them last for 3 seconds ?
- In Windows, why/how is the video 8 seconds long but shows up as 4 seconds both in the file explorer and in the actual media player ?
- How do I get the desired output, and what caused my case to not work as it did for seemingly most others ?










-
How can I show that a frame has been duplicated to extend the video framerate using ffprobe ?
14 juillet 2023, par Brandon JAs the title suggests I have a video.mp4 which I know visually has been extended from 5fps to 20fps. I know this because there are 256 frames and when I run ffprobe it reports 20fps and the video is 12.8 sec long. I also run


ffprobe -v 0 -select_streams v -show_entries stream=duration_ts,time_base,nb_frames video.mp4


reports to me 256 frames, 1/20 timebase adn 256 duration. This matches the expected 12.8 s duration. When I manually sort through the extracted frames I can see the frames have been held for 4 ticks. So it should be 5fps.


I then run the below to view the packets and the frames (cmd not typed)


ffprobe -show_packets -select_streams v:0 video.mp4


and the packets or frames don't seem to give me a huge indication that the frames have been duplicated.


With the -show_packets cmd the only possible indication of duplication I can see is that every 0.2 seconds, (consistent with 5fps) the size of the packets go from a consistent 150-300 size to around 16000 or so. Is there a way I can better articulate what I am seeing with the packet size change ? Why has their compression or encoder (forgive any error in verbiage) decided to duplicate frames to achieve 20fps vs extending the pts to 0.2 seconds for each packet ? It seems like simply defining a longer pts would reduce overall file size anyways ?


All that said, is there something within ffprobe or other tool I can use to more efficiently confirm what I am visually seeing to say yep these frames were just duplicated from another program ? Thanks !


-
ffmpeg command to scale, show images at exactly 130bpm [closed]
17 août 2023, par S. ImpI have a sequence of images which I would like to display to some music that plays at 130bpm. I'll also need to scale the images, which are a rather strange 2673x2151 pixels each down to something, ideally something that would fit without stretching inside a 1080p frame — e.g. 1342x1080.


130BPM yields weirdness with frame rates. There are 2.16666666667 (13/6) beats per second. This being the case, I can't figure out how many frames to show each each image at the usual frame rates (24, 25, 30 fps). If I could make a movie with a frame rate of 2.16666667 frames per second, i could simply show each image for one frame. This seems like it might actually be optimal — it would probably make a a very compact video file, right ??


Alternatively, if we must set the frame rate to an positive integer value, 13 frames per second works if we just display each image for six frames. 13 FPS means 780 frames per minute. 780 frames divided by 130 beats means 6 frames per beat.


Finally, my images are named j1.jpg, j2.jpg, j3.jpg, etc.


Can someone help me concoct an ffmpeg command to assemble these images into a video with each image lasting one beat at 130BPM ? I've been trying to massage this command, which does assemble the images into a movie, but my attempts to specify a frame rate have had weird effects. E.g., doing a -r results in strange videos that change image very erratically. I think it's because there's a
setpts=N/25/TB
bit in there.

ffmpeg -pattern_type glob -i "j*.jpg" -filter_complex "[0]reverse[r];[0][r]concat,loop=2:250,setpts=N/25/TB,scale=1342:1080" -pix_fmt yuv420p -c:v libx264 -preset slow -b:v 3500k output_looped.mp4



Also, I don't understand what the 250 means in
loop=2:250
. If someone could explain that to me, I'd be grateful.