
Recherche avancée
Médias (2)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (84)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)
Sur d’autres sites (9017)
-
Humble Video take snapshot of given time
23 mai 2021, par Boldbayar- 

-
Hello, I'm using https://github.com/artclarke/humble-video to take a thumbnail from a video.
So far I have successfully managed to take a snapshot from a video at start with following method.


private static Path generateThumbnail(final Path videoFile)
 throws InterruptedException, IOException {

 final Demuxer demuxer = Demuxer.make();
 demuxer.open(videoFile.toString(), null, false, true, null, null);

 int streamIndex = -1;
 Decoder videoDecoder = null;
 String rotate = null;
 final int numStreams = demuxer.getNumStreams();
 for (int i = 0; i < numStreams; ++i) {
 final DemuxerStream stream = demuxer.getStream(i);
 final KeyValueBag metaData = stream.getMetaData();
 final Decoder decoder = stream.getDecoder();
 if (decoder != null
 && decoder.getCodecType() == MediaDescriptor.Type.MEDIA_VIDEO) {
 videoDecoder = decoder;
 streamIndex = i;
 rotate = metaData.getValue("rotate", KeyValueBag.Flags.KVB_NONE);
 break;
 }
 }

 if (videoDecoder == null) {
 throw new IOException("Not a valid video file");
 }
 videoDecoder.open(null, null);

 final MediaPicture picture = MediaPicture.make(videoDecoder.getWidth(),
 videoDecoder.getHeight(), videoDecoder.getPixelFormat());

 final MediaPictureConverter converter = MediaPictureConverterFactory
 .createConverter(MediaPictureConverterFactory.HUMBLE_BGR_24, picture);

 final MediaPacket packet = MediaPacket.make();
 BufferedImage image = null;
 MUX : while (demuxer.read(packet) >= 0) {
 if (packet.getStreamIndex() != streamIndex) {
 continue;
 }
 int offset = 0;
 int bytesRead = 0;
 videoDecoder.decodeVideo(picture, packet, offset);
 do {
 bytesRead += videoDecoder.decode(picture, packet, offset);
 if (picture.isComplete()) {
 image = converter.toImage(null, picture);
 break MUX;
 }
 offset += bytesRead;

 } while (offset < packet.getSize());
 }
 if (image == null) {
 throw new IOException("Unable to find a complete video frame");
 }
 if (rotate != null) {
 final AffineTransform transform = new AffineTransform();
 transform.translate(0.5 * image.getHeight(), 0.5 * image.getWidth());
 transform.rotate(Math.toRadians(Double.parseDouble(rotate)));
 transform.translate(-0.5 * image.getWidth(), -0.5 * image.getHeight());
 final AffineTransformOp op = new AffineTransformOp(transform,
 AffineTransformOp.TYPE_BILINEAR);
 image = op.filter(image, null);
 }

 final Path target = videoFile.getParent()
 .resolve(videoFile.getFileName() + ".thumb.jpg");

 final double mul;
 if (image.getWidth() > image.getHeight()) {
 mul = 216 / (double) image.getWidth();
 } else {
 mul = 216 / (double) image.getHeight();
 }

 final int newW = (int) (image.getWidth() * mul);
 final int newH = (int) (image.getHeight() * mul);
 final Image thumbnailImage = image.getScaledInstance(newW, newH,
 Image.SCALE_SMOOTH);
 image = new BufferedImage(newW, newH, BufferedImage.TYPE_INT_BGR);

 final Graphics2D g2d = image.createGraphics();
 g2d.drawImage(thumbnailImage, 0, 0, null);
 g2d.dispose();

 ImageIO.write(image, "jpeg", target.toFile());
 return target.toAbsolutePath(); }



-
Now, what I want to do is take a snapshot after 2 seconds after the video starts, is it possible ? I
have tried using the "Demuxer" -s seek method but no luck.








-
-
Generate and concatenate videos from images with ffmpeg in single command
17 août 2022, par YulkyTulkyMy goal is to generate a video from images. Let's say I have 2 images
1.png
and2.png
.

I can do


ffmpeg -loop 1 1.png -t 3 1.mp4



ffmpeg -loop 1 2.png -t 5 2.mp4



to create a 3 second video from the first image and 5 second video from the second image.


Then, I merge the two videos using


ffmpeg -i 1.mp4 -I 2.mp4 -filter_complex "concat" final.mp4 



to create my final 8 second video.


This process seems extremely inefficient, and I feel I do not have to use all this processing power+disk reading/writing to create 2 intermediary video files when I only want the one final video.


Is there a way to execute this entire process in one ffmpeg command (efficiently) ?


-
How can I play audio file(.mp3, .flac, .wav) and then loop over it (mix every few seconds) another audio file(wav) using ffmpeg
11 mars 2019, par lukistarI got two different commands.
ffmpeg -i input.mp3 -i second.mp3 -filter_complex "[0:a]atrim=end=10,asetpts=N/SR/TB[begin];[0:a]atrim=start=10,asetpts=N/SR/TB[end];[begin][1:a][end]concat=n=3:v=0:a=1[a]" -map "[a]" output
This command inserts second.mp3 into input.mp3. It seems to always keep the parameters of input.mp3. It inserts it in exact 10 seconds of input.mp3.
Here is the second command :
ffmpeg -i input.mp3 -i second.mp3 -filter_complex "[1:a]adelay=10000|10000[1a];[0:a][1a]amix=duration:first" output
This command is closer to my final goal. It plays input.mp3 and in exact 10 seconds it plays along second.mp3 without stopping input.mp3’s sound.(I think that’s called mixing ?)
My final goal is to create final.mp3.
Its duration must always equal input.mp3 duration. It must keep the samplerate, the count of channels, etc of input.mp3
When playing final.mp3, it must play the whole input.mp3.
But each 10-15 seconds, it must play second.mp3 without stopping input.mp3.(mix)
It could be said that I must use "Second command" but in a loop.
It would be great if there is one-line command for that in ffmpeg.
I am working with flac, mp3 and wav and both of the commands were suitable for that.For example :
input.mp3 could be 40 seconds long.
second.mp3 could be 2 seconds long.
When I play final.mp3 it will be 40 seconds long, but each 10-15 seconds(on random) it will play second.mp3 at the same time as input.mp3.
Sadly I have no experience with ffmpeg, both of the commands I got are answers to questions here in stackoverflow. Hope somebody can help me. Thank you !