
Recherche avancée
Autres articles (65)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Modifier la date de publication
21 juin 2013, parComment changer la date de publication d’un média ?
Il faut au préalable rajouter un champ "Date de publication" dans le masque de formulaire adéquat :
Administrer > Configuration des masques de formulaires > Sélectionner "Un média"
Dans la rubrique "Champs à ajouter, cocher "Date de publication "
Cliquer en bas de la page sur Enregistrer -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
Sur d’autres sites (9095)
-
Generate and concatenate videos from images with ffmpeg in single command
17 août 2022, par YulkyTulkyMy goal is to generate a video from images. Let's say I have 2 images
1.png
and2.png
.

I can do


ffmpeg -loop 1 1.png -t 3 1.mp4



ffmpeg -loop 1 2.png -t 5 2.mp4



to create a 3 second video from the first image and 5 second video from the second image.


Then, I merge the two videos using


ffmpeg -i 1.mp4 -I 2.mp4 -filter_complex "concat" final.mp4 



to create my final 8 second video.


This process seems extremely inefficient, and I feel I do not have to use all this processing power+disk reading/writing to create 2 intermediary video files when I only want the one final video.


Is there a way to execute this entire process in one ffmpeg command (efficiently) ?


-
How can I play audio file(.mp3, .flac, .wav) and then loop over it (mix every few seconds) another audio file(wav) using ffmpeg
11 mars 2019, par lukistarI got two different commands.
ffmpeg -i input.mp3 -i second.mp3 -filter_complex "[0:a]atrim=end=10,asetpts=N/SR/TB[begin];[0:a]atrim=start=10,asetpts=N/SR/TB[end];[begin][1:a][end]concat=n=3:v=0:a=1[a]" -map "[a]" output
This command inserts second.mp3 into input.mp3. It seems to always keep the parameters of input.mp3. It inserts it in exact 10 seconds of input.mp3.
Here is the second command :
ffmpeg -i input.mp3 -i second.mp3 -filter_complex "[1:a]adelay=10000|10000[1a];[0:a][1a]amix=duration:first" output
This command is closer to my final goal. It plays input.mp3 and in exact 10 seconds it plays along second.mp3 without stopping input.mp3’s sound.(I think that’s called mixing ?)
My final goal is to create final.mp3.
Its duration must always equal input.mp3 duration. It must keep the samplerate, the count of channels, etc of input.mp3
When playing final.mp3, it must play the whole input.mp3.
But each 10-15 seconds, it must play second.mp3 without stopping input.mp3.(mix)
It could be said that I must use "Second command" but in a loop.
It would be great if there is one-line command for that in ffmpeg.
I am working with flac, mp3 and wav and both of the commands were suitable for that.For example :
input.mp3 could be 40 seconds long.
second.mp3 could be 2 seconds long.
When I play final.mp3 it will be 40 seconds long, but each 10-15 seconds(on random) it will play second.mp3 at the same time as input.mp3.
Sadly I have no experience with ffmpeg, both of the commands I got are answers to questions here in stackoverflow. Hope somebody can help me. Thank you !
-
JavaCV Video Player with FFmpeg and JavaFx
2 juin 2021, par ِِYazdan NaderiI want to create a media player using Java CV, but I can not adjust the frame rate
And because of this, some videos play fast and some play slow


Is it possible to read the information of a video through a FFmpegFrameGrraber object and set it in another object to solve the problem ?
Of course, I did this and did not get an answer, but if there is another way, please help


playThread = new Thread(new Runnable() {
 public void run() {
 try {
 final String videoFilename = "E:\\Java\\s1\\3.mp4";
 final String videoFilename2 = "E:AudioVideo.mp4";
 final String videoFilename3 = "E:\\1.mp4";
 final String videoFilename4 = "E:\\Java\\s1\\3.mp4";


 FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(videoFilename);



 grabber.start();

 grabber.setFrameRate(30.00);


 primaryStage.setWidth(grabber.getImageWidth());
 primaryStage.setHeight(grabber.getImageHeight());

 final AudioFormat audioFormat = new AudioFormat(grabber.getSampleRate(), 16, grabber.getAudioChannels(), true, true);

 final DataLine.Info info = new DataLine.Info(SourceDataLine.class, audioFormat);
 final SourceDataLine soundLine = (SourceDataLine) AudioSystem.getLine(info);
 soundLine.open(audioFormat);
 soundLine.start();

 JavaFXFrameConverter converter = new JavaFXFrameConverter();

 ExecutorService executor = Executors.newSingleThreadExecutor();

 while (!Thread.interrupted()) {
 Frame frame = grabber.grab();
 if (frame == null) {
 break;
 }
 if (frame.image != null) {
 final Image image = converter.convert(frame);//
 Platform.runLater(() -> {
 
 imageView.setImage(image);

 });
 }
 else if (frame.samples != null) {
 final ShortBuffer channelSamplesShortBuffer = (ShortBuffer) frame.samples[0];
 channelSamplesShortBuffer.rewind();

 final ByteBuffer outBuffer = ByteBuffer.allocate(channelSamplesShortBuffer.capacity() * 2);

 for (int i = 0; i < channelSamplesShortBuffer.capacity(); i++) {
 short val = channelSamplesShortBuffer.get(i);
 outBuffer.putShort(val);
 }

 try {



 executor.execute(() -> {
 soundLine.write(outBuffer.array(), 0, outBuffer.capacity());
 outBuffer.clear();
 });
 } catch (Exception interruptedException) {
 Thread.currentThread().interrupt();
 }
 }

 }
 executor.shutdownNow();
 executor.awaitTermination(10, TimeUnit.SECONDS);

 soundLine.stop();
 grabber.stop();
 grabber.release();
 Platform.exit();
 } catch (Exception exception) {
 LOG.log(Level.SEVERE, null, exception);
 System.exit(1);
 }
 }
 });
 playThread.start();
 }