
Recherche avancée
Autres articles (67)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)
Sur d’autres sites (8758)
-
How to concat videos with FFMPEG without a black frame in the middle
21 juillet 2020, par david kI need to join two videos together programmatically, and FFMPEG seems to be working perfectly except it leaves a single black frame between the two joined videos.


Based on this post, I'm guessing it's because the video and audio streams aren't exactly the same length (not sure why). Since I'm seeing a black flash, I'm guessing that the audio is slightly longer than the video, but I can't figure out how to remedy that. I'm fine losing a tiny piece of the audio at the end of the file.


Can anyone help me "trim" the end of my audio stream so that it matches my video stream ? I need to be able to do this for many files, so I can't just hard-code a value into my command.


I've attempted both of the methods that FFMPEG provides to concat, and both of them have this issue. I'd be fine with either approach, if I can get rid of the black frame.


Transcode :


ffmpeg -i video1.mp4 -i video2.mp4 -filter_complex \
 "[0:v][0:a][1:v][1:a] concat=n=2:v=1:a=1 [outv] [outa]" \
 -map "[outv]" -map "[outa]" output.mp4



Re-wrap :


ffmpeg -f concat -i files.txt -c copy output.mp4



Here is the output of ffmpeg -i video1.mp4 -i video2.mp4 :


ffmpeg version 4.2.2 Copyright (c) 2000-2019 the FFmpeg developers
 built with Apple clang version 11.0.0 (clang-1100.0.33.16)
 configuration: --prefix=/usr/local/Cellar/ffmpeg/4.2.2_1 --enable-shared --enable-pthreads --enable-version3 --enable-avresample --cc=clang --host-cflags='-I/Library/Java/JavaVirtualMachines/adoptopenjdk-13.0.1.jdk/Contents/Home/include -I/Library/Java/JavaVirtualMachines/adoptopenjdk-13.0.1.jdk/Contents/Home/include/darwin -fno-stack-check' --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libmp3lame --enable-libopus --enable-librubberband --enable-libsnappy --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-libsoxr --enable-videotoolbox --disable-libjack --disable-indev=jack
 libavutil 56. 31.100 / 56. 31.100
 libavcodec 58. 54.100 / 58. 54.100
 libavformat 58. 29.100 / 58. 29.100
 libavdevice 58. 8.100 / 58. 8.100
 libavfilter 7. 57.100 / 7. 57.100
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 5.100 / 5. 5.100
 libswresample 3. 5.100 / 3. 5.100
 libpostproc 55. 5.100 / 55. 5.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'video1.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 0
 compatible_brands: isommp42
 creation_time : 2020-07-20T20:43:05.000000Z
 Duration: 00:00:09.88, start: 0.000000, bitrate: 1330 kb/s
 Stream #0:0(und): Audio: aac (LC) (mp4a / 0x6134706D), 32000 Hz, mono, fltp, 53 kb/s (default)
 Metadata:
 creation_time : 2020-07-20T20:43:05.000000Z
 handler_name : AAC audio
 Stream #0:1(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p, 1280x720, 1274 kb/s, 25 fps, 25 tbr, 30k tbn, 60k tbc (default)
 Metadata:
 creation_time : 2020-07-20T20:43:05.000000Z
 handler_name : H.264/AVC video
 encoder : AVC Coding
Input #1, mov,mp4,m4a,3gp,3g2,mj2, from 'video2.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 0
 compatible_brands: isommp42
 creation_time : 2020-07-20T20:43:06.000000Z
 Duration: 00:00:11.68, start: 0.000000, bitrate: 1346 kb/s
 Stream #1:0(und): Audio: aac (LC) (mp4a / 0x6134706D), 32000 Hz, mono, fltp, 53 kb/s (default)
 Metadata:
 creation_time : 2020-07-20T20:43:06.000000Z
 handler_name : AAC audio
 Stream #1:1(und): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p, 1280x720, 1289 kb/s, 25 fps, 25 tbr, 30k tbn, 60k tbc (default)
 Metadata:
 creation_time : 2020-07-20T20:43:06.000000Z
 handler_name : H.264/AVC video
 encoder : AVC Coding
At least one output file must be specified



-
ffmpeg : Infinite length output when overlaying subtitles onto black image
18 juillet 2020, par rosuavI'm trying to do some analysis of image-based subtitles by outputting them as a sequence of PNGs to a pipe. My command line looks like this :


ffmpeg -y -i $INPUTFILE -f lavfi -i color=c=black:s=1920x1080 -filter_complex "[1:v][0:s:5]overlay[v]" -shortest -map "[v]" -c:v png -f image2pipe - | pike subspng.pike



In theory,
-shortest
should mean that the stream stops at the shortest input, which would be roughly seven minutes of input file. Instead, my script receives an infinite sequence of black frames after the last frame of subtitles, until I send FFMPEG a SIGINT. Placing-shortest
before-filter_complex
has the same effect.

Is there a different way to force the filtering to stop at the end of the input file ?


EDIT : Using the
shortest=1
flag on the overlay filter also doesn't help, even in combination with-shortest
.

-
Android Studio - Append two videos but showing black screen
6 septembre 2020, par DarkhmarFirst of all hello,


When I add another mp4 file at the end of an mp4 file, the video looks completely black screen but I can get the video sounds.


If I add a copy of a video to the end of it, I saw that there was no problem, but this is not a scenario I want.


I am getting the "text relocations" error when merging with the FFmpeg library. (This problem is fixed when I change this library version, but I have other problems. Therefore, mp4parser is an alternative I see.)


The piece of code that allows the videos to be merged is as follows. Thank you from now.


private void appendTwoVideos(String firstVideoPath, String secondVideoPath) {
 try {
 Movie[] inMovies = new Movie[2];

 inMovies[0] = MovieCreator.build(firstVideoPath);
 inMovies[1] = MovieCreator.build(secondVideoPath);

 List<track> videoTracks = new LinkedList<>();
 List<track> audioTracks = new LinkedList<>();

 for (Movie m : inMovies) {
 for (Track t : m.getTracks()) {
 if (t.getHandler().equals("soun")) {
 audioTracks.add(t);
 }
 if (t.getHandler().equals("vide")) {
 videoTracks.add(t);
 }
 }
 }

 Movie result = new Movie();

 if (audioTracks.size() > 0) {
 result.addTrack(new AppendTrack(audioTracks
 .toArray(new Track[audioTracks.size()])));
 }
 if (videoTracks.size() > 0) {
 result.addTrack(new AppendTrack(videoTracks
 .toArray(new Track[videoTracks.size()])));
 }

 BasicContainer out = (BasicContainer) new DefaultMp4Builder().build(result);

 @SuppressWarnings("resource")
 FileChannel fc = new RandomAccessFile(DIRECTORY_PATH + "output.mp4", "rw").getChannel();
 out.writeContainer(fc);
 fc.close();
 } catch (FileNotFoundException e) {
 e.printStackTrace();
 } catch (IOException e) {
 e.printStackTrace();
 }
}
</track></track>