
Recherche avancée
Médias (9)
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#3 The Safest Place
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (105)
-
L’agrémenter visuellement
10 avril 2011MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté. -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.
Sur d’autres sites (8215)
-
Combining three or more videos with ffmpeg and the xfade filter
25 février 2021, par silAs of 2020 ffmpeg has the xfade filter which can combine videos with a transition. Combining two videos is easy enough :


ffmpeg -i vid1.mp4 -i vid2.mp4 \
 -filter_complex [0][1]xfade=transition=pixelize:duration=1:offset=4,format=yuv420p" \
 out.mp4



But I don't understand how to combine three videos (so that v1 fades into v2 and v2 then fades into v3. I tried something like this :


ffmpeg -i vid1.mp4 -i vid2.mp4 -i vid3.mp4 \
 -filter_complex [0][1]xfade=transition=pixelize:duration=1:offset=4,format=yuv420p[0n1];[0n1][2]xfade=transition=pixelize:duration=1:offset=9,format=yuv420p" \
 out.mp4



but that doesn't work. My idea was that 0 and 1, or vid1 and vid2, would be combined into a [0n1] stream with a transition by xfade, and then that 0n1 stream could be combined with vid3 with another filter. As far as I can tell, this includes the first two videos but not the third. What this of course means is that I don't understand how to specify a filtergraph correctly !


How should I use xfade to combine 3 or more videos with transitions between them ?


A full example is as follows. Here I'll use three images (because then issues with combining videos at different frame rates are avoided), and smash them all to 500x500 for ease (in the final version they would be letterboxed to keep resolution and so on).


ffmpeg \
 -loop 1 -t 5 -i tests/p1.jpg \
 -loop 1 -t 5 -i tests/p2.jpg \
 -loop 1 -t 5 -i tests/p3.jpg \
 -filter_complex "[0]scale=500:500[s0];[1]scale=500:500[s1];[2]scale=500:500[s2];[s0][s1]xfade=transition=pixelize:duration=1:offset=4,format=yuv420p[s01];[s01][s2]xfade=transition=pixelize:duration=1:offset=9,format=yuv420p" out.mp4



I would expect this to create a video which was :


- 

- 4 seconds of p1.jpg
- a pixelise transition into p2.jpg lasting 1 second
- 4 seconds of p2.jpg
- a pixelise transition into p3.jpg lasting 1 second
- 4 seconds of p3.jpg












but what I actually get is


- 

- 4 seconds of p1.jpg
- a pixelise transition into p2.jpg lasting 1 second
- 4 seconds of p2.jpg








and then the video ends. p3 is not included at all.


The output is as follows :


ffmpeg version N-53546-g5eb4405fc5-static https://johnvansickle.com/ffmpeg/ Copyright (c) 2000-2020 the FFmpeg developers
 built with gcc 8 (Debian 8.3.0-6)
 configuration: --enable-gpl --enable-version3 --enable-static --disable-debug --disable-ffplay --disable-indev=sndio --disable-outdev=sndio --cc=gcc --enable-fontconfig --enable-frei0r --enable-gnutls --enable-gmp --enable-libgme --enable-gray --enable-libaom --enable-libfribidi --enable-libass --enable-libvmaf --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librubberband --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libvorbis --enable-libopus --enable-libtheora --enable-libvidstab --enable-libvo-amrwbenc --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libdav1d --enable-libxvid --enable-libzvbi --enable-libzimg
 libavutil 56. 56.100 / 56. 56.100
 libavcodec 58. 97.100 / 58. 97.100
 libavformat 58. 49.100 / 58. 49.100
 libavdevice 58. 11.101 / 58. 11.101
 libavfilter 7. 87.100 / 7. 87.100
 libswscale 5. 8.100 / 5. 8.100
 libswresample 3. 8.100 / 3. 8.100
 libpostproc 55. 8.100 / 55. 8.100
Input #0, image2, from 'tests/p1.jpg':
 Duration: 00:00:00.04, start: 0.000000, bitrate: 44845 kb/s
 Stream #0:0: Video: mjpeg (Baseline), yuvj444p(pc, bt470bg/unknown/unknown), 820x1270 [SAR 150:150 DAR 82:127], 25 fps, 25 tbr, 25 tbn, 25 tbc
Input #1, image2, from 'tests/p2.jpg':
 Duration: 00:00:00.04, start: 0.000000, bitrate: 22325 kb/s
 Stream #1:0: Video: mjpeg (Baseline), yuvj420p(pc, bt470bg/unknown/unknown), 960x600 [SAR 1:1 DAR 8:5], 25 fps, 25 tbr, 25 tbn, 25 tbc
Input #2, image2, from 'tests/p3.jpg':
 Duration: 00:00:00.04, start: 0.000000, bitrate: 15266 kb/s
 Stream #2:0: Video: mjpeg (Baseline), yuvj420p(pc, bt470bg/unknown/unknown), 728x669 [SAR 96:96 DAR 728:669], 25 fps, 25 tbr, 25 tbn, 25 tbc
File 'out.mp4' already exists. Overwrite? [y/N] y
Stream mapping:
 Stream #0:0 (mjpeg) -> scale
 Stream #1:0 (mjpeg) -> scale
 Stream #2:0 (mjpeg) -> scale
 format -> Stream #0:0 (libx264)
Press [q] to stop, [?] for help
[swscaler @ 0x8228040] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x8258640] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x827df40] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x829f800] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x82c13c0] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x82e8340] deprecated pixel format used, make sure you did set range correctly
[libx264 @ 0x77b7600] using SAR=82/127
[libx264 @ 0x77b7600] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 0x77b7600] profile High, level 3.0, 4:2:0, 8-bit
[libx264 @ 0x77b7600] 264 - core 161 r3018 db0d417 - H.264/MPEG-4 AVC codec - Copyleft 2003-2020 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'out.mp4':
 Metadata:
 encoder : Lavf58.49.100
 Stream #0:0: Video: h264 (libx264) (avc1 / 0x31637661), yuv420p, 500x500 [SAR 82:127 DAR 82:127], q=-1--1, 25 fps, 12800 tbn, 25 tbc (default)
 Metadata:
 encoder : Lavc58.97.100 libx264
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
frame= 101 fps=100 q=28.0 size= 0kB time=00:00:01.92 bitrate= 0.2kbits/s speed=[swscaler @ 0x8291d80] deprecated pixel format used, make sure you did set range correctly
 Last message repeated 2 times
[swscaler @ 0x82b3000] deprecated pixel format used, make sure you did set range correctly
 Last message repeated 2 times
[swscaler @ 0x82fc200] deprecated pixel format used, make sure you did set range correctly
 Last message repeated 2 times
frame= 153 fps=101 q=28.0 size= 0kB time=00:00:04.00 bitrate= 0.1kbits/s speed=[swscaler @ 0x82fc200] deprecated pixel format used, make sure you did set range correctly
 Last message repeated 2 times
frame= 225 fps=111 q=28.0 size= 256kB time=00:00:06.88 bitrate= 304.9kbits/s dup=0 frame= 225 fps= 95 q=-1.0 Lsize= 267kB time=00:00:08.88 bitrate= 245.9kbits/s dup=0 drop=125 speed=3.74x 
video:263kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.293847%
[libx264 @ 0x77b7600] frame I:2 Avg QP:21.03 size: 19928
[libx264 @ 0x77b7600] frame P:66 Avg QP:21.18 size: 2453
[libx264 @ 0x77b7600] frame B:157 Avg QP:29.72 size: 427
[libx264 @ 0x77b7600] consecutive B-frames: 5.3% 4.4% 1.3% 88.9%
[libx264 @ 0x77b7600] mb I I16..4: 29.0% 27.8% 43.2%
[libx264 @ 0x77b7600] mb P I16..4: 10.8% 3.2% 7.5% P16..4: 2.6% 0.9% 0.3% 0.0% 0.0% skip:74.8%
[libx264 @ 0x77b7600] mb B I16..4: 1.1% 0.5% 1.6% B16..8: 1.0% 0.5% 0.1% direct: 0.3% skip:94.9% L0:37.4% L1:43.5% BI:19.1%
[libx264 @ 0x77b7600] 8x8 transform intra:16.5% inter:55.3%
[libx264 @ 0x77b7600] coded y,uvDC,uvAC intra: 31.4% 71.7% 31.8% inter: 1.1% 1.4% 0.1%
[libx264 @ 0x77b7600] i16 v,h,dc,p: 45% 46% 9% 1%
[libx264 @ 0x77b7600] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 25% 30% 27% 6% 1% 1% 2% 1% 7%
[libx264 @ 0x77b7600] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 56% 31% 9% 1% 1% 1% 1% 1% 1%
[libx264 @ 0x77b7600] i8c dc,h,v,p: 46% 27% 23% 4%
[libx264 @ 0x77b7600] Weighted P-Frames: Y:10.6% UV:7.6%
[libx264 @ 0x77b7600] ref P L0: 61.9% 17.1% 13.9% 6.8% 0.3%
[libx264 @ 0x77b7600] ref B L0: 76.3% 22.8% 0.9%
[libx264 @ 0x77b7600] ref B L1: 97.6% 2.4%
[libx264 @ 0x77b7600] kb/s:238.88



-
when i record video with javacv it comes "java.lang.NoClassDefFoundError : org.bytedeco.javacpp.avutil"
9 avril 2020, par Pradeep SimbaI make a video recorder android app with javacv.
But, when i run this app this error occurs "java.lang.NoClassDefFoundError : org.bytedeco.javacpp.avutil".



How can I solve this error ?



gradle.build file



android {
 ..............
 packagingOptions {
 exclude 'META-INF/services/javax.annotation.processing.Processor'
 pickFirst 'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.properties'
 pickFirst 'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.xml'
 pickFirst 'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.properties'
 pickFirst 'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.xml'
 }
}

dependencies {

implementation group: 'org.bytedeco', name: 'javacv', version: '1.1'
implementation group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.0.0-1.1', classifier: 'android-arm'
implementation group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.8.1-1.1', classifier: 'android-arm'
implementation group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.0.0-1.1', classifier: 'android-x86'
implementation group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.8.1-1.1', classifier: 'android-x86'

}




My demo code VideoService which will invoke in MainActivity



package com.fs.fs.api;

import com.fs.fs.App;
import com.fs.fs.utils.DateUtils;
import com.fs.fs.utils.FileUtils;

import org.bytedeco.javacpp.avcodec;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.FrameRecorder;

import java.util.Date;

/**
 * Created by wyx on 2017/1/11.
 */
public class VideoService {
 private FFmpegFrameRecorder mFrameRecorder;
 private String path;

 private VideoService() {
 }

 private static class SingletonHolder {
 private static final VideoService INSTANCE = new VideoService();
 }

 public static VideoService getInstance() {
 return SingletonHolder.INSTANCE;
 }

 public void startRecordVideo() {
 String fileName = String.format("%s.%s", DateUtils.date2String(new Date(), "yyyyMMdd_HHmmss"), "mp4");
 path = FileUtils.getExternalFullPath(App.getInstance(), fileName);
 mFrameRecorder = new FFmpegFrameRecorder(path, 640, 480, 1);
 mFrameRecorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
 mFrameRecorder.setVideoOption("tune", "zerolatency");
 mFrameRecorder.setVideoOption("preset", "ultrafast");
 mFrameRecorder.setVideoOption("crf", "28");
 mFrameRecorder.setVideoBitrate(300 * 1000);
 mFrameRecorder.setFormat("mp4");

 mFrameRecorder.setFrameRate(30);
 mFrameRecorder.setAudioOption("crf", "0");
 mFrameRecorder.setSampleRate(48 * 1000);
 mFrameRecorder.setAudioBitrate(960 * 1000);
 mFrameRecorder.setAudioCodec(avcodec.AV_CODEC_ID_AAC);
 try {
 mFrameRecorder.start();
 } catch (FrameRecorder.Exception e) {
 e.printStackTrace();
 }
 }

 public void stop() {
 if (mFrameRecorder != null) {
 try {
 mFrameRecorder.stop();
 mFrameRecorder.release();
 } catch (FrameRecorder.Exception e) {
 e.printStackTrace();
 }
 mFrameRecorder = null;
 }
 }

}




MainActivity



package com.fs.fs.activity;

import android.app.Activity;
import android.os.Bundle;

import com.fs.fs.R;
import com.fs.fs.api.VideoService;

import static java.lang.Thread.sleep;


public class MainActivity extends Activity {

 @Override
 protected void onCreate(Bundle savedInstanceState) {
 super.onCreate(savedInstanceState);
 setContentView(R.layout.activity_main);


 VideoService.getInstance().startRecordVideo();
 try {
 sleep(10 * 1000);
 } catch (InterruptedException e) {
 e.printStackTrace();
 }
 VideoService.getInstance().stop();
 }
}




Error



E/AndroidRuntime: FATAL EXCEPTION: main
 Process: com.example.usb, PID: 660
 java.lang.NoClassDefFoundError: org.bytedeco.javacpp.avutil
 at org.bytedeco.javacpp.Loader.load(Loader.java:590)
 at org.bytedeco.javacpp.Loader.load(Loader.java:530)
 at org.bytedeco.javacpp.avcodec$AVPacket.<clinit>(avcodec.java:1694)
 at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:149)
 at com.fs.fs.api.VideoService.startRecordVideo(VideoService.java:34)
 at com.fs.fs.activity.MainActivity.onCreate(MainActivity.java:75)
 at android.app.Activity.performCreate(Activity.java:5304)
 at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1090)
 at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2245)
 at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2331)
 at android.app.ActivityThread.access$1000(ActivityThread.java:143)
 at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1244)
 at android.os.Handler.dispatchMessage(Handler.java:102)
 at android.os.Looper.loop(Looper.java:136)
 at android.app.ActivityThread.main(ActivityThread.java:5291)
 at java.lang.reflect.Method.invokeNative(Native Method)
 at java.lang.reflect.Method.invoke(Method.java:515)
 at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:849)
 at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:665)
 at dalvik.system.NativeStart.main(Native Method)
 Caused by: java.lang.ClassNotFoundException: org.bytedeco.javacpp.avutil
 at java.lang.Class.classForName(Native Method)
 at java.lang.Class.forName(Class.java:251)
 at org.bytedeco.javacpp.Loader.load(Loader.java:585)
 at org.bytedeco.javacpp.Loader.load(Loader.java:530) 
 at org.bytedeco.javacpp.avcodec$AVPacket.<clinit>(avcodec.java:1694) 
 at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:149) 
 at com.fs.fs.api.VideoService.startRecordVideo(VideoService.java:34) 
 at com.fs.fs.activity.MainActivity.onCreate(MainActivity.java:75) 
 at android.app.Activity.performCreate(Activity.java:5304) 
 at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1090) 
 at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2245) 
 at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2331) 
 at android.app.ActivityThread.access$1000(ActivityThread.java:143) 
 at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1244) 
 at android.os.Handler.dispatchMessage(Handler.java:102) 
 at android.os.Looper.loop(Looper.java:136) 
 at android.app.ActivityThread.main(ActivityThread.java:5291) 
 at java.lang.reflect.Method.invokeNative(Native Method) 
 at java.lang.reflect.Method.invoke(Method.java:515) 
 at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:849) 
 at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:665) 
 at dalvik.system.NativeStart.main(Native Method) 
 Caused by: java.lang.NoClassDefFoundError: org/bytedeco/javacpp/avutil
 at java.lang.Class.classForName(Native Method) 
 at java.lang.Class.forName(Class.java:251) 
 at org.bytedeco.javacpp.Loader.load(Loader.java:585) 
 at org.bytedeco.javacpp.Loader.load(Loader.java:530) 
 at org.bytedeco.javacpp.avcodec$AVPacket.<clinit>(avcodec.java:1694) 
 at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:149) 
 at com.fs.fs.api.VideoService.startRecordVideo(VideoService.java:34) 
 at com.fs.fs.activity.MainActivity.onCreate(MainActivity.java:75) 
 at android.app.Activity.performCreate(Activity.java:5304) 
 at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1090) 
 at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2245) 
 at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2331) 
 at android.app.ActivityThread.access$1000(ActivityThread.java:143) 
 at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1244) 
 at android.os.Handler.dispatchMessage(Handler.java:102) 
 at android.os.Looper.loop(Looper.java:136) 
 at android.app.ActivityThread.main(ActivityThread.java:5291) 
 at java.lang.reflect.Method.invokeNative(Native Method) 
 at java.lang.reflect.Method.invoke(Method.java:515) 
 at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:849) 
 at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:665) 
 at dalvik.system.NativeStart.main(Native Method) 
 Caused by: java.lang.ClassNotFoundException: Didn't find class "org.bytedeco.javacpp.avutil" on path: DexPathList[[zip file "/data/app/com.fs.fs-2.apk"],nativeLibraryDirectories=[/data/app-lib/com.fs.fs-2, /vendor/lib, /system/lib, /data/datalib]]
 at dalvik.system.BaseDexClassLoader.findClass(BaseDexClassLoader.java:56)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:497)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:457)
 at java.lang.Class.classForName(Native Method) 
 at java.lang.Class.forName(Class.java:251) 
 at org.bytedeco.javacpp.Loader.load(Loader.java:585) 
 at org.bytedeco.javacpp.Loader.load(Loader.java:530) 
 at org.bytedeco.javacpp.avcodec$AVPacket.<clinit>(avcodec.java:1694) 
 at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:149) 
 at com.fs.fs.api.VideoService.startRecordVideo(VideoService.java:34) 
 at com.fs.fs.activity.MainActivity.onCreate(MainActivity.java:75) 
 at android.app.Activity.performCreate(Activity.java:5304) 
 at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1090) 
 at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2245) 
 at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2331) 
 at android.app.ActivityThread.access$1000(ActivityThread.java:143) 
 at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1244) 
 at android.os.Handler.dispatchMessage(Handler.java:102) 
 at android.os.Looper.loop(Looper.java:136) 
 at android.app.ActivityThread.main(ActivityThread.java:5291) 
 at java.lang.reflect.Method.invokeNative(Native Method) 
 at java.lang.reflect.Method.invoke(Method.java:515) 
 at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:849) 
 at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:665) 
 at dalvik.system.NativeStart.main(Native Method) 
</init></clinit></init></clinit></init></clinit></init></clinit>



How can i solve this error ?



Why error occurs ?


-
FFMpeg : Add background image during the process of video creation
10 juillet 2018, par KillerIn reference to my previous question FFMpeg : merge images with audio for specific duration I have successfully merged few images with audio for specific duration by using following command.
ffmpeg \
-y \
-f concat \
-safe 0 \
-r 1/5 \
-i concat.txt \
-i audio.ogg \
-c:v libx264 \
-profile:v high \
-crf 17 \
-preset ultrafast \
-strict experimental \
-t 15 \
output.mp4In order to add a background image, I tried to use https://superuser.com/a/876275/299733 and other solutions that exist on the web. But the given solution doesn’t overlay my images properly and I was getting black video throughout the duration. Therefore, I encode the resultant video from earlier command once again via :
ffmpeg \
-y \
-loop 1 \
-i bg.jpg \
-i output.mp4 \
-filter_complex "overlay=(W-w)/2:(H-h)/2:shortest=1,format=yuv420p" \
-c:v libx264 \
-profile:v high \
-crf 17 \
-preset ultrafast \
-strict experimental \
-t 15 \
output2.mp4Now, I am able to get the desired result. Is there any way to merge both requests into a single pass ? Or a better way without loss of any performance ?
Additional details :
concat.txt
file '/home/shubham/Desktop/FFMpeg/image_1.jpg'
file '/home/shubham/Desktop/FFMpeg/image_2.jpg'
file '/home/shubham/Desktop/FFMpeg/image_3.jpg'Based on @gyan response
Updated concat.xml :
file '/home/shubham/Desktop/FFMpeg/image_4.jpg'
duration 5
file '/home/shubham/Desktop/FFMpeg/image_5.jpg'
duration 5
file '/home/shubham/Desktop/FFMpeg/image_6.jpg'
duration 5
file '/home/shubham/Desktop/FFMpeg/image_6.jpg'Updated Command :
ffmpeg \
-y \
-loop 1 \
-i bg.jpg \
-f concat \
-safe 0 \
-i concat.txt \
-i audio.ogg \
-filter_complex "[1]fps=25[v];[0][v]overlay=(W-w)/2:(H-h)/2:shortest=1,format=yuv420p" \
-c:v libx264 \
-profile:v high \
-crf 17 \
-preset ultrafast \
-strict experimental \
-t 15 \
output.mp4The problem is that when images are of different resolution or even for the same resolution the images are skipped. And most of the times last image is shown. However, there are no criteria of which image is selected and which one is skipped.
SAMPLE :
https://drive.google.com/file/d/1JxCsV2eudKzdgWWuefXqdaWPaBf9Dzzd/view?usp=sharingHowever, if I repeatedly used the same image or copy the image and rename it. In both the cases, I get proper images on the background without any skipping.
EDIT : 09 JULY 2018
As @gyan stated in comments : Convert each image to same resolution and type.
I check the info of images. viaffmpeg -i image_X.jpg
And found out two images have different encoding :
Image 1 : Stream #0:0 : Video : mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 640x480 [SAR 72:72 DAR 4:3], 25 tbr, 25 tbn, 25 tbc
Image 2 : Stream #0:0 : Video : mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 640x480 [SAR 72:72 DAR 4:3], 25 tbr, 25 tbn, 25 tbc
Which is the possible cause of the failure in merging.