
Recherche avancée
Autres articles (55)
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Submit bugs and patches
13 avril 2011Unfortunately a software is never perfect.
If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
You may also (...)
Sur d’autres sites (6200)
-
Combining three or more videos with ffmpeg and the xfade filter
25 février 2021, par silAs of 2020 ffmpeg has the xfade filter which can combine videos with a transition. Combining two videos is easy enough :


ffmpeg -i vid1.mp4 -i vid2.mp4 \
 -filter_complex [0][1]xfade=transition=pixelize:duration=1:offset=4,format=yuv420p" \
 out.mp4



But I don't understand how to combine three videos (so that v1 fades into v2 and v2 then fades into v3. I tried something like this :


ffmpeg -i vid1.mp4 -i vid2.mp4 -i vid3.mp4 \
 -filter_complex [0][1]xfade=transition=pixelize:duration=1:offset=4,format=yuv420p[0n1];[0n1][2]xfade=transition=pixelize:duration=1:offset=9,format=yuv420p" \
 out.mp4



but that doesn't work. My idea was that 0 and 1, or vid1 and vid2, would be combined into a [0n1] stream with a transition by xfade, and then that 0n1 stream could be combined with vid3 with another filter. As far as I can tell, this includes the first two videos but not the third. What this of course means is that I don't understand how to specify a filtergraph correctly !


How should I use xfade to combine 3 or more videos with transitions between them ?


A full example is as follows. Here I'll use three images (because then issues with combining videos at different frame rates are avoided), and smash them all to 500x500 for ease (in the final version they would be letterboxed to keep resolution and so on).


ffmpeg \
 -loop 1 -t 5 -i tests/p1.jpg \
 -loop 1 -t 5 -i tests/p2.jpg \
 -loop 1 -t 5 -i tests/p3.jpg \
 -filter_complex "[0]scale=500:500[s0];[1]scale=500:500[s1];[2]scale=500:500[s2];[s0][s1]xfade=transition=pixelize:duration=1:offset=4,format=yuv420p[s01];[s01][s2]xfade=transition=pixelize:duration=1:offset=9,format=yuv420p" out.mp4



I would expect this to create a video which was :


- 

- 4 seconds of p1.jpg
- a pixelise transition into p2.jpg lasting 1 second
- 4 seconds of p2.jpg
- a pixelise transition into p3.jpg lasting 1 second
- 4 seconds of p3.jpg












but what I actually get is


- 

- 4 seconds of p1.jpg
- a pixelise transition into p2.jpg lasting 1 second
- 4 seconds of p2.jpg








and then the video ends. p3 is not included at all.


The output is as follows :


ffmpeg version N-53546-g5eb4405fc5-static https://johnvansickle.com/ffmpeg/ Copyright (c) 2000-2020 the FFmpeg developers
 built with gcc 8 (Debian 8.3.0-6)
 configuration: --enable-gpl --enable-version3 --enable-static --disable-debug --disable-ffplay --disable-indev=sndio --disable-outdev=sndio --cc=gcc --enable-fontconfig --enable-frei0r --enable-gnutls --enable-gmp --enable-libgme --enable-gray --enable-libaom --enable-libfribidi --enable-libass --enable-libvmaf --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librubberband --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libvorbis --enable-libopus --enable-libtheora --enable-libvidstab --enable-libvo-amrwbenc --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libdav1d --enable-libxvid --enable-libzvbi --enable-libzimg
 libavutil 56. 56.100 / 56. 56.100
 libavcodec 58. 97.100 / 58. 97.100
 libavformat 58. 49.100 / 58. 49.100
 libavdevice 58. 11.101 / 58. 11.101
 libavfilter 7. 87.100 / 7. 87.100
 libswscale 5. 8.100 / 5. 8.100
 libswresample 3. 8.100 / 3. 8.100
 libpostproc 55. 8.100 / 55. 8.100
Input #0, image2, from 'tests/p1.jpg':
 Duration: 00:00:00.04, start: 0.000000, bitrate: 44845 kb/s
 Stream #0:0: Video: mjpeg (Baseline), yuvj444p(pc, bt470bg/unknown/unknown), 820x1270 [SAR 150:150 DAR 82:127], 25 fps, 25 tbr, 25 tbn, 25 tbc
Input #1, image2, from 'tests/p2.jpg':
 Duration: 00:00:00.04, start: 0.000000, bitrate: 22325 kb/s
 Stream #1:0: Video: mjpeg (Baseline), yuvj420p(pc, bt470bg/unknown/unknown), 960x600 [SAR 1:1 DAR 8:5], 25 fps, 25 tbr, 25 tbn, 25 tbc
Input #2, image2, from 'tests/p3.jpg':
 Duration: 00:00:00.04, start: 0.000000, bitrate: 15266 kb/s
 Stream #2:0: Video: mjpeg (Baseline), yuvj420p(pc, bt470bg/unknown/unknown), 728x669 [SAR 96:96 DAR 728:669], 25 fps, 25 tbr, 25 tbn, 25 tbc
File 'out.mp4' already exists. Overwrite? [y/N] y
Stream mapping:
 Stream #0:0 (mjpeg) -> scale
 Stream #1:0 (mjpeg) -> scale
 Stream #2:0 (mjpeg) -> scale
 format -> Stream #0:0 (libx264)
Press [q] to stop, [?] for help
[swscaler @ 0x8228040] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x8258640] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x827df40] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x829f800] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x82c13c0] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 0x82e8340] deprecated pixel format used, make sure you did set range correctly
[libx264 @ 0x77b7600] using SAR=82/127
[libx264 @ 0x77b7600] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 0x77b7600] profile High, level 3.0, 4:2:0, 8-bit
[libx264 @ 0x77b7600] 264 - core 161 r3018 db0d417 - H.264/MPEG-4 AVC codec - Copyleft 2003-2020 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'out.mp4':
 Metadata:
 encoder : Lavf58.49.100
 Stream #0:0: Video: h264 (libx264) (avc1 / 0x31637661), yuv420p, 500x500 [SAR 82:127 DAR 82:127], q=-1--1, 25 fps, 12800 tbn, 25 tbc (default)
 Metadata:
 encoder : Lavc58.97.100 libx264
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
frame= 101 fps=100 q=28.0 size= 0kB time=00:00:01.92 bitrate= 0.2kbits/s speed=[swscaler @ 0x8291d80] deprecated pixel format used, make sure you did set range correctly
 Last message repeated 2 times
[swscaler @ 0x82b3000] deprecated pixel format used, make sure you did set range correctly
 Last message repeated 2 times
[swscaler @ 0x82fc200] deprecated pixel format used, make sure you did set range correctly
 Last message repeated 2 times
frame= 153 fps=101 q=28.0 size= 0kB time=00:00:04.00 bitrate= 0.1kbits/s speed=[swscaler @ 0x82fc200] deprecated pixel format used, make sure you did set range correctly
 Last message repeated 2 times
frame= 225 fps=111 q=28.0 size= 256kB time=00:00:06.88 bitrate= 304.9kbits/s dup=0 frame= 225 fps= 95 q=-1.0 Lsize= 267kB time=00:00:08.88 bitrate= 245.9kbits/s dup=0 drop=125 speed=3.74x 
video:263kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.293847%
[libx264 @ 0x77b7600] frame I:2 Avg QP:21.03 size: 19928
[libx264 @ 0x77b7600] frame P:66 Avg QP:21.18 size: 2453
[libx264 @ 0x77b7600] frame B:157 Avg QP:29.72 size: 427
[libx264 @ 0x77b7600] consecutive B-frames: 5.3% 4.4% 1.3% 88.9%
[libx264 @ 0x77b7600] mb I I16..4: 29.0% 27.8% 43.2%
[libx264 @ 0x77b7600] mb P I16..4: 10.8% 3.2% 7.5% P16..4: 2.6% 0.9% 0.3% 0.0% 0.0% skip:74.8%
[libx264 @ 0x77b7600] mb B I16..4: 1.1% 0.5% 1.6% B16..8: 1.0% 0.5% 0.1% direct: 0.3% skip:94.9% L0:37.4% L1:43.5% BI:19.1%
[libx264 @ 0x77b7600] 8x8 transform intra:16.5% inter:55.3%
[libx264 @ 0x77b7600] coded y,uvDC,uvAC intra: 31.4% 71.7% 31.8% inter: 1.1% 1.4% 0.1%
[libx264 @ 0x77b7600] i16 v,h,dc,p: 45% 46% 9% 1%
[libx264 @ 0x77b7600] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 25% 30% 27% 6% 1% 1% 2% 1% 7%
[libx264 @ 0x77b7600] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 56% 31% 9% 1% 1% 1% 1% 1% 1%
[libx264 @ 0x77b7600] i8c dc,h,v,p: 46% 27% 23% 4%
[libx264 @ 0x77b7600] Weighted P-Frames: Y:10.6% UV:7.6%
[libx264 @ 0x77b7600] ref P L0: 61.9% 17.1% 13.9% 6.8% 0.3%
[libx264 @ 0x77b7600] ref B L0: 76.3% 22.8% 0.9%
[libx264 @ 0x77b7600] ref B L1: 97.6% 2.4%
[libx264 @ 0x77b7600] kb/s:238.88



-
How to overlay 2 videos, one is main second one is overlaying it, and play sound simultaneously. using FFMPEG in ANDROID STUDIO
6 août 2020, par Dusan Lilicas title say I'm trying to overlay 2 videos and play sound simultaneously. So far i managed to put 1 video over another using this command :


String[] command = {"-i", mainVideoPath, "-vf",
 "movie=" + overlayVideo + ", scale=300:-1[inner]; [in][inner]overlay=10:10[out]" ,combinedVideoOutput};



and this works but I have 3 problems here.
First, video is rotated by 90 degrees (overlay video), second Audio is played only from main video (I want to play sound from both videos simultaneously), and third overlay video is longer (for example : overlayVideo duration is 10 seconds and main video last 7 seconds) then mainVideo, so i want to final video last as long as mainVideo, as soon as mainVideo finish, overlayVideo should also stop (need to cut it prolly ?)


String[] command = {"-i", mainVideoPath, "-i", overlayVideo ,
 "-filter_complex", "[1:v][0:v]scale2ref=(256/256)*ih/8/sar:ih/8[wm][base];[base][wm]overlay=10:10" ,combinedVideoOutput};



Using this command i have 2 problems same as above except video is not rotated here.
I have to say that I'm not very familiar with ffmpeg commands. I was trying to figure it out from documentation link to documentation but without any success.
I know that I'm missing some filters like -map merge or something but can't figure it out.
Thanks in advance !


EDIT1 :
This is logcat from second commad as asked


D/LISKO: ffmpeg version n4.0-39-gda39990 Copyright (c) 2000-2018 the FFmpeg developers
 built with gcc 4.9.x (GCC) 20150123 (prerelease)
D/LISKO: configuration: --target-os=linux --cross-prefix=/root/bravobit/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/root/bravobit/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-ffprobe --enable-libopus --enable-libvorbis --enable-libfdk-aac --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-libvpx --enable-libass --enable-yasm --enable-pthreads --disable-debug --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-linux-perf --disable-doc --disable-shared --enable-static --enable-runtime-cpudetect --enable-nonfree --enable-network --enable-avresample --enable-avformat --enable-avcodec --enable-indev=lavfi --enable-hwaccels --enable-ffmpeg --enable-zlib --enable-gpl --enable-small --enable-nonfree --pkg-config=pkg-config --pkg-config-flags=--static --prefix=/root/bravobit/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/root/bravobit/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/root/bravobit/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-cxxflags=
D/LISKO: libavutil 56. 14.100 / 56. 14.100
 libavcodec 58. 18.100 / 58. 18.100
 libavformat 58. 12.100 / 58. 12.100
 libavdevice 58. 3.100 / 58. 3.100
 libavfilter 7. 16.100 / 7. 16.100
D/LISKO: libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 1.100 / 5. 1.100
D/LISKO: libswresample 3. 1.100 / 3. 1.100
 libpostproc 55. 1.100 / 55. 1.100
D/LISKO: Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/mainVideo.mp4':
 Metadata:
 major_brand : iso6
 minor_version : 1
 compatible_brands: mp42iso6avc1isom
 creation_time : 2020-08-03T13:20:11.000000Z
 Duration: 00:00:07.04, start: 0.000000, bitrate: 1380 kb/s
 Stream #0:0(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 140 kb/s (default)
 Metadata:
 creation_time : 2020-07-28T08:11:36.000000Z
 Stream #0:1(und): Video: h264 (avc1 / 0x31637661), yuv420p(tv, bt709), 1280x720 [SAR 81:256 DAR 9:16], 1264 kb/s, 30 fps, 30 tbr, 90k tbn, 180k tbc (default)
 Metadata:
 creation_time : 2020-07-28T08:11:36.000000Z
D/LISKO: Input #1, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/overlayVideo.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 0
 compatible_brands: isommp42
 creation_time : 2020-08-04T07:27:47.000000Z
 com.android.version: 10
 Duration: 00:00:11.19, start: 0.000000, bitrate: 9993 kb/s
D/LISKO: Stream #1:0(eng): Video: h264 (avc1 / 0x31637661), yuv420p(tv, bt709), 1280x720, 9238 kb/s, SAR 1:1 DAR 16:9, 28.38 fps, 29.75 tbr, 90k tbn, 180k tbc (default)
 Metadata:
 rotate : 270
 creation_time : 2020-08-04T07:27:47.000000Z
 handler_name : VideoHandle
 Side data:
 displaymatrix: rotation of 90.00 degrees
 Stream #1:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 192 kb/s (default)
 Metadata:
 creation_time : 2020-08-04T07:27:47.000000Z
 handler_name : SoundHandle
 Stream mapping:
 Stream #0:1 (h264) -> scale2ref:ref (graph 0)
 Stream #1:0 (h264) -> scale2ref:default (graph 0)
 overlay (graph 0) -> Stream #0:0 (libx264)
 Stream #0:0 -> #0:1 (aac (native) -> aac (native))
 Press [q] to stop, [?] for help
D/LISKO: [libx264 @ 0xee986100] using SAR=81/256
D/LISKO: [libx264 @ 0xee986100] using cpu capabilities: ARMv6 NEON
 [libx264 @ 0xee986100] profile High, level 3.1
D/LISKO: [libx264 @ 0xee986100] 264 - core 152 r2851M ba24899 - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=12 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
D/LISKO: Output #0, mp4, to '/storage/emulated/0/outputVideo.mp4':
D/LISKO: Metadata:
 major_brand : iso6
D/LISKO: minor_version : 1
D/LISKO: compatible_brands: mp42iso6avc1isom
D/LISKO: encoder : Lavf58.12.100
 Stream #0:0: Video: h264 (libx264) (avc1 / 0x31637661), yuv420p, 1280x720 [SAR 81:256 DAR 9:16], q=-1--1, 30 fps, 15360 tbn, 30 tbc (default)
D/LISKO: Metadata:
 encoder : Lavc58.18.100 libx264
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
D/LISKO: Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
 Metadata:
 creation_time : 2020-07-28T08:11:36.000000Z
D/LISKO: encoder : Lavc58.18.100 aac
D/LISKO: frame= 26 fps=0.0 q=0.0 size= 0kB time=00:00:00.09 bitrate= 4.1kbits/s dup=2 drop=0 speed=0.185x 
D/LISKO: frame= 41 fps= 41 q=0.0 size= 0kB time=00:00:00.58 bitrate= 0.7kbits/s dup=2 drop=0 speed=0.574x 
D/LISKO: frame= 49 fps= 32 q=0.0 size= 0kB time=00:00:00.92 bitrate= 0.4kbits/s dup=2 drop=0 speed=0.613x 
D/LISKO: frame= 59 fps= 29 q=29.0 size= 0kB time=00:00:01.97 bitrate= 0.2kbits/s dup=2 drop=0 speed=0.974x 
D/LISKO: frame= 75 fps= 29 q=29.0 size= 0kB time=00:00:01.97 bitrate= 0.2kbits/s dup=2 drop=0 speed=0.762x 



EDIT2 :
After adding "-shortest" to command i managed to cut overlay video to be the same length as main video (because overlay video is always longer then mainVideo, "-shortest" take short one duration. So now, command looks like this :


String[] command = {"-i", mainVideoPath, "-i", overlayVideo ,"-filter_complex", 
"[1:v][0:v]scale2ref=(256/256)*ih/8/sar:ih/8[wm][base];[base][wm]overlay=10:10", "-shortest", combinedVideoOutput};



Rotation is good so only need to merge their audios. For now, only mainVideo audio is playing, overlay video audio isn't


EDIT 3 :


String[] command = {"-i", mainVideoPath, "-i", overlayVideo ,
 "-strict", "experimental",
 "-filter_complex",
 "[1:v][0:v]scale2ref=(256/256)*ih/8/sar:ih/8[wm][base];" +
 "[base][wm]overlay=10:10; " +
 "pan=stereo|c0=2*c0|c1=3*c0[a0];[1:a]pan=stereo|c0=1*c0|c1=4*c0[a1];[a0][a1]amix=inputs=2:duration=first:dropout_transition=2",
 "-shortest" ,combinedVideoOutput};



With this command i managed to overlay videos, and play sound from both of them, rotation is good, but -shortest doesn't work now. Only existing problem now is to make them to last as shorter one (mainVideo is always shorter) ???


EDIT 4 :


This is finally working command


String[] command = {"-i", mainVideoPath, "-i", overlayVideo,
 "-filter_complex",
 "[1:v][0:v]scale2ref=(256/256)*ih/8/sar:ih/8[wm][base];" +
 "[base][wm]overlay=10:10:shortest=1;" +
 "pan=stereo|c0=2*c0|c1=3*c0[a0];[1:a]pan=stereo|c0=1*c0|c1=4*c0[a1];" +
 "[a0][a1]amix=inputs=2:duration=first:dropout_transition=2",
 combinedVideoOutput};



Thanks


-
Errors converting EXRs to mp4 with ffmpeg
3 février 2021, par GaryOI'm using ffmpeg 4.3.1 to convert some EXRs to mp4, with tone mapping for the extended range in the source and rec709 conversion for the mp4.


I'm using this command :


ffmpeg -pattern_type glob -framerate 24 -i 'frames/*.exr' -vf zscale=transfer=linear,tonemap=hable,zscale=transfer=bt709,format=yuv420p -c:v libx264 -r 24 -preset slow -crf 18 -pix_fmt yuv420p foo-tonemap-01.mp4



but it gives me this error :


Input #0, image2, from 'frames/*.exr':
 Duration: 00:00:00.08, start: 0.000000, bitrate: N/A
 Stream #0:0: Video: exr, gbrapf32le, 1920x1080 [SAR 1:1 DAR 16:9], 24 tbr, 24 tbn, 24 tbc
Stream mapping:
 Stream #0:0 -> #0:0 (exr (native) -> h264 (libx264))
code 3074: no path between colorspaces
Error while filtering: Generic error in an external library



What does that error "no path between colorspaces" mean ? Do I need to tell it what colorspace the EXRs are in (scene-linear RGB) or something ?


Complete log :


ffmpeg started on 2020-08-05 at 22:22:33
Report written to "ffmpeg-20200805-222233.log"
Log level: 48
Command line:
ffmpeg -report -y -i frames/0240.exr -vf "zscale=transfer=linear,tonemap=clip,zscale=transfer=bt709,format=yuv420p" -c:v libx264 -r 24 -preset slow -crf 18 -pix_fmt yuv420p foo-tonemap-01.mp4
ffmpeg version 4.3.1-static https://johnvansickle.com/ffmpeg/ Copyright (c) 2000-2020 the FFmpeg developers
 built with gcc 8 (Debian 8.3.0-6)
 configuration: --enable-gpl --enable-version3 --enable-static --disable-debug --disable-ffplay --disable-indev=sndio --disable-outdev=sndio --cc=gcc --enable-fontconfig --enable-frei0r --enable-gnutls --enable-gmp --enable-libgme --enable-gray --enable-libaom --enable-libfribidi --enable-libass --enable-libvmaf --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librubberband --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libvorbis --enable-libopus --enable-libtheora --enable-libvidstab --enable-libvo-amrwbenc --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libdav1d --enable-libxvid --enable-libzvbi --enable-libzimg
 libavutil 56. 51.100 / 56. 51.100
 libavcodec 58. 91.100 / 58. 91.100
 libavformat 58. 45.100 / 58. 45.100
 libavdevice 58. 10.100 / 58. 10.100
 libavfilter 7. 85.100 / 7. 85.100
 libswscale 5. 7.100 / 5. 7.100
 libswresample 3. 7.100 / 3. 7.100
 libpostproc 55. 7.100 / 55. 7.100
Splitting the commandline.
Reading option '-report' ... matched as option 'report' (generate a report) with argument '1'.
Reading option '-y' ... matched as option 'y' (overwrite output files) with argument '1'.
Reading option '-i' ... matched as input url with argument 'frames/0240.exr'.
Reading option '-vf' ... matched as option 'vf' (set video filters) with argument 'zscale=transfer=linear,tonemap=clip,zscale=transfer=bt709,format=yuv420p'.
Reading option '-c:v' ... matched as option 'c' (codec name) with argument 'libx264'.
Reading option '-r' ... matched as option 'r' (set frame rate (Hz value, fraction or abbreviation)) with argument '24'.
Reading option '-preset' ... matched as AVOption 'preset' with argument 'slow'.
Reading option '-crf' ... matched as AVOption 'crf' with argument '18'.
Reading option '-pix_fmt' ... matched as option 'pix_fmt' (set pixel format) with argument 'yuv420p'.
Reading option 'foo-tonemap-01.mp4' ... matched as output url.
Finished splitting the commandline.
Parsing a group of options: global .
Applying option report (generate a report) with argument 1.
Applying option y (overwrite output files) with argument 1.
Successfully parsed a group of options.
Parsing a group of options: input url frames/0240.exr.
Successfully parsed a group of options.
Opening an input file: frames/0240.exr.
[NULL @ 0x6db71c0] Opening 'frames/0240.exr' for reading
[file @ 0x6db7c80] Setting default whitelist 'file,crypto,data'
[exr_pipe @ 0x6db71c0] Format exr_pipe probed with size=2048 and score=51
[exr_pipe @ 0x6db71c0] Before avformat_find_stream_info() pos: 0 bytes read:32768 seeks:0 nb_streams:1
[exr_pipe @ 0x6db71c0] parser not found for codec exr, packets or times may be invalid.
[exr_pipe @ 0x6db71c0] parser not found for codec exr, packets or times may be invalid.
[exr @ 0x6db92c0] line order: 0.
[exr_pipe @ 0x6db71c0] After avformat_find_stream_info() pos: 4970635 bytes read:4970635 seeks:0 frames:1
Input #0, exr_pipe, from 'frames/0240.exr':
 Duration: N/A, bitrate: N/A
 Stream #0:0, 1, 1/25: Video: exr, gbrapf32le, 1920x1080 [SAR 1:1 DAR 16:9], 25 tbr, 25 tbn, 25 tbc
Successfully opened the file.
Parsing a group of options: output url foo-tonemap-01.mp4.
Applying option vf (set video filters) with argument zscale=transfer=linear,tonemap=clip,zscale=transfer=bt709,format=yuv420p.
Applying option c:v (codec name) with argument libx264.
Applying option r (set frame rate (Hz value, fraction or abbreviation)) with argument 24.
Applying option pix_fmt (set pixel format) with argument yuv420p.
Successfully parsed a group of options.
Opening an output file: foo-tonemap-01.mp4.
[file @ 0x6e0a0c0] Setting default whitelist 'file,crypto,data'
Successfully opened the file.
detected 32 logical cores
Stream mapping:
 Stream #0:0 -> #0:0 (exr (native) -> h264 (libx264))
Press [q] to stop, [?] for help
cur_dts is invalid st:0 (0) [init:0 i_done:0 finish:0] (this is harmless if it occurs once at the start per stream)
cur_dts is invalid st:0 (0) [init:0 i_done:0 finish:0] (this is harmless if it occurs once at the start per stream)
[exr @ 0x6dbe700] line order: 0.
[Parsed_zscale_0 @ 0x725c4c0] Setting 'transfer' to value 'linear'
[Parsed_tonemap_1 @ 0x725c940] Setting 'tonemap' to value 'clip'
[Parsed_zscale_2 @ 0x725d380] Setting 'transfer' to value 'bt709'
[Parsed_format_3 @ 0x725e100] Setting 'pix_fmts' to value 'yuv420p'
[graph 0 input from stream 0:0 @ 0x725eb40] Setting 'video_size' to value '1920x1080'
[graph 0 input from stream 0:0 @ 0x725eb40] Setting 'pix_fmt' to value '180'
[graph 0 input from stream 0:0 @ 0x725eb40] Setting 'time_base' to value '1/25'
[graph 0 input from stream 0:0 @ 0x725eb40] Setting 'pixel_aspect' to value '1/1'
[graph 0 input from stream 0:0 @ 0x725eb40] Setting 'frame_rate' to value '25/1'
[graph 0 input from stream 0:0 @ 0x725eb40] w:1920 h:1080 pixfmt:gbrapf32le tb:1/25 fr:25/1 sar:1/1
[format @ 0x725fc00] Setting 'pix_fmts' to value 'yuv420p'
[AVFilterGraph @ 0x6dc0000] query_formats: 7 queried, 6 merged, 0 already done, 0 delayed
[Parsed_zscale_0 @ 0x725c4c0] w:1920 h:1080 fmt:gbrapf32le sar:1/1 -> w:1920 h:1080 fmt:gbrapf32le sar:1/1
[Parsed_zscale_2 @ 0x725d380] w:1920 h:1080 fmt:gbrapf32le sar:1/1 -> w:1920 h:1080 fmt:yuv420p sar:1/1
[Parsed_zscale_0 @ 0x725c4c0] w:1920 h:1080 fmt:gbrapf32le sar:1/1 -> w:1920 h:1080 fmt:gbrapf32le sar:1/1
code 3074: no path between colorspaces
Error while filtering: Generic error in an external library
Failed to inject frame into filter network: Generic error in an external library
Error while processing the decoded data for stream #0:0
[AVIOContext @ 0x6e08a80] Statistics: 0 seeks, 0 writeouts
[AVIOContext @ 0x6dc00c0] Statistics: 4970635 bytes read, 0 seeks
Conversion failed!