
Recherche avancée
Médias (1)
-
Sintel MP4 Surround 5.1 Full
13 mai 2011, par
Mis à jour : Février 2012
Langue : English
Type : Video
Autres articles (111)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
Sur d’autres sites (14023)
-
ffmpeg : #EXT-X-INDEPENDENT-SEGMENTS tag not added
26 novembre 2022, par xav-stargateI try to add #EXT-X-INDEPENDENT-SEGMENTS in my master playlist file with -hls_flags independent_segments but except #EXT-X-VERSION change from 4 to 6, the tag isn't added.


Do you have an idea why ?


Command :


ffmpeg -i ./2022-11-26_09_00_00_radiojournal_0000_128.mp4 
-codec copy -f hls -hls_time 11 
-hls_flags independent_segments+single_file -hls_segment_filename "output.aac" 
-master_pl_name master.m3u8 -hls_segment_type mpegts -hls_playlist_type vod 
-hls_start_number_source generic -start_number 1 "index_128.m3u8"



-
How to expand bash array into arguments correctly
12 avril 2022, par alvgarciI'm working on a script on Bash.


The objective is pass 36 arguments to a ffmpeg via command.


Steps here :


- 

-
I verified the command. This one works fine :


ffmpeg -i "$f" -y -acodec aac -ab 128k -vcodec copy -f mp4 -movflags use_metadata_tags -metadata MtsExifToolVersionNumber="12.30" -metadata MtsFileName="00017.MTS" -metadata MtsDirectory="." -metadata MtsFileSize="59 MiB" -metadata MtsFileModificationDate/Time="2020" -metadata MtsFileAccessDate/Time="2021" -metadata MtsFileInodeChangeDate/Time="2022" -metadata MtsFilePermissions="-rwxr-xr-x" -metadata MtsFileType="M2TS" -metadata MtsFileTypeExtension="mts" -metadata MtsMIMEType="video/m2ts" -metadata MtsVideoStreamType="H.264 (AVC) Video" -metadata MtsAudioBitrate="256 kbps" -metadata MtsSurroundMode="Not indicated" -metadata MtsAudioChannels="2" -metadata MtsAudioStreamType="PGS Audio" -metadata MtsImageWidth="1920" -metadata MtsImageHeight="1080" -metadata MtsDate/TimeOriginal="2020" -metadata MtsApertureSetting="Auto" -metadata MtsGain="0 dB" -metadata MtsExposureProgram="Program AE" -metadata MtsWhiteBalance="Auto" -metadata MtsFocus="Auto (0.155)" -metadata MtsImageStabilization="On (0x3f)" -metadata MtsExposureTime="1/60" -metadata MtsFNumber="3.4" -metadata MtsMake="Sony" -metadata MtsCameraModelName="HDR-CX405" -metadata MtsWarning="[minor] The ExtractEmbedded option may find more tags in the video data" -metadata MtsAudioSampleRate="48000" -metadata MtsDuration="18.71 s" -metadata MtsAperture="3.4" -metadata MtsImageSize="1920x1080" -metadata MtsMegapixels="2.1" -metadata MtsShutterSpeed="1/60" -metadata Offset="$Offset" -metadata creation_time="$DATE" "./$output/convert_exif_ok/$MP4_NAME"







Now.. I created a bash array and I tried to expand it directly on the command :


ffmpeg -i "$f" -y -acodec aac -ab 128k -vcodec copy -f mp4 -movflags use_metadata_tags $(echo "${exif_2[@]}" ) -metadata Offset="$Offset" -metadata creation_time="$DATE" "./$output/convert_exif_ok/$MP4_NAME"



And.. this doesn't work..


Seems like it's taking one argument so long instead of 36 different arguments..


This is the output :


ffmpeg version 5.0.1 Copyright (c) 2000-2022 the FFmpeg developers
 built with Apple clang version 12.0.0 (clang-1200.0.32.29)
 configuration: --prefix=/usr/local/Cellar/ffmpeg/5.0.1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox
 libavutil 57. 17.100 / 57. 17.100
 libavcodec 59. 18.100 / 59. 18.100
 libavformat 59. 16.100 / 59. 16.100
 libavdevice 59. 4.100 / 59. 4.100
 libavfilter 8. 24.100 / 8. 24.100
 libswscale 6. 4.100 / 6. 4.100
 libswresample 4. 3.100 / 4. 3.100
 libpostproc 56. 3.100 / 56. 3.100
Input #0, mpegts, from '00017.MTS':
 Duration: 00:00:18.72, start: 1.020000, bitrate: 26466 kb/s
 Program 1
 Stream #0:0[0x1011]: Video: h264 (High) (HDPR / 0x52504448), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], 50 fps, 50 tbr, 90k tbn
 Stream #0:1[0x1100]: Audio: ac3 (AC-3 / 0x332D4341), 48000 Hz, stereo, fltp, 256 kb/s
 Stream #0:2[0x1200]: Subtitle: hdmv_pgs_subtitle ([144][0][0][0] / 0x0090), 1920x1080
 -metadata MtsExifToolVersionNumber="12.30" -metadata MtsFileName="00017.MTS" -metadata MtsDirectory="." -metadata MtsFileSize="59 MiB" -metadata MtsFileModificationDate/Time="2020" -metadata MtsFileAccessDate/Time="2021" -metadata MtsFileInodeChangeDate/Time="2022" -metadata MtsFilePermissions="-rwxr-xr-x" -metadata MtsFileType="M2TS" -metadata MtsFileTypeExtension="mts" -metadata MtsMIMEType="video/m2ts" -metadata MtsVideoStreamType="H.264 (AVC) Video" -metadata MtsAudioBitrate="256 kbps" -metadata MtsSurroundMode="Not indicated" -metadata MtsAudioChannels="2" -metadata MtsAudioStreamType="PGS Audio" -metadata MtsImageWidth="1920" -metadata MtsImageHeight="1080" -metadata MtsDate/TimeOriginal="2020" -metadata MtsApertureSetting="Auto" -metadata MtsGain="0 dB" -metadata MtsExposureProgram="Program AE" -metadata MtsWhiteBalance="Auto" -metadata MtsFocus="Auto (0.155)" -metadata MtsImageStabilization="On (0x3f)" -metadata MtsExposureTime="1/60" -metadata MtsFNumber="3.4" -metadata MtsMake="Sony" -metadata MtsCameraModelName="HDR-CX405" -metadata MtsWarning="[minor] The ExtractEmbedded option may find more tags in the video data" -metadata MtsAudioSampleRate="48000" -metadata MtsDuration="18.71 s" -metadata MtsAperture="3.4" -metadata MtsImageSize="1920x1080" -metadata MtsMegapixels="2.1" -metadata MtsShutterSpeed="1/60": File name too long



Well.. I'm sure I'm doing something wrong on the way of passing the content of the array to the arguments..


Any help ?


Thanks


-
-
FFMPEG blending screen two libvpx-vp9 webm yuva420p video files comes out looking incorrect
30 novembre 2022, par OneWorldI'm trying to screen blend two libvpx-vp9 webm files, so that the blend comes out looking correct in FFMPEG. The example below takes two rgba png input files, loops them for a couple of seconds into libvpx-vp9 webm files with the pixel format yuva420p. It then tries to blend them using FFMPEG. I then output frames of these to visualise how it looks here in this Stack Overflow post.


I have these two input rgba pngs (circle and Pikachu)




I create two libvpx-vp9 webm files from them like this :-


ffmpeg -loop 1 -i circle_50_rgba.png -c:v libvpx-vp9 -t 2 -pix_fmt yuva420p circle_libvpx-vp9_yuva420p.webm

ffmpeg -loop 1 -i pikachu_rgba.png -c:v libvpx-vp9 -t 2 -pix_fmt yuva420p pikachu_libvpx-vp9_yuva420p.webm



I then try and do a blend of these two libvpx-vp9 webm files like this :-


ffmpeg -y -c:v libvpx-vp9 -i circle_libvpx-vp9_yuva420p.webm -c:v libvpx-vp9 -i pikachu_libvpx-vp9_yuva420p.webm -filter_complex "[1:v][0:v]blend=all_mode=screen" pikachu_reverse_all_mode_screened_onto_circle_both_yuva420p.webm



and extract a frame from that like this


ffmpeg -c:v libvpx-vp9 -i pikachu_reverse_all_mode_screened_onto_circle_both_yuva420p.webm -frames:v 1 pikachu_reverse_all_mode_screened_onto_circle_from_yuva420p.png





If I do this without all_mode, like this


ffmpeg -y -c:v libvpx-vp9 -i circle_libvpx-vp9_yuva420p.webm -c:v libvpx-vp9 -i pikachu_libvpx-vp9_yuva420p.webm -filter_complex "[1:v][0:v]blend=screen" pikachu_reverse_screened_onto_circle_both_yuva420p.webm



and then extract the png so we can visualise it, like this :-


ffmpeg -c:v libvpx-vp9 -i pikachu_reverse_screened_onto_circle_both_yuva420p.webm -frames:v 1 pikachu_reverse_screened_onto_circle_from_yuva420p.png





which is also incorrect because the white part of the circle should be completely white in the screen blend. We shouldn't see a faint yellow outline of Pikachu inside the white part.


It should look like this :-



Here is the full log of this is like this :-


ffmpeg -y -c:v libvpx-vp9 -i circle_libvpx-vp9_yuva420p.webm -c:v libvpx-vp9 -i pikachu_libvpx-vp9_yuva420p.webm -filter_complex "[1:v][0:v]blend=screen" pikachu_reverse_screened_onto_circle_both_yuva420p.webm
ffmpeg version 4.2.7-0ubuntu0.1 Copyright (c) 2000-2022 the FFmpeg developers
 built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
 configuration: --prefix=/usr --extra-version=0ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
 libavutil 56. 31.100 / 56. 31.100
 libavcodec 58. 54.100 / 58. 54.100
 libavformat 58. 29.100 / 58. 29.100
 libavdevice 58. 8.100 / 58. 8.100
 libavfilter 7. 57.100 / 7. 57.100
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 5.100 / 5. 5.100
 libswresample 3. 5.100 / 3. 5.100
 libpostproc 55. 5.100 / 55. 5.100
[libvpx-vp9 @ 0x55d5b1f34680] v1.8.2
 Last message repeated 1 times
Input #0, matroska,webm, from 'circle_libvpx-vp9_yuva420p.webm':
 Metadata:
 ENCODER : Lavf58.29.100
 Duration: 00:00:02.00, start: 0.000000, bitrate: 19 kb/s
 Stream #0:0: Video: vp9 (Profile 0), yuva420p(tv), 50x50, SAR 1:1 DAR 1:1, 25 fps, 25 tbr, 1k tbn, 1k tbc (default)
 Metadata:
 alpha_mode : 1
 ENCODER : Lavc58.54.100 libvpx-vp9
 DURATION : 00:00:02.000000000
[libvpx-vp9 @ 0x55d5b1f854c0] v1.8.2
 Last message repeated 1 times
Input #1, matroska,webm, from 'pikachu_libvpx-vp9_yuva420p.webm':
 Metadata:
 ENCODER : Lavf58.29.100
 Duration: 00:00:02.00, start: 0.000000, bitrate: 29 kb/s
 Stream #1:0: Video: vp9 (Profile 0), yuva420p(tv), 50x50, SAR 1:1 DAR 1:1, 25 fps, 25 tbr, 1k tbn, 1k tbc (default)
 Metadata:
 alpha_mode : 1
 ENCODER : Lavc58.54.100 libvpx-vp9
 DURATION : 00:00:02.000000000
[libvpx-vp9 @ 0x55d5b1f38940] v1.8.2
[libvpx-vp9 @ 0x55d5b1f49440] v1.8.2
Stream mapping:
 Stream #0:0 (libvpx-vp9) -> blend:bottom
 Stream #1:0 (libvpx-vp9) -> blend:top
 blend -> Stream #0:0 (libvpx-vp9)
Press [q] to stop, [?] for help
[libvpx-vp9 @ 0x55d5b1f49440] v1.8.2
[libvpx-vp9 @ 0x55d5b1f38940] v1.8.2
[libvpx-vp9 @ 0x55d5b1f80c40] v1.8.2
Output #0, webm, to 'pikachu_reverse_screened_onto_circle_both_yuva420p.webm':
 Metadata:
 encoder : Lavf58.29.100
 Stream #0:0: Video: vp9 (libvpx-vp9), yuva420p, 50x50 [SAR 1:1 DAR 1:1], q=-1--1, 200 kb/s, 25 fps, 1k tbn, 25 tbc (default)
 Metadata:
 encoder : Lavc58.54.100 libvpx-vp9
 Side data:
 cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
frame= 50 fps=0.0 q=0.0 Lsize= 7kB time=00:00:01.96 bitrate= 29.3kbits/s speed=33.2x 
video:4kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 96.711426%




I also tried doing a convertion to rgba, like this :-


ffmpeg -y -c:v libvpx-vp9 -i circle_libvpx-vp9_yuva420p.webm -c:v libvpx-vp9 -i pikachu_libvpx-vp9_yuva420p.webm -filter_complex "[0:v]format=pix_fmts=rgba[zero];[1:v]format=pix_fmts=rgba[one];[one][zero]blend=screen" pikachu_reverse_screened_all_mode_onto_circle_after_rgba_conversion_webm.webm



However the result of this also comes out with yellow inside the white circle, which should be white


I was wondering what I need to do so that the blend of these two webm libvpx-vp9 video files looks correct, like it does above.


note : I need to retain the alpha channels, because sometimes assets have transparent alpha channels. In the examples above the assets happen to have opaque alpha channels.