
Recherche avancée
Autres articles (112)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (14640)
-
Missing packets when transcoding using ffmpeg
23 février 2021, par Adam SzmydI have very weird issue with trans-coding
opus
to any other format usingffmpeg
. I example it on transcoding toflac
as this is what I'm currently using.

So I have this one example
opus
file that after processing viaffmpeg
kinda gets shorten like if the ffmpeg would drop some packets/data out of it.

I wouldn't mind if ffmpeg was cleaning up some redundant stuff but in practice this thing is making my output file of multiple streams out of sync so at some point audio tracks gets overlapping each other.


So I have this input
input.opus
file that's length is00:00:05.78
and when i pass it through ffmpeg like this :

$ ffmpeg -i input.opus -c flac output.flac
ffmpeg version 4.3.1 Copyright (c) 2000-2020 the FFmpeg developers
 built with gcc 10 (GCC)
 configuration: --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --docdir=/usr/share/doc/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib64 --mandir=/usr/share/man --arch=x86_64 --optflags='-O2 -flto=auto -ffat-lto-objects -fexceptions -g -grecord-gcc-switches -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -Wp,-D_GLIBCXX_ASSERTIONS -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -fstack-protector-strong -specs=/usr/lib/rpm/redhat/redhat-annobin-cc1 -m64 -mtune=generic -fasynchronous-unwind-tables -fstack-clash-protection' --extra-ldflags='-Wl,-z,relro -Wl,--as-needed -Wl,-z,now -specs=/usr/lib/rpm/redhat/redhat-hardened-ld ' --extra-cflags=' ' --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libvo-amrwbenc --enable-version3 --enable-bzlib --disable-crystalhd --enable-fontconfig --enable-frei0r --enable-gcrypt --enable-gnutls --enable-ladspa --enable-libaom --enable-libdav1d --enable-libass --enable-libbluray --enable-libcdio --enable-libdrm --enable-libjack --enable-libfreetype --enable-libfribidi --enable-libgsm --enable-liblensfun --enable-libmp3lame --enable-libmysofa --enable-nvenc --enable-openal --enable-opencl --enable-opengl --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librav1e --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libvorbis --enable-libv4l2 --enable-libvidstab --enable-libvmaf --enable-version3 --enable-vapoursynth --enable-libvpx --enable-vulkan --enable-libglslang --enable-libx264 --enable-libx265 --enable-libxvid --enable-libzimg --enable-libzvbi --enable-avfilter --enable-avresample --enable-libmodplug --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug --disable-stripping --shlibdir=/usr/lib64 --enable-lto --enable-libmfx --enable-runtime-cpudetect
 libavutil 56. 51.100 / 56. 51.100
 libavcodec 58. 91.100 / 58. 91.100
 libavformat 58. 45.100 / 58. 45.100
 libavdevice 58. 10.100 / 58. 10.100
 libavfilter 7. 85.100 / 7. 85.100
 libavresample 4. 0. 0 / 4. 0. 0
 libswscale 5. 7.100 / 5. 7.100
 libswresample 3. 7.100 / 3. 7.100
 libpostproc 55. 7.100 / 55. 7.100
Input #0, ogg, from 'input.opus':
 Duration: 00:00:05.78, start: -0.017500, bitrate: 48 kb/s
 Stream #0:0: Audio: opus, 48000 Hz, mono, fltp
 Metadata:
 DURATION : 00:00:05.836000000
 encoder : Lavc58.91.100 opus
Stream mapping:
 Stream #0:0 -> #0:0 (opus (native) -> flac (native))
Press [q] to stop, [?] for help
[flac @ 0x55f94c9c2f40] encoding as 24 bits-per-sample
Output #0, flac, to 'output.flac':
 Metadata:
 encoder : Lavf58.45.100
 Stream #0:0: Audio: flac, 48000 Hz, mono, s32 (24 bit), 128 kb/s
 Metadata:
 DURATION : 00:00:05.836000000
 encoder : Lavc58.91.100 flac
size= 393kB time=00:00:05.79 bitrate= 555.7kbits/s speed= 373x 
video:0kB audio:385kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 2.102753%



interestingly ffmpeg above shows that output should get
00:00:05.79
duration but when checking it with ffprobe, its shorter by 50ms :

$ ffprobe -hide_banner output.flac
Input #0, flac, from 'output.flac':
 Metadata:
 encoder : Lavf58.45.100
 Duration: 00:00:05.73, start: 0.000000, bitrate: 561 kb/s
 Stream #0:0: Audio: flac, 48000 Hz, mono, s32 (24 bit)



It may seem silly small difference but I intentionally cut the file to be 5s long to keep troubleshooting easier. Source file is 30mins and I loose 1min out of it during that process so it is real.


ffmpeg version 4.3.1


How can i troubleshoot this ? When i try with
-loglevel trace
I noticed these few logs that looks "suspicious" :

[opus @ 0x555a5abe6c40] skip 0 / discard 192 samples due to side data
[opus @ 0x555a5abe6c40] discard 192/960 samples



But haven't found a way to stop it from discarding these samples (not even sure it is causing that..)


I would appreciate any help or point troubleshooting direction.


-
How to overlay sequence of frames on video using ffmpeg-python ?
19 novembre 2022, par Yogesh YadavI tried below but it is only showing the background video.


background_video = ffmpeg.input( "input.mp4")
overlay_video = ffmpeg.input(f'{frames_folder}*.png', pattern_type='glob', framerate=25)
subprocess = ffmpeg.overlay(
 background_video,
 overlay_video,
 ).filter("setsar", sar=1)



I also tried to assemble sequence of frames into .webm/.mov video but transparency is lost. video is taking black as background.


P.s - frame size is same as background video size. So no scaling needed.


Edit


I tried @Rotem suggestions




Try using single PNG image first




overlay_video = ffmpeg.input('test-frame.png')



It's not working for frames generated by OpenCV but working for any other png image. This is weird, when I'm manually viewing these frames folder it's showing blank images(Link to my frames folder).
But If I convert these frames into the video(see below) it is showing correctly what I draw on each frame.


output_options = {
 'crf': 20,
 'preset': 'slower',
 'movflags': 'faststart',
 'pix_fmt': 'yuv420p'
 }
ffmpeg.input(f'{frames_folder}*.png', pattern_type='glob', framerate=25 , reinit_filter=0).output(
 'movie.avi',
 **output_options
 ).global_args('-report').run()





try creating a video from all the PNG images without overlay




It's working as expected only issue is transparency. Is there is way to create a transparent background video ? I tried .webm/.mov/.avi but no luck.




Add .global_args('-report') and check the log file




Report written to "ffmpeg-20221119-110731.log"
Log level: 48
ffmpeg version 5.1 Copyright (c) 2000-2022 the FFmpeg developers
 built with Apple clang version 13.1.6 (clang-1316.0.21.2.5)
 configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/5.1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-neon
 libavutil 57. 28.100 / 57. 28.100
 libavcodec 59. 37.100 / 59. 37.100
 libavformat 59. 27.100 / 59. 27.100
 libavdevice 59. 7.100 / 59. 7.100
 libavfilter 8. 44.100 / 8. 44.100
 libswscale 6. 7.100 / 6. 7.100
 libswresample 4. 7.100 / 4. 7.100
 libpostproc 56. 6.100 / 56. 6.100
Input #0, image2, from './frames/*.png':
 Duration: 00:00:05.00, start: 0.000000, bitrate: N/A
 Stream #0:0: Video: png, rgba(pc), 1920x1080, 25 fps, 25 tbr, 25 tbn
Codec AVOption crf (Select the quality for constant quality mode) specified for output file #0 (movie.avi) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
Codec AVOption preset (Configuration preset) specified for output file #0 (movie.avi) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
Stream mapping:
 Stream #0:0 -> #0:0 (png (native) -> mpeg4 (native))
Press [q] to stop, [?] for help
Output #0, avi, to 'movie.avi':
 Metadata:
 ISFT : Lavf59.27.100
 Stream #0:0: Video: mpeg4 (FMP4 / 0x34504D46), yuv420p(tv, progressive), 1920x1080, q=2-31, 200 kb/s, 25 fps, 25 tbn
 Metadata:
 encoder : Lavc59.37.100 mpeg4
 Side data:
 cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: N/A
frame= 125 fps= 85 q=31.0 Lsize= 491kB time=00:00:05.00 bitrate= 804.3kbits/s speed=3.39x 
video:482kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.772174%



To draw frame I used below.


for i in range(num_frames):
 transparent_img = np.zeros((height, width, 4), dtype=np.uint8)
 cv2.line(transparent_img, (x1,y1), (x2,y2) ,(255, 255, 255), thickness=1, lineType=cv2.LINE_AA)
 self.frames.append(transparent_img)


## To Save each frame of the video in the given folder
for i, f in enumerate(frames):
 cv2.imwrite("{}/{:0{n}d}.png".format(path_to_frames, i, n=num_digits), f)






-
Use C# to converting apng to webm with ffmpeg from pipe input and output
30 novembre 2022, par martin wangI was using ffmpeg to convert Line sticker from apng file to webm file.
And the result is weird, some of them was converted successed and some of them failed.
not sure what happend with these failed convert.


Here is my c# code to convert Line sticker to webm,
and I use CliWrap to run ffmpeg command line.


async Task Main()
{

 var downloadUrl = @"http://dl.stickershop.LINE.naver.jp/products/0/0/1/23303/iphone/stickerpack@2x.zip";
 var arg = @$"-i pipe:.png -vf scale=512:512:force_original_aspect_ratio=decrease:flags=lanczos -pix_fmt yuva420p -c:v libvpx-vp9 -cpu-used 5 -minrate 50k -b:v 350k -maxrate 450k -to 00:00:02.900 -an -y -f webm pipe:1";

 var errorCount = 0;
 try
 {
 using (var hc = new HttpClient())
 {
 var imgsZip = await hc.GetStreamAsync(downloadUrl);

 using (ZipArchive zipFile = new ZipArchive(imgsZip))
 {
 var files = zipFile.Entries.Where(entry => Regex.IsMatch(entry.FullName, @"animation@2x\/\d+\@2x.png"));
 foreach (var entry in files)
 {
 try
 {
 using (var fileStream = File.Create(Path.Combine("D:", "Projects", "ffmpeg", "Temp", $"{Path.GetFileNameWithoutExtension(entry.Name)}.webm")))
 using (var pngFileStream = File.Create(Path.Combine("D:", "Projects", "ffmpeg", "Temp", $"{entry.Name}")))
 using (var entryStream = entry.Open())
 using (MemoryStream ms = new MemoryStream())
 {
 entry.Open().CopyTo(pngFileStream);

 var result = await Cli.Wrap("ffmpeg")
 .WithArguments(arg)
 .WithStandardInputPipe(PipeSource.FromStream(entryStream))
 .WithStandardOutputPipe(PipeTarget.ToStream(ms))
 .WithStandardErrorPipe(PipeTarget.ToFile(Path.Combine("D:", "Projects", "ffmpeg", "Temp", $"{Path.GetFileNameWithoutExtension(entry.Name)}Info.txt")))
 .WithValidation(CommandResultValidation.ZeroExitCode)
 .ExecuteAsync();
 ms.Seek(0, SeekOrigin.Begin);
 ms.WriteTo(fileStream);
 }
 }
 catch (Exception ex)
 {
 entry.FullName.Dump();
 ex.Dump();
 errorCount++;
 }
 }
 }

 }
 }
 catch (Exception ex)
 {
 ex.Dump();
 }
 $"Error Count:{errorCount.Dump()}".Dump();

}



This is the failed convert file's error information from ffmpeg :




And the successed convert file from ffmpeg infromation :



It's strange when I was manually converted these failed convert file from command line, and it will be converted successed.



The question is the resource of images are all the same apng file,
so I just can't understan why some of files will convert failed from my c# code
but also when I manually use command line will be converted successed ?



I have written same exampe from C# to Python...
and here is python code :


from io import BytesIO
import os
import re
import subprocess
import zipfile

import requests


downloadUrl = "http://dl.stickershop.LINE.naver.jp/products/0/0/1/23303/iphone/stickerpack@2x.zip"
args = [
 'ffmpeg',
 '-i', 'pipe:',
 '-vf', 'scale=512:512:force_original_aspect_ratio=decrease:flags=lanczos',
 '-pix_fmt', 'yuva420p',
 '-c:v', 'libvpx-vp9',
 '-cpu-used', '5',
 '-minrate', '50k',
 '-b:v', '350k',
 '-maxrate', '450k', '-to', '00:00:02.900', '-an', '-y', '-f', 'webm', 'pipe:1'
]


imgsZip = requests.get(downloadUrl)
with zipfile.ZipFile(BytesIO(imgsZip.content)) as archive:
 files = [file for file in archive.infolist() if re.match(
 "animation@2x\/\d+\@2x.png", file.filename)]
 for entry in files:
 fileName = entry.filename.replace(
 "animation@2x/", "").replace(".png", "")
 rootPath = 'D:\\' + os.path.join("Projects", "ffmpeg", "Temp")
 # original file
 apngFile = os.path.join(rootPath, fileName+'.png')
 # output file
 webmFile = os.path.join(rootPath, fileName+'.webm')
 # output info
 infoFile = os.path.join(rootPath, fileName+'info.txt')

 with archive.open(entry) as file, open(apngFile, 'wb') as output_apng, open(webmFile, 'wb') as output_webm, open(infoFile, 'wb') as output_info:
 p = subprocess.Popen(args, stdin=subprocess.PIPE,
 stdout=subprocess.PIPE, stderr=output_info)
 outputBytes = p.communicate(input=file.read())[0]

 output_webm.write(outputBytes)
 file.seek(0)
 output_apng.write(file.read())




And you can try it,the result will be the as same as C#.