
Recherche avancée
Médias (1)
-
SPIP - plugins - embed code - Exemple
2 septembre 2013, par
Mis à jour : Septembre 2013
Langue : français
Type : Image
Autres articles (75)
-
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
List of compatible distributions
26 avril 2011, parThe table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)
Sur d’autres sites (9257)
-
Running ffmpeg via PHP
11 mars 2020, par FionaFSo the end goal is to have a php script that, given a list of images, will automatically create a slideshow video with transitions and text overlays and maybe an audio track as well.
I’m competent in coding PHP but not experienced in command line stuff.
We have a dedicated server and I got hosting company to install ffmpeg for me.
So I thought I’d start simple and slowly build up to ultimate goal. But I’m really struggling. I’ve spent a couple of days trying all sorts of things with very little success.
I found this post with some excellent examples https://superuser.com/questions/833232/create-video-with-5-images-with-fadein-out-effect-in-ffmpeg but I can’t get any of them to work. Which makes me think that I’m doing something fundamentally wrong.
So this does work for me - I get a nice little 30 sec video slideshow of 6 images displaying for 5 secs each :
$ffmpeg="/usr/bin/ffmpeg";
exec($ffmpeg.' -f concat -safe 0 -i input.txt -c:v libx264 -r 30 -pix_fmt yuv420p -y out.mp4 2>&1', $output);
var_dump($output);and this is input.txt
file /home/webvivre/public_html/videos/test/i1.jpg
duration 5
file /home/webvivre/public_html/videos/test/i2.jpg
duration 5
file /home/webvivre/public_html/videos/test/i3.jpg
duration 5
file /home/webvivre/public_html/videos/test/i4.jpg
duration 5
file /home/webvivre/public_html/videos/test/i5.jpg
duration 5
file /home/webvivre/public_html/videos/test/i6.jpg
duration 5But this (basically taken from example in link above - have only changed image names and ffmpeg location) doesn’t work for me :
$code="/usr/bin/ffmpeg -y -loop 1 -i i1.jpg -loop 1 -i i2.jpg -loop 1 -i i3.jpg -filter_complex \" [0:v]zoompan=z='min(zoom+0.0015,1.5)':d=125,trim=duration=5,blend=all_expr='A*(if(gte(T,0.5),1,T/0.5))+B*(1-(if(gte(T,0.5),1,T/0.5)))',setpts=PTS-STARTPTS[v0]; [1:v]zoompan=z='min(zoom+0.0015,1.5)':d=125,trim=duration=5,blend=all_expr='A*(if(gte(T,0.5),1,T/0.5))+B*(1-(if(gte(T,0.5),1,T/0.5)))',setpts=PTS-STARTPTS[v1]; [2:v]zoompan=z='min(zoom+0.0015,1.5)':d=125,trim=duration=5,blend=all_expr='A*(if(gte(T,0.5),1,T/0.5))+B*(1-(if(gte(T,0.5),1,T/0.5)))',setpts=PTS-STARTPTS[v2]; [v0][v1][v2] concat=n=3:v=1:a=0, format=yuv420p[v]\" -map '[v]' -c:v libx264 -pix_fmt yuvj420p -q:v 1 out.mp4 2>&1";
exec($code,$output);
var_dump($output);This is the output :
array(25) { [0]=> string(67) "ffmpeg version 2.8.15 Copyright (c) 2000-2018 the FFmpeg developers" [1]=> string(56) " built with gcc 4.8.5 (GCC) 20150623 (Red Hat 4.8.5-36)" [2]=> string(1147) " configuration: --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib64 --mandir=/usr/share/man --arch=x86_64 --optflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic' --extra-ldflags='-Wl,-z,relro ' --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libvo-amrwbenc --enable-version3 --enable-bzlib --disable-crystalhd --enable-gnutls --enable-ladspa --enable-libass --enable-libcdio --enable-libdc1394 --enable-libfdk-aac --enable-nonfree --disable-indev=jack --enable-libfreetype --enable-libgsm --enable-libmp3lame --enable-openal --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libv4l2 --enable-libx264 --enable-libx265 --enable-libxvid --enable-x11grab --enable-avfilter --enable-avresample --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug --disable-stripping --shlibdir=/usr/lib64 --enable-runtime-cpudetect" [3]=> string(40) " libavutil 54. 31.100 / 54. 31.100" [4]=> string(40) " libavcodec 56. 60.100 / 56. 60.100" [5]=> string(40) " libavformat 56. 40.101 / 56. 40.101" [6]=> string(40) " libavdevice 56. 4.100 / 56. 4.100" [7]=> string(40) " libavfilter 5. 40.101 / 5. 40.101" [8]=> string(40) " libavresample 2. 1. 0 / 2. 1. 0" [9]=> string(40) " libswscale 3. 1.101 / 3. 1.101" [10]=> string(40) " libswresample 1. 2.101 / 1. 2.101" [11]=> string(40) " libpostproc 53. 3.100 / 53. 3.100" [12]=> string(38) "[mjpeg @ 0x183c720] Changeing bps to 8" [13]=> string(32) "Input #0, image2, from 'i1.jpg':" [14]=> string(61) " Duration: 00:00:00.04, start: 0.000000, bitrate: 33777 kb/s" [15]=> string(128) " Stream #0:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 1000x750 [SAR 1:1 DAR 4:3], 25 fps, 25 tbr, 25 tbn, 25 tbc" [16]=> string(38) "[mjpeg @ 0x18427c0] Changeing bps to 8" [17]=> string(32) "Input #1, image2, from 'i2.jpg':" [18]=> string(61) " Duration: 00:00:00.04, start: 0.000000, bitrate: 41896 kb/s" [19]=> string(132) " Stream #1:0: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 1000x750 [SAR 300:300 DAR 4:3], 25 fps, 25 tbr, 25 tbn, 25 tbc" [20]=> string(38) "[mjpeg @ 0x1849fa0] Changeing bps to 8" [21]=> string(32) "Input #2, image2, from 'i3.jpg':" [22]=> string(61) " Duration: 00:00:00.04, start: 0.000000, bitrate: 34776 kb/s" [23]=> string(132) " Stream #2:0: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 1000x750 [SAR 300:300 DAR 4:3], 25 fps, 25 tbr, 25 tbn, 25 tbc" [24]=> string(81) "Cannot find a matching stream for unlabeled input pad 1 on filter Parsed_blend_10" }
And for reference, this is info provided to me by hosting company after installing ffmpeg for me :
svr01~ # ffmpeg
ffmpeg version 2.8.15 Copyright (c) 2000-2018 the FFmpeg developers
built with gcc 4.8.5 (GCC) 20150623 (Red Hat 4.8.5-36)
configuration: --prefix=/usr --bindir=/usr/bin --datadir=/usr/share/ffmpeg --incdir=/usr/include/ffmpeg --libdir=/usr/lib64 --mandir=/usr/share/man --arch=x86_64 --optflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -m64 -mtune=generic' --extra-ldflags='-Wl,-z,relro ' --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libvo-amrwbenc --enable-version3 --enable-bzlib --disable-crystalhd --enable-gnutls --enable-ladspa --enable-libass --enable-libcdio --enable-libdc1394 --enable-libfdk-aac --enable-nonfree --disable-indev=jack --enable-libfreetype --enable-libgsm --enable-libmp3lame --enable-openal --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libv4l2 --enable-libx264 --enable-libx265 --enable-libxvid --enable-x11grab --enable-avfilter --enable-avresample --enable-postproc --enable-pthreads --disable-static --enable-shared --enable-gpl --disable-debug --disable-stripping --shlibdir=/usr/lib64 --enable-runtime-cpudetect
libavutil 54. 31.100 / 54. 31.100
libavcodec 56. 60.100 / 56. 60.100
libavformat 56. 40.101 / 56. 40.101
libavdevice 56. 4.100 / 56. 4.100
libavfilter 5. 40.101 / 5. 40.101
libavresample 2. 1. 0 / 2. 1. 0
libswscale 3. 1.101 / 3. 1.101
libswresample 1. 2.101 / 1. 2.101
libpostproc 53. 3.100 / 53. 3.100
Hyper fast Audio and Video encoderIf someone could just shove me in the right direction to get the above working I’d be very grateful. I just know it’s going to end up being something really simple.
-
How can I capture real time command line output of x265.exe with Python ?
29 février 2020, par LeeRoermondI would like to write a GUI for x265.exe which presents a better (more humanized) real time progress .
Here’s the code I used to capture subprocess’s output :
import subprocess
cmd = r'ping www.baidu.com -n 4'
popen = subprocess.Popen(cmd, stdout = subprocess.PIPE ,stderr=subprocess.STDOUT ,shell=True)
while True:
next_line = popen.stdout.readline()
if next_line == b'' and popen.poll() != None:
break
else:
print(next_line.decode('ascii').replace('\r\n','\n') , end='')It performs perfectly with ’ping’.
However ,when I swiched to ’x265’ command ,thing goes to wired.
For example, If I replaced string variable
'cmd'
into"x265 --y4m --crf 21 --output output.hevc input.y4m"
in the preceding code.Theoretically , it should gives out the following output in lines arranged in order of time :y4m [info]: 1920x1080 fps 24000/1001 i420p10 frames 0 - 100 of 101
x265 [info]: Using preset ultrafast & tune none
raw [info]: output file: C:\temp\output.hevc
x265 [info]: Main 10 profile, Level-4 (Main tier)
x265 [info]: Thread pool created using 16 threads
x265 [info]: Slices : 1
x265 [info]: frame threads / pool features : 4 / wpp(34 rows)
x265 [info]: Coding QT: max CU size, min CU size : 32 / 16
x265 [info]: Residual QT: max TU size, max depth : 32 / 1 inter / 1 intra
x265 [info]: ME / range / subpel / merge : dia / 57 / 0 / 2
x265 [info]: Keyframe min / max / scenecut / bias: 23 / 250 / 0 / 5.00
x265 [info]: Lookahead / bframes / badapt : 5 / 3 / 0
x265 [info]: AQ: mode / str / qg-size / cu-tree : 1 / 0.0 / 32 / 1
x265 [info]: Rate Control / qCompress : CRF-21.0 / 0.60
x265 [info]: tools: strong-intra-smoothing lslices=6 deblock
[1.0%] 1/101 frames, 6.289 fps, 7217.8 kb/s
[25.7%] 26/101 frames, 59.23 fps, 299.23 kb/s
[45.5%] 46/101 frames, 66.76 fps, 322.81 kb/s
[69.3%] 70/101 frames, 73.30 fps, 224.53 kb/s
[93.1%] 94/101 frames, 77.05 fps, 173.67 kb/s
x265 [info]: frame I: 1, Avg QP:23.45 kb/s: 7098.44
x265 [info]: frame P: 25, Avg QP:25.71 kb/s: 311.24
x265 [info]: frame B: 75, Avg QP:28.33 kb/s: 23.89
x265 [info]: consecutive B-frames: 3.8% 0.0% 0.0% 96.2%
encoded 101 frames in 1.22s (82.58 fps), 165.06 kb/s, Avg QP:27.64But the truth is, those output block in the middle part which indicates the real time progress will not be captured every specific it updated .
popen.stdout.readline()
command will be blocked until progress goes to 100% and then output altogether. Obviously that’s not what I want.( ↓ I mean by this part)
[1.0%] 1/101 frames, 6.289 fps, 7217.8 kb/s
[25.7%] 26/101 frames, 59.23 fps, 299.23 kb/s
[45.5%] 46/101 frames, 66.76 fps, 322.81 kb/s
[69.3%] 70/101 frames, 73.30 fps, 224.53 kb/s
[93.1%] 94/101 frames, 77.05 fps, 173.67 kb/sCould anyone help me fiture out what’s going on and how to fix it to achieve my goal ?
Thanks a lot.
-
Concatenating video clip with static image causes buffer errors
16 février 2020, par jgaebI’m trying to concatenate a 15 second clip of a video (
MOVIE.mp4
) with 5 seconds (no audio) of an image (IMAGE.jpg
) usingFFmpeg
.Something seems to be wrong with my filtergraph, although I’m unable to determine what. The command I’ve put together is the following :
ffmpeg \
-loop 1 -t 5 -I IMAGE.jpg \
-t 15 -I MOVIE.mp4 \
-filter_complex "[0:v]scale=480:640[1_v];anullsrc[1_a];[1:v][1:a][1_v][1_a]concat=n=2:v=1:a=1[out]" \
-map "[out]" \
-strict experimental tst_full.mp4Unfortunately, this seems to be creating some strange results :
-
On my personal computer (
FFmpeg 4.2.1
) it correctly concatenates the movie with the static image ; however, the static image lasts for an unbounded length of time. (After enteringctrl-C
, the movie is still viewable, but is of an extremely long length—e.g., 35 min—depending on when I interrupt the process.) -
On a remote machine where I need to do the ultimate video processing (
FFmpeg 2.8.15-0ubuntu0.16.04.1
), the command does not terminate, and instead, I get cascading errors of the following form :
Past duration 0.611458 too large
...
[output stream 0:0 @ 0x21135a0] 100 buffers queued in output stream 0:0, something may be wrong.
...
[output stream 0:0 @ 0x21135a0] 100000 buffers queued in output stream 0:0, something may be wrong.I haven’t been able to find much documentation that elucidates what these errors mean, so I don’t know what’s going wrong.
-