
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (65)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
MediaSPIP Core : La Configuration
9 novembre 2010, parMediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...)
Sur d’autres sites (8661)
-
how to give random name to the file in ffmpeg
18 mars 2020, par amit9867How can i give random name to a output file in ffmpeg.
I want to give the filename as current date_time (ex.2020-3-18-10-13-4.mkv).
I don’t want to give a fix name such as output.mkv.
import os
import subprocess
import tkinter as tk
import datetime
root = tk.Tk()
os.chdir(f'C://Users/{os.getlogin()}/desktop/')
def recording_voice():
global p
p =subprocess.Popen('ffmpeg -i video.avi -i audio.wav -c:v copy -c:a aac -strict experimental -strftime 1 "%Y-%m-%d_%H-%M-%S.mkv"' ,stdin=subprocess.PIPE)
rec_btn = tk.Button(text='Start merging', width=20, command=recording_voice)
rec_btn.pack()
root.mainloop() -
(osx) ffmpeg combining mp3 and png to mp4 resulting in mp4 with no audio
18 juin 2016, par Ian HI’m writing a python script that uses unix commands to do some file conversions/renderings. I’m trying to join some mp3 files with png files to get mp4s that are the picture with the mp3 playing over them. However, I’ve tried this with lots of different codecs and settings, and the output mp4 video never seems to have audio in it. I’ve looked at any answer to any question even related to ffmpeg and haven’t found a solution.
Some commands I’m trying to get working currently :
ffmpeg -loop 1 -i slide_shot%d.png -i %s -c:v libx264 -pix_fmt yuv420p
-s 720x540 -t %.3f -c:a aac -b:a 192k -shortest out%d.mp4"
% (i, aud, slideTime, i)
ffmpeg -loop 1 -i slide_shot%d.png -i %s -shortest -t %.3f -write_xing
0 -c:v libx264 -c:a libmp3lame -pix_fmt yuv420p -tune stillimage out%d.mp4"
% (i, aud, slideTime, i)
ffmpeg -loop 1 -i slide_shot%d.png -i %s -shortest -t %.3f -write_xing
0 -c:v libx264 -c:a copy -pix_fmt yuv420p -tune stillimage out%d.mp4"
% (i, aud, slideTime, i)I’m currently using the third one. However, none of them are giving me any audio. For reference, i is a loop iterator for naming consistency, aud is the audio filepath, and slideTime is the number of seconds the video should take.
Using this command, I’m currently getting this output in the Terminal :
ffmpeg version 3.0.2 Copyright (c) 2000-2016 the FFmpeg developers
built with Apple LLVM version 7.0.2 (clang-700.1.81)
configuration: --prefix=/usr/local/Cellar/ffmpeg/3.0.2 --enable-shared
--enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-
tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --
enable-opencl --enable-libx264 --enable-libmp3lame --enable-libxvid --
enable-vda
libavutil 55. 17.103 / 55. 17.103
libavcodec 57. 24.102 / 57. 24.102
libavformat 57. 25.100 / 57. 25.100
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 31.100 / 6. 31.100
libavresample 3. 0. 0 / 3. 0. 0
libswscale 4. 0.100 / 4. 0.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
Input #0, png_pipe, from 'slide_shot16.png':
Duration: N/A, bitrate: N/A
Stream #0:0: Video: png, rgba(pc), 720x540, 25 fps, 25 tbr, 25 tbn, 25
tbc
[mp3 @ 0x7fe4f1817e00] Skipping 0 bytes of junk at 0.
[mp3 @ 0x7fe4f1817e00] Estimating duration from bitrate, this may be inaccurate
Input #1, mp3, from 'pres_projects/Cytokine sepsis 13/data/a24x43.mp3':
Duration: 00:02:04.11, start: 0.000000, bitrate: 23 kb/s
Stream #1:0: Audio: mp3, 22050 Hz, mono, s16p, 24 kb/s
[libx264 @ 0x7fe4f1808000] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
[libx264 @ 0x7fe4f1808000] profile High, level 3.0
[libx264 @ 0x7fe4f1808000] 264 - core 148 r2668 fd2c324 - H.264/MPEG-4 AVC codec - Copyleft 2003-2016 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:-3:-3 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=2.00:0.70 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-4 threads=12 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.20
Output #0, mp4, to 'out16.mp4':
Metadata:
encoder : Lavf57.25.100
Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 720x540, q=-1--1, 25 fps, 12800 tbn, 25 tbc
Metadata:
encoder : Lavc57.24.102 libx264
Side data:
unknown side data type 10 (24 bytes)
Stream #0:1: Audio: mp3 (i[0][0][0] / 0x0069), 22050 Hz, mono, 24 kb/s
Stream mapping:
Stream #0:0 -> #0:0 (png (native) -> h264 (libx264))
Stream #1:0 -> #0:1 (copy)
Press [q] to stop, [?] for help
frame= 132 fps=0.0 q=28.0 size=40kB time=00:00:02.96 bitrate=111.8kbits/
frame= 272 fps=271 q=28.0 size= 61kB time=00:00:08.56 bitrate= 58.2kbits/
frame= 404 fps=269 q=28.0 size= 113kB time=00:00:13.84 bitrate= 66.6kbits/
frame= 537 fps=268 q=28.0 size= 132kB time=00:00:19.16 bitrate= 56.2kbits/
frame= 672 fps=268 q=28.0 size= 184kB time=00:00:24.56 bitrate= 61.3kbits/
frame= 808 fps=268 q=28.0 size= 236kB time=00:00:30.00 bitrate= 64.5kbits/
frame= 943 fps=268 q=28.0 size= 255kB time=00:00:35.40 bitrate= 59.1kbits/
frame= 1087 fps=271 q=28.0 size= 309kB time=00:00:41.16 bitrate= 61.5kbits/
frame= 1219 fps=270 q=28.0 size= 328kB time=00:00:46.44 bitrate= 57.8kbits/
frame= 1355 fps=270 q=28.0 size= 380kB time=00:00:51.88 bitrate= 60.0kbits/frame= 1494 fps=271 q=28.0 size= 400kB time=00:00:57.44 bitrate= 57.1kbits/
frame= 1632 fps=271 q=28.0 size= 453kB time=00:01:02.96 bitrate= 58.9kbits/
frame= 1767 fps=271 q=28.0 size= 472kB time=00:01:08.36 bitrate= 56.6kbits/
frame= 1893 fps=269 q=28.0 size= 523kB time=00:01:13.40 bitrate= 58.4kbits/
frame= 2020 fps=268 q=28.0 size= 541kB time=00:01:18.48 bitrate= 56.5kbits/
frame= 2147 fps=267 q=28.0 size= 592kB time=00:01:23.56 bitrate= 58.1kbits/
frame= 2275 fps=267 q=28.0 size= 611kB time=00:01:28.68 bitrate= 56.4kbits/
frame= 2401 fps=266 q=28.0 size= 661kB time=00:01:33.72 bitrate= 57.8kbits/
frame= 2528 fps=265 q=28.0 size= 680kB time=00:01:38.80 bitrate= 56.4kbits/
frame= 2654 fps=264 q=28.0 size= 731kB time=00:01:43.84 bitrate= 57.6kbits/
frame= 2781 fps=264 q=28.0 size= 749kB time=00:01:48.92 bitrate= 56.3kbits/
frame= 2906 fps=263 q=28.0 size= 799kB time=00:01:53.92 bitrate= 57.5kbits/
frame= 3033 fps=263 q=28.0 size= 818kB time=00:01:59.00 bitrate= 56.3kbits/
frame= 3102 fps=261 q=-1.0 Lsize= 983kB time=00:02:04.08 bitrate= 64.9kbits/s speed=10.5x
video:505kB audio:364kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 13.169518%
[libx264 @ 0x7fe4f1808000] frame I:13 Avg QP:14.07 size: 33159
[libx264 @ 0x7fe4f1808000] frame P:782 Avg QP: 6.24 size: 36
[libx264 @ 0x7fe4f1808000] frame B:2307 Avg QP: 9.67 size: 25
[libx264 @ 0x7fe4f1808000] consecutive B-frames: 0.8% 0.0% 0.0% 9 9.2%
[libx264 @ 0x7fe4f1808000] mb I I16..4: 44.1% 26.2% 29.6%
[libx264 @ 0x7fe4f1808000] mb P I16..4: 0.0% 0.0% 0.0% P16..4: 0.0% 0.0% 0.0% 0.0% 0.0% skip:100.0%
[libx264 @ 0x7fe4f1808000] mb B I16..4: 0.0% 0.0% 0.0% B16..8: 0.1% 0.0% 0.0% direct: 0.0% skip:99.9% L0:40.4% L1:59.6% BI: 0.0%
[libx264 @ 0x7fe4f1808000] 8x8 transform intra:26.1% inter:77.7%
[libx264 @ 0x7fe4f1808000] coded y,uvDC,uvAC intra: 23.8% 9.6% 8.1% inter: 0.0% 0.0% 0.0%
[libx264 @ 0x7fe4f1808000] i16 v,h,dc,p: 60% 33% 7% 0%
[libx264 @ 0x7fe4f1808000] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 57% 12% 29% 0% 0% 0% 0% 0% 2%
[libx264 @ 0x7fe4f1808000] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 36% 29% 14% 2% 3% 4% 4% 3% 4%
[libx264 @ 0x7fe4f1808000] i8c dc,h,v,p: 74% 21% 5% 0%
[libx264 @ 0x7fe4f1808000] Weighted P-Frames: Y:0.0% UV:0.0%
[libx264 @ 0x7fe4f1808000] ref P L0: 95.4% 1.1% 3.5%
[libx264 @ 0x7fe4f1808000] ref B L0: 8.5% 90.2% 1.3%
[libx264 @ 0x7fe4f1808000] kb/s:33.31Has anyone ran into a similar problem, and if so, how did you go about fixing it ? Thanks in advance for looking at my question.
-
How to get frames from HDR video in scRGB color space ?
5 mars 2018, par Виталий СинявскийI want to create a simple video player that will show HDR video on HDR TV. For example, this "LG Chess HDR" video. It is encoded with HEVC, its bit depth is 10 bit, pixel format is YUV420P10LE and it has metadata abount BT2020 color space and PQ transfer function.
In this NVIDIA article I found the next :
The display driver takes the scRGB back buffer, and converts it to the
standard expected by the display presently connected. In general, this
means converting the color space from sRGB primaries to BT. 2020
primaries, scaling to an appropriate level, and encoding with a
mechanism like PQ. Also, possibly performing conversions like RGB to
YCC if that display connection requires it.It means that my player should render pixels in the scRGB color space (linear encoding, sRGB primaries, full range is -0.5 through just less than +7.5). So I need to get frames from the source video in this color space somehow, preferably in FP16 pixel format (half float, 16 bits per one color channel). I come to the following simple pipeline to render videos to HDR :
source HDR video in BT2020 color space with applied PQ -> [some video library] ->
-> video frames with colors in scRGB color space -> [my program] ->
-> rendered video on HDR TV with applied conversions by display driverI’m trying to use FFmpeg as this library and do not understand how to get frames from the source HDR video in scRGB color space.
I use sws_scale FFmpeg method now to get frames and know about filters API. But I did not found any information and help about how to transparantly get frames in scRGB using these functionality without parsing metadata for all source videos and create custom video filters for them.
Please, tell me what I can do to get frames in the scRGB color space using FFmpeg. Can someone tell other libraries with which I can do it ?