
Recherche avancée
Autres articles (37)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
Sur d’autres sites (6466)
-
MacOS - how to choose audio device from terminal
9 octobre 2024, par jon_twoI've been working on a Python program to create audio and also play back existing sound files. I can spawn multiple processes and have them all play to the laptop speakers, but I was wondering if it was possible to send each signal to a separate sound device. This is so I can apply effects to some processes but not all together.


I'm using a MacBook and python
simpleaudio
, which calls AudioToolbox to connect to the output device. I've also gotffmpeg
installed, so could useffplay
if that is easier. Thepydub
library uses this - it exports the current wave to a temp file then uses subprocess andffplay
to play it back.

I can get a list of devices, but am not sure how to use this list to choose a device.


% ffplay -devices
Devices:
 D. = Demuxing supported
 .E = Muxing supported
 --
 E audiotoolbox AudioToolbox output device
 D avfoundation AVFoundation input device
 D lavfi Libavfilter virtual input device
 E sdl,sdl2 SDL2 output device
 D x11grab X11 screen capture, using XCB



I did see a post that suggested using
ffmpeg
to list devices, again I can't figure out how to use this list.

% ffmpeg -f lavfi -i sine=r=44100 -f audiotoolbox -list_devices true -
Input #0, lavfi, from 'sine=r=44100':
 Duration: N/A, start: 0.000000, bitrate: 705 kb/s
 Stream #0:0: Audio: pcm_s16le, 44100 Hz, mono, s16, 705 kb/s
Stream mapping:
 Stream #0:0 -> #0:0 (pcm_s16le (native) -> pcm_s16le (native))
Press [q] to stop, [?] for help
[AudioToolbox @ 0x135e3f230] CoreAudio devices:
[AudioToolbox @ 0x135e3f230] [0] Background Music, (null)
[AudioToolbox @ 0x135e3f230] [1] Background Music (UI Sounds), BGMDevice_UISounds
[AudioToolbox @ 0x135e3f230] [2] MacBook Air Microphone, BuiltInMicrophoneDevice
[AudioToolbox @ 0x135e3f230] [3] MacBook Air Speakers, BuiltInSpeakerDevice
[AudioToolbox @ 0x135e3f230] [4] Aggregate Device, ~:AMS2_Aggregate:0
Output #0, audiotoolbox, to 'pipe:':
 Metadata:
 encoder : Lavf59.27.100
 Stream #0:0: Audio: pcm_s16le, 44100 Hz, mono, s16, 705 kb/s
 Metadata:
 encoder : Lavc59.37.100 pcm_s16le
size=N/A time=00:00:05.06 bitrate=N/A speed=0.984x 
video:0kB audio:436kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
Exiting normally, received signal 2.



This does at least give me a recognisable list of devices. If I add more Aggregate Devices, can I play back different files to each device ?


-
How do i get ffprobe to parse 'TAG : timecode' into ffmpeg 'drawtext' command ? (Bash Terminal)
4 juin 2019, par MylesI have a .mov file that contains original source timecode metadata but i can’t figure out a way to get ffmpeg to burn the original timecode into the picture.
If i open the original file in QuickTime Player we can see it displays the true timecode on the far left :
I can also see that ffprobe is able to see the metadata when i run the following :
Command :
ffprobe -i test.mov -show_streams
Abbreviated Result :
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'test.mov':
Metadata:
major_brand : qt
minor_version : 537199360
compatible_brands: qt
creation_time : 2018-11-05T14:20:51.000000Z
timecode : 09:59:53:00
Duration: 00:16:37.64, start: 0.000000, bitrate: 1680 kb/sSo i can see that ffprobe is able to determine the start timecode of the file in its metadata results. The question is how to i pass that information into an ffmpeg command so that the timecode seen by ffprobe is what gets used when i convert the file for timecode burn-in ?
An example of a standard burnt in timecode command would be this :
ffmpeg -i test.mov -vcodec libx264 -cmp 22 -vf
"drawtext=fontfile=DroidSansMono.ttf : timecode=’09:59:53:00’ : r=25 :
x=(w-tw)/2 : y=h-(2*lh) : fontcolor=white : box=1 : boxcolor=0x00000099"
-y test_bitc.movThe only problem there though is that i’ve had to manually put the timecode in myself. I want the command to use the existing timecode metadata as the timecode input value so the same command can be used on multiple files.
Does anyone know how to do this ?
-
Terminal : How to extract image-based subtitles from mp4 ?
4 août 2022, par DevonDahonHow to extract image-based subtitles from mp4 video ?


$ ffmpeg -i myvideo.mp4
ffmpeg version 5.0.1 Copyright (c) 2000-2022 the FFmpeg developers
 built with Apple clang version 13.1.6 (clang-1316.0.21.2.5)
 configuration: --prefix=/usr/local/Cellar/ffmpeg/5.0.1_3 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libdav1d --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox
 libavutil 57. 17.100 / 57. 17.100
 libavcodec 59. 18.100 / 59. 18.100
 libavformat 59. 16.100 / 59. 16.100
 libavdevice 59. 4.100 / 59. 4.100
 libavfilter 8. 24.100 / 8. 24.100
 libswscale 6. 4.100 / 6. 4.100
 libswresample 4. 3.100 / 4. 3.100
 libpostproc 56. 3.100 / 56. 3.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'myvideo.mp4':
 Metadata:
 major_brand : mp42
 minor_version : 512
 compatible_brands: mp42iso2avc1mp41
 creation_time : 2022-08-03T21:58:07.000000Z
 encoder : HandBrake 1.5.1 2022011000
 Duration: 00:11:05.40, start: 0.000000, bitrate: 1635 kb/s
 Stream #0:0[0x1](und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, smpte170m/bt470bg/bt470bg, progressive), 712x576 [SAR 16:15 DAR 178:135], 1454 kb/s, 25 fps, 25 tbr, 90k tbn (default)
 Metadata:
 creation_time : 2022-08-03T21:58:07.000000Z
 handler_name : VideoHandler
 vendor_id : [0][0][0][0]
 Stream #0:1[0x2](bre): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 162 kb/s (default)
 Metadata:
 creation_time : 2022-08-03T21:58:07.000000Z
 handler_name : Stereo
 vendor_id : [0][0][0][0]
 Stream #0:2[0x3](bre): Subtitle: dvd_subtitle (mp4s / 0x7334706D), 720x576, 6 kb/s (default)
 Metadata:
 creation_time : 2022-08-03T21:58:07.000000Z
 handler_name : Brezhoneg
 Stream #0:3[0x4](fra): Subtitle: dvd_subtitle (mp4s / 0x7334706D), 720x576, 6 kb/s
 Metadata:
 creation_time : 2022-08-03T21:58:07.000000Z
 handler_name : Français
At least one output file must be specified