
Recherche avancée
Médias (1)
-
Publier une image simplement
13 avril 2011, par ,
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (47)
-
XMP PHP
13 mai 2011, parDixit Wikipedia, XMP signifie :
Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...) -
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users. -
Installation en mode ferme
4 février 2011, parLe mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
C’est la méthode que nous utilisons sur cette même plateforme.
L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)
Sur d’autres sites (5672)
-
FFMPEG or FFPLAY, catch FFT signal in real time as floats
25 avril 2021, par NVRMLooking to extract in real time a FFT snapshot of waveforms data with
ffplay
, in the view of creating animations.

This is exactly what I am looking to catch, but this demo is using JavaScript in a browser. (Source own post)




const audio = document.getElementById('music');
audio.load();
audio.play();

const ctx = new AudioContext();
const audioSrc = ctx.createMediaElementSource(audio);
const analyser = ctx.createAnalyser();

audioSrc.connect(analyser);
analyser.connect(ctx.destination);

analyser.fftSize = 256;
const bufferLength = analyser.frequencyBinCount;
const frequencyData = new Uint8Array(bufferLength);

setInterval(() => {
 analyser.getByteFrequencyData(frequencyData);
 console.log(frequencyData);
}, 1000);


<audio src="http://strm112.1.fm/reggae_mobile_mp3" crossorigin="use-URL-credentials" controls="true"></audio>








I tried many variations around the method posted on https://trac.ffmpeg.org/wiki/Waveform .




The problem is the output format for FFT is
PCM
(Pulse Code Modulation), and not real time.


In a generic way, is there a simple way to do this, while the sound is playing, to retrieve this data ?


ffplay -fft file.mp3 > fft.json




Using C, same stuff : Apply FFT on pcm data and convert to a spectrogram


FFMPEG waveform filter documentation


-
How do I know ffmpeg-php is installed ?
18 juillet 2014, par Rob Avery IVI just followed the instructions from this link on how to install ffmpeg-php on my dedicated server : http://www.ndchost.com/wiki/server-administration/install-ffmpeg
At the bottom, it says to run the command
php -i|grep ffmpeg
and if it outputs the following lines then it is installed :ffmpegffmpeg support (ffmpeg-php) => enabled
ffmpeg-php version => 0.6.0
ffmpeg.allow_persistent => 0 => 0When I run it, it gives me this :
ffmpeg
ffmpeg-php version => 0.6.0-svn
ffmpeg-php built on => Jul 18 2014 08:46:12
ffmpeg-php gd support => enabled
ffmpeg libavcodec version => Lavc52.108.0
ffmpeg libavformat version => Lavf52.93.0
ffmpeg swscaler version => SwS0.12.0
ffmpeg.allow_persistent => 0 => 0
ffmpeg.show_warnings => 0 => 0
PWD => /usr/local/src/ffmpeg-php-0.6.0
_SERVER["PWD"] => /usr/local/src/ffmpeg-php-0.6.0
_ENV["PWD"] => /usr/local/src/ffmpeg-php-0.6.0I got 2/3 lines, but the one is not character-for-character the same.
Is
ffmpegffmpeg support (ffmpeg-php) => enabled
the same asffmpegffmpeg support (ffmpeg-php) => enabled
in this context ?EDIT :
Running this commandffmpeg -version
gives me this result :FFmpeg version SVN-r26402, Copyright (c) 2000-2011 the FFmpeg developers
built on Jul 18 2014 08:41:45 with gcc 4.4.7 20120313 (Red Hat 4.4.7-3)
configuration: --enable-libmp3lame --disable-mmx --enable-shared
libavutil 50.36. 0 / 50.36. 0
libavcore 0.16. 1 / 0.16. 1
libavcodec 52.108. 0 / 52.108. 0
libavformat 52.93. 0 / 52.93. 0
libavdevice 52. 2. 3 / 52. 2. 3
libavfilter 1.74. 0 / 1.74. 0
libswscale 0.12. 0 / 0.12. 0
FFmpeg SVN-r26402
libavutil 50.36. 0 / 50.36. 0
libavcore 0.16. 1 / 0.16. 1
libavcodec 52.108. 0 / 52.108. 0
libavformat 52.93. 0 / 52.93. 0
libavdevice 52. 2. 3 / 52. 2. 3
libavfilter 1.74. 0 / 1.74. 0
libswscale 0.12. 0 / 0.12. 0 -
Cannot get JACK Audio/Netjack working over LAN
23 juin 2020, par JamesI'm trying to stream low latency audio between 2 raspberry pis. Both gstreamer and ffmpeg induce 2+ second delays for me.



I've played around with Jack Audio and locally on a single pi it seems promising. I can route mic input to a speaker locally and it is almost instantaneous.



However, I have been having trouble getting it to route between devices using Netjack.



# ON SERVER
jackd -P70 -p16 -t2000 -dalsa -dhw:1 -p128 -n3 -r44100 -s 

# ON CLIENT
jackd -v -R -P70 -dnetone -i1 -o1 -I0 -O0 -r44100 -p128 -n3

# ON SERVER
jack_netsource -H < ip address of client >
jack_lsp # list availible connection ports

>system:capture_1
>system:playback_1
>system:playback_2
>netjack:capture_1
>netjack:capture_2
>netjack:capture_3
>netjack:playback_1
>netjack:playback_2
>netjack:playback_3

jack_connect system:capture_1 system:playback_1 # this works
jack_connect system:capture_1 netjack:playback_1 # this doesn't work :(




Most of the launch options I pulled from here http://wiki.linuxaudio.org/wiki/raspberrypi#using_jack. I'll be honest I don't really know what they do.



The client jackd output shows messages like



Jack: data not valid
Jack: data not valid
Jack: JackSocketServerChannel::Execute : fPollTable i = 1 fd = 6
Jack: JackRequest::Notification
Jack: JackEngine::ClientNotify: no callback for notification = 3
Jack: JackEngine::ClientNotify: no callback for notification = 3
netxruns... duration: 139ms
Jack: JackSocketServerChannel::Execute : fPollTable i = 1 fd = 6
Jack: JackRequest::Notification
Jack: JackEngine::ClientNotify: no callback for notification = 3
Jack: JackEngine::ClientNotify: no callback for notification = 3




And the server jack_netsource output looks like



current latency 114
current latency 20
current latency 27
current latency 29
current latency 48
current latency 23
current latency 33
current latency 28
current latency 41
current latency 84
current latency 44




and the server jackd output looks like



JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackEngine::XRun: client = netjack was not finished, state = Triggered
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackEngine::XRun: client = netjack was not finished, state = Triggered
JackEngine::XRun: client = netjack was not finished, state = Triggered




I believe the
-dnetone
flag indicates to use Netjack2. Netjack 1, which I've tried with the-dnet
flag results in a singleNot Connected
message from jack_netsource and :


Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: JackSocketServerChannel::Execute : fPollTable i = 1 fd = 6




from the client jackd.