Recherche avancée

Médias (91)

Autres articles (86)

  • Encodage et transformation en formats lisibles sur Internet

    10 avril 2011

    MediaSPIP transforme et ré-encode les documents mis en ligne afin de les rendre lisibles sur Internet et automatiquement utilisables sans intervention du créateur de contenu.
    Les vidéos sont automatiquement encodées dans les formats supportés par HTML5 : MP4, Ogv et WebM. La version "MP4" est également utilisée pour le lecteur flash de secours nécessaire aux anciens navigateurs.
    Les documents audios sont également ré-encodés dans les deux formats utilisables par HTML5 :MP3 et Ogg. La version "MP3" (...)

  • Monitoring de fermes de MediaSPIP (et de SPIP tant qu’à faire)

    31 mai 2013, par

    Lorsque l’on gère plusieurs (voir plusieurs dizaines) de MediaSPIP sur la même installation, il peut être très pratique d’obtenir d’un coup d’oeil certaines informations.
    Cet article a pour but de documenter les scripts de monitoring Munin développés avec l’aide d’Infini.
    Ces scripts sont installés automatiquement par le script d’installation automatique si une installation de munin est détectée.
    Description des scripts
    Trois scripts Munin ont été développés :
    1. mediaspip_medias
    Un script de (...)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

Sur d’autres sites (14064)

  • Cannot get JACK Audio/Netjack working over LAN

    23 juin 2020, par James

    I'm trying to stream low latency audio between 2 raspberry pis. Both gstreamer and ffmpeg induce 2+ second delays for me.

    



    I've played around with Jack Audio and locally on a single pi it seems promising. I can route mic input to a speaker locally and it is almost instantaneous.

    



    However, I have been having trouble getting it to route between devices using Netjack.

    



    # ON SERVER
jackd -P70 -p16 -t2000 -dalsa -dhw:1 -p128 -n3 -r44100 -s 

# ON CLIENT
jackd -v -R -P70 -dnetone -i1 -o1 -I0 -O0  -r44100 -p128 -n3

# ON SERVER
jack_netsource -H < ip address of client >
jack_lsp # list availible connection ports

>system:capture_1
>system:playback_1
>system:playback_2
>netjack:capture_1
>netjack:capture_2
>netjack:capture_3
>netjack:playback_1
>netjack:playback_2
>netjack:playback_3

jack_connect system:capture_1 system:playback_1 # this works
jack_connect system:capture_1 netjack:playback_1 # this doesn't work :(


    



    Most of the launch options I pulled from here http://wiki.linuxaudio.org/wiki/raspberrypi#using_jack. I'll be honest I don't really know what they do.

    



    The client jackd output shows messages like

    



    Jack: data not valid
Jack: data not valid
Jack: JackSocketServerChannel::Execute : fPollTable i = 1 fd = 6
Jack: JackRequest::Notification
Jack: JackEngine::ClientNotify: no callback for notification = 3
Jack: JackEngine::ClientNotify: no callback for notification = 3
netxruns... duration: 139ms
Jack: JackSocketServerChannel::Execute : fPollTable i = 1 fd = 6
Jack: JackRequest::Notification
Jack: JackEngine::ClientNotify: no callback for notification = 3
Jack: JackEngine::ClientNotify: no callback for notification = 3


    



    And the server jack_netsource output looks like

    



    current latency 114
current latency 20
current latency 27
current latency 29
current latency 48
current latency 23
current latency 33
current latency 28
current latency 41
current latency 84
current latency 44


    



    and the server jackd output looks like

    



    JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackEngine::XRun: client = netjack was not finished, state = Triggered
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackAudioDriver::ProcessGraphAsyncMaster: Process error
JackEngine::XRun: client = netjack was not finished, state = Triggered
JackEngine::XRun: client = netjack was not finished, state = Triggered


    



    I believe the -dnetone flag indicates to use Netjack2. Netjack 1, which I've tried with the -dnet flag results in a single Not Connected message from jack_netsource and :

    



    Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: CatchHost fd = 5 err = Resource temporarily unavailable
Jack: JackSocketServerChannel::Execute : fPollTable i = 1 fd = 6


    



    from the client jackd.

    


  • How do I know ffmpeg-php is installed ?

    18 juillet 2014, par Rob Avery IV

    I just followed the instructions from this link on how to install ffmpeg-php on my dedicated server : http://www.ndchost.com/wiki/server-administration/install-ffmpeg

    At the bottom, it says to run the command php -i|grep ffmpeg and if it outputs the following lines then it is installed :

    ffmpegffmpeg support (ffmpeg-php) => enabled
    ffmpeg-php version => 0.6.0
    ffmpeg.allow_persistent => 0 => 0

    When I run it, it gives me this :

    ffmpeg
    ffmpeg-php version => 0.6.0-svn
    ffmpeg-php built on => Jul 18 2014 08:46:12
    ffmpeg-php gd support  => enabled
    ffmpeg libavcodec version => Lavc52.108.0
    ffmpeg libavformat version => Lavf52.93.0
    ffmpeg swscaler version => SwS0.12.0
    ffmpeg.allow_persistent => 0 => 0
    ffmpeg.show_warnings => 0 => 0
    PWD => /usr/local/src/ffmpeg-php-0.6.0
    _SERVER["PWD"] => /usr/local/src/ffmpeg-php-0.6.0
    _ENV["PWD"] => /usr/local/src/ffmpeg-php-0.6.0

    I got 2/3 lines, but the one is not character-for-character the same.

    Is ffmpegffmpeg support (ffmpeg-php) => enabled the same as ffmpegffmpeg support (ffmpeg-php) => enabled in this context ?

    EDIT :
    Running this command ffmpeg -version gives me this result :

    FFmpeg version SVN-r26402, Copyright (c) 2000-2011 the FFmpeg developers
     built on Jul 18 2014 08:41:45 with gcc 4.4.7 20120313 (Red Hat 4.4.7-3)
     configuration: --enable-libmp3lame --disable-mmx --enable-shared
     libavutil     50.36. 0 / 50.36. 0
     libavcore      0.16. 1 /  0.16. 1
     libavcodec    52.108. 0 / 52.108. 0
     libavformat   52.93. 0 / 52.93. 0
     libavdevice   52. 2. 3 / 52. 2. 3
     libavfilter    1.74. 0 /  1.74. 0
     libswscale     0.12. 0 /  0.12. 0
    FFmpeg SVN-r26402
    libavutil     50.36. 0 / 50.36. 0
    libavcore      0.16. 1 /  0.16. 1
    libavcodec    52.108. 0 / 52.108. 0
    libavformat   52.93. 0 / 52.93. 0
    libavdevice   52. 2. 3 / 52. 2. 3
    libavfilter    1.74. 0 /  1.74. 0
    libswscale     0.12. 0 /  0.12. 0
  • FFMPEG or FFPLAY, catch FFT signal in real time as floats

    25 avril 2021, par NVRM

    Looking to extract in real time a FFT snapshot of waveforms data with ffplay, in the view of creating animations.

    


    This is exactly what I am looking to catch, but this demo is using JavaScript in a browser. (Source own post)

    


    

    

    const audio = document.getElementById('music');
audio.load();
audio.play();

const ctx = new AudioContext();
const audioSrc = ctx.createMediaElementSource(audio);
const analyser = ctx.createAnalyser();

audioSrc.connect(analyser);
analyser.connect(ctx.destination);

analyser.fftSize = 256;
const bufferLength = analyser.frequencyBinCount;
const frequencyData = new Uint8Array(bufferLength);

setInterval(() => {
   analyser.getByteFrequencyData(frequencyData);
   console.log(frequencyData);
}, 1000);

    


    <audio src="http://strm112.1.fm/reggae_mobile_mp3" crossorigin="use-URL-credentials" controls="true"></audio>

    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;&#xA;


    &#xA;

    I tried many variations around the method posted on https://trac.ffmpeg.org/wiki/Waveform .

    &#xA;

    enter image description here

    &#xA;

    The problem is the output format for FFT is PCM (Pulse Code Modulation), and not real time.

    &#xA;


    &#xA;

    In a generic way, is there a simple way to do this, while the sound is playing, to retrieve this data ?

    &#xA;

    ffplay -fft file.mp3 > fft.json&#xA;

    &#xA;


    &#xA;

    Using C, same stuff : Apply FFT on pcm data and convert to a spectrogram

    &#xA;

    FFMPEG waveform filter documentation

    &#xA;