Recherche avancée

Médias (2)

Mot : - Tags -/media

Autres articles (95)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • MediaSPIP Core : La Configuration

    9 novembre 2010, par

    MediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
    Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...)

Sur d’autres sites (6412)

  • Revision effd974b16 : Limit arf interval for low fpf clips. This patch limits the maximum arf interv

    16 avril 2015, par paulwilkins

    Changed Paths :
     Modify /vp9/encoder/vp9_firstpass.c


     Modify /vp9/encoder/vp9_ratectrl.c



    Limit arf interval for low fpf clips.

    This patch limits the maximum arf interval length to
    approximately half a second. In some low fps animations in
    particular the existing code was selecting an overly long interval
    which was hurting visual quality. For a sample problem test clip
    (360P animation , 15fps, 200Kbit/s) this change also improved
    metrics by >0.5 db.

    There may be some clips where this hurts metrics a little, but the
    worst case impact visually is likely to be less than having an
    interval that is much too long. On more normal material at 24
    fps or higher, the impact is likely to be nil/minimal.

    Change-Id : Id8b57413931a670c861213ea91d7cc596375a297

  • Displaying video on audio timeline [on hold]

    9 décembre 2016, par ark1974

    I need to add a ’clean cutter’ feature i.e. ability to trim video based on the audio signatures on the audio time-line graphically. But I don’t know the general gist as to how to display audio (in a video say a mov file) on the timeline.

    Do I have to parse the video and record the audio on a separate file before editing ? I want to drop a video/audio file on the timeline and show it graphically almost real-time. Any input will be appreciated.
    (Question been revised for clarity. )

  • Using FFMPEG for Bidirectional Voice Communication with Symmetric RTP

    3 février 2024, par Batuhan Öksüz

    I am trying to make a voice communication between two peers. The first peer is my local machine which is behind a symmetric NAT. The second peer is my server running on an AWS EC2 device which has a public IP address. I want to use FFMPEG for sending the audio stream through RTP while at the same time listening to a known port to receive the audio stream the remote peer sends to my device. In order to not deal with NAT traversal issues, I want to be able to use the same IP address and port number I use for sending on my local device for receiving. Is this plausible ? I'm starting to think that this is not possible and here is my rationale :

    


      

    • If my understanding is correct UDP doesn't allow full-duplex communication ; that is, one cannot use one IP:port pair for both sending and receiving data packets at the same time. Is that correct ?
    • 


    • If one wants to bind a socket that is already been bound, the function throws a bind error saying "Address already in use.". I figured there is a UDP option to allow binding to the same port or address even if they are in use. These options are namely SO_REUSEADDR and SO_REUSEPORT. However, this page states that it is only possible if the port is in TIME_WAIT state. So this also supports my suspicion.
    • 


    


    On the other hand, there is symmetric RTP that clearly states a device can receive RTP streams from an address and port that it simultaneously uses to transmit RTP streams from. With exact words of the RFC :

    


    


    A device supports symmetric RTP if it selects, communicates, and uses
IP addresses and port numbers such that, when receiving a
bidirectional RTP media stream on UDP port "A" and IP address "a", it
also transmits RTP media for that stream from the same source UDP
port "A" and IP address "a". That is, it uses the same UDP port to
transmit and receive one RTP stream.

    


    


    


    A device that doesn't support symmetric RTP would transmit RTP from a
different port, or from a different IP address, than the port and IP
address used to receive RTP for that bidirectional media steam.

    


    


    So this is where I get confused. Is symmetric RTP somehow works around the limitations of UDP ? How am I getting this wrong ?

    


    Now going back to FFMPEG and the use of symmetric RTP, I see there is an rtp option we can use to set it up, the so called localrtpport=n. I can find almost no explanation to what it does and how it's useful though. Can anyone help me with that ? As far as I can tell, this option tells FFMPEG to use port "n" as the outbound port when transmitting an RTP stream. So if the receiver transmitted its stream to this port then the problem of symmetric NAT requirement would be resolved. Or so I thought...

    


    To draw you a better picture, here are my FFMPEG commands (I'm trying everything in my local host in these commands) :

    


    # My Mac with en0 IP of 192.168.1.64
ffmpeg \
-hide_banner \
-re \
-fflags +genpts \
-f lavfi -i aevalsrc="sin(400*2*PI*t)" \
-ar 8000 \
-f mulaw \
-f rtp -reuse 1 "rtp://192.168.1.72:9193?reuse=1&localrtpport=16386&localrtcpport=16387" \
-protocol_whitelist udp,rtp \
-i rtp://192.168.1.64:16386 \
audio_signal_with_symmetric_rtp.mp3


    


    Here I am simply generating a fixed sound signal and outputting it in mulaw format through rtp. I am using the 'localrtport' option to set my outbound port and I am expecting to receive the remote peer's stream on the same port. This command starts running and and waits for the incoming stream. As soon as I start transmitting the stream from my Raspberry Pi which is on the same wireless network, I get the dreaded "Address already in use." error and the process terminates. Here is the command I use on the Raspberry :

    


    # My Raspberry Pi with wlan0 IP of 192.168.1.72
ffmpeg \
-hide_banner \
-re \
-f lavfi -i aevalsrc="sin(400*2*PI*t)" \
-ar 8000 \
-f mulaw \
-f rtp "rtp://192.168.1.64:16386?reuse=1&localrtpport=16386"


    


    TLDR

    


    The short form of the question comes down to this : How can I make use of symmetric RTP with FFMPEG, receive a stream from the same port I am actively transmitting another stream from ? Is what I'm asking impossible ? Should I go for an alternate route and try to set up a TURN server for my system ? Any help would be appreciated.