Recherche avancée

Médias (0)

Mot : - Tags -/médias

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (38)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (4667)

  • Problem with ffmpeg rtp stream to janus webrtc

    13 juin 2020, par XPModder

    I am trying to use ffmpeg and janus-gateway to stream video in the local network. I am piping the h264 video directly in to ffmpeg and from there it gets transferred to janus as an rtp stream. Janus then does the rest.

    



    The problem is, that when I try to open the stream using the streamingtest html page included in janus, I can select the stream, but I never get to see anything. On the console where I started janus, it throws multiple errors starting with : "SDP missing mandatory information"

    



    After starting ffmpeg it shows the SDP in console :

    



    v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name
c=IN IP4 127.0.0.1
t=0 0
a=tool:libavformat 58.20.100
m=video 8004 RTP/AVP 96
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1; sprop-parameter-sets=J2QAKKwrQDwBE/LAPEiagA==,KO4BNywA; profile-level-id=640028


    



    The command I use to start ffmpeg is the following :

    



    ffmpeg -loglevel quiet -y -i - -c:v copy -preset veryfast -f rtp rtp://127.0.0.1:8004


    



    I assume that I am missing something in the ffmpeg command and therefore there is some kind of problem with the SDP.

    



    Edit : I looked a little further into this and actually looked at the code for janus at the lines that throw the error. Then I traced that to the origin of the error.

    



    Apparently the SDP is supposed to contain some authorization in for of this :

    



    a=... ice-ufrag=?; ice-pwd=?


    



    So how do I find out what this username and password are and how do I get it in to the SDP from ffmpeg ?

    



    Or how do I disable this entirely ?

    


  • Decoding H.264 stream using FFmpeg on Java

    25 mai 2020, par maru2213

    I'm trying to decode H.264 stream, which is sent over Socket from an Android application to a computer. And I also want to show the decoded stream using JavaFX. I searched much hours, and decided to use JavaCV / FFmpeg. However I got error from FFmpeg. (I was inspired by this code)

    



    Questions :

    



      

    • Why does FFmpeg make error ?
    • 


    • Is it a correct way to convert AVFrame to javafx.scene.image.Image ?
    • 


    



    I'm using :

    



      

    • javacv-platform 1.4.4
    • 


    • ffmpeg-platform 4.1-1.4.4
    • 


    



    Code :

    



    This is a part of import and class fields, and method which runs once at the first time. (Actually the content of initialize() is wrapped by try catch.)

    



        import javafx.scene.image.Image;

    private avcodec.AVCodec avCodec;
    private avcodec.AVCodecContext avCodecContext;
    private avutil.AVDictionary avDictionary;
    private avutil.AVFrame avFrame;

    public void initialize() {
        avCodec = avcodec_find_decoder(AV_CODEC_ID_H264);
        if (avCodec == null) {
            throw new RuntimeException("Can't find decoder");
        }
        avCodecContext = avcodec_alloc_context3(avCodec);
        if (avCodecContext == null) {
            throw new RuntimeException("Can't allocate decoder context");
        }
        int result = avcodec_open2(avCodecContext, avCodec, (AVDictionary) null);
        if (result < 0) {
            throw new RuntimeException("Can't open decoder");
        }
        avFrame = av_frame_alloc();
        if (avFrame == null) {
            throw new RuntimeException("Can't allocate frame");
        }
    }


    



    And this is a method which is called every time when I receive a packet from Android. byte[] data is the packet data starting with 0x00, 0x00, 0x00, 0x01.

    



    The place where I get error is number_of_written_bytes. It always gets <0.

    &#xA;&#xA;

        private void decode(byte[] data) {&#xA;        AVPacket avPacket = new AVPacket();&#xA;        av_init_packet(avPacket);&#xA;        avPacket.pts(AV_NOPTS_VALUE);&#xA;        avPacket.dts(AV_NOPTS_VALUE);&#xA;        BytePointer bytePointer = new BytePointer(data);&#xA;        bytePointer.capacity(data.length);&#xA;        avPacket.data(bytePointer);&#xA;        avPacket.size(data.length);&#xA;        avPacket.pos(-1);&#xA;&#xA;        avcodec_send_packet(avCodecContext, avPacket);&#xA;        int result = avcodec_receive_frame(avCodecContext, avFrame);&#xA;        if (result >= 0) {&#xA;            int bufferOutputSize = av_image_get_buffer_size(avFrame.format(), avFrame.width(), avFrame.height(), 16);&#xA;            Pointer pointer = av_malloc(bufferOutputSize);&#xA;            BytePointer outputPointer = new BytePointer(pointer);&#xA;            int number_of_written_bytes = av_image_copy_to_buffer(outputPointer, bufferOutputSize, avFrame.data(), avFrame.linesize(), avFrame.chroma_location(), avFrame.width(), avFrame.height(), 1);&#xA;            if (number_of_written_bytes &lt; 0) {&#xA;                //The process always come here.&#xA;                throw new RuntimeException("Can&#x27;t copy image to buffer");&#xA;            }&#xA;&#xA;            System.out.println("decode success");&#xA;            Image image = new Image(new ByteArrayInputStream(outputPointer.asBuffer().array()));&#xA;        } else {&#xA;            System.out.println("decode failed");&#xA;        }&#xA;    }&#xA;

    &#xA;

  • (Ffmpeg) How to play live audio in the browser from received UDP packets using Ffmpeg ?

    26 octobre 2022, par Yousef Alaqra

    I have .NET Core console application which acts as UDP Server and UDP Client

    &#xA;&#xA;

      &#xA;
    • UDP client by receiving audio packet.
    • &#xA;

    • UDP server, by sending each received packet.
    • &#xA;

    &#xA;&#xA;

    Here's a sample code of the console app :

    &#xA;&#xA;

    static UdpClient udpListener = new UdpClient();&#xA;    static IPEndPoint endPoint = new IPEndPoint(IPAddress.Parse("192.168.1.230"), 6980);&#xA;    static IAudioSender audioSender = new UdpAudioSender(new IPEndPoint(IPAddress.Parse("192.168.1.230"), 65535));&#xA;&#xA;    static void Main(string[] args)&#xA;    {&#xA;        udpListener.Client.SetSocketOption(SocketOptionLevel.Socket, SocketOptionName.ReuseAddress, true);&#xA;        udpListener.Client.Bind(endPoint);&#xA;&#xA;        try&#xA;        {&#xA;            udpListener.BeginReceive(new AsyncCallback(recv), null);&#xA;        }&#xA;        catch (Exception e)&#xA;        {&#xA;            throw e;&#xA;        }&#xA;&#xA;        Console.WriteLine("Press enter to dispose the running service");&#xA;        Console.ReadLine();&#xA;    }&#xA;&#xA;    private async static void recv(IAsyncResult res)&#xA;    {&#xA;        byte[] received = udpListener.EndReceive(res, ref endPoint);&#xA;        OnAudioCaptured(received);&#xA;        udpListener.BeginReceive(new AsyncCallback(recv), null);&#xA;    }&#xA;

    &#xA;&#xA;

    On the other side, I have a node js API application, which supposes to execute an FFmpeg command as a child process and to do the following

    &#xA;&#xA;

      &#xA;
    • receive the audio packet as an input from the console app UDP server.
    • &#xA;

    • convert the received bytes into WebM
    • &#xA;

    • pipe out the result into the response.
    • &#xA;

    &#xA;&#xA;

    Finally, in the client-side, I should have an audio element with source value equals to the http://localhost:3000

    &#xA;&#xA;

    For now, I can only execute this FFmpeg command :

    &#xA;&#xA;

    ffmpeg -f  s16le  -ar 48000 -ac 2 -i &#x27;udp://192.168.1.230:65535&#x27; output.wav&#xA;

    &#xA;&#xA;

    Which do the following

    &#xA;&#xA;

      &#xA;
    • Receive UDP packet as an input
    • &#xA;

    • Convert the received bytes into the output.wav audio file.
    • &#xA;

    &#xA;&#xA;

    How would I execute a child process in the node js server which receives the UDP packets and pipe out the result into the response as Webm ?

    &#xA;