Recherche avancée

Médias (1)

Mot : - Tags -/biomaping

Autres articles (77)

  • Qualité du média après traitement

    21 juin 2013, par

    Le bon réglage du logiciel qui traite les média est important pour un équilibre entre les partis ( bande passante de l’hébergeur, qualité du média pour le rédacteur et le visiteur, accessibilité pour le visiteur ). Comment régler la qualité de son média ?
    Plus la qualité du média est importante, plus la bande passante sera utilisée. Le visiteur avec une connexion internet à petit débit devra attendre plus longtemps. Inversement plus, la qualité du média est pauvre et donc le média devient dégradé voire (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

Sur d’autres sites (8326)

  • how to stream live videos with no latency (ffplay, mplayer) and what kind of wrapper could be used with ffplay ?

    10 juin 2015, par user573014

    I have been testing playing multiple live streams using different players because I wanted to get the lowest latency value. I tried gstreamer player (gst-launch-0.01), mplayer, totem and ffmpeg player (ffplay). I used different configuration values to get the lowest latency for each of them for example :

    ffplay -fflags nobuffer
    mplayer -benchmark

    The protocol I am streaming with is udp and I am getting a better values with ffplay than mplayer or gst-launch. To be honest, I don’t know what kind of configuration I need to do it the gstreamer to get a lower latency.
    Now, what I need is two things :

    1. I would like to know if someone has a better suggestion about streaming a live stream with lower latency < 100 ms. I am now getting higher than 100 ms which is not really efficient for me.

    2. Since I am using ffplay currently, because it is the best so far. I would like to do a simple gui with a play and record button and 3 screens to stream from different video servers, I just don’t know what kind of wrapper (which should be really fast) to use !

  • Using ffmpeg to stream live video from a raspberry pi to a web server for distribution

    7 mars 2019, par CNorlander

    I am trying to build a device that will encode h.264 video on a raspberrypi and stream it out to a separate web server in the cloud. The main issue I am having is most implementations I search for either have the web server directly on the pi or have the embedded player playing video directly from the device.

    I would like it to be pretty much plug and play no matter what network I am on ie no port forwarding of any sort all I need to do is connect the device to the network and the stream will be visible on a webpage.

    One possible solution to the issue is just simply encode frames in base 64 as jpegs and send them to a an endpoint on the webserver, however, this is a huge waste of bandwidth and wont allow for the framerate h.264 would.

    Any idea on some possible technologies that could be used to do this ?

    I feel like it can be done with some websockets or zmq and ffmpeg somehow but I am not sure.

  • How to display live video stream from ffmpeg in C#/WPF ?

    9 avril 2019, par Kevin McDaniel

    I am trying to get an output from FFmpeg.exe to display live on my wpf media player. The inputs to FFmpeg are an IP camera and a video file (mp4).

    The WPF media player needs a URI as the input. So, I have tried hosting a Udp server for FFmpeg to post to (which I can log the data coming in on the console). Then I try to get the stream using the media player, but no success.

    My Udp listener :

    UdpClient _udpServer = new UdpClient(5000);
    UdpClient _udpClient = new UdpClient(5001);
    private void UdpListen() {
      while (true)
      {
         IPEndPoint RemoteIpEndPoint = new IPEndPoint(IPAddress.Any, 0);
         Byte[] receiveBytes = _udpServer.Receive(ref RemoteIpEndPoint);
         if (RemoteIpEndPoint.ToString() != "192.168.1.110:5001")
         {
            IPEndPoint ep = new IPEndPoint(IPAddress.Parse("192.168.1.110"), 5001);
            _udpClient.Send(receiveBytes, receiveBytes.Length, ep);
         }
         Console.WriteLine("receive data from " + RemoteIpEndPoint.ToString() + ": " + receiveBytes.Length);
      }
    }

    The ffmpeg output : ... -f mpegts udp ://192.168.1.110:5000

    I pass this into the media player : udp ://192.168.1.110:5001

    I am expecting to see the video file being displayed as ffmpeg is processing the output. But, I just see ffmpeg do its processing, and the Udp listener writing to the console.
    Let me know where I am going wrong, thanks in advance