Recherche avancée

Médias (0)

Mot : - Tags -/logo

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (112)

  • Les vidéos

    21 avril 2011, par

    Comme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
    Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
    Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

Sur d’autres sites (11463)

  • avcodec/get_bits : add cached bitstream reader

    7 juillet 2017, par Paul B Mahol
    avcodec/get_bits : add cached bitstream reader
    

    Signed-off-by : Paul B Mahol <onemda@gmail.com>

    • [DH] libavcodec/get_bits.h
    • [DH] libavcodec/golomb.h
  • Live AAC and H264 data into live stream

    10 mai 2024, par tzuleger

    I have a remote camera that captures H264 encoded video data and AAC encoded audio data, places the data into a custom ring buffer, which then is sent to a Node.js socket server, where the packet of information is detected as audio or video and then handled accordingly. That data should turn into a live stream, the protocol doesn't matter, but the delay has to be around 4 seconds and can be played on iOS and Android devices.

    &#xA;

    After reading hundreds of pages of documentation, questions, or solutions on the internet, I can't seem to find anything about handling two separate streams of AAC and H264 data to create a live stream.

    &#xA;

    Despite attempting many different ways of achieving this goal, even having a working implementation of HLS, I want to revisit ALL options of live streaming, and I am hoping someone out there can give me advice or guidance to specific documentation on how to achieve this goal.

    &#xA;

    To be specific, this is our goal :

    &#xA;

      &#xA;
    • Stream AAC and H264 data from remote cellular camera to a server which will do some work on that data to live stream to one user (possibly more users in the future) on a mobile iOS or Android device
    • &#xA;

    • Delay of the live stream should be a maximum of 4 seconds, if the user has bad signal, then a longer delay is okay, as we obviously cannot do anything about that.
    • &#xA;

    • We should not have to re-encode our data. We've explored WebRTC, but that requires OPUS audio packets and thus requires us to re-encode the data, which would be expensive for our server to run.
    • &#xA;

    &#xA;

    Any and all help, ranging from re-visiting an old approach we took to exploring new ones, is appreciated.

    &#xA;

    I can provide code snippets as well for our current implementation of LLHLS if it helps, but I figured this post is already long enough.

    &#xA;

    I've tried FFmpeg with named pipes, I expected it to just work, but FFmpeg kept blocking on the first named pipe input. I thought of just writing the data out to two files and then using FFmpeg, but it's continuous data and I don't have enough knowledge on FFmpeg on how I could use that type of implementation to create one live stream.

    &#xA;

    I've tried implementing our own RTSP server on the camera using Gstreamer (our camera had its RTSP server stripped out, wasn't my call) but the camera's flash storage cannot handle having GStreamer on it, so that wasn't an option.

    &#xA;

    My latest attempt was using a derivation of hls-parser to create an HLS manifest and mux.js to create MP4 containers for .m4s fragmented mp4 files and do an HLS live stream. This was my most successful attempt, where we successfully had a live stream going, but the delay was up to 16 seconds, as one would expect with HLS live streaming. We could drop the target duration down to 2 seconds and get about 6-8 seconds delay, but this could be unreliable, as these cameras could have no signal making it relatively expensive to send so many IDR frames with such low bandwidth.

    &#xA;

    With the delay being the only factor left, I attempted to upgrade the implementation to support Apple's Low Latency HLS. It seems to work, as the right partial segments are getting requested and everything that makes LLHLS is working as intended, but the delay isn't going down when played on iOS' native AVPlayer, as a matter of fact, it looks like it worsened.

    &#xA;

    I would also like to disclaim, my knowledge on media streaming is fairly limited. I've learned most of what I speak of in this post over the past 3 months by reading RFCs, documentation, and stackoverflow/reddit questions and answers. If anything appears to be confusing, it might be just my lack of understanding of it.

    &#xA;

  • Gstreamer basic tutorial gives blank screen

    11 novembre 2013, par user763410

    I have the following gstreamer basic tutorial. When I compile and run the program, I am only getting a blank screen. The same happens when I use mplayer, but I supply -vo x11 option to it and it works well(for mplayer). So, I am guessing the issue gstreamer is also the same. So, my question is..."What should I do to tell the program that my output driver is x11 ?" How do I get rid of the blank screen ?

    Thanks

    #include <gst></gst>gst.h>

    int main(int argc, char *argv[]) {
     GstElement *pipeline;
     GstBus *bus;
     GstMessage *msg;

     /* Initialize GStreamer */
     gst_init (&amp;argc, &amp;argv);

     /* Build the pipeline */
     /*
       pipeline = gst_parse_launch ("playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm", NULL);
     */
       pipeline = gst_parse_launch ("playbin2 uri=file:///home/mylogin/gstreamerstuff/sintel_trailer-480p.webm", NULL);

     /* Start playing */
     gst_element_set_state (pipeline, GST_STATE_PLAYING);

     /* Wait until error or EOS */
     bus = gst_element_get_bus (pipeline);
     msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, GST_MESSAGE_ERROR | GST_MESSAGE_EOS);

     /* Free resources */
     if (msg != NULL)
       gst_message_unref (msg);
     gst_object_unref (bus);
     gst_element_set_state (pipeline, GST_STATE_NULL);
     gst_object_unref (pipeline);
     return 0;
    }