Recherche avancée

Médias (1)

Mot : - Tags -/bug

Autres articles (59)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (6105)

  • nutenc/write_index : warn if 2 consecutive keyframes have the same PTS and discard...

    22 décembre 2013, par Michael Niedermayer
    nutenc/write_index : warn if 2 consecutive keyframes have the same PTS and discard the 2nd
    

    This fixes an assertion failure and regression and restores previous behaviour
    Fixes Ticket3197

    An alternative would be to fail hard in this case and refuse to mux such data.

    Signed-off-by : Michael Niedermayer <michaelni@gmx.at>

    • [DH] libavformat/nutenc.c
  • How to implement live video streaming with ffmpeg without using WebRTC ?

    8 août 2024, par Artem

    Following up on my previous question, I'd like to inquire about alternative methods for live video streaming using ffmpeg (WebRTC is not an option due to certain constraints I prefer not to discuss here).

    &#xA;

    Context :

    &#xA;

    I have a Go application where a goroutine launches ffmpeg to process a video stream, which is then delivered to the main goroutine via a chan []byte. I tried using WebSocket, but encountered issues as described in the previous question. HLS also didn't work well due to significant latency and artifacts like green squares on the video.

    &#xA;

    Based on a comment in the previous question, I attempted to stream the video via a simple GET request. Here's the Go handler I implemented :

    &#xA;

    func stream(helperApp agent.Helper) func(rw http.ResponseWriter, rr *http.Request) {&#xA;   a := atomic.Bool{}&#xA;   return func(rw http.ResponseWriter, rr *http.Request) {&#xA;      if !a.CAS(false, true) {&#xA;         http.Error(rw, "already running", http.StatusInternalServerError)&#xA;         return&#xA;      }&#xA;&#xA;      rw.Header().Set("Access-Control-Allow-Origin", "*")&#xA;      rw.Header().Set("Content-Type", "video/mp2t")&#xA;&#xA;      out := make(chan []byte)&#xA;&#xA;      // create StreamParam&#xA;      go ScreenCaptureForLiveStream(StreamParam, out) // ffmpeg process starts inside&#xA;&#xA;      r, w := io.Pipe()&#xA;      go func() {&#xA;         for data := range out {&#xA;            w.Write(data)&#xA;            fmt.Println(len(data))&#xA;         }&#xA;      }()&#xA;      io.Copy(rw, r)&#xA;   }&#xA;}&#xA;&#xA;

    &#xA;

    On the client side (HTML) :

    &#xA;

    <video muted="muted" src="http://localhost:8080/stream" controls="controls"></video>&#xA;

    &#xA;

    In the browser console, I can see data being received, but the video doesn't play.

    &#xA;

    FFmpeg is executed with these parameters :

    &#xA;

    -loglevel error -f avfoundation -framerate 5 -capture_cursor 1 -capture_mouse_clicks 1 -i 1 -c:v libx264 -pix_fmt yuv420p -vf pad=&#x27;ceil(iw/2)*2:ceil(ih/2)*2&#x27; -threads 0 -preset veryfast -bf 2 -f mpegts pipe:1&#xA;&#xA;

    &#xA;

    For validation, I ran :

    &#xA;

    ffmpeg -i http://localhost:8080/stream -c copy out.mp4&#xA;&#xA;

    &#xA;

    The video was successfully saved and plays.

    &#xA;

    Question :&#xA;What alternative methods exist to implement live video streaming with ffmpeg, aside from WebRTC ? Why does the current approach of streaming video via HTTP GET request not function correctly in the browser, and how can this be resolved ?

    &#xA;

  • libdash mpd parser integration with custom player

    19 novembre 2016, par Raj

    I have written a multimedia player using ffmpeg and qt-framework. And currently I support local file playback and progressive download playback in my player. Now I want to add support for mpeg-dash vod stream playback. I thought of using libdash opensource dash library. But after so much of googling I couldn’t find any API or developer documentation regarding the same.

    Also another issue is libdash has integrated network module. But I want only mpd parser to be integrated and I have my own network module for downloading. I couln’t find any doxygen and usage documentation for libdash.

    I need help in these :

    1. Can you anyone point me to any developer documentation available for
      libdash ?
    2. Is there any alternative library for libdash ?