Recherche avancée

Médias (3)

Mot : - Tags -/spip

Autres articles (71)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (10266)

  • Best way to stream live video under webRTC development [on hold]

    5 avril 2016, par Ankit Dhanna

    I have implemented webrtc live video chat with multiple presentors (using licode ).
    However, I want to ask how can I broadcast this video stream so that multiple users who are not on webrtc can be served ?

    Where/Which server should I broadcast this stream to to get least lag time ?

  • Is it possible to create a video from a live stream of bitmaps ?

    30 mars 2015, par user3088260

    I have an application that gets single frame from the webcam roughly every 200 ms as a bitmap. I have used AForge.Video.FFMPEG.VideoFileWriter to create a video from a few of them, but the problem is that the stream must be closed for the bitmaps to be written to file and I want to add more bitmaps to the same file for days - like a CCTV camera. Is it possible to transcode live video using AForge from bitmaps ?

  • Independent MP4 fragments for live stream

    20 avril 2020, par J Doe

    My goal is to stream a H264 live-generated infinite feed (webcam) to the browser.

    



    Specifically, I want to stream a Raspberry PI camera to the browser.

    



    (Disclaimer : I'm streaming a H264 file right now ; but it's to be replaced with the PI webcam)

    



    So, I have a H264 stream which I pipe to ffmpeg to mux it to MP4 fragments (movflags is set to frag_keyframe+empty_moov+default_base_moof), and then I send the fragments using websocket (in the server I extract the fragments and send each fragment as a message that has moof+mdata. Didn't figure out how to do it with HTTP Progressive, because the stream is live-generated...), and then they're played in the browser using the MediaSource API.

    



    It works, but there's one problem : the video doesn't play unless it has ALL the fragments.

    



    I tried sending the initializing fragment, and then sending only newly generated fragments, but it errors out and doesn't play.

    



    How can I make it so you can join in the middle of the livestream, without being there from the very start of when it started recording (or in my testing, in the middle of the file stream).

    



    In other words, how do I make the fragments independent of each other ?

    



    P.S. any extra info will help, I'm new to this.