Recherche avancée

Médias (1)

Mot : - Tags -/bug

Autres articles (89)

  • À propos des documents

    21 juin 2013, par

    Que faire quand un document ne passe pas en traitement, dont le rendu ne correspond pas aux attentes ?
    Document bloqué en file d’attente ?
    Voici une liste d’actions ordonnée et empirique possible pour tenter de débloquer la situation : Relancer le traitement du document qui ne passe pas Retenter l’insertion du document sur le site MédiaSPIP Dans le cas d’un média de type video ou audio, retravailler le média produit à l’aide d’un éditeur ou un transcodeur. Convertir le document dans un format (...)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

Sur d’autres sites (18723)

  • Is it possible to create a video from a live stream of bitmaps ?

    30 mars 2015, par user3088260

    I have an application that gets single frame from the webcam roughly every 200 ms as a bitmap. I have used AForge.Video.FFMPEG.VideoFileWriter to create a video from a few of them, but the problem is that the stream must be closed for the bitmaps to be written to file and I want to add more bitmaps to the same file for days - like a CCTV camera. Is it possible to transcode live video using AForge from bitmaps ?

  • Independent MP4 fragments for live stream

    20 avril 2020, par J Doe

    My goal is to stream a H264 live-generated infinite feed (webcam) to the browser.

    



    Specifically, I want to stream a Raspberry PI camera to the browser.

    



    (Disclaimer : I'm streaming a H264 file right now ; but it's to be replaced with the PI webcam)

    



    So, I have a H264 stream which I pipe to ffmpeg to mux it to MP4 fragments (movflags is set to frag_keyframe+empty_moov+default_base_moof), and then I send the fragments using websocket (in the server I extract the fragments and send each fragment as a message that has moof+mdata. Didn't figure out how to do it with HTTP Progressive, because the stream is live-generated...), and then they're played in the browser using the MediaSource API.

    



    It works, but there's one problem : the video doesn't play unless it has ALL the fragments.

    



    I tried sending the initializing fragment, and then sending only newly generated fragments, but it errors out and doesn't play.

    



    How can I make it so you can join in the middle of the livestream, without being there from the very start of when it started recording (or in my testing, in the middle of the file stream).

    



    In other words, how do I make the fragments independent of each other ?

    



    P.S. any extra info will help, I'm new to this.

    


  • Live streaming and simultaneous local/server video saving with Insta360/Theta 360 camera [closed]

    13 août 2023, par Fornow

    I'm currently working on a project that involves live streaming video from a 360 camera, specifically the Insta360 and Theta models, while also saving the streamed video either locally or on a remote server. I'm relatively new to both live streaming and working with 360 cameras, so I'm seeking guidance on the best approach to achieve this.

    


    My primary goals are as follows :

    


      

    1. Live Streaming : I want to be able to stream the real-time video captured by the 360 camera to a web platform or application, allowing users to experience the immersive 360 content as it happens.

      


    2. 


    3. Simultaneous Video Saving : In addition to live streaming, I also need to save the streamed video. This can either be saved locally on the device running the streaming process or on a remote server. The saved video should ideally retain its 360 nature and high-quality resolution.

      


    4. 


    


    I've been researching various technologies and frameworks like WebRTC for live streaming, but I'm unsure about the compatibility and best practices when dealing specifically with 360 cameras like Insta360 and Theta. Additionally, I'm uncertain about the most efficient way to save the streamed video while maintaining its immersive properties.

    


    If anyone has experience with live streaming from 360 cameras and simultaneously saving the content, could you please provide insights into the following :

    


      

    • Recommended libraries, SDKs, or frameworks for live streaming 360 video from Insta360 or Theta cameras.
    • 


    • Tips for ensuring the streamed video retains its 360 attributes and high quality.
    • 


    • Best practices for saving the streamed video either locally or on a remote server while the live stream is ongoing.
    • 


    


    Any code examples, tutorials, or step-by-step guides would be greatly appreciated. Thank you in advance for your help !