Recherche avancée

Médias (1)

Mot : - Tags -/remix

Autres articles (51)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (8336)

  • FFMPEG hls event start always from first segment on Android device

    29 juillet 2020, par tidpe

    I've this live streaming server that encode RTMP stream into HLS.
My server is doing something like that :

    


     ffmpeg -re -i rtmp://server:port/address/name -rw_timeout 500 -http_persistent 1 -method PUT -http_user_agent myuseragent -f hls -hls_list_size 5 -hls_flags discont_start+delete_segments+omit_endlist -async 1 -vf "scale=640:trunc(ow/a/2)*2" -movflags +faststart -c:a libfdk_aac -ar 48000 -c:v h264 -profile:v main -crf 24 -sc_threshold 0 -g 48 -keyint_min 48 -hls_time 4 -hls_playlist_type event -preset veryfast -b:v 350k -maxrate 800k -bufsize 1100k -b:a 96k -vcodec libx264 -hls_segment_filename %03d.ts master.m3u8


    


    On desktop all working fine, problem is that on android devices/android native players the live streaming is starting always from begin.

    


    I'm trying to remove segments from playlist, but nothing can work and playlist is starting always from first TS segment.

    


    What I've tried

    


    


    -hls_list_size 5

    


    


    


    -hls_flags delete_segments

    


    


    


    -hls_delete_threshold 2

    


    


  • avformat : add a stream event flag for new packets

    25 octobre 2020, par Anton Khirnov
    avformat : add a stream event flag for new packets
    
    • [DH] doc/APIchanges
    • [DH] libavformat/avformat.h
    • [DH] libavformat/utils.c
    • [DH] libavformat/version.h
  • Is it possible to send a temporary slate (image or video) into a running Azure Live Event RTMP-stream ?

    15 novembre 2020, par Brian Frisch

    I'm currently building a video streaming app which leverages Azure Media Services Live Events.

    


    It consists of :

    


      

    1. a mobile app that can stream live video and.
    2. 


    3. a web client that plays the live event video.
    4. 


    5. a producer screen with controls to start and stop the web client access to the video.
    6. 


    7. a server that handles various operations around the entire system
    8. 


    


    It's working very well, but I would like to add a feature that would enable the producer to add some elegance to the experience. Therefore I'm trying to get my head around how I can enable the producer be able to switch the incoming source of the stream to a pre-recorded video or event a still image at any point during the recording, and also to switch back to live-video. A kill-switch of some kind, that would cover waiting-time if there's technical difficulties on the set, and it could also be used for pre-/post-roll branding slates when introing and outroing a video event. I would like this source switch to be embedded in the video stream (also so it would be possible to get this into the final video-product if I need it in an archive for later playback)

    


    I'm trying to do it in a way where the producer can set a timestamp for when the video override should come in, and when it should stop. The I want to have my server respond to these timestamps and send the instructions over RTMP to the Azure Live Event. Is it possible to send such an instruction ("Hey, play this video-bit/show this image in the stream for x-seconds") in the RTMP-protocol ? I've tried to figure it out, and I've read about SCTE-35 markers and such, but I have not been able to find any examples on how to do it, so I'm a bit stuck.

    


    My plan-B is to make it possible to stream an image from the mobile application that already handles the live video-stream, but I'm initially targeting an architecture where the mobile app is unaware of anything else than live streaming, and this override switch should preferably be handled by the server, which is a firebase functions setup.

    


    If you are able to see other ways of doing it, I'm all ears.

    


    I've already tried to build a ffmpeg method that listens to updates to the producer-set state, and then streams an image to the same RTMP-url that the video goes to from the mobile app. But it only works when the live video isn't already streaming - it seems like I cannot take over a RTMP-stream when it's already running.