Recherche avancée

Médias (91)

Autres articles (53)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Librairies et logiciels spécifiques aux médias

    10 décembre 2010, par

    Pour un fonctionnement correct et optimal, plusieurs choses sont à prendre en considération.
    Il est important, après avoir installé apache2, mysql et php5, d’installer d’autres logiciels nécessaires dont les installations sont décrites dans les liens afférants. Un ensemble de librairies multimedias (x264, libtheora, libvpx) utilisées pour l’encodage et le décodage des vidéos et sons afin de supporter le plus grand nombre de fichiers possibles. Cf. : ce tutoriel ; FFMpeg avec le maximum de décodeurs et (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

Sur d’autres sites (11145)

  • H264 decoder in opencv for real time video transmission

    30 mai 2019, par Narendra

    I am writing a client-server application which does real time video transmission from an android based phone to a server. The captured video from the phone camera is encoded using the android provided h264 encoder and transmitted via UDP socket. The frames are not RTP encapsulated. I need it to reduce the overhead and hence the delay.
    On the receiver, I need to decode the incoming encoded frame. The data being sent on the UDP socket not only contains the encoded frame but some other information related to the frame as a part of its header. Each frame is encoded as an nal unit.

    I am able to retrieve the frames from the received packet as a byte array. I can save this byte array as raw h264 file and playback using vlc and everything works fine.
    However, I need to do some processing on this frame and hence need to use it with opencv.
    Can anyone help me with decoding a raw h264 byte array in opencv ?
    Can ffmpeg be used for this ?

  • H264 decoder in opencv for real time video transmission

    4 avril 2014, par Narendra

    I am writing a client-server application which does real time video transmission from an android based phone to a server. The captured video from the phone camera is encoded using the android provided h264 encoder and transmitted via UDP socket. The frames are not RTP encapsulated. I need it to reduce the overhead and hence the delay.
    On the receiver, I need to decode the incoming encoded frame. The data being sent on the UDP socket not only contains the encoded frame but some other information related to the frame as a part of its header. Each frame is encoded as an nal unit.

    I am able to retrieve the frames from the received packet as a byte array. I can save this byte array as raw h264 file and playback using vlc and everything works fine.
    However, I need to do some processing on this frame and hence need to use it with opencv.
    Can anyone help me with decoding a raw h264 byte array in opencv ?
    Can ffmpeg be used for this ?

  • How to estimate bandwidth / speed requirements for real-time streaming video ?

    19 juin 2016, par Vivek Seth

    For a project I’m working on, I’m trying to stream video to an iPhone through its headphone jack. My estimated bitrate is about 200kbps (If i’m wrong about this, please ignore that).

    I’d like to squeeze as much performance out of this bitrate as possible and sound is not important for me, only video. My understanding is that to stream a a real-time video I will need to encode it with some codec on-the-fly and send compressed frames to the iPhone for it to decode and render. Based on my research, it seems that H.265 is one of the most space efficient codecs available so i’m considering using that.

    Assuming my basic understanding of live streaming is correct, how would I estimate the FPS I could achieve for a given resolution using the H.265 codec ?

    The best solution I can think of it to take a video file, encode it with H.265 and trim it to 1 minute of length to see how large the file is. The issue I see with this approach is that I think my calculations would include some overhead from the video container format (AVI, MKV, etc) and from the audio channels that I don’t care about.