Recherche avancée

Médias (91)

Autres articles (66)

  • Gestion des droits de création et d’édition des objets

    8 février 2011, par

    Par défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (10307)

  • Javacv : Decoding H.264 "live" stream coming from red5 server on android device

    26 mars 2015, par Ichigo Kurosaki

    Here is my problem,
    I have implemented a server side application using Red5, which sends H.264 encoded live stream, on client side the stream is received as byte[]
    In order to decode it on Android client side i have followed the Javacv-FFmpeg library. The code for decoding is as follows

    public Frame decodeVideo(byte[] data,long timestamp){
              frame.image = null;
              frame.samples = null;
              avcodec.av_init_packet(pkt);
              BytePointer video_data = new BytePointer(data);
              avcodec.AVCodec codec = avcodec.avcodec_find_decoder(codec_id);
              video_c = null;
              video_c = avcodec.avcodec_alloc_context3(codec);
              video_c.width(320);
              video_c.height(240);
              video_c.pix_fmt(0);
              video_c.flags2(video_c.flags2()|avcodec.CODEC_FLAG2_CHUNKS);
              avcodec.avcodec_open2(video_c, codec, null))
              picture = avcodec.avcodec_alloc_frame()
              pkt.data(video_data);
              pkt.size(data.length);
              int len = avcodec.avcodec_decode_video2(video_c, picture, got_frame, pkt);
              if ((len >= 0) && ( got_frame[0] != 0)) {
                ....
                 process the decoded frame into **IPLImage of Javacv** and render it with **Imageview** of Android
              }
    }

    Data received from server is as follows
    Few Frames having following pattern
    17 01 00 00 00 00 00 00 02 09 10 00 00 00 0F 06 00 01 C0 01 07 09 08 04 9A 00 00 03 00 80 00 00 16 EF 65 88 80 07 00 05 6C 98 90 00...

    Many frames having following pattern
    27 01 00 00 00 00 00 00 02 09 30 00 00 00 0C 06 01 07 09 08 05 9A 00 00 03 00 80 00 00 0D 77 41 9A 02 04 15 B5 06 20 E3 11 E2 3C 46 ....

    With H.264 codec for decoder, decoder outputs length >0 but got_frames=0 always.
    With MPEG1 codec, decoder outputs length >0 and got_frames>0 but the output image is green or distorted.

    However following FFmpegFrameGrabber code of javacv i can decode the local files( H.264 encoded ) with similar code as above.

    I wonder what details i am missing, and header related data manipulation or setting codec appropriate for decoder.

    Any suggestion, help appreciated.
    Thanks in advance.

  • How to implement live camera plus WebGL overlay live streams

    12 avril 2016, par zproxy

    The goal is to test a youtube live stream configuration where :

    1. Samsung Galaxy S6/S7 (android) (wifi) streams cameras (one or all)
    2. Chrome renders a WebGL 3D (static or realtime, webrtc ?, flash ?) to be overlayed (localhost)
    3. ffmpeg mixes the available stream and upstreams to youtube live event

    Whats needed to implement this configration ? Yes this may require actual custom implementation for android app and chrome app in the end.

  • Best way to stream live video under webRTC development [on hold]

    5 avril 2016, par Ankit Dhanna

    I have implemented webrtc live video chat with multiple presentors (using licode ).
    However, I want to ask how can I broadcast this video stream so that multiple users who are not on webrtc can be served ?

    Where/Which server should I broadcast this stream to to get least lag time ?