Recherche avancée

Médias (17)

Mot : - Tags -/wired

Autres articles (43)

  • Installation en mode ferme

    4 février 2011, par

    Le mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
    C’est la méthode que nous utilisons sur cette même plateforme.
    L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
    Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)

  • Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs

    12 avril 2011, par

    La manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
    Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.

  • La sauvegarde automatique de canaux SPIP

    1er avril 2010, par

    Dans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
    Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)

Sur d’autres sites (6527)

  • Video Conferencing in HTML5 : WebRTC via Web Sockets

    1er janvier 2014, par silvia

    A bit over a week ago I gave a presentation at Web Directions Code 2012 in Melbourne. Maxine and John asked me to speak about something related to HTML5 video, so I went for the new shiny : WebRTC – real-time communication in the browser.

    Presentation slides

    I only had 20 min, so I had to make it tight. I wanted to show off video conferencing without special plugins in Google Chrome in just a few lines of code, as is the promise of WebRTC. To a large extent, I achieved this. But I made some interesting discoveries along the way. Demos are in the slide deck.

    UPDATE : Opera 12 has been released with WebRTC support.

    Housekeeping : if you want to replicate what I have done, you need to install a Google Chrome Web Browser 19+. Then make sure you go to chrome ://flags and activate the MediaStream and PeerConnection experiment(s). Restart your browser and now you can experiment with this feature. Big warning up-front : it’s not production-ready, since there are still changes happening to the spec and there is no compatible implementation by another browser yet.

    Here is a brief summary of the steps involved to set up video conferencing in your browser :

    1. Set up a video element each for the local and the remote video stream.
    2. Grab the local camera and stream it to the first video element.
    3. (*) Establish a connection to another person running the same Web page.
    4. Send the local camera stream on that peer connection.
    5. Accept the remote camera stream into the second video element.

    Now, the most difficult part of all of this – believe it or not – is the signalling part that is required to build the peer connection (marked with (*)). Initially I wanted to run completely without a server and just enter the remote’s IP address to establish the connection. This is, however, not a functionality that the PeerConnection object provides [might this be something to add to the spec ?].

    So, you need a server known to both parties that can provide for the handshake to set up the connection. All the examples that I have seen, such as https://apprtc.appspot.com/, use a channel management server on Google’s appengine. I wanted it all working with HTML5 technology, so I decided to use a Web Socket server instead.

    I implemented my Web Socket server using node.js (code of websocket server). The video conferencing demo is in the slide deck in an iframe – you can also use the stand-alone html page. Works like a treat.

    While it is still using Google’s STUN server to get through NAT, the messaging for setting up the connection is running completely through the Web Socket server. The messages that get exchanged are plain SDP message packets with a session ID. There are OFFER, ANSWER, and OK packets exchanged for each streaming direction. You can see some of it in the below image :

    WebRTC demo

    I’m not running a public WebSocket server, so you won’t be able to see this part of the presentation working. But the local loopback video should work.

    At the conference, it all went without a hitch (while the wireless played along). I believe you have to host the WebSocket server on the same machine as the Web page, otherwise it won’t work for security reasons.

    A whole new world of opportunities lies out there when we get the ability to set up video conferencing on every Web page – scary and exciting at the same time !

  • Video Conferencing in HTML5 : WebRTC via Web Sockets

    14 juin 2012, par silvia

    A bit over a week ago I gave a presentation at Web Directions Code 2012 in Melbourne. Maxine and John asked me to speak about something related to HTML5 video, so I went for the new shiny : WebRTC – real-time communication in the browser.

    Presentation slides

    I only had 20 min, so I had to make it tight. I wanted to show off video conferencing without special plugins in Google Chrome in just a few lines of code, as is the promise of WebRTC. To a large extent, I achieved this. But I made some interesting discoveries along the way. Demos are in the slide deck.

    UPDATE : Opera 12 has been released with WebRTC support.

    Housekeeping : if you want to replicate what I have done, you need to install a Google Chrome Web Browser 19+. Then make sure you go to chrome ://flags and activate the MediaStream and PeerConnection experiment(s). Restart your browser and now you can experiment with this feature. Big warning up-front : it’s not production-ready, since there are still changes happening to the spec and there is no compatible implementation by another browser yet.

    Here is a brief summary of the steps involved to set up video conferencing in your browser :

    1. Set up a video element each for the local and the remote video stream.
    2. Grab the local camera and stream it to the first video element.
    3. (*) Establish a connection to another person running the same Web page.
    4. Send the local camera stream on that peer connection.
    5. Accept the remote camera stream into the second video element.

    Now, the most difficult part of all of this – believe it or not – is the signalling part that is required to build the peer connection (marked with (*)). Initially I wanted to run completely without a server and just enter the remote’s IP address to establish the connection. This is, however, not a functionality that the PeerConnection object provides [might this be something to add to the spec ?].

    So, you need a server known to both parties that can provide for the handshake to set up the connection. All the examples that I have seen, such as https://apprtc.appspot.com/, use a channel management server on Google’s appengine. I wanted it all working with HTML5 technology, so I decided to use a Web Socket server instead.

    I implemented my Web Socket server using node.js (code of websocket server). The video conferencing demo is in the slide deck in an iframe – you can also use the stand-alone html page. Works like a treat.

    While it is still using Google’s STUN server to get through NAT, the messaging for setting up the connection is running completely through the Web Socket server. The messages that get exchanged are plain SDP message packets with a session ID. There are OFFER, ANSWER, and OK packets exchanged for each streaming direction. You can see some of it in the below image :

    WebRTC demo

    I’m not running a public WebSocket server, so you won’t be able to see this part of the presentation working. But the local loopback video should work.

    At the conference, it all went without a hitch (while the wireless played along). I believe you have to host the WebSocket server on the same machine as the Web page, otherwise it won’t work for security reasons.

    A whole new world of opportunities lies out there when we get the ability to set up video conferencing on every Web page – scary and exciting at the same time !

  • How to get stream info from opened file in ffmpeg ?

    31 mai 2013, par Srv19

    I am trying to read video file using ffmpeg. I had working code that corresponded to somewhat old version of it, and started to try and upgrade to latest build version, exchanging all those deprecated functions for their actual analogues.

    However i have run into a problem. No streams seem to be retrieved and the load of video stops dead in tracks.

    here is the code i am using :

      // Open video file
      if(avformat_open_input(&pFormatCtx, filename.toStdString().c_str(), NULL, NULL)!=0)
          return FILE_NOT_OPENED; // Couldn't open file

      // Retrieve stream information
      if(avformat_find_stream_info(pFormatCtx,NULL)<0)
          return NO_STREAM_INFO; // Couldn't find stream information

      // Dump information about file onto standard error
      av_dump_format(pFormatCtx, 0, filename.toStdString().c_str(), false);

      // Find the first video stream
      videoStream=-1;
      for(unsigned i=0; inb_streams; i++)
          if(pFormatCtx->streams[i]->codec->codec_type==ffmpeg::AVMEDIA_TYPE_VIDEO)
          {
              videoStream=i;
              break;
          }
      if(videoStream==-1)
          return OTHER; // Didn't find a video stream

      // Get a pointer to the codec context for the video stream
      pCodecCtx=pFormatCtx->streams[videoStream]->codec;

      // Find the decoder for the video stream
      pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
      if(pCodec==NULL)
          return CODEC_NOT_FOUND; // Codec not found

      // Open codec
      if(avcodec_open2(pCodecCtx, pCodec,NULL)<0)
          return CODEC_NOT_OPENED; // Could not open codec

    The problem arises in the cycle through video streams in ffmpeg::AVFormatContext *pFormatCtx. nb_streams field is 0, and i never actually enter the cycle, and codec is not loaded etc. Strange thing is, av_dump_format gives following output :

    License: GPL version 3 or later
    AVCodec version 3606372
    AVFormat configuration: --disable-static --enable-shared --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libcaca --enable-libfreetype --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxavs --enable-libxvid --enable-zlib
    [asf @ 004e9540] Stream #0: not enough frames to estimate rate; consider increasing probesize
    Input #0, asf, from 'C:/Users/Public/Videos/Sample Videos/Wildlife.wmv':
     Metadata:
       SfOriginalFPS   : 299700
       WMFSDKVersion   : 11.0.6001.7000
       WMFSDKNeeded    : 0.0.0.0000
       comment         : Footage: Small World Productions, Inc; Tourism New Zealand | Producer: Gary F. Spradling | Music: Steve Ball
       title           : Wildlife in HD
       copyright       : В© 2008 Microsoft Corporation
       IsVBR           : 0
       DeviceConformanceTemplate: AP@L3
     Duration: 00:00:30.09, start: 0.000000, bitrate: 6977 kb/s
       Stream #0:0(eng): Audio: wmav2 (a[1][0][0] / 0x0161), 44100 Hz, 2 channels, fltp, 192 kb/s
       Stream #0:1(eng): Video: vc1 (Advanced) (WVC1 / 0x31435657), yuv420p, 1280x720, 5942 kb/s, 29.97 tbr, 1k tbn, 1k tbc

    and there are 2 streams, clear as day.

    I am utterly baffled. Please help.