
Recherche avancée
Autres articles (94)
-
Configuration spécifique pour PHP5
4 février 2011, parPHP5 est obligatoire, vous pouvez l’installer en suivant ce tutoriel spécifique.
Il est recommandé dans un premier temps de désactiver le safe_mode, cependant, s’il est correctement configuré et que les binaires nécessaires sont accessibles, MediaSPIP devrait fonctionner correctement avec le safe_mode activé.
Modules spécifiques
Il est nécessaire d’installer certains modules PHP spécifiques, via le gestionnaire de paquet de votre distribution ou manuellement : php5-mysql pour la connectivité avec la (...) -
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs
Sur d’autres sites (8137)
-
Video Conferencing in HTML5 : WebRTC via Web Sockets
14 juin 2012, par silviaA bit over a week ago I gave a presentation at Web Directions Code 2012 in Melbourne. Maxine and John asked me to speak about something related to HTML5 video, so I went for the new shiny : WebRTC – real-time communication in the browser.
I only had 20 min, so I had to make it tight. I wanted to show off video conferencing without special plugins in Google Chrome in just a few lines of code, as is the promise of WebRTC. To a large extent, I achieved this. But I made some interesting discoveries along the way. Demos are in the slide deck.
UPDATE : Opera 12 has been released with WebRTC support.
Housekeeping : if you want to replicate what I have done, you need to install a Google Chrome Web Browser 19+. Then make sure you go to chrome ://flags and activate the MediaStream and PeerConnection experiment(s). Restart your browser and now you can experiment with this feature. Big warning up-front : it’s not production-ready, since there are still changes happening to the spec and there is no compatible implementation by another browser yet.
Here is a brief summary of the steps involved to set up video conferencing in your browser :
- Set up a video element each for the local and the remote video stream.
- Grab the local camera and stream it to the first video element.
- (*) Establish a connection to another person running the same Web page.
- Send the local camera stream on that peer connection.
- Accept the remote camera stream into the second video element.
Now, the most difficult part of all of this – believe it or not – is the signalling part that is required to build the peer connection (marked with (*)). Initially I wanted to run completely without a server and just enter the remote’s IP address to establish the connection. This is, however, not a functionality that the PeerConnection object provides [might this be something to add to the spec ?].
So, you need a server known to both parties that can provide for the handshake to set up the connection. All the examples that I have seen, such as https://apprtc.appspot.com/, use a channel management server on Google’s appengine. I wanted it all working with HTML5 technology, so I decided to use a Web Socket server instead.
I implemented my Web Socket server using node.js (code of websocket server). The video conferencing demo is in the slide deck in an iframe – you can also use the stand-alone html page. Works like a treat.
While it is still using Google’s STUN server to get through NAT, the messaging for setting up the connection is running completely through the Web Socket server. The messages that get exchanged are plain SDP message packets with a session ID. There are OFFER, ANSWER, and OK packets exchanged for each streaming direction. You can see some of it in the below image :
I’m not running a public WebSocket server, so you won’t be able to see this part of the presentation working. But the local loopback video should work.
At the conference, it all went without a hitch (while the wireless played along). I believe you have to host the WebSocket server on the same machine as the Web page, otherwise it won’t work for security reasons.
A whole new world of opportunities lies out there when we get the ability to set up video conferencing on every Web page – scary and exciting at the same time !
-
Video Conferencing in HTML5 : WebRTC via Web Sockets
1er janvier 2014, par silviaA bit over a week ago I gave a presentation at Web Directions Code 2012 in Melbourne. Maxine and John asked me to speak about something related to HTML5 video, so I went for the new shiny : WebRTC – real-time communication in the browser.
I only had 20 min, so I had to make it tight. I wanted to show off video conferencing without special plugins in Google Chrome in just a few lines of code, as is the promise of WebRTC. To a large extent, I achieved this. But I made some interesting discoveries along the way. Demos are in the slide deck.
UPDATE : Opera 12 has been released with WebRTC support.
Housekeeping : if you want to replicate what I have done, you need to install a Google Chrome Web Browser 19+. Then make sure you go to chrome ://flags and activate the MediaStream and PeerConnection experiment(s). Restart your browser and now you can experiment with this feature. Big warning up-front : it’s not production-ready, since there are still changes happening to the spec and there is no compatible implementation by another browser yet.
Here is a brief summary of the steps involved to set up video conferencing in your browser :
- Set up a video element each for the local and the remote video stream.
- Grab the local camera and stream it to the first video element.
- (*) Establish a connection to another person running the same Web page.
- Send the local camera stream on that peer connection.
- Accept the remote camera stream into the second video element.
Now, the most difficult part of all of this – believe it or not – is the signalling part that is required to build the peer connection (marked with (*)). Initially I wanted to run completely without a server and just enter the remote’s IP address to establish the connection. This is, however, not a functionality that the PeerConnection object provides [might this be something to add to the spec ?].
So, you need a server known to both parties that can provide for the handshake to set up the connection. All the examples that I have seen, such as https://apprtc.appspot.com/, use a channel management server on Google’s appengine. I wanted it all working with HTML5 technology, so I decided to use a Web Socket server instead.
I implemented my Web Socket server using node.js (code of websocket server). The video conferencing demo is in the slide deck in an iframe – you can also use the stand-alone html page. Works like a treat.
While it is still using Google’s STUN server to get through NAT, the messaging for setting up the connection is running completely through the Web Socket server. The messages that get exchanged are plain SDP message packets with a session ID. There are OFFER, ANSWER, and OK packets exchanged for each streaming direction. You can see some of it in the below image :
I’m not running a public WebSocket server, so you won’t be able to see this part of the presentation working. But the local loopback video should work.
At the conference, it all went without a hitch (while the wireless played along). I believe you have to host the WebSocket server on the same machine as the Web page, otherwise it won’t work for security reasons.
A whole new world of opportunities lies out there when we get the ability to set up video conferencing on every Web page – scary and exciting at the same time !
-
AVFilterGraph error when uploading a video on MediaWiki
20 janvier 2018, par Jeremy DicaireI’m trying to implement videos on my wiki - dedicated server under debian - using the extension TimedMediaHandler.
I installed all the packages needed (I think) and I chmod 777 ffmpeg and ffmpeg2theora.
I’m running MediaWiki 1.31. I also put in my LocalSettings.php :
$wgMaxShellMemory = 512000;
$wgMaxShellFileSize = 1024 * 512;
$wgMaxShellTime = 60 * 60;I can upload successfully and image’s thumbnails are generated properly. But when I upload a video I get this error :
Error creating thumbnail : ’/usr/bin/avconv’ -threads 1 -ss 33 -y -i
’/var/www/wiki_games/images/a/a8/DBD_-_Game_Intro.webm’ -ss 3 -s
854x480 -f mjpeg -an -vframes 1 ’/tmp/transform_6c1de80a29b2.jpg’ 2>&1
wgMaxShellMemory : 512000 ffmpeg version 3.2.9-1 deb9u1 Copyright (c)
2000-2017 the FFmpeg developers built with gcc 6.3.0 (Debian 6.3.0-18)
20170516 configuration : —prefix=/usr —extra-version=’1 deb9u1’
—toolchain=hardened —libdir=/usr/lib/x86_64-linux-gnu —incdir=/usr/include/x86_64-linux-gnu —enable-gpl —disable-stripping —enable-avresample —enable-avisynth —enable-gnutls —enable-ladspa —enable-libass —enable-libbluray —enable-libbs2b —enable-libcaca —enable-libcdio —enable-libebur128 —enable-libflite —enable-libfontconfig —enable-libfreetype —enable-libfribidi —enable-libgme —enable-libgsm —enable-libmp3lame —enable-libopenjpeg —enable-libopenmpt —enable-libopus —enable-libpulse —enable-librubberband —enable-libshine —enable-libsnappy —enable-libsoxr —enable-libspeex —enable-libssh —enable-libtheora —enable-libtwolame —enable-libvorbis —enable-libvpx —enable-libwavpack —enable-libwebp —enable-libx265 —enable-libxvid —enable-libzmq —enable-libzvbi —enable-omx —enable-openal —enable-opengl —enable-sdl2 —enable-libdc1394 —enable-libiec61883 —enable-chromaprint —enable-frei0r —enable-libopencv —enable-libx264 —enable-shared libavutil 55. 34.101 / 55. 34.101 libavcodec 57. 64.101 / 57. 64.101 libavformat 57. 56.101 / 57. 56.101
libavdevice 57. 1.100 / 57. 1.100 libavfilter 6. 65.100 / 6. 65.100
libavresample 3. 1. 0 / 3. 1. 0 libswscale 4. 2.100 / 4. 2.100
libswresample 2. 3.100 / 2. 3.100 libpostproc 54. 1.100 / 54. 1.100
Input #0, matroska,webm, from
’/var/www/wiki_games/images/a/a8/DBD_-_Game_Intro.webm’ : Metadata :
encoder : Lavf57.71.100 Duration : 00:01:12.56, start : -0.007000,
bitrate : 1114 kb/s Stream #0:0(eng) : Video : vp8, yuv420p(progressive),
854x480, SAR 1:1 DAR 427:240, 29.97 fps, 29.97 tbr, 1k tbn, 1k tbc
(default) Stream #0:1(eng) : Audio : opus, 48000 Hz, stereo, fltp
(default) [AVFilterGraph @ 0x562cb5f6f6e0] Error initializing
threading. [AVFilterGraph @ 0x562cb5f6f6e0] Error creating filter
’null’ Error opening filters !My debug file shows :
Creating video thumbnail at /tmp/transform_a558a5a34268.jpg
File::transform: Doing stat for mwstore://local-backend/local-thumb/a/a8/DBD_-_Game_Intro.webm/854px-seek=36-DBD_-_Game_Intro.webm.jpg
Creating video thumbnail at /tmp/transform_bc23be9e4649.jpg
[exec] MediaWiki\Shell\Command::execute: /bin/bash '/var/www/wiki_games/includes/shell/limit.sh' ''\''/usr/bin/avconv'\'' -threads 1 -ss 33 -y -i '\''/var/www/wiki_games/images/a/a8/DBD_-_Game_Intro.webm'\'' -ss 3 -s 854x480 -f mjpeg -an -vframes 1 '\''/tmp$
[thumbnail] Removing bad 0-byte thumbnail "/tmp/transform_bc23be9e4649.jpg". unlink() succeededWhat do I need to do make the thumbnail feature works ?
Thanks !