
Recherche avancée
Autres articles (39)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Contribute to a better visual interface
13 avril 2011MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community. -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...)
Sur d’autres sites (8122)
-
Stream video and commands on same connection or split connections ?
10 mai 2013, par bizzehdeeBackground
I am in the middle of writing a client/server app that i will install on every machine within my office (roughly 30 - 35 machines). I currently have the client connecting to the server and it has an ability to send mouse movement, mouse clicks, key strokes and execute certain commands. The next step is to stream back a video output of the screen, i am using the GDI method from Fastest method of screen capturing to capture the entire screen and will be using the x264 encoder to compress the frames and transmit them back to the client which will then decode and display the stream.
Question
is it best (by means of reducing lag, ensuring all commands are delivered as fast as possible and that streaming is as live as possible) that i transmit back along the same connection that i established for the commands, or, should i establish a separate connection on the same port, or on a different port to stream back the video ?
P.S.
i am aware that VNC, RD and other things such as TeamViewer already exist and already do this sort of thing, but none of these support all the requirements needed for what we need within this system.
-
Video Conferencing in HTML5 : WebRTC via Web Sockets
14 juin 2012, par silviaA bit over a week ago I gave a presentation at Web Directions Code 2012 in Melbourne. Maxine and John asked me to speak about something related to HTML5 video, so I went for the new shiny : WebRTC – real-time communication in the browser.
I only had 20 min, so I had to make it tight. I wanted to show off video conferencing without special plugins in Google Chrome in just a few lines of code, as is the promise of WebRTC. To a large extent, I achieved this. But I made some interesting discoveries along the way. Demos are in the slide deck.
UPDATE : Opera 12 has been released with WebRTC support.
Housekeeping : if you want to replicate what I have done, you need to install a Google Chrome Web Browser 19+. Then make sure you go to chrome ://flags and activate the MediaStream and PeerConnection experiment(s). Restart your browser and now you can experiment with this feature. Big warning up-front : it’s not production-ready, since there are still changes happening to the spec and there is no compatible implementation by another browser yet.
Here is a brief summary of the steps involved to set up video conferencing in your browser :
- Set up a video element each for the local and the remote video stream.
- Grab the local camera and stream it to the first video element.
- (*) Establish a connection to another person running the same Web page.
- Send the local camera stream on that peer connection.
- Accept the remote camera stream into the second video element.
Now, the most difficult part of all of this – believe it or not – is the signalling part that is required to build the peer connection (marked with (*)). Initially I wanted to run completely without a server and just enter the remote’s IP address to establish the connection. This is, however, not a functionality that the PeerConnection object provides [might this be something to add to the spec ?].
So, you need a server known to both parties that can provide for the handshake to set up the connection. All the examples that I have seen, such as https://apprtc.appspot.com/, use a channel management server on Google’s appengine. I wanted it all working with HTML5 technology, so I decided to use a Web Socket server instead.
I implemented my Web Socket server using node.js (code of websocket server). The video conferencing demo is in the slide deck in an iframe – you can also use the stand-alone html page. Works like a treat.
While it is still using Google’s STUN server to get through NAT, the messaging for setting up the connection is running completely through the Web Socket server. The messages that get exchanged are plain SDP message packets with a session ID. There are OFFER, ANSWER, and OK packets exchanged for each streaming direction. You can see some of it in the below image :
I’m not running a public WebSocket server, so you won’t be able to see this part of the presentation working. But the local loopback video should work.
At the conference, it all went without a hitch (while the wireless played along). I believe you have to host the WebSocket server on the same machine as the Web page, otherwise it won’t work for security reasons.
A whole new world of opportunities lies out there when we get the ability to set up video conferencing on every Web page – scary and exciting at the same time !
-
FFmpeg - mapping 4 audio channels to 1 audio track
18 juillet 2016, par AvalonI have two QT MOV’s that I want to concatenate using FFmpeg, but I am having trouble understanding how to map the audio channels.
First MOV has 2 channels, Front Left and Front Right. Second MOV has 4 channels, Front Left, Front Right, Side Left and Side Right.
How do I create 1 audio track with 4 channels mapped as FL, FR, SL and SR ?
MediaInfo reports the following (not desired result) :
Audio #1
ID : 2
Format : AAC
Format/Info : Advanced Audio Codec
Format profile : LC
Codec ID : 40
Duration : 4mn 35s
Bit rate mode : Variable
Bit rate : 126 Kbps
Maximum bit rate : 160 Kbps
Channel(s) : 2 channels
Channel(s)_Original : 4 channels
Channel positions : Front: L C R, Side: C
Sampling rate : 48.0 KHz
Frame rate : 46.875 fps (1024 spf)
Compression mode : Lossy
Stream size : 4.14 MiB (12%)
Default : Yes
Alternate group : 1
Audio #2
ID : 3
Format : AAC
Format/Info : Advanced Audio Codec
Format profile : LC
Codec ID : 40
Duration : 4mn 35s
Bit rate mode : Variable
Bit rate : 127 Kbps
Maximum bit rate : 160 Kbps
Channel(s) : 2 channels
Channel(s)_Original : 4 channels
Channel positions : Front: L C R, Side: C
Sampling rate : 48.0 KHz
Frame rate : 46.875 fps (1024 spf)
Compression mode : Lossy
Stream size : 4.18 MiB (12%)
Default : No
Alternate group : 1
Audio #3
ID : 4
Format : AAC
Format/Info : Advanced Audio Codec
Format profile : LC
Codec ID : 40
Duration : 4mn 35s
Bit rate mode : Variable
Bit rate : 110 Kbps
Maximum bit rate : 160 Kbps
Channel(s) : 2 channels
Channel(s)_Original : 4 channels
Channel positions : Front: L C R, Side: C
Sampling rate : 48.0 KHz
Frame rate : 46.875 fps (1024 spf)
Compression mode : Lossy
Stream size : 3.62 MiB (10%)
Default : No
Alternate group : 1
Audio #4
ID : 5
Format : AAC
Format/Info : Advanced Audio Codec
Format profile : LC
Codec ID : 40
Duration : 4mn 35s
Bit rate mode : Variable
Bit rate : 110 Kbps
Maximum bit rate : 160 Kbps
Channel(s) : 2 channels
Channel(s)_Original : 4 channels
Channel positions : Front: L C R, Side: C
Sampling rate : 48.0 KHz
Frame rate : 46.875 fps (1024 spf)
Compression mode : Lossy
Stream size : 3.61 MiB (10%)
Default : No
Alternate group : 1FFmpeg command is as follows :
`ffmpeg -i "2chan.mov" -i "4chan.mov" -filter_complex "[0:v] [0:a] [1:v] [1:a] concat=n=2:v=1:a=1 [v] [a]; [v]scale=-1:288[v2]; [a]channelsplit=channel_layout=quad(side)[FL][FR][SL][SR]" -map "[v2]" -map "[FL]" -map "[FR]" -map "[SL]" -map "[SR]" -c:v libx264 -pix_fmt yuv420p -b:v 700k -minrate 700k -maxrate 700k -bufsize 700k -r 25 -sc_threshold 25 -keyint_min 25 -g 25 -qmin 3 -qmax 51 -threads 8 -c:a aac -strict -2 -b:a 160k -ar 48000 -async 1 -ac 4 combined.mp4`
Console output :
ffmpeg version N-77883-gd7c75a5 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 5.2.0 (GCC)
configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-av
isynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enab
le-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --
enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-l
ibilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enab
le-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --en
able-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --ena
ble-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc
--enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enabl
e-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --
enable-lzma --enable-decklink --enable-zlib
libavutil 55. 13.100 / 55. 13.100
libavcodec 57. 22.100 / 57. 22.100
libavformat 57. 21.101 / 57. 21.101
libavdevice 57. 0.100 / 57. 0.100
libavfilter 6. 25.100 / 6. 25.100
libswscale 4. 0.100 / 4. 0.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
[mov,mp4,m4a,3gp,3g2,mj2 @ 0000000000575c00] ignoring 'frma' atom of 'mp4a', str
eam format is 'mp4a'
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '2chan.mov':
Metadata:
major_brand : qt
minor_version : 537199360
compatible_brands: qt
creation_time : 2016-01-19 05:48:38
Duration: 00:00:45.00, start: 0.000000, bitrate: 364 kb/s
Stream #0:0(eng): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, smpte1
70m/smpte170m/bt709), 768x576, 196 kb/s, SAR 1:1 DAR 4:3, 25 fps, 25 tbr, 25 tbn
, 50 tbc (default)
Metadata:
creation_time : 2016-01-19 05:48:40
handler_name : Apple Alias Data Handler
encoder : H.264
timecode : 00:00:00:00
Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, flt
p, 159 kb/s (default)
Metadata:
creation_time : 2016-01-19 05:48:42
handler_name : Apple Alias Data Handler
timecode : 00:00:00:00
Stream #0:2(eng): Data: none (tmcd / 0x64636D74) (default)
Metadata:
creation_time : 2016-01-19 05:49:42
handler_name : Apple Alias Data Handler
timecode : 00:00:00:00
[mov,mp4,m4a,3gp,3g2,mj2 @ 00000000005da420] ignoring 'frma' atom of 'mp4a', str
eam format is 'mp4a'
Input #1, mov,mp4,m4a,3gp,3g2,mj2, from '4chan.mov':
Metadata:
major_brand : qt
minor_version : 537199360
compatible_brands: qt
creation_time : 2016-01-19 04:11:52
Duration: 00:19:58.00, start: 0.000000, bitrate: 5118 kb/s
Stream #1:0(eng): Video: h264 (Main) (avc1 / 0x31637661), yuv420p(tv, smpte1
70m/smpte170m/bt709), 768x576, 4955 kb/s, 25 fps, 25 tbr, 25k tbn, 50k tbc (defa
ult)
Metadata:
creation_time : 2016-01-19 04:11:52
handler_name : Apple Alias Data Handler
encoder : H.264
timecode : 00:28:33:21
Stream #1:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, quad, fltp,
157 kb/s (default)
Metadata:
creation_time : 2016-01-19 04:11:52
handler_name : Apple Alias Data Handler
Stream #1:2(eng): Data: none (tmcd / 0x64636D74), 0 kb/s
Metadata:
rotate : 0
creation_time : 2016-01-19 04:11:52
handler_name : Apple Alias Data Handler
timecode : 00:28:33:21
File 'Output_Complex_6.mp4' already exists. Overwrite ? [y/N] y
-async is forwarded to lavfi similarly to -af aresample=async=1:min_hard_comp=0.
100000:first_pts=0.
Last message repeated 1 times
[libx264 @ 000000000057d260] using SAR=1/1
[libx264 @ 000000000057d260] using cpu capabilities: MMX2 SSE2Fast SSSE3 Cache64
SlowShuffle
[libx264 @ 000000000057d260] profile High, level 2.1
[libx264 @ 000000000057d260] 264 - core 148 r2638 7599210 - H.264/MPEG-4 AVC cod
ec - Copyleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=1 r
ef=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed
_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pski
p=1 chroma_qp_offset=-2 threads=8 lookahead_threads=1 sliced_threads=0 nr=0 deci
mate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_
adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=25 keyint_min=13
scenecut=25 intra_refresh=0 rc_lookahead=25 rc=cbr mbtree=1 bitrate=700 ratetol
=1.0 qcomp=0.60 qpmin=3 qpmax=51 qpstep=4 vbv_maxrate=700 vbv_bufsize=700 nal_hr
d=none filler=0 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'Output_Complex_6.mp4':
Metadata:
major_brand : qt
minor_version : 537199360
compatible_brands: qt
title : TestTitle
encoder : Lavf57.21.101
Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 384x28
8 [SAR 1:1 DAR 4:3], q=3-51, 700 kb/s, 25 fps, 12800 tbn, 25 tbc (default)
Metadata:
encoder : Lavc57.22.100 libx264
Side data:
unknown side data type 10 (24 bytes)
Stream #0:1: Audio: aac (LC) ([64][0][0][0] / 0x0040), 48000 Hz, 4.0, fltp,
160 kb/s
Metadata:
encoder : Lavc57.22.100 aac
Stream #0:2: Audio: aac (LC) ([64][0][0][0] / 0x0040), 48000 Hz, 4.0, fltp,
160 kb/s
Metadata:
encoder : Lavc57.22.100 aac
Stream #0:3: Audio: aac (LC) ([64][0][0][0] / 0x0040), 48000 Hz, 4.0, fltp,
160 kb/s
Metadata:
encoder : Lavc57.22.100 aac
Stream #0:4: Audio: aac (LC) ([64][0][0][0] / 0x0040), 48000 Hz, 4.0, fltp,
160 kb/s
Metadata:
encoder : Lavc57.22.100 aac
Stream mapping:
Stream #0:0 (h264) -> concat:in0:v0
Stream #0:1 (aac) -> concat:in0:a0
Stream #1:0 (h264) -> concat:in1:v0
Stream #1:1 (aac) -> concat:in1:a0
scale -> Stream #0:0 (libx264)
channelsplit:FL -> Stream #0:1 (aac)
channelsplit:FR -> Stream #0:2 (aac)
channelsplit:SL -> Stream #0:3 (aac)
channelsplit:SR -> Stream #0:4 (aac)
Press [q] to stop, [?] for help
frame= 85 fps=0.0 q=3.0 size= 6kB time=00:00:02.68 bitrate= 18.8kbits/s
frame= 150 fps=148 q=3.0 size= 14kB time=00:00:05.24 bitrate= 21.3kbits/s
frame= 198 fps=130 q=3.0 size= 65kB time=00:00:07.21 bitrate= 74.4kbits/s
frame= 238 fps=118 q=3.0 size= 125kB time=00:00:08.78 bitrate= 116.2kbits/s
frame= 271 fps=108 q=3.0 size= 171kB time=00:00:10.09 bitrate= 138.9kbits/s
frame= 304 fps=100 q=3.0 size= 218kB time=00:00:11.45 bitrate= 156.0kbits/s
frame= 339 fps= 96 q=3.0 size= 268kB time=00:00:12.84 bitrate= 171.0kbits/s
frame= 372 fps= 93 q=-1.0 Lsize= 376kB time=00:00:14.80 bitrate= 208.0kbits
/s speed=3.69x
video:27kB audio:319kB subtitle:0kB other streams:0kB global headers:0kB muxing
overhead: 8.551113%
[libx264 @ 000000000057d260] frame I:15 Avg QP: 3.01 size: 1370
[libx264 @ 000000000057d260] frame P:90 Avg QP: 3.00 size: 24
[libx264 @ 000000000057d260] frame B:267 Avg QP: 3.00 size: 17
[libx264 @ 000000000057d260] consecutive B-frames: 4.3% 0.0% 0.0% 95.7%
[libx264 @ 000000000057d260] mb I I16..4: 95.8% 0.0% 4.2%
[libx264 @ 000000000057d260] mb P I16..4: 0.0% 0.0% 0.0% P16..4: 0.1% 0.0
% 0.0% 0.0% 0.0% skip:99.9%
[libx264 @ 000000000057d260] mb B I16..4: 0.0% 0.0% 0.0% B16..8: 0.0% 0.0
% 0.0% direct: 0.0% skip:100.0% L0: 0.0% L1:100.0% BI: 0.0%
[libx264 @ 000000000057d260] 8x8 transform intra:0.0% inter:50.0%
[libx264 @ 000000000057d260] coded y,uvDC,uvAC intra: 3.1% 8.8% 8.2% inter: 0.0%
0.0% 0.0%
[libx264 @ 000000000057d260] i16 v,h,dc,p: 90% 5% 5% 0%
[libx264 @ 000000000057d260] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 46% 29% 23% 0% 0%
0% 0% 0% 0%
[libx264 @ 000000000057d260] i8c dc,h,v,p: 76% 7% 17% 0%
[libx264 @ 000000000057d260] Weighted P-Frames: Y:0.0% UV:0.0%
[libx264 @ 000000000057d260] kb/s:14.64
[aac @ 0000000000578320] Qavg: 65394.652
[aac @ 0000000000593020] Qavg: 65473.359
[aac @ 0000000000593940] Qavg: 65536.000
[aac @ 0000000000594260] Qavg: 65536.000
Exiting normally, received signal 2.
Terminate batch job (Y/N)? Y