
Recherche avancée
Médias (1)
-
La conservation du net art au musée. Les stratégies à l’œuvre
26 mai 2011
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (90)
-
L’utiliser, en parler, le critiquer
10 avril 2011La première attitude à adopter est d’en parler, soit directement avec les personnes impliquées dans son développement, soit autour de vous pour convaincre de nouvelles personnes à l’utiliser.
Plus la communauté sera nombreuse et plus les évolutions seront rapides ...
Une liste de discussion est disponible pour tout échange entre utilisateurs. -
Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur
8 février 2011, parLa visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
Configuration de la boite multimédia
Dès (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs
Sur d’autres sites (9831)
-
How should I add a transparent watermark.png over my RTMP h264 stream with ffmpeg ?
16 juin 2013, par RoelandPI have a Raspberry Pi with the new camera module hooked up to (in this case) Bambuser. You can see the stream here, it's from a windmill in The Netherlands (camera position will be better within a few weeks).
I succesfully have the stream running, but now I want to add an image (alpha transparent png) on top of the input-stream which is piped to ffmpeg to be streamed to Bambuser.
I currently use the following command (user specific details wiped out) to succesfully stream the input from the Raspberry Camera module (it's great, HD & all, hardware rendering) to Bambuser, following the great tutorial by Slickstreamer :
raspivid -t 999999999 -w 960 -h 540 -fps 25 -b 500000 -o - | ffmpeg -i - -vcodec copy -an -metadata title="STREAM NAME" -f flv rtmp://USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X
I followed the docs about ffmpeg and it seems to me I should use the '-vf'-command to apply the 'movies :' filter, like so :
raspivid -t 999999999 -w 960 -h 540 -fps 25 -b 500000 -o - | ffmpeg -i - -vf "movie='/home/USER/watermark.png' [logo]; [in][logo] overlay=main_w-overlay_w-10:10 [out]" -vcodec copy -an -metadata title="STREAM NAME" -f flv rtmp://USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X
and various other -vf commands, like '-vf vflip' or '-vf mandelbrot'. But it doesn't seem to work, as the stream just shows the direct input from the Raspberry Camera.
This is the output when started with the following -vf command :
raspivid -t 999999999 -w 960 -h 540 -fps 25 -b 500000 -o - | ffmpeg -i - -vcodec copy -vf 'movie=0:png:/home/USER/watermark.png [watermark];[in] [watermark]overlay=0:0:1[out]' -an -metadata title="STREAM NAME" -f flv rtmp://USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X
ffmpeg version N-54036-g6c4516d Copyright (c) 2000-2013 the FFmpeg developers built on Jun 15 2013 XX:XX with gcc 4.6 (Debian 4.6.3-14+rpi1) configuration : libavutil 52. 35.101 / 52. 35.101 libavcodec 55. 16.100 / 55. 16.100 libavformat 55. 8.102 / 55. 8.102 libavdevice 55. 2.100 / 55. 2.100 libavfilter 3. 77.101 / 3. 77.101 libswscale 2. 3.100 / 2. 3.100 libswresample 0. 17.102 / 0. 17.102 [h264 @ 0x1917cc0] max_analyze_duration 5000000 reached at 5000000 microseconds Input #0, h264, from 'pipe :' : Duration : N/A, bitrate : N/A Stream #0:0 : Video : h264 (High), yuv420p, 960x540, 25 fps, 25 tbr, 1200k tbn, 50 tbc Output #0, flv, to 'rtmp ://USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X' : Metadata : title : STREAM NAME encoder : Lavf55.8.102 Stream #0:0 : Video : h264 ([7][0][0][0] / 0x0007), yuv420p, 960x540, q=2-31, 25 fps, 1k tbn, 1200k tbc Stream mapping : Stream #0:0 -> #0:0 (copy) frame= 2344 fps= 27 q=-1.0 size= 4827kB time=00:01:33.72 bitrate= 421.9kbits/s
As mentioned above, other -vf filters also don't seem to apply on the output stream on Bambuser, I think I fundamentally do something wrong here.
- Should I map the Raspivid-stream and map the image 'watermark.png' on top of that ? Would that be the solution ? Anyone experience with this ?
Thank you very much for your thoughts in advance.
-
Extracting subclip from a .webm file with ffmpeg [migrated]
19 juin 2013, par user815423426I am trying to extract a subclip from a
webm
file and write to anmp4
fil. I have the following command line :ffmpeg -ss 560 -i input.webm -ss 20 -t 46 -acodec copy -vcodec copy output.mp4
but I get the following error :
ffmpeg version 1.2 Copyright (c) 2000-2013 the FFmpeg developers
built on Mar 22 2013 10:42:11 with gcc 4.7.2 (GCC)
configuration: --prefix=/path/to/installations --enable-shared --enable-gpl --en
able-nonfree --enable-version3 --enable-libx264
libavutil 52. 18.100 / 52. 18.100
libavcodec 54. 92.100 / 54. 92.100
libavformat 54. 63.104 / 54. 63.104
libavdevice 54. 3.103 / 54. 3.103
libavfilter 3. 42.103 / 3. 42.103
libswscale 2. 2.100 / 2. 2.100
libswresample 0. 17.102 / 0. 17.102
libpostproc 52. 2.100 / 52. 2.100
Input #0, matroska,webm, from '/path/to/input.webm':
Duration: 01:13:20.32, start: 0.000000, bitrate: 2006 kb/s
Stream #0:0(eng): Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 30 fp,
30 tbr, 1k tbn, 1k tbc (default)
Stream #0:1(eng): Audio: vorbis, 44100 Hz, stereo, fltp (default)
[mp4 @ 0x1493c340] track 0: could not find tag, codec not currently support
ed in container
Output #0, mp4, to '/path/to/output.mp4':
Metadata:
encoder : Lavf54.63.104
Stream #0:0(eng): Video: vp8, yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=2-3
1, 30 fps, 90k tbn, 1k tbc (default)
Stream #0:1(eng): Audio: vorbis ([221][0][0][0] / 0x00DD), 44100 Hz, st
ereo (default)
Stream mapping:
Stream #0:0 -> #0:0 (copy)
Stream #0:1 -> #0:1 (copy)
Could not write header for output file #0 (incorrect codec parameters ?): O
peration not permittedThe strange thing is that the following works well for extracting files from
mp4
tomp4
ffmpeg -ss 560 -i input.mp4 -ss 20 -t 46 -acodec copy -vcodec copy output.mp4
-
Making Sure The PNG Gets There
14 juin 2013, par Multimedia Mike — GeneralRewind to 1999. I was developing an HTTP-based remote management interface for an embedded device. The device sat on an ethernet LAN and you could point a web browser at it. The pitch was to transmit an image of the device’s touch screen and the user could click on the picture to interact with the device. So we needed an image format. If you were computing at the time, you know that the web was insufferably limited back then. Our choice basically came down to GIF and JPEG. Being the office’s annoying free software zealot, I was championing a little known up and coming format named PNG.
So the challenge was to create our own PNG encoder (incorporating a library like libpng wasn’t an option for this platform). I seem to remember being annoyed at having to implement an integrity check (CRC) for the PNG encoder. It’s part of the PNG spec, after all. It just seemed so redundant. At the time, I reasoned that there were 5 layers of integrity validation in play.
I don’t know why, but I was reflecting on this episode recently and decided to revisit it. Here are all the encapsulation layers of a PNG file when flung over an ethernet network :
So there are up to 5 encapsulations for the data in this situation. At the innermost level is the image data which is compressed with the zlib DEFLATE method. At first, I thought that this also had a CRC or checksum. However, in researching this post, I couldn’t find any evidence of such an integrity check. Further, I don’t think we bothered to compress the PNG data in this project long ago. It was a small image, monochrome, and transferring via LAN, so the encoder could get away with signaling uncompressed data.
The graphical data gets wrapped up in a PNG chunk and all PNG chunks have a CRC. To transmit via the network, it goes into a TCP frame, which also has a checksum. That goes into an IP packet. I previously believed that this represented another integrity check. While an IP frame does have a checksum, the checksum only covers the IP header and not the payload. So that doesn’t really count towards this goal.
Finally, the data gets encapsulated into an ethernet frame which has — you guessed it — a CRC.
I see that other link layer protocols like PPP and wireless ethernet (802.11) also feature frame CRCs. So I guess what I’m saying is that, if you transfer a PNG file over the network, you can be confident that the data will be free of any errors.