
Recherche avancée
Autres articles (100)
-
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.
Sur d’autres sites (8505)
-
How should I add a transparent watermark.png over my RTMP h264 stream with ffmpeg ?
16 juin 2013, par RoelandPI have a Raspberry Pi with the new camera module hooked up to (in this case) Bambuser. You can see the stream here, it's from a windmill in The Netherlands (camera position will be better within a few weeks).
I succesfully have the stream running, but now I want to add an image (alpha transparent png) on top of the input-stream which is piped to ffmpeg to be streamed to Bambuser.
I currently use the following command (user specific details wiped out) to succesfully stream the input from the Raspberry Camera module (it's great, HD & all, hardware rendering) to Bambuser, following the great tutorial by Slickstreamer :
raspivid -t 999999999 -w 960 -h 540 -fps 25 -b 500000 -o - | ffmpeg -i - -vcodec copy -an -metadata title="STREAM NAME" -f flv rtmp://USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X
I followed the docs about ffmpeg and it seems to me I should use the '-vf'-command to apply the 'movies :' filter, like so :
raspivid -t 999999999 -w 960 -h 540 -fps 25 -b 500000 -o - | ffmpeg -i - -vf "movie='/home/USER/watermark.png' [logo]; [in][logo] overlay=main_w-overlay_w-10:10 [out]" -vcodec copy -an -metadata title="STREAM NAME" -f flv rtmp://USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X
and various other -vf commands, like '-vf vflip' or '-vf mandelbrot'. But it doesn't seem to work, as the stream just shows the direct input from the Raspberry Camera.
This is the output when started with the following -vf command :
raspivid -t 999999999 -w 960 -h 540 -fps 25 -b 500000 -o - | ffmpeg -i - -vcodec copy -vf 'movie=0:png:/home/USER/watermark.png [watermark];[in] [watermark]overlay=0:0:1[out]' -an -metadata title="STREAM NAME" -f flv rtmp://USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X
ffmpeg version N-54036-g6c4516d Copyright (c) 2000-2013 the FFmpeg developers built on Jun 15 2013 XX:XX with gcc 4.6 (Debian 4.6.3-14+rpi1) configuration : libavutil 52. 35.101 / 52. 35.101 libavcodec 55. 16.100 / 55. 16.100 libavformat 55. 8.102 / 55. 8.102 libavdevice 55. 2.100 / 55. 2.100 libavfilter 3. 77.101 / 3. 77.101 libswscale 2. 3.100 / 2. 3.100 libswresample 0. 17.102 / 0. 17.102 [h264 @ 0x1917cc0] max_analyze_duration 5000000 reached at 5000000 microseconds Input #0, h264, from 'pipe :' : Duration : N/A, bitrate : N/A Stream #0:0 : Video : h264 (High), yuv420p, 960x540, 25 fps, 25 tbr, 1200k tbn, 50 tbc Output #0, flv, to 'rtmp ://USER_X.fme.bambuser.com/b-fme/USER_STREAM_KEY_X' : Metadata : title : STREAM NAME encoder : Lavf55.8.102 Stream #0:0 : Video : h264 ([7][0][0][0] / 0x0007), yuv420p, 960x540, q=2-31, 25 fps, 1k tbn, 1200k tbc Stream mapping : Stream #0:0 -> #0:0 (copy) frame= 2344 fps= 27 q=-1.0 size= 4827kB time=00:01:33.72 bitrate= 421.9kbits/s
As mentioned above, other -vf filters also don't seem to apply on the output stream on Bambuser, I think I fundamentally do something wrong here.
- Should I map the Raspivid-stream and map the image 'watermark.png' on top of that ? Would that be the solution ? Anyone experience with this ?
Thank you very much for your thoughts in advance.
-
Extracting subclip from a .webm file with ffmpeg [migrated]
19 juin 2013, par user815423426I am trying to extract a subclip from a
webm
file and write to anmp4
fil. I have the following command line :ffmpeg -ss 560 -i input.webm -ss 20 -t 46 -acodec copy -vcodec copy output.mp4
but I get the following error :
ffmpeg version 1.2 Copyright (c) 2000-2013 the FFmpeg developers
built on Mar 22 2013 10:42:11 with gcc 4.7.2 (GCC)
configuration: --prefix=/path/to/installations --enable-shared --enable-gpl --en
able-nonfree --enable-version3 --enable-libx264
libavutil 52. 18.100 / 52. 18.100
libavcodec 54. 92.100 / 54. 92.100
libavformat 54. 63.104 / 54. 63.104
libavdevice 54. 3.103 / 54. 3.103
libavfilter 3. 42.103 / 3. 42.103
libswscale 2. 2.100 / 2. 2.100
libswresample 0. 17.102 / 0. 17.102
libpostproc 52. 2.100 / 52. 2.100
Input #0, matroska,webm, from '/path/to/input.webm':
Duration: 01:13:20.32, start: 0.000000, bitrate: 2006 kb/s
Stream #0:0(eng): Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 30 fp,
30 tbr, 1k tbn, 1k tbc (default)
Stream #0:1(eng): Audio: vorbis, 44100 Hz, stereo, fltp (default)
[mp4 @ 0x1493c340] track 0: could not find tag, codec not currently support
ed in container
Output #0, mp4, to '/path/to/output.mp4':
Metadata:
encoder : Lavf54.63.104
Stream #0:0(eng): Video: vp8, yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=2-3
1, 30 fps, 90k tbn, 1k tbc (default)
Stream #0:1(eng): Audio: vorbis ([221][0][0][0] / 0x00DD), 44100 Hz, st
ereo (default)
Stream mapping:
Stream #0:0 -> #0:0 (copy)
Stream #0:1 -> #0:1 (copy)
Could not write header for output file #0 (incorrect codec parameters ?): O
peration not permittedThe strange thing is that the following works well for extracting files from
mp4
tomp4
ffmpeg -ss 560 -i input.mp4 -ss 20 -t 46 -acodec copy -vcodec copy output.mp4
-
Making Sure The PNG Gets There
14 juin 2013, par Multimedia Mike — GeneralRewind to 1999. I was developing an HTTP-based remote management interface for an embedded device. The device sat on an ethernet LAN and you could point a web browser at it. The pitch was to transmit an image of the device’s touch screen and the user could click on the picture to interact with the device. So we needed an image format. If you were computing at the time, you know that the web was insufferably limited back then. Our choice basically came down to GIF and JPEG. Being the office’s annoying free software zealot, I was championing a little known up and coming format named PNG.
So the challenge was to create our own PNG encoder (incorporating a library like libpng wasn’t an option for this platform). I seem to remember being annoyed at having to implement an integrity check (CRC) for the PNG encoder. It’s part of the PNG spec, after all. It just seemed so redundant. At the time, I reasoned that there were 5 layers of integrity validation in play.
I don’t know why, but I was reflecting on this episode recently and decided to revisit it. Here are all the encapsulation layers of a PNG file when flung over an ethernet network :
So there are up to 5 encapsulations for the data in this situation. At the innermost level is the image data which is compressed with the zlib DEFLATE method. At first, I thought that this also had a CRC or checksum. However, in researching this post, I couldn’t find any evidence of such an integrity check. Further, I don’t think we bothered to compress the PNG data in this project long ago. It was a small image, monochrome, and transferring via LAN, so the encoder could get away with signaling uncompressed data.
The graphical data gets wrapped up in a PNG chunk and all PNG chunks have a CRC. To transmit via the network, it goes into a TCP frame, which also has a checksum. That goes into an IP packet. I previously believed that this represented another integrity check. While an IP frame does have a checksum, the checksum only covers the IP header and not the payload. So that doesn’t really count towards this goal.
Finally, the data gets encapsulated into an ethernet frame which has — you guessed it — a CRC.
I see that other link layer protocols like PPP and wireless ethernet (802.11) also feature frame CRCs. So I guess what I’m saying is that, if you transfer a PNG file over the network, you can be confident that the data will be free of any errors.