
Recherche avancée
Médias (1)
-
1 000 000 (wav version)
26 septembre 2011, par
Mis à jour : Avril 2013
Langue : English
Type : Audio
Autres articles (100)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
L’agrémenter visuellement
10 avril 2011MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté.
Sur d’autres sites (12395)
-
Pipe between two child processes in Node.js ?
4 mai 2014, par Joey MoraniI’m trying to capture video using FFmpeg with Node.js, and send it to a browser via websockets for playing using the MediaSource API. What I have so far works in Firefox but doesn’t decode properly in Chrome. Apparently, from reading this question I need to use the sample_muxer program to ensure each ’cluster’ starts with a keyframe.
Here’s the code I’m using :
var ffmpeg = child_process.spawn("ffmpeg",[
"-y",
"-r", "30",
"-f","dshow",
"-i","video=FFsource:audio=Stereo Mix (Realtek High Definition Audio)",
"-vcodec", "libvpx",
"-acodec", "libvorbis",
"-threads", "0",
"-b:v", "3300k",
"-keyint_min", "150",
"-g", "150",
"-f", "webm",
"-" // Output to STDOUT
]);
ffmpeg.stdout.on('data', function(data) {
//socket.send(data); // Just sending the FFmpeg clusters works with Firefox's
// implementation of the MediaSource API. No joy with Chrome.
// - - - This is the part that doesn't work - - -
var muxer = child_process.spawn("sample_muxer",[
"-i", data, // This isn't correct...
"-o", "-" // Output to STDOUT
]);
muxer.stdout.on('data', function(muxdata) {
socket.send(muxdata); // Send the cluster
});
});
ffmpeg.stderr.on('data', function (data) {
console.log("" + data); // Output to console
});Obviously I’m not piping it correctly and I’m unsure how I would while also including the arguments. Appreciate any help getting this working. Thanks !
-
swscale : add two spatially stable dithering methods
23 mars 2014, par Øyvind Kolåsswscale : add two spatially stable dithering methods
Both of these dithering methods are from http://pippin.gimp.org/a_dither/ for
GIF they can be considered better than bayer (provides more gray-levels), and
spatial stability - often more than twice as good compression and less visual
flicker than error diffusion methods (the methods also avoids error-shadow
artifacts of diffusion dithers).These methods are similar to blue/green noise type dither masks ; but are
simple enough to generate their mask on the fly. They are still research work
in progress ; though more expensive to generate masks (which can be used in a
LUT) like ’void and cluster’ and similar methods will yield superior results -
Node.js Webm live stream server : issues with tag
10 décembre 2013, par breathe0I'm using Node.js as stream server to stream realtime Webm videos that is sent by FFMPEG (executed from another application, the stream is done via HTTP) and received by a webapp that uses the tag.
This is what I'm doing : FFMPEG streams the received frames using the following command :
ffmpeg -r 30 -f rawvideo -pix_fmt bgra -s 640x480
-i \\.\pipe\STREAM_PIPE -r 60
-f segment -s 240x160 -codec:v libvpx -f webm
http://my.domain.com/video_stream.webm(the stream comes from an application that uses the Kinect as source and communicates with FFMPEG through a pipe, sending one frame after another)
When the webapp connects, it receives immediately this response from the server :
HTTP/1.1 200 OK
X-Powered-By: Express
content-type: video/webm
cache-control: private
connection: close
Date: Fri, 06 Dec 2013 14:36:31 GMTand a Webm header (previously stored on the server, with the same resolution and frame rate of the source stream and tested as working on VLC) is immediately appended. Then the webapp starts to receive the data streamed by FFMPEG. Here is a screenshot of Mkvinfo GUI showing the fields of the header :
However, even if the Network tab of the Chrome console shows that there is an actual stream of data (meaning that what is streamed is not completely garbage, otherwise the connection would be dropped), the player doesn't display anything. We tried to manually prepend our header to the dumped video received by the webapp and VLC plays it just fine, but this is not happening with the tag.
What can cause this problem ? Are we missing something about the encoding on the FFMPEG side or we stored wrong values on the header (or they're not enough) ?
PS : I cannot rely on an extern stream server.
PPS : We tried the following experiments :
- substituting the video header with the one stored in the server makes the video playable on both vlc and video tag
- if we dump the video that is already started (without an header) and we prepend the video header stored in the server or even its original header, the video is playable in VLC but not on the tag (we're carefully prepending the header just before the beggining of the cluster).