
Recherche avancée
Médias (1)
-
Bug de détection d’ogg
22 mars 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Video
Autres articles (101)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Contribute to translation
13 avril 2011You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
MediaSPIP is currently available in French and English (...) -
Prérequis à l’installation
31 janvier 2010, parPréambule
Cet article n’a pas pour but de détailler les installations de ces logiciels mais plutôt de donner des informations sur leur configuration spécifique.
Avant toute chose SPIPMotion tout comme MediaSPIP est fait pour tourner sur des distributions Linux de type Debian ou dérivées (Ubuntu...). Les documentations de ce site se réfèrent donc à ces distributions. Il est également possible de l’utiliser sur d’autres distributions Linux mais aucune garantie de bon fonctionnement n’est possible.
Il (...)
Sur d’autres sites (7304)
-
ffmpeg for x264 core 136 lib in iOS [on hold]
23 août 2013, par 官承翰I want to develop a service (written in php) to provide mobile platforms(iOS/android/..etc) or desktop PCs to stream videos.
I have tried a lot and found.
My iPhone4 didn't support core 136 lib based on x264...it would crash or freeze when I tried playing it.
Here is my question :
Can ffmpeg decode video so that iOS can play ?
Or is there another way to solve it ?
I am new to these things...Thanks in advance
-
How do I close a Node.js FFMPEG child process that is actively streaming from a live capture source ?
1er juin 2013, par RickZI'm new to Node.js and have figured out how to utilize child.spawn to launch an instance of FFMPEG that is being used to capture live video and send it over to Adobe Media Server via rtmp.
Every example I've seen of FFMPEG being used in conjunction with Node.js has been with a time limited sample, so the child process closes once FFMPEG reaches the end of the file it is converting.
In this case, there is no "end of file".
If I instantiate :
var ffmpeg = child.spawn('ffmpeg.exe', [args]);
it creates the live feed.
I have tried immediately shutting the child process down with a :
setTimeout(function() {
ffmpeg.stdin.resume();
ffmpeg.stdin.write('insert command to echo q to close FFMPEG');
ffmpeg.stdin.end();
});However, that does not seem to work. I continue to see my rtmp feed on my test box.
Is there any way to pass FFMPEG a shut down command via stdin in Node.js ?
Thanks in advance !
Rick
-
Writing Live-Multimedia-Application using OpenGL & Co. saving output to disc [closed]
21 janvier 2013, par user1997286I want to write an application that does the following thing :
- Getting Commands via ArtNET (DMX over Ethernet, a Control Protocol) for each object (called Layer)
- each Layer could be one of the following : Live Camera Stream, Movie, Image
- each layer could be translated, rotated or stretched
- on each layer I can set filters (Like a Kaleidoscope Effect, Blur, Color Correction, etc.)
- the rsulting video-stream is in the 3d-space
- I want to display each part of the image on one Projector (in total up to 3 ones) using a TripleHead2GO (3 Projectors display a different region of my DVI-Output). Each Projecector-Image should have own Soft-Edge and Keystone parameters.
- the resulting image will also be shown on a Preview-Screen with some Information overlay.
I think all that should be possible with opengl and openal (for the movie audio)
I think I'll use C++, OpenGL for Graphics, OpenAL for Audio, if needed ffmpeg for Video conversion, Ubuntu/Debian as OS.
The software is used to do Multimedia-Shows on Concerts including Cameras & Co.
All that should happen Live (On a FullHD output), Having i7 3770, GLX 670 and 16GB of Ram for at least 8 Layers. (4 Live-Images at once + Some Overlays like the Actors Name and some Logos)
But now comes the question.
Is it also Posible to do the following with that setting :
- Writing the output Image with all the 3d translations to a Movie File (To Master a DVD later) with Audio
- Mixing Audio from different Inputs & Files (Ambience Mics, Signal from the Sound Mixer, Playbacks from my own application) to more than one Mix (eg. one Mix for the Recording, one Mix for Live)
- Stream that Output Complete or in Parts (e.g. the left Part of the Image) over the Network (For example, Projector 1 is near the Server, so I connect it using DVI, Projector 2+3 is connected to a Computer that receives the streams for that two projectors (with soft edge on each stream) and Screen 4 is outside the Concert Hall and shows the complete Live-Stream.
- What GUI-Framework should I use for that ?
- is it perhaps event performant enough to use Java for that ?
- is it posible to use that mechanism for just rendering (eg. I have stored the cut points on Disc and saved every single camera stream to change some errors later or cut out some parts)