
Recherche avancée
Médias (1)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
Autres articles (99)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...)
Sur d’autres sites (9094)
-
How to create video with images and different time interval ?
14 mai 2012, par Preet SandhuI am struggling with this command :
ffmpeg -loop 1 -r 5 -i img_0.jpg -c:v libx264 -preset slow -tune stillimage -crf 24 -t 20 $frame_target
it creates 20 sec video with one image. i need to put more images, so i make this like :
ffmpeg -loop 1 -r 5 -i img_%d.jpg -c:v libx264 -preset slow -tune stillimage -crf 24 -t 20 $frame_target
but this thing create video of 20 sec but images are changing multiple times. basically i want to divide time interval and every image should show ones only.
-
ffmpeg with opengl_es on Android
21 juin 2012, par ksharpThis is driving me crazy. I do know how to make ffmpeg decode videos&make opengl show contents on Android .But I can never make them together. FFmpeg keeps decoding frames while OpenGL has its own circle.How to make them co-work/sync ? Well,I mean OpenGL can show every frame right after decoded by ffmpeg ? Can I just call the decode methods in OpenGL onDrawFrame() method(by jni) ?What about the audiotrack then ?
Thx ! -
Revisiting Nosefart and Discovering GME
30 mai 2011, par Multimedia Mike — Game HackingI found the following screenshot buried deep in an old directory structure of mine :
I tried to recall how this screenshot came to exist. Had I actually created a functional KDE frontend to Nosefart yet neglected to release it ? I think it’s more likely that I used some designer tool (possibly KDevelop) to prototype a frontend. This would have been sometime in 2000.
However, this screenshot prompted me to revisit Nosefart.
Nosefart Background
Nosefart is a program that can play Nintendo Sound Format (NSF) files. NSF files are files containing components that were surgically separated from Nintendo Entertainment System (NES) ROM dumps. These components contain the music playback engines for various games. An NSF player is a stripped down emulation system that can simulate the NES6502 CPU along with the custom hardware (2 square waves, 1 triangle wave, 1 noise generator, and 1 limited digital channel).Nosefart was written by Matt Conte and eventually imported into a Sourceforge project, though it has not seen any development since then. The distribution contains standalone command line players for Linux and DOS, a GTK frontend for the Linux command line version, and plugins for Winamp, XMMS, and CL-Amp.
The Sourceforge project page notes that Nosefart is also part of XBMC. Let the record show that Nosefart is also incorporated into xine (I did that in 2002, I think).
Upgrading the API
When I tried running the command line version of Nosefart under Linux, I hit hard against the legacy audio API : OSS. Remember that ?In fairly short order, I was able to upgrade the CL program to use PulseAudio. The program is not especially sophisticated. It’s a single-threaded affair which checks for a keypress, processes an audio frame, and sends the frame out to the OSS file interface. All that was needed was to rewrite open_hardware() and close_hardware() for PA and then replace the write statement in play(). The only quirk that stood out is that including <pulse/pulseaudio.h> is insufficient for programming PA’s simple API. <pulse/simple.h> must be included separately.
For extra credit, I adapted the program to ALSA. The program uses the most simplistic audio output API possible — just keep filling a buffer and sending it out to the DAC.
Discovering GME
I’m not sure what to do with the the program now since, during my research to attempt to bring Nosefart up to date, I became aware of a software library named Game Music Emu, or GME. It’s a pure C++ library that can essentially play any classic video game format you can possible name. Wow. A lot can happen in 10 years when you’re not paying attention.It’s such a well-written library that I didn’t need any tutorial or documentation to come up to speed. Just a quick read of the main gme.h header library enabled me in short order to whip up a quick C program that could play NSF and SPC files. Path of least resistance : Client program asks library to open a hardcoded file, synthesize 10 seconds of audio, and dump it into a file ; ask the FLAC command line program to transcode raw data to .flac file ; use ffplay to verify the results.
I might develop some other uses for this library.