Recherche avancée

Médias (91)

Autres articles (70)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (11354)

  • Decoding VP8 On A Sega Dreamcast

    20 février 2011, par Multimedia Mike — Sega Dreamcast, VP8

    I got Google’s libvpx VP8 codec library to compile and run on the Sega Dreamcast with its Hitachi/Renesas SH-4 200 MHz CPU. So give Google/On2 their due credit for writing portable software. I’m not sure how best to illustrate this so please accept this still photo depicting my testbench Dreamcast console driving video to my monitor :



    Why ? Because I wanted to try my hand at porting some existing software to this console and because I tend to be most comfortable working with assorted multimedia software components. This seemed like it would be a good exercise.

    You may have observed that the video is blue. Shortest, simplest answer : Pure laziness. Short, technical answer : Path of least resistance for getting through this exercise. Longer answer follows.

    Update : I did eventually realize that the Dreamcast can work with YUV textures. Read more in my followup post.

    Process and Pitfalls
    libvpx comes with a number of little utilities including decode_to_md5.c. The first order of business was porting over enough source files to make the VP8 decoder compile along with the MD5 testbench utility.

    Again, I used the KallistiOS (KOS) console RTOS (aside : I’m still working to get modern Linux kernels compiled for the Dreamcast). I started by configuring and compiling libvpx on a regular desktop Linux system. From there, I was able to modify a number of configuration options to make the build more amenable to the embedded RTOS.

    I had to create a few shim header files that mapped various functions related to threading and synchronization to their KOS equivalents. For example, KOS has a threading library cleverly named kthreads which is mostly compatible with the more common pthread library functions. KOS apparently also predates stdint.h, so I had to contrive a file with those basic types.

    So I got everything compiled and then uploaded the binary along with a small VP8 IVF test vector. Imagine my surprise when an MD5 sum came out of the serial console. Further, visualize my utter speechlessness when I noticed that the MD5 sum matched what my desktop platform produced. It worked !

    Almost. When I tried to decode all frames in a test vector, the program would invariably crash. The problem was that the file that manages motion compensation (reconinter.c) needs to define MUST_BE_ALIGNED which compiles byte-wise block copy functions. This is necessary for CPUs like the SH-4 which can’t load unaligned data. Apparently, even ARM CPUs these days can handle unaligned memory accesses which is why this isn’t a configure-time option.

    Showing The Work
    I completed the first testbench application which ran the MD5 test on all 17 official IVF test vectors. The SH-4/Dreamcast version aces the whole suite.

    However, this is a video game console, so I had better be able to show the decoded video. The Dreamcast is strictly RGB— forget about displaying YUV data directly. I could take the performance hit to convert YUV -> RGB. Or, I could just display the intensity information (Y plane) rendered on a random color scale (I chose blue) on an RGB565 texture (the DC’s graphics hardware can also do paletted textures but those need to be rearranged/twiddled/swizzled).

    Results
    So, can the Dreamcast decode VP8 video in realtime ? Sure ! Well, I really need to qualify. In the test depicted in the picture, it seems to be realtime (though I wasn’t enforcing proper frame timings, just decoding and displaying as quickly as possible). Obviously, I wasn’t bothering to properly convert YUV -> RGB. Plus, that Big Buck Bunny test vector clip is only 176x144. Obviously, no audio decoding either.

    So, realtime playback, with a little fine print.

    On the plus side, it’s trivial to get the Dreamcast video hardware to upscale that little blue image to fullscreen.

    I was able to tally the total milliseconds’ worth of wall clock time required to decode the 17 VP8 test vectors. As you can probably work out from this list, when I try to play a 320x240 video, things start to break down.

    1. Processed 29 176x144 frames in 987 milliseconds.
    2. Processed 49 176x144 frames in 1809 milliseconds.
    3. Processed 49 176x144 frames in 704 milliseconds.
    4. Processed 29 176x144 frames in 255 milliseconds.
    5. Processed 49 176x144 frames in 339 milliseconds.
    6. Processed 48 175x143 frames in 2446 milliseconds.
    7. Processed 29 176x144 frames in 432 milliseconds.
    8. Processed 2 1432x888 frames in 2060 milliseconds.
    9. Processed 49 176x144 frames in 1884 milliseconds.
    10. Processed 57 320x240 frames in 5792 milliseconds.
    11. Processed 29 176x144 frames in 989 milliseconds.
    12. Processed 29 176x144 frames in 740 milliseconds.
    13. Processed 29 176x144 frames in 839 milliseconds.
    14. Processed 49 175x143 frames in 2849 milliseconds.
    15. Processed 260 320x240 frames in 29719 milliseconds.
    16. Processed 29 176x144 frames in 962 milliseconds.
    17. Processed 29 176x144 frames in 933 milliseconds.
  • ffmpeg creating multiple output videos, splitting on gt(scene,x)

    18 janvier 2013, par Ben Halpern

    I want to split one video up into multiple parts based on detecting the first frame of each shot, by select scene in ffmpeg.

    The following entry records the scene frames and creates a photo mosaic out of them. This indicates to me that the select portion is functional, but I want to use this to create many separate videos, each scene it's own video file.

    ffmpeg -i video.mpg -vf select='gt(scene\,0.2331)','scale=320x240,tile=1x100' -frames:v preview.png

    Thank you. I think I am close, and I am open to any solution.

  • Anomalie #2521 (Nouveau) : Images dans upload et portfolio

    3 février 2012, par Suske -

    Je viens d’ajouter 150 images via /tmp/upoload. Idée : un album photo facile. Réalisation foirée : il faut cliquer une à une sur "Mettre dans le portfolio"... Piste : Un bouton "tout mettre" dans le portfolio ? Alternative : mettre d’office dans le portfolio (vu que les squel de la dist et bcp (...)