Recherche avancée

Médias (3)

Mot : - Tags -/spip

Autres articles (48)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

  • Use, discuss, criticize

    13 avril 2011, par

    Talk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
    The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
    A discussion list is available for all exchanges between users.

  • Installation en mode ferme

    4 février 2011, par

    Le mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
    C’est la méthode que nous utilisons sur cette même plateforme.
    L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
    Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)

Sur d’autres sites (5430)

  • Why can I not change the number of frames (nframes) in a gganimate animation ?

    26 décembre 2022, par Gekin

    I have produced an animation per gganimate and rendered it per ffmpeg. It works just fine, but only, if I do not change the number of frames. If I do set the number of frames, I get this error message :

    


    nframes and fps adjusted to match transition
Error parsing framerate 8,4.                           
Error: Rendering with ffmpeg failed


    


    I produced the gganim MonthlyAveragePrecipitationMap the following way :

    


    options(scipen = 999, OutDec  =  ",")

MonthlyAveragePrecipitationMap = ggplot(MonthlyAverageExtremePrecipitation) + 
  geom_path(data = map_data("world","Germany"),
            aes(x = long, y = lat, group = group)) +
  coord_fixed(xlim = c(6,15),
              ylim = c(47,55)) + 
  geom_point(aes(x=lon, y=lat, 
                 colour = ShareOfExtremePrecipitationEvents,
                 group = MonthOfYear),
             size = 3) + 
  scale_color_gradient(low="blue", high="yellow") + 
  xlab("Longitude (degree)") +
  ylab("Latitude (degree)") + 
  theme_bw() +
  transition_manual(frames = MonthOfYear) + 
  labs(title = '{unique(MonthlyAverageExtremePrecipitation$MonthOfYear)[as.integer(frame)]}', 
       color = paste0("Share of Extreme Precipitation Events \namong all Precipitation Events")) 


    


    I call the animation the following way :

    


    animate(MonthlyAveragePrecipitationMap,
        nframes = 300,
        renderer =
          ffmpeg_renderer(
            format = "auto",
            ffmpeg = NULL,
            options = list(pix_fmt = "yuv420p")))



    


    I used this exact code just a few days ago and it worked fine.

    


    Has someone had similar experiences ?
Thanks in advance.

    


  • Decoding VP8 On A Sega Dreamcast

    20 février 2011, par Multimedia Mike — Sega Dreamcast, VP8

    I got Google’s libvpx VP8 codec library to compile and run on the Sega Dreamcast with its Hitachi/Renesas SH-4 200 MHz CPU. So give Google/On2 their due credit for writing portable software. I’m not sure how best to illustrate this so please accept this still photo depicting my testbench Dreamcast console driving video to my monitor :



    Why ? Because I wanted to try my hand at porting some existing software to this console and because I tend to be most comfortable working with assorted multimedia software components. This seemed like it would be a good exercise.

    You may have observed that the video is blue. Shortest, simplest answer : Pure laziness. Short, technical answer : Path of least resistance for getting through this exercise. Longer answer follows.

    Update : I did eventually realize that the Dreamcast can work with YUV textures. Read more in my followup post.

    Process and Pitfalls
    libvpx comes with a number of little utilities including decode_to_md5.c. The first order of business was porting over enough source files to make the VP8 decoder compile along with the MD5 testbench utility.

    Again, I used the KallistiOS (KOS) console RTOS (aside : I’m still working to get modern Linux kernels compiled for the Dreamcast). I started by configuring and compiling libvpx on a regular desktop Linux system. From there, I was able to modify a number of configuration options to make the build more amenable to the embedded RTOS.

    I had to create a few shim header files that mapped various functions related to threading and synchronization to their KOS equivalents. For example, KOS has a threading library cleverly named kthreads which is mostly compatible with the more common pthread library functions. KOS apparently also predates stdint.h, so I had to contrive a file with those basic types.

    So I got everything compiled and then uploaded the binary along with a small VP8 IVF test vector. Imagine my surprise when an MD5 sum came out of the serial console. Further, visualize my utter speechlessness when I noticed that the MD5 sum matched what my desktop platform produced. It worked !

    Almost. When I tried to decode all frames in a test vector, the program would invariably crash. The problem was that the file that manages motion compensation (reconinter.c) needs to define MUST_BE_ALIGNED which compiles byte-wise block copy functions. This is necessary for CPUs like the SH-4 which can’t load unaligned data. Apparently, even ARM CPUs these days can handle unaligned memory accesses which is why this isn’t a configure-time option.

    Showing The Work
    I completed the first testbench application which ran the MD5 test on all 17 official IVF test vectors. The SH-4/Dreamcast version aces the whole suite.

    However, this is a video game console, so I had better be able to show the decoded video. The Dreamcast is strictly RGB— forget about displaying YUV data directly. I could take the performance hit to convert YUV -> RGB. Or, I could just display the intensity information (Y plane) rendered on a random color scale (I chose blue) on an RGB565 texture (the DC’s graphics hardware can also do paletted textures but those need to be rearranged/twiddled/swizzled).

    Results
    So, can the Dreamcast decode VP8 video in realtime ? Sure ! Well, I really need to qualify. In the test depicted in the picture, it seems to be realtime (though I wasn’t enforcing proper frame timings, just decoding and displaying as quickly as possible). Obviously, I wasn’t bothering to properly convert YUV -> RGB. Plus, that Big Buck Bunny test vector clip is only 176x144. Obviously, no audio decoding either.

    So, realtime playback, with a little fine print.

    On the plus side, it’s trivial to get the Dreamcast video hardware to upscale that little blue image to fullscreen.

    I was able to tally the total milliseconds’ worth of wall clock time required to decode the 17 VP8 test vectors. As you can probably work out from this list, when I try to play a 320x240 video, things start to break down.

    1. Processed 29 176x144 frames in 987 milliseconds.
    2. Processed 49 176x144 frames in 1809 milliseconds.
    3. Processed 49 176x144 frames in 704 milliseconds.
    4. Processed 29 176x144 frames in 255 milliseconds.
    5. Processed 49 176x144 frames in 339 milliseconds.
    6. Processed 48 175x143 frames in 2446 milliseconds.
    7. Processed 29 176x144 frames in 432 milliseconds.
    8. Processed 2 1432x888 frames in 2060 milliseconds.
    9. Processed 49 176x144 frames in 1884 milliseconds.
    10. Processed 57 320x240 frames in 5792 milliseconds.
    11. Processed 29 176x144 frames in 989 milliseconds.
    12. Processed 29 176x144 frames in 740 milliseconds.
    13. Processed 29 176x144 frames in 839 milliseconds.
    14. Processed 49 175x143 frames in 2849 milliseconds.
    15. Processed 260 320x240 frames in 29719 milliseconds.
    16. Processed 29 176x144 frames in 962 milliseconds.
    17. Processed 29 176x144 frames in 933 milliseconds.
  • FFMPEG - why zoompan causes unexpected stretching ?

    14 septembre 2020, par Sarmad S.

    I have two images as input, both are 1600x1066. I am vertically stacking them. Then I am drawing a box and vertically stacking that box under both of the image. Inside of the box I write text, then I output a video that is 1080x1920. Everything works well, until I use zoompan to zoom in on the images, I get weird behavior. basically all input images including the box stretchs (shrink) vertically and no longer fit the entire height of the video which is 1920.

    


    The command (removed some drawtext commands from it) :

    


    -filter_complex 
"color=s=1600x1066:color=blue, drawtext=fontfile=font.otf: text='My Text':fontcolor=white: fontsize=30: x=50: y=50[box]; 
[0]scale=4000x4000,zoompan=z='min(zoom+0.0015,1.5)':x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':d=125:s=1600x1066[z0];
[1]scale=4000x4000,zoompan=z='min(zoom+0.0015,1.5)':x='iw/2-(iw/zoom/2)':y='ih/2-(ih/zoom/2)':d=125:s=1600x1066[z1];
[z0][z1][box]vstack=inputs=3"


    


    How do I fix this ? I want to zoom in without stretching the images

    


    Before using zoompan this is how the video looks like (I want to keep it this way while zooming in the images) :
enter image description here

    


    After using zoompan this is how the video looks like :

    


    With zoompan