Recherche avancée

Médias (91)

Autres articles (106)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

Sur d’autres sites (11761)

  • yasm is installed in my system but in some other folder

    22 octobre 2014, par janpal

    I am trying to install ffmpeg and x264. For this to install I have followed the instructions given in "https://ffmpeg.org/trac/ffmpeg/wiki/UbuntuCompilationGuide". I have installed yasm and checked by using the command yasm —version it showing the successfully installed version. But when I try to install ffmpeg its throwing an error like yasm not found. I have referred some sites its intend me to check the path. yasm default path is /usr/local/bin but in my system its showing as /home/janpal/bin/yasm. I tried it as a root user also. Can somebody help me to install this in the correct path. I am using ubuntu 10.04

  • Playing H.264 video in an application through ffmpeg using DXVA2 acceleration

    28 avril 2012, par cloudraven

    I am trying to output H.264 video in a Windows application. I am moderately familiar with FFMPEG and I have been successful at getting it to play H.264 in a SDL window without a problem. Still, I would really benefit from using Hardware Acceleration (probably through DXVA2)

    I am reading raw H264 video, no container, no audio ... just raw video (and no B-frames, just I and P). Also, I know that all the systems that will use this applications have Nvidia GPUs supporting at least VP3.
    Given that set of assumptions I was hoping to cut some corners, make it simple instead of general, just have it working for my particular scenario.

    So far I know that I need to set the hardware acceleration in the codec context by filling the hwaccel member through a call to ff_find_hwaccel. My plan is to look at Media Player Classic Home Cinema which does a pretty good job at supporting DXVA2 using FFMPEG when decoding H.264. However, the code is quite large and I am not exactly sure where to look. I can find the place where ff_find_hwaccel is called in h264.c, but I was wondering where else should I be looking at.

    More specifically, I would like to know what is the minimum set of steps that I have to code to get DXVA2 through FFMPEG working ?

    EDIT : I am open to look at VLC or anything else if someone knows where I can find the "important" piece of code that does the trick. I just mentioned MPC-HC because I think it is the easiest to get to compile in Windows.

  • How To Write An Oscilloscope

    29 avril 2012, par Multimedia Mike — General, gme, oscilloscope, visualization

    I’m trying to figure out how to write a software oscilloscope audio visualization. It’s made more frustrating by the knowledge that I am certain that I have accomplished this task before.

    In this context, the oscilloscope is used to draw the time-domain samples of an audio wave form. I have written such a plugin as part of the xine project. However, for that project, I didn’t have to write the full playback pipeline— my plugin was just handed some PCM data and drew some graphical data in response. Now I’m trying to write the entire engine in a standalone program and I’m wondering how to get it just right.



    This is an SDL-based oscilloscope visualizer and audio player for Game Music Emu library. My approach is to have an audio buffer that holds a second of audio (44100 stereo 16-bit samples). The player updates the visualization at 30 frames per second. The o-scope is 512 pixels wide. So, at every 1/30th second interval, the player dips into the audio buffer at position ((frame_number % 30) * 44100 / 30) and takes the first 512 stereo frames for plotting on the graph.

    It seems to be working okay, I guess. The only problem is that the A/V sync seems to be slightly misaligned. I am just wondering if this is the correct approach. Perhaps the player should be performing some slightly more complicated calculation over those (44100/30) audio frames during each update in order to obtain a more accurate graph ? I described my process to an electrical engineer friend of mine and he insisted that I needed to apply something called hysteresis to the output or I would never get accurate A/V sync in this scenario.

    Further, I know that some schools of thought on these matters require that the dots in those graphs be connected, that the scattered points simply won’t do. I guess it’s a stylistic choice.

    Still, I think I have a reasonable, workable approach here. I might just be starting the visualization 1/30th of a second too late.