Recherche avancée

Médias (1)

Mot : - Tags -/remix

Autres articles (51)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (8567)

  • Is there a faster video rendering solution than using AndroidBitmap_xxx functions ?

    25 juillet 2012, par Alex

    In my native thread I have FFMpeg getting and decoding frames then putting them in a queue.

    On Java side I have a GLSurfaceView and from Renderer.onDrawFrame I'm calling into the native code passing a bitmap (that bitmap I create only once, then pass it every time).

    In the native code I get the head of the queue, copy data to the java bitmap using AndroidBitmap_xxx functions, then render that Bitmap on Java side as a texture.

    I wonder is there a faster way to render video ? Shall I do it entirely in the native code, if yes, why it will be faster ?

    Edit : I now don't copy RGB frame pixels to the locked bitmap pixels, rather I decode YUV frame directly into the locked bitmap pixels. This makes rendering significantly faster (because no unneeded memcpy anymore) still the question remains.

  • Need help understanding HTML 5 audio/video meta data use and placement

    27 juin 2012, par nicoz

    I have spent a significant amount of time researching HTML 5 audio and video, however I am stuck understanding the encoding of the meta data. It seems that in my research, using programs like Handbrake (ffmpeg) it is recommend to check of "web optimized" (There is also a similar type of setting when exporting from other types of software, like imovie). I have concluded (and correct me if I am wrong) that this has to do with moving the file's meta data to the beginning of the file. This then gives the player/browser the info it requires to play immediately without having to wait to load the entire file.

    So here is where I am confused and the question I need help with

    Does this only apply to the mp4 MOOV atom ? Or is there something similar in .ogv and .webm files ?

  • Truly live streaming to Android/iPhone

    4 juillet 2012, par Tsaukpaetra

    I have spent quite a while (past week) trying this to little avail. However, what I want seems completely unheard of. So far, I have reviewed recommendations available through google, which include encoding a static file into multiple static files in different formats, creating a playlist that hosts static files in an m3u8 file (files which get added to the playlist as streaming continues).
    I have also seen ideas involving rtmp, rtsp etc which are completely out of the question because of their incompatibility.
    Ideally, I would have one webpage that would link to the stream (http://server/video.mp4) and/or show it in a webpage (via the video tag). With that in mind, the most likely format would be h264+aac in mp4 container.

    Unfortunately, (and probably because the file has no duration metadata) it does not work. I can use a desktop player (such as VLC) to open the stream and play it, but my iPhone and Android both give their respective "Can't be played" messages.

    I don't think the problem is caused by the devices' ability to stream, for I have made a streaming shoutcast server work just fine (mp3 only).

    Currently, the closest I have become is using the following setup on my win32 machine :

    FFMPEG Command: : ffmpeg -f dshow -i video="Logitech Webcam 200":audio="Microphone (Webcam 200)" -b:v 180k -bt 240k -vcodec libx264 -tune zerolatency -profile:v baseline -preset ultrafast -r 10 -strict -2 -acodec aac -ac 2 -ar 48000 -ab 32k -f flv "udp ://127.0.0.1:1234"

    VLC: : Stream from udp ://127.0.0.1:1234 to http:// :8080/video.mp4 (No Transcoding), basically just to convert the UDP stream into an http-accessible stream.

    Any hints or suggestions would be warmly welcomed !