Recherche avancée

Médias (0)

Mot : - Tags -/navigation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (52)

  • Pas question de marché, de cloud etc...

    10 avril 2011

    Le vocabulaire utilisé sur ce site essaie d’éviter toute référence à la mode qui fleurit allègrement
    sur le web 2.0 et dans les entreprises qui en vivent.
    Vous êtes donc invité à bannir l’utilisation des termes "Brand", "Cloud", "Marché" etc...
    Notre motivation est avant tout de créer un outil simple, accessible à pour tout le monde, favorisant
    le partage de créations sur Internet et permettant aux auteurs de garder une autonomie optimale.
    Aucun "contrat Gold ou Premium" n’est donc prévu, aucun (...)

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

  • Formulaire personnalisable

    21 juin 2013, par

    Cette page présente les champs disponibles dans le formulaire de publication d’un média et il indique les différents champs qu’on peut ajouter. Formulaire de création d’un Media
    Dans le cas d’un document de type média, les champs proposés par défaut sont : Texte Activer/Désactiver le forum ( on peut désactiver l’invite au commentaire pour chaque article ) Licence Ajout/suppression d’auteurs Tags
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire. (...)

Sur d’autres sites (8596)

  • FFMPEG 2 Videos transcoded and side by side in 1 frame ?

    3 mars 2016, par dcoffey3296

    I have 2 videos : HEADSHOT.MOV and SCREEN.MOV. They are both large files and I am looking to both shrink (size, bitrate, etc) and place these two side by side in the same, very wide, video frame. The end result would be that when you play the output_video.mp4, you would have a very wide frame with both videos in sync and playing at the same rate.

    Here is the syntatically incorrect version of what I am trying to do :

    ffmpeg -i HEADSHOT.MOV -t 00:02:00 -acodec libfaac -ab 64k -vcodec libx264 -r 30 -pass 1 -s 374x210 -vf "movie=SCREEN.MOV [small]; [in][small] -an -r 30 -pass 1 -s 374x210 overlay=10:10 -t 00:02:00 [out]" -threads 0 output_movie.mp4

    In the above example, I also tried to set a test movie duration for 2 minutes which raises another question, What is the best way to handle 2 movies of varying length (if they are close) ?

    The resources I have found helpful so far are :

    Multiple video sources combined into one and

    http://ffmpeg.org/ffmpeg.html#overlay-1

    Any help/advice is greatly appreciated. I am having trouble with the FFMPEG syntax ! Thank you !

  • VLC libx264 streaming muxed as FLV

    24 mars 2012, par Jan Novák

    I have a question on streaming output of libx264. My scenario is that Iam capturing video from webcam, encoding with x264 and then streaming data to flash, muxed as FLV. For muxing, Im using output/flv_bitstream.h, included in libx264 budle. The only modification of muxer, that I made, is that instead of fwrite() im usig send() to transfer data via socket... Encoding library is working fine. If I save output (even muxed), vlc player is able to play it. But, when it goes to data transfer via socket, vlc and flash are not cooperating. The weird thig is, that if Im sending data to vlc player thru socket, it waits till transmission end and then plays video from buffer. But what I need is to play live stream.

    I also tryed to read flv file and send it to vlc of flash tag by tag and it is working fine.

    Any suggestions ?

  • CVOpenGLESTextureCacheCreateTextureFromImage from uint8_t buffer

    6 novembre 2015, par resident_

    I’m developing an video player for iPhone. I’m using ffmpeg libraries to decode frames of video and I’m using opengl 2.0 to render the frames to the screen.

    But my render method is very slowly.

    A user told me :
    iOS 5 includes a new way to do this fast. The trick is to use AVFoundation and link a Core Video pixel buffer directly to an OpenGL texture.

    My problem now is that my video player send to render method a uint8_t* type that I use then with glTexSubImage2D.

    But if I want to use CVOpenGLESTextureCacheCreateTextureFromImage I need a CVImageBufferRef with the frame.

    The question is : How I can create CVImageBufferRef from uint8_t buffer ?

    This is my render method :

    - (void) render: (uint8_t*) buffer


    NSLog(@"render") ;

    [EAGLContext setCurrentContext:context];

    glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);
    glViewport(0, 0, backingWidth, backingHeight);

    glClearColor(0.0f, 0.0f, 0.0f, 1.0f);

    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

    // OpenGL loads textures lazily so accessing the buffer is deferred until draw; notify
    // the movie player that we're done with the texture after glDrawArrays.        
    glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, mFrameW, mFrameH, GL_RGB,GL_UNSIGNED_SHORT_5_6_5, buffer);  

    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

    [moviePlayerDelegate bufferDone];

    glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);
    [context presentRenderbuffer:GL_RENDERBUFFER];

    Thanks,