Recherche avancée

Médias (91)

Autres articles (35)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (4030)

  • Stream video from ffmpeg and capture with OpenCV

    10 décembre 2014, par chembrad

    I have a video stream coming in on rtp to ffmpeg and I want to pipe this to my OpenCV tools for live streaming processing. The rtp linkage is working because I am able to send the incoming data to a file and play it (or play if via ffplay). My OpenCV implementation is functional as well because I am able to capture video from a file and also a webcam.

    The problem is the streaming to OpenCV. I have heard that this may be done using a named pipe. First I could stream the ffmpeg output to the pipe and then have OpenCV open this pipe and begin processing.

    What I’ve tried :

    I make a named-pipe in my cygwin bash by :

       $ mkfifo stream_pipe

    Next I use my ffmpeg command to pull the stream from rtp and send it to the pipe :

       $ ffmpeg -f avi -i rtp://xxx.xxx.xxx.xxx:1234 -f avi -y out.avi > stream_pipe

    I am not sure if this is the right way to go about sending the stream to the named pipe but it seems to be accepting the command and work because of the output from ffmpeg gives me bitrates, fps, and such.

    Next I use the named pipe in my OpenCV capture function :

       $ ./cvcap.exe stream_pipe

    where the code for cvcap.cpp boils down to this :

       cv::VideoCapture *pIns = new cv::VideoCapture(argv[1]);

    The program seems to hang when reaching this one line, so, I am wondering if this is the right way of going about this. I have never used named pipes before and I am not sure if this is the correct usage. In addition, I don’t know if I need to handle the named pipe differently in OpenCV—change code around to accept this kind of input. Like I said, my code already accepts files and camera inputs, I am just hung up on a stream coming in. I have only heard that named pipes can be used for OpenCV—I haven’t seen any actual code or commands !

    Any help or insights are appreciated !

    UPDATE :

    I believe named pipes may not be working in the way I intended. As seen on this cygwin forum post :

    The problem is that Cygwin’s implementation of fifos is very buggy. I wouldn’t recommend using fifos for anything but the simplest of applications.

    I may need to find another way to do this. I have tried to pipe the ffmpeg output into a normal file and then have OpenCV read it at the same time. This works to some extent, but I imagine in can be dangerous to read and write from a file concurrently—who knows what would happen !

  • Which FFmpeg codec should be used for video streams with single byte pixel format ?

    2 décembre 2011, par Gearoid Murphy

    I've got a black and white video stream coming off a Firewire astronomy camera, I'd like to use FFmpeg to compress the video stream but it will not accept single byte pixel formats for the MPEG1VIDEO codecs. I've been trying random codecs for the last hour without much success, could anyone give me some sage advise on how to achieve my goal ? :) thx

  • Video Transcoding/Conversion using Java

    16 janvier 2012, par MalTec

    I would like to convert a video from one format to another. Initially, I want to convert MP4 to AVI or any other more suitable video format.

    Came across the ffmpeg library. Is is the best solution ?

    Came across Xubber and Jave for the same ? Any experiences ?