Recherche avancée

Médias (91)

Autres articles (53)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

Sur d’autres sites (12230)

  • PipedInputStream / PipedOutputStream, ImageIO and ffmpeg

    19 avril 2015, par jdevelop

    I have the following code in Scala :

         val pos = new PipedOutputStream()
         val pis = new PipedInputStream(pos)

         Future {
           LOG.trace("Start rendering")
           generateFrames(videoRenderParams.length) {
             img ⇒ ImageIO.write(img, "PNG", pos)
           }
           pos.flush()
           IOUtils.closeQuietly(pos)
           LOG.trace("Finished rendering")
         } onComplete {
           case Success(_) ⇒
             LOG.trace("Complete successfully")
           case Failure(err) ⇒
             LOG.error("Can't render stuff", err)
             IOUtils.closeQuietly(pis)
             IOUtils.closeQuietly(pos)
         }

         val prc = (ffmpegCli #< pis).!(logger)

    the Future simply writes the generated images one by one to the OutputStream. Now the ffmpeg process reads the input images from stdin and converts them to MP4 file.

    That works pretty well, but for some reason sometimes I’m getting the following stacktraces :

    I/O error Pipe closed for process: <input stream="stream" />
    java.io.IOException: Pipe closed
       at java.io.PipedInputStream.checkStateForReceive(PipedInputStream.java:260)
       at java.io.PipedInputStream.receive(PipedInputStream.java:226)
       at java.io.PipedOutputStream.write(PipedOutputStream.java:149)
       at scala.sys.process.BasicIO$.loop$1(BasicIO.scala:236)
       at scala.sys.process.BasicIO$.transferFullyImpl(BasicIO.scala:242)
       at scala.sys.process.BasicIO$.transferFully(BasicIO.scala:223)
       at scala.sys.process.ProcessImpl$PipeThread.runloop(ProcessImpl.scala:159)
       at scala.sys.process.ProcessImpl$PipeSource.run(ProcessImpl.scala:179)

    At the same time I’m getting the following error from another stream :

    javax.imageio.IIOException: I/O error writing PNG file!
       at com.sun.imageio.plugins.png.PNGImageWriter.write(PNGImageWriter.java:1168)
       at javax.imageio.ImageWriter.write(ImageWriter.java:615)
       at javax.imageio.ImageIO.doWrite(ImageIO.java:1612)
       at javax.imageio.ImageIO.write(ImageIO.java:1578)
       at

    So it seems that the streams were broken somewhere in between, so ffmpeg can not read the data, and ImageIO can not write the data.

    What is even more interesting - the problem is reproducible only on certain Linux server (Amazon). It works flawlessly on other Linux boxes. So I wonder if somebody could point me out to the possible causes of this error.

    What I’ve tried so far :

    • use Oracle JDK 8 and OpenJDK
    • use different versions of FFMPEG

    Nothing worked by the moment.

  • RTMP stream monitoring in python

    18 mars 2015, par lasgun

    I don’t have experience with python, but I found this online :
    https://gist.github.com/sinkers/d647a80fdb180b4cc3a6
    Assuming it works with the current version of ffmpeg (ffprobe), I tried to just modify the code a bit, so it doesn’t log in to Amazon SNS to send a message. Just simply opening an audio file when the stream goes down, with the following command (I found it on this site) would do just fine :

    os.system("start /sound/xyz.mp3")

    I tried to do this-and-that, but I can’t seem to succeed. I have 3.x installed.

    I know it’s probably silly, but do I need to enter the relative, or absolute file locations ? For ffprobe and the sound file, is it C :\... or what’s the correct format and path ?

    Any help to solve this would be greatly appreciated.

  • Muxing in audio to gstreamer RTMP stream kills both video and Audio

    1er avril 2015, par Adam

    I need some genius help here - I’m trying to set up a live stream for my upcoming wedding... and I have it ALMOST working - audio seems to be the problem.

    This is my setup

    • Raspberry Pi Model B+
    • Logitech C920 (with onboard h264 encoding that I am utilising)
    • on-camera (C920) microphone
    • USB wifi to iPhone 4G connection
    • gstreamer1.0
    • Amazon EC2 Wowza RTMP server

    I have it all set up, but as soon as I mux in the audio, the streams wont play by any player.

    What Works :
    - my gstreamer pipeline WITHOUT the audio muxed in
    - Wowza receives a consistent stream, no failures
    - The various Flash players / iOS / Android and VLC all play back the video

    What doesnt :
    - enabling audio in the mux (using the pipeline below)
    - BUT gstreamer doesnt complain
    - BUT Wowza receives a consistent stream, no failures
    - The various flash players fail to play both Audio and Video. some just display the first video frame
    - VLC plays 1 video frame, and about 100ms of audio, then stops

    Ideally I’d like the muxed audio/video FLV stored on the SD card too in case the network goes down - but if the ’tee’ needs to be sacrificed to make it work, so be it.

    This is my current FAILING pipeline - I assume there’s something really stupid in it because I know practically nothing about gstreamer.... The first frame loads in all the players (except iOS.. which never shows anything)

    # set camera resolution to 720p, and the data format to H264 (alternatives are YUV and JPG)
    v4l2-ctl --device=/dev/video0 --set-fmt-video=width=1280,height=720,pixelformat=1
    # set the frame rate
    v4l2-ctl --device=/dev/video0 --set-parm=10

    gst-launch-1.0 -v -e uvch264src initial-bitrate=300000 average-bitrate=300000 device=/dev/video0 name=src auto-start=true src.vidsrc \
                   ! queue \
                   ! video/x-h264,width=1280,height=720,framerate=10/1 \
                   ! h264parse \
                   ! flvmux streamable=true name=mux \
                   ! queue \
                   ! tee name=t \
                   ! queue \
                   ! filesink location=/home/pi/wedding.flv t. \
                   ! queue \
                   ! rtmpsink location='rtmp://wowzaserver/live/wedding live=1' >>/home/pi/wedding.log 2>&amp;1

    Some of the things I can’t really afford to change at this late stage are the encapsulation (FLV) and wowza RTMP because I’ve built everything around that...

    Please Help !! Thanks !

    UPDATE

    Given that I am also saving the FLV file, I have found that if I use ffmpeg to send that FLV file (using audio copy, video copy) to the RTMP server, everything works (but obviously its not live) ! So I am now starting to believe this is a problem with the way Gstreamer encapsulates RTMP - and by putting ffmpeg in the middle it fixes it... but it’s not live of course.
    Is it possible to pipe my output to ffmpeg and using ffmpeg’s RTMP ?