Recherche avancée

Médias (1)

Mot : - Tags -/biographie

Autres articles (95)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Les vidéos

    21 avril 2011, par

    Comme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
    Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
    Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)

Sur d’autres sites (8686)

  • Gstreamer Multiple Source with Single Sink Implementation in PythonGST

    11 octobre 2018, par biswajitGhosh

    I’m new to Gstreamer as well as Python-GST also. I have to collect multiple source raw video stream data into a single sink, I don’t know, if it is possible or not.

    Lets explain more my scenario :

    I have 2 video source server for now, one is my webcam and another is just an mp4 file, I did open those source using following command

    Source1 :

    gst-launch-1.0 v4l2src device=/dev/video0 !
    ’video/x-raw,width=640,height=480’ ! x264enc pass=qual quantizer=20
    tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=5000

    Source2 :

    gst-launch-1.0 filesrc location = file_name.mp4 !
    ’video/x-raw,width=640,height=480’ ! x264enc pass=qual quantizer=20
    tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=5000

    I’m trying to send both stream data to localhost 5000 port after encoding with H264.

    For receiving I have a Python Sink Server like this :

    import gi
    gi.require_version('Gst', '1.0')
    gi.require_version('GstApp', '1.0')
    from gi.repository import GObject, Gst, GstApp

    GObject.threads_init()
    Gst.init(None)
    GST_DEBUG="6"

    class Example:
       def __init__(self):
           self.mainloop   = GObject.MainLoop()
           self.pipeline   = Gst.Pipeline.new("Pipeline")
           self.bus        = self.pipeline.get_bus()
           self.bus.add_signal_watch()
           self.bus.connect("message", self.on_message)
           # self.bus.connect('message::eos', self.on_eos)
           # self.bus.connect('message::error', self.on_error)
           # self.bus.connect("sync-message::element", self.on_sync_message)

           #Create QUEUE elements
           self.queue1     = Gst.ElementFactory.make("queue",          None)
           self.queue2     = Gst.ElementFactory.make("queue",          None)

           # Create the elements
           self.source     = Gst.ElementFactory.make("udpsrc",         None)
           self.depay      = Gst.ElementFactory.make("rtph264depay",   None)
           self.parser     = Gst.ElementFactory.make("h264parse",      None)
           self.decoder    = Gst.ElementFactory.make("avdec_h264",     None)
           self.sink       = Gst.ElementFactory.make("appsink",        None)

           # Add elements to pipeline
           self.pipeline.add(self.source)
           self.pipeline.add(self.queue1)
           self.pipeline.add(self.depay)
           self.pipeline.add(self.parser)
           self.pipeline.add(self.decoder)
           self.pipeline.add(self.queue2)
           self.pipeline.add(self.sink)

           # Set properties
           self.source.set_property('port', 5000)
           self.source.set_property('caps', Gst.caps_from_string("application/x-rtp, encoding-name=H264,payload=96"))
           self.sink.set_property('emit-signals', True)
           # turns off sync to make decoding as fast as possible
           self.sink.set_property('sync', False)
           self.sink.connect('new-sample', self.on_new_buffer, self.sink)




       def on_new_buffer(self, appsink, data):
           print "exec two..."
           appsink_sample = GstApp.AppSink.pull_sample(self.sink)
           # with open('example.h264', 'a+') as streamer:
           buff = appsink_sample.get_buffer()
           size, offset, maxsize = buff.get_sizes()
           frame_data = buff.extract_dup(offset, size)
           print(size)
           # flag, info = buff.map(Gst.MapFlags.READ)
           # streamer.write(info.data)
           # print(info.size)
           return False

       def run(self):
           ret = self.pipeline.set_state(Gst.State.PLAYING)        
           if ret == Gst.StateChangeReturn.FAILURE:
               print("Unable to set the pipeline to the playing state.")
               exit(-1)

           self.mainloop.run()

       def kill(self):
           self.pipeline.set_state(Gst.State.NULL)
           self.mainloop.quit()

       def on_eos(self, bus, msg):
           print('on_eos()')
           self.kill()

       def on_error(self, bus, msg):
           print('on_error():', msg.parse_error())
           self.kill()

       def on_message(self, bus, message):
           t = message.type
           if t == Gst.MessageType.EOS:
               print "End of Stream :("
               self.kill()
           elif t == Gst.MessageType.ERROR:
               err, debug = message.parse_error()
               print "Error: %s" % err, debug
               self.kill()


       def on_sync_message(self, bus, message):
           print message.src


    example = Example()
    example.run()

    Firstly AppSink callback does not working, I don’t know why ? I think I have to make that configuration correct into Python Code. Can any one please help to figure it out ?

    Secondly when I tried with FFMpeg and FFPlay I got so many H264 encoding issues like this :

    enter image description here

    Main confusion is Can GStreamer handle Multiple Source data into a single sink(I need to distinguished each video frame).

    Thanks a lot.

  • Python : movie stream into pygame

    4 mars 2019, par user1533267

    Quick challenge :

    I’m streaming part of my desktop screen trough the network using the following :

    Server

    $ ffmpeg -f x11grab -s 800x600 -r 30 -i :100 -an -q 10 -f mjpeg - | nc -lp 5000

    Client

    $ nc <computer address="address"> 5000 | ffplay -
    </computer>

    I would like to display the following stream inside my pygame opengl window on the client, I read some documents on pygame.movie and found :

    "Load a new MPEG movie stream from a file or a python file object."

    Would it be possible to load the stream into pygame ?
    I need it to be as responsive as possible, and right now im seeing about 500ms lag.

    Any ideas ?

    Thanks.

  • H264/MP4 live stream from ffmpeg gets delayed

    1er octobre 2018, par paunescuionica

    I’m transcoding a MJPEG stream to H264/MP4 using ffmpeg. As client I’m using VLC.

    I use this command line :

    ffmpeg \
     -use_wallclock_as_timestamps 1 \
     -f mjpeg \
     -i "http://127.0.0.1:4100/getMjpegStream=L" \
     -f mp4 \
     -vcodec libx264 \
     -movflags frag_keyframe+empty_moov+faststart \
     -tune zerolatency \
     -crf 30 \
     -preset ultrafast \
     -g 1 \
     -reset_timestamps 1 \
     -vsync 1 \
     -flags global_header \
     -r 15 \
     -listen 1 \
     -fflags flush_packets \
     -avioflags direct \
     -flush_packets 1 \
     *output_URL*

    If I set the output_URL to this :

    udp://127.0.0.1:5000?pkt_size=1316

    it works just fine. I start ffmpeg, then after some time I start VLC with this URL udp://@127.0.0.1:5000, and it plays (almost) real time. The delay is 1-2 seconds, which is acceptable. And this delay is constant, it does not depend on when I start VLC. I can stop and reopen VLC and it keeps going realtime.

    But I need this stream to also be viewed in browser (in a HTML5 video), so I normally use this for output_URL :

    http://127.0.0.1:5000

    In VLC I use this URL http://127.0.0.1:5000 and it also works fine, but only if I start VLC immediately after I start ffmpeg. If there is a significant delay between the start of ffmpeg and the start of VLC, then that delay will be noticeable in the playback. Let’s say that I start ffmpeg at time T then after 10 seconds (at T+10) I start VLC. I have this behavior in VLC :

    • It starts, and it displays the frame at time T and then the stream freezes
    • After 10 seconds, the streaming resumes in VLC and it starts playing, but the image is 10 second behind ‘realtime’
    • This delays is constant, it does not recover from it

    Is there a way to solve this ?

    • Instruct ffmpeg to start transcoding only when the client connects ?
    • Instruct ffmpeg to not buffer transcoded stream until the client connects ? (with current command line, it clearly buffers because when VLC starts at T+10, the first frame displayed is from time T) ?

    Thank you !