Recherche avancée

Médias (91)

Autres articles (74)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

Sur d’autres sites (12473)

  • Gstreamer Multiple Source with Single Sink Implementation in PythonGST

    11 octobre 2018, par biswajitGhosh

    I’m new to Gstreamer as well as Python-GST also. I have to collect multiple source raw video stream data into a single sink, I don’t know, if it is possible or not.

    Lets explain more my scenario :

    I have 2 video source server for now, one is my webcam and another is just an mp4 file, I did open those source using following command

    Source1 :

    gst-launch-1.0 v4l2src device=/dev/video0 !
    ’video/x-raw,width=640,height=480’ ! x264enc pass=qual quantizer=20
    tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=5000

    Source2 :

    gst-launch-1.0 filesrc location = file_name.mp4 !
    ’video/x-raw,width=640,height=480’ ! x264enc pass=qual quantizer=20
    tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=5000

    I’m trying to send both stream data to localhost 5000 port after encoding with H264.

    For receiving I have a Python Sink Server like this :

    import gi
    gi.require_version('Gst', '1.0')
    gi.require_version('GstApp', '1.0')
    from gi.repository import GObject, Gst, GstApp

    GObject.threads_init()
    Gst.init(None)
    GST_DEBUG="6"

    class Example:
       def __init__(self):
           self.mainloop   = GObject.MainLoop()
           self.pipeline   = Gst.Pipeline.new("Pipeline")
           self.bus        = self.pipeline.get_bus()
           self.bus.add_signal_watch()
           self.bus.connect("message", self.on_message)
           # self.bus.connect('message::eos', self.on_eos)
           # self.bus.connect('message::error', self.on_error)
           # self.bus.connect("sync-message::element", self.on_sync_message)

           #Create QUEUE elements
           self.queue1     = Gst.ElementFactory.make("queue",          None)
           self.queue2     = Gst.ElementFactory.make("queue",          None)

           # Create the elements
           self.source     = Gst.ElementFactory.make("udpsrc",         None)
           self.depay      = Gst.ElementFactory.make("rtph264depay",   None)
           self.parser     = Gst.ElementFactory.make("h264parse",      None)
           self.decoder    = Gst.ElementFactory.make("avdec_h264",     None)
           self.sink       = Gst.ElementFactory.make("appsink",        None)

           # Add elements to pipeline
           self.pipeline.add(self.source)
           self.pipeline.add(self.queue1)
           self.pipeline.add(self.depay)
           self.pipeline.add(self.parser)
           self.pipeline.add(self.decoder)
           self.pipeline.add(self.queue2)
           self.pipeline.add(self.sink)

           # Set properties
           self.source.set_property('port', 5000)
           self.source.set_property('caps', Gst.caps_from_string("application/x-rtp, encoding-name=H264,payload=96"))
           self.sink.set_property('emit-signals', True)
           # turns off sync to make decoding as fast as possible
           self.sink.set_property('sync', False)
           self.sink.connect('new-sample', self.on_new_buffer, self.sink)




       def on_new_buffer(self, appsink, data):
           print "exec two..."
           appsink_sample = GstApp.AppSink.pull_sample(self.sink)
           # with open('example.h264', 'a+') as streamer:
           buff = appsink_sample.get_buffer()
           size, offset, maxsize = buff.get_sizes()
           frame_data = buff.extract_dup(offset, size)
           print(size)
           # flag, info = buff.map(Gst.MapFlags.READ)
           # streamer.write(info.data)
           # print(info.size)
           return False

       def run(self):
           ret = self.pipeline.set_state(Gst.State.PLAYING)        
           if ret == Gst.StateChangeReturn.FAILURE:
               print("Unable to set the pipeline to the playing state.")
               exit(-1)

           self.mainloop.run()

       def kill(self):
           self.pipeline.set_state(Gst.State.NULL)
           self.mainloop.quit()

       def on_eos(self, bus, msg):
           print('on_eos()')
           self.kill()

       def on_error(self, bus, msg):
           print('on_error():', msg.parse_error())
           self.kill()

       def on_message(self, bus, message):
           t = message.type
           if t == Gst.MessageType.EOS:
               print "End of Stream :("
               self.kill()
           elif t == Gst.MessageType.ERROR:
               err, debug = message.parse_error()
               print "Error: %s" % err, debug
               self.kill()


       def on_sync_message(self, bus, message):
           print message.src


    example = Example()
    example.run()

    Firstly AppSink callback does not working, I don’t know why ? I think I have to make that configuration correct into Python Code. Can any one please help to figure it out ?

    Secondly when I tried with FFMpeg and FFPlay I got so many H264 encoding issues like this :

    enter image description here

    Main confusion is Can GStreamer handle Multiple Source data into a single sink(I need to distinguished each video frame).

    Thanks a lot.

  • H264/MP4 live stream from ffmpeg gets delayed

    1er octobre 2018, par paunescuionica

    I’m transcoding a MJPEG stream to H264/MP4 using ffmpeg. As client I’m using VLC.

    I use this command line :

    ffmpeg \
     -use_wallclock_as_timestamps 1 \
     -f mjpeg \
     -i "http://127.0.0.1:4100/getMjpegStream=L" \
     -f mp4 \
     -vcodec libx264 \
     -movflags frag_keyframe+empty_moov+faststart \
     -tune zerolatency \
     -crf 30 \
     -preset ultrafast \
     -g 1 \
     -reset_timestamps 1 \
     -vsync 1 \
     -flags global_header \
     -r 15 \
     -listen 1 \
     -fflags flush_packets \
     -avioflags direct \
     -flush_packets 1 \
     *output_URL*

    If I set the output_URL to this :

    udp://127.0.0.1:5000?pkt_size=1316

    it works just fine. I start ffmpeg, then after some time I start VLC with this URL udp://@127.0.0.1:5000, and it plays (almost) real time. The delay is 1-2 seconds, which is acceptable. And this delay is constant, it does not depend on when I start VLC. I can stop and reopen VLC and it keeps going realtime.

    But I need this stream to also be viewed in browser (in a HTML5 video), so I normally use this for output_URL :

    http://127.0.0.1:5000

    In VLC I use this URL http://127.0.0.1:5000 and it also works fine, but only if I start VLC immediately after I start ffmpeg. If there is a significant delay between the start of ffmpeg and the start of VLC, then that delay will be noticeable in the playback. Let’s say that I start ffmpeg at time T then after 10 seconds (at T+10) I start VLC. I have this behavior in VLC :

    • It starts, and it displays the frame at time T and then the stream freezes
    • After 10 seconds, the streaming resumes in VLC and it starts playing, but the image is 10 second behind ‘realtime’
    • This delays is constant, it does not recover from it

    Is there a way to solve this ?

    • Instruct ffmpeg to start transcoding only when the client connects ?
    • Instruct ffmpeg to not buffer transcoded stream until the client connects ? (with current command line, it clearly buffers because when VLC starts at T+10, the first frame displayed is from time T) ?

    Thank you !

  • Stream real-time video flux in HTML video tag

    26 septembre 2018, par c.censier

    I want to stream a real-time video flux that come from udp into a HTML video tag.
    I made some research but I got a lot of informations and I struggle to have a clear overview of what I can do and what I can’t.

    The video flux use H.264 and AAC codecs, MP4 container and has a 3840x2160 (4K) resolution. I’d like to play it on Chrome (latest version).

    As I understand from now, HTML video tag can natively read H.264/AAC videos. I made it work with the video direclty on my server (I’m using Meteor JS + React).

    I learnt to use FFmpeg to stream an udp flux read by VLC player, and then I used FFserver (I know it’s deprecated) to create an HTTP flux also read by VLC but not by the HTML video tag.

    So... my question is : is HTML video can natively read video stream from HTTP ?

    I’ve seen a lot of discussions about HLS and DASH, but I didn’t understand if (and why) they’re mandatory.

    I read a post about someone creating a HLS m3u8 using only FFmpeg, is it a viable solution ?

    FFserver configuration

    HTTPPort                        8090
    HTTPBindAddress                 0.0.0.0
    MaxHTTPConnections              20
    MaxClients                      10
    MaxBandwidth                    100000

    <feed>
     File                          /tmp/feed.ffm
     FileMaxSize                   1g
     ACL allow                     127.0.0.1
    </feed>

    <stream>
     Feed                          feed.ffm
     Format                        mpeg
     AudioCodec                    aac
     AudioBitRate                  256
     AudioChannels                 1
     VideoCodec                    libx264
     VideoBitRate                  10000      // Total random here
     VideoBitRateRange             5000-15000 // And here...
     VideoFrameRate                30
     VideoQMin                     1
     VideoQMax                     50
     VideoSize                     3840x2160
     VideoBufferSize               20000      // Not sure either
     AVOptionVideo                 flags +global_header
    </stream>

    I had to specify QMin and QMax to avoid error message but I don’t really understand what is it.

    FFmpeg command line

    ffmpeg -re -i bbb_sunflower_2160p_30fps_normal.mp4 -strict -2 -r 30 -vcodec libx264 http://localhost:8090/feed.ffm

    This work with VLC. I’m working with a file on my computer before moving to an udp stream.