Recherche avancée

Médias (1)

Mot : - Tags -/ogg

Autres articles (12)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (4618)

  • ffmpeg posting to nginx hangs after 90 minutes

    22 octobre 2018, par Harnek Gulati

    I’m using a very simple program to push an ffmpeg stream from my raspberry pi camera to a nginx server. However, it fails after 90 minutes, with almost every single Raspberry Pi I use (I have this code on 12 Raspberry Pis).
    Here is my ffmpeg command :

    raspivid -o - -t 0 -w 640 -h 480 -fps 25 | ffmpeg -re -i - -vcodec copy -hls_time 4 -f hls -hls_list_size 5 -hls_wrap 5 -threads 0  -timeout 5000 -max_reload 20000 -method PUT http://{}:{}/live/{}

    And here is my configuration for nginx.conf :

    #user nobody;
    load_module /usr/lib/nginx/modules/ndk_http_module.so;
    load_module /usr/lib/nginx/modules/ngx_http_lua_module.so;

    worker_processes auto;
    env CONTROL_SERVER_IP;


    #error_log /var/log/nginx/error.log;
    error_log /dev/stdout info;
    events {
       worker_connections 1024;
    }

    http {
       include     mime.types;
       default_type    application/octet-stream;
       access_log /dev/stdout;
       sendfile on;
       keepalive_requests 100000;
       keepalive_timeout 30;

     client_max_body_size 10M;
     server {
       listen 80;
       server_name localhost;

       location /live {
         root /var/static;
               client_body_temp_path   /var/static/client_temp;

               dav_methods PUT;
               create_full_put_path on;
               dav_access user:rw  group:r     all:r;

               types {
                   application/vnd.apple.mpegurl m3u8;
               }

               # Disable Cache
               add_header Cache-Control no-cache;
               include cors.conf;
     }
     }
    }

    daemon off;

    If anyone can help me, I would deeply appreciate it. I’ve been pulling my hair out trying to figure out this bug. On the raspberry pi, it hangs on this :

    [hls @ 0x25a8c90] Opening 'http://192.168.8.1:80/live/c35d8935-0a31-4d22-b71a-ad3f4f1d47631.ts' for writing
    frame=105609 fps= 25 q=-1.0 q=1.6 size=N/A time=01:11:40.00 bitrate=N/A dup=0 drop=105518 speed=1.02xframe=105623 fps= 25 q=-1.0 q=1.6 size=N/A time=01:11:40.00 bitrate=N/A dup=0 drop=105532 speed=1.02xframe=105636 fps= 25 q=-1.0 q=1.6 size=N/A time=01:11:40.00 bitrate=N/A dup=0 drop=105545 speed=1.02xframe=105648 fps= 25 q=-1.0 q=1.6 size=N/A time=01:11:40.00 bitrate=N/A dup=0 drop=105557 speed=1.02xframe=105662 fps= 25 q=-1.0 q=1.6 size=N/A time=01:11:40.00 bitrate=N/A dup=0 drop=105571 speed=1.02xframe=105674 fps= 25 q=-1.0 q=1.6 size=N/A time=01:11:40.00 bitrate=N/A dup=0 drop=105583 speed=1.02xframe=105688 fps= 25 q=-1.0 q=1.6 size=N/A time=01:11:40.00 bitrate=N/A dup=0 drop=105597 speed=1.02xframe=105700 fps= 25 q=-1.0 q=1.6 size=N/A time=01:11:40.00 bitrate=N/A dup=0 drop=105609 speed=1.02xframe=105714 fps= 25 q=-1.0 q=1.6 size=N/A time=01:11:40.00 bitrate=N/A dup=0 drop=105623 speed=1.02x[hls muxer @ 0x25a9200] Duplicated segment filename detected: c35d8935-0a31-4d22-b71a-ad3f4f1d47631.ts
    [hls @ 0x25a8c90] Opening 'http://192.168.8.1:80/live/c35d8935-0a31-4d22-b71a-ad3f4f1d47632.ts' for writing

    And on the nginx logs, I get :

    192.168.10.242 - - [21/Oct/2018:22:34:01 +0000] "PUT /live/c35d8935-0a31-4d22-b71a-ad3f4f1d4763.m3u8 HTTP/1.1" 204 0 "-" "Lavf/57.83.100"
    192.168.10.242 - - [21/Oct/2018:22:35:04 +0000] "PUT /live/c35d8935-0a31-4d22-b71a-ad3f4f1d47633.ts HTTP/1.1" 408 0 "-" "Lavf/57.83.100"

    I need to set up a way to either a) keep consistent connections longer than 90 minutes or b) detect when the 408 error happens and stop it from hanging.

    FFMPEG version : 3.4.1

  • Gstreamer Multiple Source with Single Sink Implementation in PythonGST

    11 octobre 2018, par biswajitGhosh

    I’m new to Gstreamer as well as Python-GST also. I have to collect multiple source raw video stream data into a single sink, I don’t know, if it is possible or not.

    Lets explain more my scenario :

    I have 2 video source server for now, one is my webcam and another is just an mp4 file, I did open those source using following command

    Source1 :

    gst-launch-1.0 v4l2src device=/dev/video0 !
    ’video/x-raw,width=640,height=480’ ! x264enc pass=qual quantizer=20
    tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=5000

    Source2 :

    gst-launch-1.0 filesrc location = file_name.mp4 !
    ’video/x-raw,width=640,height=480’ ! x264enc pass=qual quantizer=20
    tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=5000

    I’m trying to send both stream data to localhost 5000 port after encoding with H264.

    For receiving I have a Python Sink Server like this :

    import gi
    gi.require_version('Gst', '1.0')
    gi.require_version('GstApp', '1.0')
    from gi.repository import GObject, Gst, GstApp

    GObject.threads_init()
    Gst.init(None)
    GST_DEBUG="6"

    class Example:
       def __init__(self):
           self.mainloop   = GObject.MainLoop()
           self.pipeline   = Gst.Pipeline.new("Pipeline")
           self.bus        = self.pipeline.get_bus()
           self.bus.add_signal_watch()
           self.bus.connect("message", self.on_message)
           # self.bus.connect('message::eos', self.on_eos)
           # self.bus.connect('message::error', self.on_error)
           # self.bus.connect("sync-message::element", self.on_sync_message)

           #Create QUEUE elements
           self.queue1     = Gst.ElementFactory.make("queue",          None)
           self.queue2     = Gst.ElementFactory.make("queue",          None)

           # Create the elements
           self.source     = Gst.ElementFactory.make("udpsrc",         None)
           self.depay      = Gst.ElementFactory.make("rtph264depay",   None)
           self.parser     = Gst.ElementFactory.make("h264parse",      None)
           self.decoder    = Gst.ElementFactory.make("avdec_h264",     None)
           self.sink       = Gst.ElementFactory.make("appsink",        None)

           # Add elements to pipeline
           self.pipeline.add(self.source)
           self.pipeline.add(self.queue1)
           self.pipeline.add(self.depay)
           self.pipeline.add(self.parser)
           self.pipeline.add(self.decoder)
           self.pipeline.add(self.queue2)
           self.pipeline.add(self.sink)

           # Set properties
           self.source.set_property('port', 5000)
           self.source.set_property('caps', Gst.caps_from_string("application/x-rtp, encoding-name=H264,payload=96"))
           self.sink.set_property('emit-signals', True)
           # turns off sync to make decoding as fast as possible
           self.sink.set_property('sync', False)
           self.sink.connect('new-sample', self.on_new_buffer, self.sink)




       def on_new_buffer(self, appsink, data):
           print "exec two..."
           appsink_sample = GstApp.AppSink.pull_sample(self.sink)
           # with open('example.h264', 'a+') as streamer:
           buff = appsink_sample.get_buffer()
           size, offset, maxsize = buff.get_sizes()
           frame_data = buff.extract_dup(offset, size)
           print(size)
           # flag, info = buff.map(Gst.MapFlags.READ)
           # streamer.write(info.data)
           # print(info.size)
           return False

       def run(self):
           ret = self.pipeline.set_state(Gst.State.PLAYING)        
           if ret == Gst.StateChangeReturn.FAILURE:
               print("Unable to set the pipeline to the playing state.")
               exit(-1)

           self.mainloop.run()

       def kill(self):
           self.pipeline.set_state(Gst.State.NULL)
           self.mainloop.quit()

       def on_eos(self, bus, msg):
           print('on_eos()')
           self.kill()

       def on_error(self, bus, msg):
           print('on_error():', msg.parse_error())
           self.kill()

       def on_message(self, bus, message):
           t = message.type
           if t == Gst.MessageType.EOS:
               print "End of Stream :("
               self.kill()
           elif t == Gst.MessageType.ERROR:
               err, debug = message.parse_error()
               print "Error: %s" % err, debug
               self.kill()


       def on_sync_message(self, bus, message):
           print message.src


    example = Example()
    example.run()

    Firstly AppSink callback does not working, I don’t know why ? I think I have to make that configuration correct into Python Code. Can any one please help to figure it out ?

    Secondly when I tried with FFMpeg and FFPlay I got so many H264 encoding issues like this :

    enter image description here

    Main confusion is Can GStreamer handle Multiple Source data into a single sink(I need to distinguished each video frame).

    Thanks a lot.

  • Python : movie stream into pygame

    4 mars 2019, par user1533267

    Quick challenge :

    I’m streaming part of my desktop screen trough the network using the following :

    Server

    $ ffmpeg -f x11grab -s 800x600 -r 30 -i :100 -an -q 10 -f mjpeg - | nc -lp 5000

    Client

    $ nc <computer address="address"> 5000 | ffplay -
    </computer>

    I would like to display the following stream inside my pygame opengl window on the client, I read some documents on pygame.movie and found :

    "Load a new MPEG movie stream from a file or a python file object."

    Would it be possible to load the stream into pygame ?
    I need it to be as responsive as possible, and right now im seeing about 500ms lag.

    Any ideas ?

    Thanks.