Recherche avancée

Médias (3)

Mot : - Tags -/image

Autres articles (8)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (5335)

  • Error while trying to stream higer resolution using ffmpeg and jsmpeg

    11 avril 2017, par trojek

    I try to do a prof of concept of streaming webcam in browser using jsmpeg. It works perfect when I use code from the documentation which is as follows :

    ffmpeg \
       -f v4l2 \
           -framerate 25 -video_size 640x480 -i /dev/video0 \
       -f mpegts \
           -codec:v mpeg1video -s 640x480 -b:v 1000k -bf 0 \
       http://localhost:8081/supersecret

    My webcam supports resolutions, framerates and formats as in code below :

    [video4linux2,v4l2 @ 0x2655360] Compressed:       mjpeg :          Motion-JPEG : 1920x1080 1280x720 1024x768 640x480 800x600 1280x1024 320x240
    [video4linux2,v4l2 @ 0x2655360] Raw       :     yuyv422 :           YUYV 4:2:2 : 1920x1080 1280x720 1024x768 640x480 800x600 1280x1024 320x240

    While I change in above code the resolution from 640x480 to e.g. 1024x768, I get an error :

    [mpeg1video @ 0x110f4e0] MPEG1/2 does not support 10/1 fps
    Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height

    I assume that camera driver change number of fps and MPEG1/2 doesn’t support that numer of fps.
    Probably I get from camera raw data (yuyv422) and it can’t play it with higer framerate when e.f. 10 for 1024x768.
    How Can I modify ffmped execution code in order to stream video in Full HD resulution ?

  • ffmpeg colorspace conversion speed

    5 mai 2017, par Mikhail Novikov

    I am running 2 ffmpeg commands on a fairly fast, GPU-enabled machine (AWS g2.2xlarge instance) :

    ffmpeg -i ./in.mp4 -s 1280x720 -r 30 -an -f rawvideo -pix_fmt yuv420p - | cat - >/dev/null

    gives 524fps while

    ffmpeg -i ./in.mp4 -s 1280x720 -r 30 -an -f rawvideo -pix_fmt argb - | cat - >/dev/null

    just 101... it just shouldn’t, couldn’t take as much as 8ms per frame on a modern CPU, let alone GPU !

    What am i doing wrong and how can i improve speed of this ?

    PS : Now this is truly ridiculous !

    ffmpeg -i ./in.mp4 -s 1280x720 -r 30 -an -f rawvideo -pix_fmt yuv420p - | ffmpeg -s 1280x720 -r 30 -an -f rawvideo -pix_fmt yuv420p -i - -s 1280x720 -r 30 -an -f rawvideo -pix_fmt argb - | cat - >/dev/null

    makes 275 fps ! which is by far not perfect, but something i can live with.

    why ?

    Thanks !

  • FFMPEG / streameye - MJPEG streaming from raspebbry cam

    16 mai 2017, par Emmanuel Brunet

    I’m trying to stream video from different Raspberry Pi’s .. ; RPI 2B /2 B+ and RPI 3.
    I found out how to stream rtmp / flv video through NINGX and it works fine but now I need to convert this stream to MJPEG stream. Actually I need it as input for motion (detection)

    I’ve tried mjeg_streamer to publish a MJPEG stream using FFMPEG / image2 in front to split the stream into image chunhs. But it’s too slow and to much greedy

    after many searches I’ve installed streameye which seems perfect. but the process traps 10s after a web client connects to the server hosted on the raspberry py (2/3 same result so not an ARM61/71 issue I think .. same with FFMPEG or AVCONV).

    ... and more over it crashes the ssh session we are connected to access the rapsberry device. An FFMPEG trap generating a SSH session issue ?.
    Note that the raspberry was not down after connection loss as all it’s services were still up and running. Just need to connect via SSH to again.

    I don’t know if it’s due to my ffmpeg feeding command or if it’s a streameye nug. Didn’t find any info about on google but a web client issue that would have been fixed in 0.8 (the version I’m working on)

    from a FLV / h264 I run

    avconv -r 15 -f flv -i rtmp://127.0.0.1/app/my_stream -c:v mjpeg  -crf 15 -f mjpeg  - | streameye -p 8080

    the source stream is defined as follow (no audio stream) :

    Input #0, live_flv, from 'rtmp://boxnet-0-eth/live/boxnet-2':
    Metadata:
    Server          : NGINX RTMP (github.com/arut/nginx-rtmp-module)
    displayWidth    : 480
    displayHeight   : 320
    fps             : 0
    profile         :
    level           :
    Duration: 00:00:00.00, start: 0.000000, bitrate: N/A
    Stream #0:0: Video: h264 (High), yuv420p(progressive), 480x320, 1k tbr, 1k tbn