Recherche avancée

Médias (1)

Mot : - Tags -/Rennes

Autres articles (52)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Possibilité de déploiement en ferme

    12 avril 2011, par

    MediaSPIP peut être installé comme une ferme, avec un seul "noyau" hébergé sur un serveur dédié et utilisé par une multitude de sites différents.
    Cela permet, par exemple : de pouvoir partager les frais de mise en œuvre entre plusieurs projets / individus ; de pouvoir déployer rapidement une multitude de sites uniques ; d’éviter d’avoir à mettre l’ensemble des créations dans un fourre-tout numérique comme c’est le cas pour les grandes plate-formes tout public disséminées sur le (...)

Sur d’autres sites (5822)

  • How to solve issue for converting ismv file of h264 video codec and aac audio codec ?

    4 octobre 2013, par Priyal

    I want to convert transmux ismv file to mp4 format. ISMV file is encoded withe the following ffmpeg detail :

      Metadata:
      major_brand     : isml
      minor_version   : 1
      compatible_brands: piffiso2
      creation_time   : 2013-10-03 14:10:41
      Duration: 15:21:13.16, start: 0.000000, bitrate: 41 kb/s
      Stream #0:0(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 64
                        kb/s (default)
      Metadata:
      creation_time   : 2013-10-03 14:10:41
      handler_name    : Audio
      Stream #0:1(und): Video: h264 (avc1 / 0x31637661), 1280x720, 3217 kb/s, 29.9
                        7 tbr, 10000k tbn, 20000k tbc (default)
    Metadata:
      creation_time   : 2013-10-03 14:10:41
      handler_name    : Video
      Stream #0:2(und): Data: none (dfxp / 0x70786664), 32 kb/s (default)
      Metadata:
      creation_time   : 2013-10-03 14:10:41
      handler_name    : Text

    I used the following ffmpeg command to transmux :

    ffmpeg -y -ss 00:00:10 -i Encoder1.ismv -vcodec libx264 -ar 44100 -t 40 -preset slow -qp 0  
    Encoder.mp4

    I got output as :

    ffmpeg version N-56010-g54d628a Copyright (c) 2000-2013 the FFmpeg developers
    built on Sep  4 2013 00:44:27 with gcc 4.7.3 (GCC)
    configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-av
    isynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enab
    le-iconv --enable-libass --enable-libbluray --enable-libcaca --enable-libfreetyp
    e --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --ena
    ble-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-l
    ibopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libsp
    eex --enable-libtheora --enable-libtwolame --enable-libvo-aacenc --enable-libvo-
    amrwbenc --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxavs --
    enable-libxvid --enable-zlib
        libavutil      52. 43.100 / 52. 43.100
        libavcodec     55. 31.100 / 55. 31.100
        libavformat    55. 16.100 / 55. 16.100
        libavdevice    55.  3.100 / 55.  3.100
        libavfilter     3. 83.102 /  3. 83.102
        libswscale      2.  5.100 /  2.  5.100
        libswresample   0. 17.103 /  0. 17.103
        libpostproc    52.  3.100 / 52.  3.100
    [aac @ 040f1ba0] TYPE_FIL: Input buffer exhausted before END element found
    [h264 @ 00353120] AVC: nal size -729776398
    [h264 @ 00353120] no frame!
    [h264 @ 00353120] AVC: nal size -570515285
    [h264 @ 00353120] no frame!
    [h264 @ 00353120] AVC: nal size -1477874754
    [h264 @ 00353120] no frame!
    [h264 @ 00353120] AVC: nal size -712314563
    [h264 @ 00353120] no frame!
    [h264 @ 00353120] AVC: nal size -23151524
    [h264 @ 00353120] no frame!
    [h264 @ 00353120] AVC: nal size -592499201
    [h264 @ 00353120] no frame!
    [h264 @ 00353120] AVC: nal size 225768173
    [h264 @ 00353120] no frame!
    [h264 @ 00353120] AVC: nal size 698187359
    [h264 @ 00353120] AVC: nal size 635127544
    [h264 @ 00353120] no frame!
    [h264 @ 00353120] AVC: nal size -1242688339
    [h264 @ 00353120] AVC: nal size 269543071
    [h264 @ 00353120] no frame!
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0035e840] decoding for stream 1 failed
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0035e840] Could not find codec parameters for stream
     1 (Video: h264 (avc1 / 0x31637661), 1280x720, 3217 kb/s): unspecified pixel form
     at Consider increasing the value for the 'analyzeduration' and 'probesize' options
     Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'C:\inetpub\media\archives\DEFAULT WEB S
     ITE\PushToPUblishPoint\Eagan_12034_Soccer_3-10-2013_14_10_19-isml\2013-10-03-14-
     10-35-436\Segment001\Encoder1.ismv':
      Metadata:
      major_brand     : isml
      minor_version   : 1
      compatible_brands: piffiso2
      creation_time   : 2013-10-03 14:10:41
      Duration: 15:21:13.16, start: 0.000000, bitrate: 41 kb/s
      Stream #0:0(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 64
                        kb/s (default)
       Metadata:
      creation_time   : 2013-10-03 14:10:41
      handler_name    : Audio
      Stream #0:1(und): Video: h264 (avc1 / 0x31637661), 1280x720, 3217 kb/s, 29.9
                        7 tbr, 10000k tbn, 20000k tbc (default)
      Metadata:
      creation_time   : 2013-10-03 14:10:41
      handler_name    : Video
      Stream #0:2(und): Data: none (dfxp / 0x70786664), 32 kb/s (default)
      Metadata:
      creation_time   : 2013-10-03 14:10:41
      handler_name    : Text

    [buffer @ 04679480] Unable to parse option value "-1" as pixel format
                  Last message repeated 1 times
    [buffer @ 04679480] Error setting option pix_fmt to value -1.
    [graph 0 input from stream 0:1 @ 040f7380] Error applying options to the filter.

    Error opening filters!

    Please suggest the solution for Windows OS

  • NextJS not working with jsmpeg/node-rtsp-stream. Trying to display RTSP stream

    15 septembre 2022, par Vlad Crehul

    Following : https://github.com/kyriesent/node-rtsp-stream and How to display IP camera feed from an RTSP url onto reactjs app page ? I was trying to display the RTSP stream from a CCTV but it gives me an error. ReferenceError: document is not defined at scripts\jsmpeg.min.js (1:701) @ eval

    


    I haven't found a single implementation of this module in NextJS so I might be doing something wrong but I can't tell what. And I didn't find any better solution for NextJS.

    


    There wasn't anything to get me out in : https://github.com/phoboslab/jsmpeg but I might be using it wrong in here.

    


    The rabbit hole started from this : How can I display an RTSP video stream in a web page ? but things are either outdated, do not apply or I couldn't figure them out.

    


    The actual question :

    


    How can I fix the error I get ? Is there an alternative to this in NextJS ? I don't care how all I need is to stream the RTSP feed from a CCTV.

    


    Folder Structure :

    


    components
   -layout
      -Stream.js
pages
   -api
   -stream
       -[streamId].js
       -app.js
   -index.js
scripts
    -jsmpeg.min.js


    


    Stream.js is a component in stream/app.js and stream/app.js is used in stream/[streamId].js

    


    Client-side : Stream.js

    


    import JSMpeg from "../../scripts/jsmpeg.min.js";&#xA;&#xA;const Stream = (props) => {&#xA;  const player = new JSMpeg.Player("ws://localhost:9999", {&#xA;    canvas: document.getElementById("video-canvas"), // Canvas should be a canvas DOM element&#xA;  });&#xA;&#xA; return (&#xA;    <fragment>&#xA;        &#xA;    </fragment>&#xA;  );&#xA;};&#xA;

    &#xA;

    Server-side : [streamId.js]

    &#xA;

    export async function getStaticProps(context) {&#xA;const StreamCCTV = require("node-rtsp-stream");&#xA;  const streamCCTV = new StreamCCTV({&#xA;    ffmpegPath: "C:\\Program Files\\ffmpeg\\bin\\ffmpeg.exe", //! remove on Ubuntu&#xA;    name: "name",&#xA;    streamUrl: "rtsp://someuser:somepassword@1.1.1.1",&#xA;    wsPort: 9999,&#xA;    ffmpegOptions: {&#xA;      // options ffmpeg flags&#xA;      "-stats": "", // an option with no neccessary value uses a blank string&#xA;      "-r": 30, // options with required values specify the value after the key&#xA;    },&#xA;  });&#xA;&#xA;

    &#xA;

    Edit :

    &#xA;

    I have also tried with https://www.npmjs.com/package/jsmpeg&#xA;Where i changed Stream.js to :

    &#xA;

    import jsmpeg from &#x27;jsmpeg&#x27;;&#xA;&#xA;const Stream = (props) => {&#xA;  const client = new WebSocket("ws://localhost:9999")&#xA;  const player = new jsmpeg(client, {&#xA;    canvas: document.getElementById("video-canvas"), // Canvas should be a canvas DOM element&#xA;  });&#xA;&#xA; return (&#xA;    <fragment>&#xA;        &#xA;    </fragment>&#xA;  );&#xA;};&#xA;

    &#xA;

    Now the error is : ReferenceError: window is not defined

    &#xA;

  • How to use FFMPEG properly in python with openCV ?

    9 mai 2022, par GottaGaming

    I got a problem with my programm. It has no output. I try to make a Webcamvideostream on my raspberry pi 4 and try to use ffmpeg to handle the frame grabbing in python. My Problem is, that the output looks terrible. No real picture from my cam. Is it the wrong pixel format ? And it doesnt really care what resolution i choose. It always takes the 640x480 as resolution. I show you the code :

    &#xA;

    &#xA;    import os&#xA;    import tempfile&#xA;    import subprocess&#xA;    import cv2&#xA;    import numpy as np&#xA;    &#xA;    # To get this path execute:&#xA;    #    $ which ffmpeg&#xA;    FFMPEG_BIN = &#x27;/usr/bin/ffmpeg&#x27;&#xA;    def run_ffmpeg():&#xA;        ffmpg_cmd = [&#xA;            FFMPEG_BIN,&#xA;            &#x27;-i&#x27;, &#x27;/dev/video0&#x27;,&#xA;            &#x27;-f&#x27;, &#x27;image2pipe&#x27;,&#xA;            &#x27;-framerate&#x27;, &#x27;30&#x27;,&#xA;            &#x27;-video_size&#x27;, &#x27;1280x720&#x27;,&#xA;            # opencv requires bgr24 pixel format&#xA;            &#x27;-input_format&#x27;, &#x27;mjpeg&#x27;,&#xA;            #&#x27;-an&#x27;,&#x27;-sn&#x27;,                # disable audio processing&#xA;           # &#x27;-f&#x27;, &#x27;image2pipe&#x27;,&#xA;            &#x27;-&#x27;,                        # output to go to stdout&#xA;        ]&#xA;        return subprocess.Popen(ffmpg_cmd, stdout = subprocess.PIPE, bufsize=10**8)&#xA;    &#xA;    def run_cv_window(process):&#xA;        while True:&#xA;            # read frame-by-frame&#xA;            raw_image = process.stdout.read(640*480*3)&#xA;            if raw_image == b&#x27;&#x27;:&#xA;                raise RuntimeError("Empty pipe")&#xA;            &#xA;            # transform the bytes read into a numpy array&#xA;            frame =  np.frombuffer(raw_image, dtype=&#x27;uint8&#x27;)&#xA;            frame = frame.reshape((480,640,3)) # height, width, channels&#xA;            if frame is not None:&#xA;                cv2.imshow(&#x27;Video&#x27;, frame)&#xA;            &#xA;            if cv2.waitKey(1) &amp; 0xFF == ord(&#x27;q&#x27;):&#xA;                break&#xA;            process.stdout.flush()&#xA;        &#xA;        cv2.destroyAllWindows()&#xA;        process.terminate()&#xA;        print(process.poll())&#xA;    &#xA;    def run():&#xA;        ffmpeg_process = run_ffmpeg()&#xA;        run_cv_window(ffmpeg_process)&#xA;    &#xA;    run()&#xA;&#xA;

    &#xA;

    This is the following output on the console :

    &#xA;

    &#xA;    libavutil      56. 51.100 / 56. 51.100&#xA;      libavcodec     58. 91.100 / 58. 91.100&#xA;      libavformat    58. 45.100 / 58. 45.100&#xA;      libavdevice    58. 10.100 / 58. 10.100&#xA;      libavfilter     7. 85.100 /  7. 85.100&#xA;      libavresample   4.  0.  0 /  4.  0.  0&#xA;      libswscale      5.  7.100 /  5.  7.100&#xA;      libswresample   3.  7.100 /  3.  7.100&#xA;      libpostproc    55.  7.100 / 55.  7.100&#xA;    Input #0, video4linux2,v4l2, from &#x27;/dev/video0&#x27;:&#xA;      Duration: N/A, start: 2588.219343, bitrate: 147456 kb/s&#xA;        Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 640x480, 147456 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc&#xA;    Stream mapping:&#xA;      Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))&#xA;    Press [q] to stop, [?] for help&#xA;    [swscaler @ 0xb317a0] deprecated pixel format used, make sure you did set range correctly&#xA;    Output #0, image2pipe, to &#x27;pipe:&#x27;:&#xA;      Metadata:&#xA;        encoder         : Lavf58.45.100&#xA;        Stream #0:0: Video: mjpeg, yuvj422p(pc), 640x480, q=2-31, 200 kb/s, 30 fps, 30 tbn, 30 tbc&#xA;        Metadata:&#xA;          encoder         : Lavc58.91.100 mjpeg&#xA;        Side data:&#xA;          cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: N/A&#xA;    frame=   16 fps=0.0 q=24.8 size=     331kB time=00:00:00.53 bitrate=5082.7kbits/s speed=1.06x    &#xA;    frame=   31 fps= 31 q=24.8 size=     510kB time=00:00:01.03 bitrate=4044.3kbits/s speed=1.02x    &#xA;    frame=   46 fps= 30 q=24.8 size=     694kB time=00:00:01.53 bitrate=3706.8kbits/s speed=1.01x&#xA;&#xA;

    &#xA;

    This is why i chose the 640x480 in the main code. It wont choose the 1280x720 resolution. Theres no real output picture and idk why. Can someone please help me ?

    &#xA;

    Heres the output picture :&#xA;enter image description here

    &#xA;