Recherche avancée

Médias (0)

Mot : - Tags -/objet éditorial

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (20)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Other interesting software

    13 avril 2011, par

    We don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
    The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
    We don’t know them, we didn’t try them, but you can take a peek.
    Videopress
    Website : http://videopress.com/
    License : GNU/GPL v2
    Source code : (...)

Sur d’autres sites (5381)

  • avfilter/avf_showcqt : cqt_calc optimization on x86

    4 juin 2016, par Muhammad Faiz
    avfilter/avf_showcqt : cqt_calc optimization on x86
    

    on x86_64 :
    time PSNR
    plain 3.303 inf
    SSE 1.649 107.087535
    SSE3 1.632 107.087535
    AVX 1.409 106.986771
    FMA3 1.265 107.108437

    on x86_32 (PSNR compared to x86_64 plain) :
    time PSNR
    plain 7.225 103.951979
    SSE 1.827 105.859282
    SSE3 1.819 105.859282
    AVX 1.533 105.997661
    FMA3 1.384 105.885377

    FMA4 test is not available

    Reviewed-by : James Almer <jamrial@gmail.com>
    Signed-off-by : Muhammad Faiz <mfcc64@gmail.com>

    • [DH] libavfilter/avf_showcqt.c
    • [DH] libavfilter/avf_showcqt.h
    • [DH] libavfilter/x86/Makefile
    • [DH] libavfilter/x86/avf_showcqt.asm
    • [DH] libavfilter/x86/avf_showcqt_init.c
  • webrtc to rtmp send video from camera to rtmp link

    14 avril 2024, par Leo-Mahendra

    i cant send the video from webrtc which is converted to bufferd data for every 10seconds and send to server.js where it takes it via websockets and convert it to flv format using ffmpeg.

    &#xA;

    i am trying to send it to rtmp server named restreamer for start, here i tried to convert the buffer data and send it to rtmp link using ffmpeg commands, where i initially started to suceesfully save the file from webrtc to mp4 format for a duration of 2-3 minute.

    &#xA;

    after i tried to use webrtc to send video data for every 10 seconds and in server i tried to send it to rtmp but i cant send it, but i can see the connection of rtmp url and server is been taken place but i cant see the video i can see the logs in rtmp server as

    &#xA;

    2024-04-14 12:35:45 ts=2024-04-14T07:05:45Z level=INFO component="RTMP" msg="no streams available" action="INVALID" address=":1935" client="172.17.0.1:37700" path="/3d30c5a9-2059-4843-8957-da963c7bc19b.stream" who="PUBLISH"&#xA;2024-04-14 12:35:45 ts=2024-04-14T07:05:45Z level=INFO component="RTMP" msg="no streams available" action="INVALID" address=":1935" client="172.17.0.1:37716" path="/3d30c5a9-2059-4843-8957-da963c7bc19b.stream" who="PUBLISH"&#xA;2024-04-14 12:35:45 ts=2024-04-14T07:05:45Z level=INFO component="RTMP" msg="no streams available" action="INVALID" address=":1935" client="172.17.0.1:37728" path="/3d30c5a9-2059-4843-8957-da963c7bc19b.stream" who="PUBLISH"   &#xA;

    &#xA;

    my frontend code

    &#xA;

         const handleSendVideo = async () => {&#xA;        console.log("start");&#xA;    &#xA;        if (!ws) {&#xA;            console.error(&#x27;WebSocket connection not established.&#x27;);&#xA;            return;&#xA;        }&#xA;    &#xA;        try {&#xA;            const videoStream = await navigator.mediaDevices.getUserMedia({ video: true });&#xA;            const mediaRecorder = new MediaRecorder(videoStream);&#xA;    &#xA;            const requiredFrameSize = 460800;&#xA;            const frameDuration = 10 * 1000; // 10 seconds in milliseconds&#xA;    &#xA;            mediaRecorder.ondataavailable = async (event) => {&#xA;                if (ws.readyState !== WebSocket.OPEN) {&#xA;                    console.error(&#x27;WebSocket connection is not open.&#x27;);&#xA;                    return;&#xA;                }&#xA;    &#xA;                if (event.data.size > 0) {&#xA;                    const arrayBuffer = await event.data.arrayBuffer();&#xA;                    const uint8Array = new Uint8Array(arrayBuffer);&#xA;    &#xA;                    const width = videoStream.getVideoTracks()[0].getSettings().width;&#xA;                    const height = videoStream.getVideoTracks()[0].getSettings().height;&#xA;    &#xA;                    const numFrames = Math.ceil(uint8Array.length / requiredFrameSize);&#xA;    &#xA;                    for (let i = 0; i &lt; numFrames; i&#x2B;&#x2B;) {&#xA;                        const start = i * requiredFrameSize;&#xA;                        const end = Math.min((i &#x2B; 1) * requiredFrameSize, uint8Array.length);&#xA;                        let frameData = uint8Array.subarray(start, end);&#xA;    &#xA;                        // Pad or trim the frameData to match the required size&#xA;                        if (frameData.length &lt; requiredFrameSize) {&#xA;                            // Pad with zeros to reach the required size&#xA;                            const paddedData = new Uint8Array(requiredFrameSize);&#xA;                            paddedData.set(frameData, 0);&#xA;                            frameData = paddedData;&#xA;                        } else if (frameData.length > requiredFrameSize) {&#xA;                            // Trim to match the required size&#xA;                            frameData = frameData.subarray(0, requiredFrameSize);&#xA;                        }&#xA;    &#xA;                        const dataToSend = {&#xA;                            buffer: Array.from(frameData), // Convert Uint8Array to array of numbers&#xA;                            width: width,&#xA;                            height: height,&#xA;                            pixelFormat: &#x27;yuv420p&#x27;,&#xA;                            mode: &#x27;SendRtmp&#x27;&#xA;                        };&#xA;    &#xA;                        console.log("Sending frame:", i);&#xA;                        ws.send(JSON.stringify(dataToSend));&#xA;                    }&#xA;                }&#xA;            };&#xA;    &#xA;            // Start recording and send data every 10 seconds&#xA;            mediaRecorder.start(frameDuration);&#xA;    &#xA;            console.log("MediaRecorder started.");&#xA;        } catch (error) {&#xA;            console.error(&#x27;Error accessing media devices or starting recorder:&#x27;, error);&#xA;        }&#xA;      };&#xA;

    &#xA;

    and my backend

    &#xA;

        wss.on(&#x27;connection&#x27;, (ws) => {&#xA;    console.log(&#x27;WebSocket connection established.&#x27;);&#xA;&#xA;    ws.on(&#x27;message&#x27;, async (data) => {&#xA;        try {&#xA;            const parsedData = JSON.parse(data);&#xA;&#xA;            if (parsedData.mode === &#x27;SendRtmp&#x27; &amp;&amp; Array.isArray(parsedData.buffer)) {&#xA;                const { buffer, pixelFormat, width, height } = parsedData;&#xA;                const bufferArray = Buffer.from(buffer);&#xA;&#xA;                await sendRtmpVideo(bufferArray, pixelFormat, width, height);&#xA;            } else {&#xA;                console.log(&#x27;Received unknown or invalid mode or buffer data&#x27;);&#xA;            }&#xA;        } catch (error) {&#xA;            console.error(&#x27;Error parsing WebSocket message:&#x27;, error);&#xA;        }&#xA;    });&#xA;&#xA;    ws.on(&#x27;close&#x27;, () => {&#xA;        console.log(&#x27;WebSocket connection closed.&#x27;);&#xA;    });&#xA;    });&#xA;    const sendRtmpVideo = async (frameBuffer, pixelFormat, width, height) => {&#xA;    console.log("ffmpeg data",frameBuffer)&#xA;    try {&#xA;        const ratio = `${width}x${height}`;&#xA;        const ffmpegCommand = [&#xA;            &#x27;-re&#x27;,&#xA;            &#x27;-f&#x27;, &#x27;rawvideo&#x27;,&#xA;            &#x27;-pix_fmt&#x27;, pixelFormat,&#xA;            &#x27;-s&#x27;, ratio,&#xA;            &#x27;-i&#x27;, &#x27;pipe:0&#x27;,&#xA;            &#x27;-c:v&#x27;, &#x27;libx264&#x27;,&#xA;            &#x27;-preset&#x27;, &#x27;fast&#x27;, // Specify the preset for libx264&#xA;            &#x27;-b:v&#x27;, &#x27;3000k&#x27;,    // Specify the video bitrate&#xA;            &#x27;-loglevel&#x27;, &#x27;debug&#x27;,&#xA;            &#x27;-f&#x27;, &#x27;flv&#x27;,&#xA;            // &#x27;-flvflags&#x27;, &#x27;no_duration_filesize&#x27;, &#xA;            RTMPLINK&#xA;        ];&#xA;&#xA;&#xA;        const ffmpeg = spawn(&#x27;ffmpeg&#x27;, ffmpegCommand);&#xA;&#xA;        ffmpeg.on(&#x27;exit&#x27;, (code, signal) => {&#xA;            if (code === 0) {&#xA;                console.log(&#x27;FFmpeg process exited successfully.&#x27;);&#xA;            } else {&#xA;                console.error(`FFmpeg process exited with code ${code} and signal ${signal}`);&#xA;            }&#xA;        });&#xA;&#xA;        ffmpeg.on(&#x27;error&#x27;, (error) => {&#xA;            console.error(&#x27;FFmpeg spawn error:&#x27;, error);&#xA;        });&#xA;&#xA;        ffmpeg.stderr.on(&#x27;data&#x27;, (data) => {&#xA;            console.error(`FFmpeg stderr: ${data}`);&#xA;        });&#xA;&#xA;        ffmpeg.stdin.write(frameBuffer, (err) => {&#xA;            if (err) {&#xA;                console.error(&#x27;Error writing to FFmpeg stdin:&#x27;, err);&#xA;            } else {&#xA;                console.log(&#x27;Data written to FFmpeg stdin successfully.&#x27;);&#xA;            }&#xA;            ffmpeg.stdin.end(); // Close stdin after writing the buffer&#xA;        });&#xA;        } catch (error) {&#xA;        console.error(&#x27;Error in sendRtmpVideo:&#x27;, error);&#xA;        }&#xA;    };&#xA;&#xA;

    &#xA;

  • Transcode webcam blob to RTMP using ffmpeg.wasm

    29 novembre 2023, par hassan moradnezhad

    I'm trying transcode webcam blob data to a rtmp server from browser by using ffmpeg.wasm .
    &#xA;first, i create a MediaStream.

    &#xA;

            const stream = await navigator.mediaDevices.getUserMedia({&#xA;            video: true,&#xA;        });&#xA;

    &#xA;

    then, i create a MediaRecorder.

    &#xA;

            const recorder = new MediaRecorder(stream, {mimeType: "video/webm; codecs:vp9"});&#xA;        recorder.ondataavailable = handleDataAvailable;&#xA;        recorder.start(0)&#xA;

    &#xA;

    when data is available, i call a function called handleDataAvailable.
    &#xA;here is the function.

    &#xA;

        const handleDataAvailable = (event: BlobEvent) => {&#xA;        console.log("data-available");&#xA;        if (event.data.size > 0) {&#xA;            recordedChunksRef.current.push(event.data);&#xA;            transcode(event.data)&#xA;        }&#xA;    };&#xA;

    &#xA;

    in above code, i use another function which called transcode it's goal is going to send data to rtmp server using use ffmpeg.wasm.
    &#xA;here it is.

    &#xA;

    const transcode = async (inputVideo: Blob | undefined) => {&#xA;        const ffmpeg = ffmpegRef.current;&#xA;        const fetchFileOutput = await fetchFile(inputVideo)&#xA;        ffmpeg?.writeFile(&#x27;input.webm&#x27;, fetchFileOutput)&#xA;&#xA;        const data = await ffmpeg?.readFile(&#x27;input.webm&#x27;);&#xA;        if (videoRef.current) {&#xA;            videoRef.current.src =&#xA;                URL.createObjectURL(new Blob([(data as any)?.buffer], {type: &#x27;video/webm&#x27;}));&#xA;        }&#xA;&#xA;        // execute by node-media-server config 1&#xA;        await ffmpeg?.exec([&#x27;-re&#x27;, &#x27;-i&#x27;, &#x27;input.webm&#x27;, &#x27;-c&#x27;, &#x27;copy&#x27;, &#x27;-f&#x27;, &#x27;flv&#x27;, "rtmp://localhost:1935/live/ttt"])&#xA;&#xA;        // execute by node-media-server config 2&#xA;        // await ffmpeg?.exec([&#x27;-re&#x27;, &#x27;-i&#x27;, &#x27;input.webm&#x27;, &#x27;-c:v&#x27;, &#x27;libx264&#x27;, &#x27;-preset&#x27;, &#x27;veryfast&#x27;, &#x27;-tune&#x27;, &#x27;zerolatency&#x27;, &#x27;-c:a&#x27;, &#x27;aac&#x27;, &#x27;-ar&#x27;, &#x27;44100&#x27;, &#x27;-f&#x27;, &#x27;flv&#x27;, &#x27;rtmp://localhost:1935/live/ttt&#x27;]);&#xA;&#xA;        // execute by stack-over-flow config 1&#xA;        // await ffmpeg?.exec([&#x27;-re&#x27;, &#x27;-i&#x27;, &#x27;input.webm&#x27;, &#x27;-c:v&#x27;, &#x27;h264&#x27;, &#x27;-c:a&#x27;, &#x27;aac&#x27;, &#x27;-f&#x27;, &#x27;flv&#x27;, "rtmp://localhost:1935/live/ttt"]);&#xA;&#xA;        // execute by stack-over-flow config 2&#xA;        // await ffmpeg?.exec([&#x27;-i&#x27;, &#x27;input.webm&#x27;, &#x27;-c:v&#x27;, &#x27;libx264&#x27;, &#x27;-flags:v&#x27;, &#x27;&#x2B;global_header&#x27;, &#x27;-c:a&#x27;, &#x27;aac&#x27;, &#x27;-ac&#x27;, &#x27;2&#x27;, &#x27;-f&#x27;, &#x27;flv&#x27;, "rtmp://localhost:1935/live/ttt"]);&#xA;&#xA;        // execute by stack-over-flow config 3&#xA;        // await ffmpeg?.exec([&#x27;-i&#x27;, &#x27;input.webm&#x27;, &#x27;-acodec&#x27;, &#x27;aac&#x27;, &#x27;-ac&#x27;, &#x27;2&#x27;, &#x27;-strict&#x27;, &#x27;experimental&#x27;, &#x27;-ab&#x27;, &#x27;160k&#x27;, &#x27;-vcodec&#x27;, &#x27;libx264&#x27;, &#x27;-preset&#x27;, &#x27;slow&#x27;, &#x27;-profile:v&#x27;, &#x27;baseline&#x27;, &#x27;-level&#x27;, &#x27;30&#x27;, &#x27;-maxrate&#x27;, &#x27;10000000&#x27;, &#x27;-bufsize&#x27;, &#x27;10000000&#x27;, &#x27;-b&#x27;, &#x27;1000k&#x27;, &#x27;-f&#x27;, &#x27;flv&#x27;, &#x27;rtmp://localhost:1935/live/ttt&#x27;]);&#xA;&#xA;    }&#xA;

    &#xA;

    after running app and start streaming, console logs are as below.

    &#xA;

    ffmpeg >>>  ffmpeg version 5.1.3 Copyright (c) 2000-2022 the FFmpeg developers index.tsx:81:20&#xA;ffmpeg >>>    built with emcc (Emscripten gcc/clang-like replacement &#x2B; linker emulating GNU ld) 3.1.40 (5c27e79dd0a9c4e27ef2326841698cdd4f6b5784) index.tsx:81:20&#xA;ffmpeg >>>    configuration: --target-os=none --arch=x86_32 --enable-cross-compile --disable-asm --disable-stripping --disable-programs --disable-doc --disable-debug --disable-runtime-cpudetect --disable-autodetect --nm=emnm --ar=emar --ranlib=emranlib --cc=emcc --cxx=em&#x2B;&#x2B; --objcc=emcc --dep-cc=emcc --extra-cflags=&#x27;-I/opt/include -O3 -msimd128&#x27; --extra-cxxflags=&#x27;-I/opt/include -O3 -msimd128&#x27; --disable-pthreads --disable-w32threads --disable-os2threads --enable-gpl --enable-libx264 --enable-libx265 --enable-libvpx --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libopus --enable-zlib --enable-libwebp --enable-libfreetype --enable-libfribidi --enable-libass --enable-libzimg index.tsx:81:20&#xA;ffmpeg >>>    libavutil      57. 28.100 / 57. 28.100 index.tsx:81:20&#xA;ffmpeg >>>    libavcodec     59. 37.100 / 59. 37.100 index.tsx:81:20&#xA;ffmpeg >>>    libavformat    59. 27.100 / 59. 27.100 index.tsx:81:20&#xA;ffmpeg >>>    libavdevice    59.  7.100 / 59.  7.100 index.tsx:81:20&#xA;ffmpeg >>>    libavfilter     8. 44.100 /  8. 44.100 index.tsx:81:20&#xA;ffmpeg >>>    libswscale      6.  7.100 /  6.  7.100 index.tsx:81:20&#xA;ffmpeg >>>    libswresample   4.  7.100 /  4.  7.100 index.tsx:81:20&#xA;ffmpeg >>>    libpostproc    56.  6.100 / 56.  6.100 index.tsx:81:20&#xA;ffmpeg >>>  Input #0, matroska,webm, from &#x27;input.webm&#x27;: index.tsx:81:20&#xA;ffmpeg >>>    Metadata: index.tsx:81:20&#xA;ffmpeg >>>      encoder         : QTmuxingAppLibWebM-0.0.1 index.tsx:81:20&#xA;ffmpeg >>>    Duration: N/A, start: 0.000000, bitrate: N/A index.tsx:81:20&#xA;ffmpeg >>>    Stream #0:0(eng): Video: vp8, yuv420p(progressive), 640x480, SAR 1:1 DAR 4:3, 15.50 tbr, 1k tbn (default)&#xA;

    &#xA;

    the problem is when ffmpeg.wasm try to execute the last command.
    &#xA;await ffmpeg?.exec([&#x27;-re&#x27;, &#x27;-i&#x27;, &#x27;input.webm&#x27;, &#x27;-c&#x27;, &#x27;copy&#x27;, &#x27;-f&#x27;, &#x27;flv&#x27;, "rtmp://localhost:1935/live/ttt"]).
    &#xA;it just calls a GET Request, I will send further details about this request.
    &#xA;as u can see, i try to use lots of arg sample with ffmpeg?.exec, but non of them works.

    &#xA;

    the network tab in browser, after ffmpeg.wasm execute the command is as below.

    &#xA;

    enter image description here

    &#xA;

    it send a GET request to ws://localhost:1935/&#xA;and nothing happened after that.

    &#xA;

    for backend, i use node-media-server and here is my output logs when ffmpeg.wasm trying to execute the args

    &#xA;

    11/28/2023 19:33:18 55301 [INFO] [rtmp disconnect] id=JL569YOF&#xA;[NodeEvent on doneConnect] id=JL569YOF args=undefined&#xA;

    &#xA;

    at last here are my ques

    &#xA;

    &#xA;
      &#xA;
    • how can i achive this option ?
    • &#xA;

    • is it possible to share webcam to rtmp server ?
    • &#xA;

    &#xA;

    &#xA;