Recherche avancée

Médias (91)

Autres articles (106)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Formulaire personnalisable

    21 juin 2013, par

    Cette page présente les champs disponibles dans le formulaire de publication d’un média et il indique les différents champs qu’on peut ajouter. Formulaire de création d’un Media
    Dans le cas d’un document de type média, les champs proposés par défaut sont : Texte Activer/Désactiver le forum ( on peut désactiver l’invite au commentaire pour chaque article ) Licence Ajout/suppression d’auteurs Tags
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire. (...)

Sur d’autres sites (12744)

  • webrtc to rtmp send video from camera to rtmp link

    14 avril 2024, par Leo-Mahendra

    i cant send the video from webrtc which is converted to bufferd data for every 10seconds and send to server.js where it takes it via websockets and convert it to flv format using ffmpeg.

    


    i am trying to send it to rtmp server named restreamer for start, here i tried to convert the buffer data and send it to rtmp link using ffmpeg commands, where i initially started to suceesfully save the file from webrtc to mp4 format for a duration of 2-3 minute.

    


    after i tried to use webrtc to send video data for every 10 seconds and in server i tried to send it to rtmp but i cant send it, but i can see the connection of rtmp url and server is been taken place but i cant see the video i can see the logs in rtmp server as

    


    2024-04-14 12:35:45 ts=2024-04-14T07:05:45Z level=INFO component="RTMP" msg="no streams available" action="INVALID" address=":1935" client="172.17.0.1:37700" path="/3d30c5a9-2059-4843-8957-da963c7bc19b.stream" who="PUBLISH"
2024-04-14 12:35:45 ts=2024-04-14T07:05:45Z level=INFO component="RTMP" msg="no streams available" action="INVALID" address=":1935" client="172.17.0.1:37716" path="/3d30c5a9-2059-4843-8957-da963c7bc19b.stream" who="PUBLISH"
2024-04-14 12:35:45 ts=2024-04-14T07:05:45Z level=INFO component="RTMP" msg="no streams available" action="INVALID" address=":1935" client="172.17.0.1:37728" path="/3d30c5a9-2059-4843-8957-da963c7bc19b.stream" who="PUBLISH"   


    


    my frontend code

    


         const handleSendVideo = async () => {
        console.log("start");
    
        if (!ws) {
            console.error('WebSocket connection not established.');
            return;
        }
    
        try {
            const videoStream = await navigator.mediaDevices.getUserMedia({ video: true });
            const mediaRecorder = new MediaRecorder(videoStream);
    
            const requiredFrameSize = 460800;
            const frameDuration = 10 * 1000; // 10 seconds in milliseconds
    
            mediaRecorder.ondataavailable = async (event) => {
                if (ws.readyState !== WebSocket.OPEN) {
                    console.error('WebSocket connection is not open.');
                    return;
                }
    
                if (event.data.size > 0) {
                    const arrayBuffer = await event.data.arrayBuffer();
                    const uint8Array = new Uint8Array(arrayBuffer);
    
                    const width = videoStream.getVideoTracks()[0].getSettings().width;
                    const height = videoStream.getVideoTracks()[0].getSettings().height;
    
                    const numFrames = Math.ceil(uint8Array.length / requiredFrameSize);
    
                    for (let i = 0; i < numFrames; i++) {
                        const start = i * requiredFrameSize;
                        const end = Math.min((i + 1) * requiredFrameSize, uint8Array.length);
                        let frameData = uint8Array.subarray(start, end);
    
                        // Pad or trim the frameData to match the required size
                        if (frameData.length < requiredFrameSize) {
                            // Pad with zeros to reach the required size
                            const paddedData = new Uint8Array(requiredFrameSize);
                            paddedData.set(frameData, 0);
                            frameData = paddedData;
                        } else if (frameData.length > requiredFrameSize) {
                            // Trim to match the required size
                            frameData = frameData.subarray(0, requiredFrameSize);
                        }
    
                        const dataToSend = {
                            buffer: Array.from(frameData), // Convert Uint8Array to array of numbers
                            width: width,
                            height: height,
                            pixelFormat: 'yuv420p',
                            mode: 'SendRtmp'
                        };
    
                        console.log("Sending frame:", i);
                        ws.send(JSON.stringify(dataToSend));
                    }
                }
            };
    
            // Start recording and send data every 10 seconds
            mediaRecorder.start(frameDuration);
    
            console.log("MediaRecorder started.");
        } catch (error) {
            console.error('Error accessing media devices or starting recorder:', error);
        }
      };


    


    and my backend

    


        wss.on('connection', (ws) => {
    console.log('WebSocket connection established.');

    ws.on('message', async (data) => {
        try {
            const parsedData = JSON.parse(data);

            if (parsedData.mode === 'SendRtmp' && Array.isArray(parsedData.buffer)) {
                const { buffer, pixelFormat, width, height } = parsedData;
                const bufferArray = Buffer.from(buffer);

                await sendRtmpVideo(bufferArray, pixelFormat, width, height);
            } else {
                console.log('Received unknown or invalid mode or buffer data');
            }
        } catch (error) {
            console.error('Error parsing WebSocket message:', error);
        }
    });

    ws.on('close', () => {
        console.log('WebSocket connection closed.');
    });
    });
    const sendRtmpVideo = async (frameBuffer, pixelFormat, width, height) => {
    console.log("ffmpeg data",frameBuffer)
    try {
        const ratio = `${width}x${height}`;
        const ffmpegCommand = [
            '-re',
            '-f', 'rawvideo',
            '-pix_fmt', pixelFormat,
            '-s', ratio,
            '-i', 'pipe:0',
            '-c:v', 'libx264',
            '-preset', 'fast', // Specify the preset for libx264
            '-b:v', '3000k',    // Specify the video bitrate
            '-loglevel', 'debug',
            '-f', 'flv',
            // '-flvflags', 'no_duration_filesize', 
            RTMPLINK
        ];


        const ffmpeg = spawn('ffmpeg', ffmpegCommand);

        ffmpeg.on('exit', (code, signal) => {
            if (code === 0) {
                console.log('FFmpeg process exited successfully.');
            } else {
                console.error(`FFmpeg process exited with code ${code} and signal ${signal}`);
            }
        });

        ffmpeg.on('error', (error) => {
            console.error('FFmpeg spawn error:', error);
        });

        ffmpeg.stderr.on('data', (data) => {
            console.error(`FFmpeg stderr: ${data}`);
        });

        ffmpeg.stdin.write(frameBuffer, (err) => {
            if (err) {
                console.error('Error writing to FFmpeg stdin:', err);
            } else {
                console.log('Data written to FFmpeg stdin successfully.');
            }
            ffmpeg.stdin.end(); // Close stdin after writing the buffer
        });
        } catch (error) {
        console.error('Error in sendRtmpVideo:', error);
        }
    };



    


  • HLS. FFmpeg : error when loading first segment [closed]

    30 avril 2024, par rus_99_pk

    I'm trying to download a streaming video using ffmpeg. There is a file in the format *.m3u8. BUT, if everything was so simple, I would not have come here.

    


    There are a number of nuances :

    


      

    1. It cannot be downloaded by specifying a link to the file
    2. 


    3. If you upload the file and look at its contents, there will be : #EXT-X-KEY:METHOD=AES-128,URI="[KEY],IV=[IV]
    4. 


    


    With a URI, the task is easy to solve ; just specify the value list.m3u8.

    


    I end up getting :

    


    Error when loading first segment 'https://cdnv-m12.boomstream.com/vod/hash:21596def3216ed982660d609751b8078/id:35105.29443.1039983.85853232.150106.hls/time:0/data:eyJ2ZXJzaW9uIjoiMS4yLjk3IiwidXNlX2RpcmVjdF9saW5rcyI6InllcyIsImlzX2VuY3J5cHQiOiJ5ZXMifQ==/m61/2024/04/27/1Q0idCxb.mp4/media-1.ts'


    


    But with IV it’s more difficult, because file processing is performed on the server side. Please help.

    


    I tried to get JS with similar variable names and monitor network traffic using Wireshark, in the hope of catching a response from the server with IV.

    


    But it didn't help me.

    


    My script for download :

    


    #!/bin/bash
clear

link="/home/user/Download/chunklist.m3u8"
filename="testfile"

ffmpeg \
-headers $'User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:124.0) Gecko/20100101 Firefox/124.0\r\nAccept: */*\r\nAccept-Language: ru-RU,ru;q=0.8,en-US;q=0.5,en;q=0.3\r\nAccept-Encoding: gzip, deflate, br\r\nOrigin: https://example.com\r\nConn>
-protocol_whitelist "file,http,https,tcp,tls,crypto" \
-allowed_extensions ALL \
-f hls \
-i "$link" \
-map p:2 \
-bsf:a aac_adtstoasc  -vcodec copy  -c copy -crf 50 /tmp/$filename.mp4 -v trace


    


    Trace log

    


  • ffmpeg : Unrecognized option 'alpha_quality'

    16 avril 2024, par László Monda

    I want to make transparent videos work in Safari, which doesn't support WebM for this purpose but only H265 with alpha transparency.

    


    According to this post, I used Shutter Encoder, but only some of its versions work for this purpose on Mac.

    


    Instead of using Shutter Encoder on Mac, I want to use ffmpeg on my Linux PC. Shutter Encoder uses the following command in the background :

    


    ffmpeg -threads 0 -hwaccel none -i input.mov -c:v hevc_videotoolbox -alpha_quality 1 -b:v 1000k -profile:v main -level 5.2 -map v:0 -an -pix_fmt yuva420p -sws_flags bicubic -tag:v hvc1 -metadata creation_time=2024-04-14T14:53:08.734684Z -y output.mp4

    


    which yields the following output on my PC :

    


    ffmpeg version 4.4.2-0ubuntu0.22.04.1 Copyright (c) 2000-2021 the FFmpeg developers
  built with gcc 11 (Ubuntu 11.2.0-19ubuntu1)
  configuration: --prefix=/usr --extra-version=0ubuntu0.22.04.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libdav1d --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-pocketsphinx --enable-librsvg --enable-libmfx --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  WARNING: library configuration mismatch
  avcodec     configuration: --prefix=/usr --extra-version=0ubuntu0.22.04.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libdav1d --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librabbitmq --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-pocketsphinx --enable-librsvg --enable-libmfx --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared --enable-version3 --disable-doc --disable-programs --enable-libaribb24 --enable-libopencore_amrnb --enable-libopencore_amrwb --enable-libtesseract --enable-libvo_amrwbenc --enable-libsmbclient
  libavutil      56. 70.100 / 56. 70.100
  libavcodec     58.134.100 / 58.134.100
  libavformat    58. 76.100 / 58. 76.100
  libavdevice    58. 13.100 / 58. 13.100
  libavfilter     7.110.100 /  7.110.100
  libswscale      5.  9.100 /  5.  9.100
  libswresample   3.  9.100 /  3.  9.100
  libpostproc    55.  9.100 / 55.  9.100
Unrecognized option 'alpha_quality'.
Error splitting the argument list: Option not found


    


    When googling for "Unrecognized option 'alpha_quality'.", there are no results, which I find very odd.

    


    What's going on, and how can I make ffmpeg work for this purpose without Shutter Encoder ?