Recherche avancée

Médias (1)

Mot : - Tags -/MediaSPIP

Autres articles (23)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

Sur d’autres sites (4692)

  • webrtc to rtmp send video from camera to rtmp link

    14 avril 2024, par Leo-Mahendra

    i cant send the video from webrtc which is converted to bufferd data for every 10seconds and send to server.js where it takes it via websockets and convert it to flv format using ffmpeg.

    


    i am trying to send it to rtmp server named restreamer for start, here i tried to convert the buffer data and send it to rtmp link using ffmpeg commands, where i initially started to suceesfully save the file from webrtc to mp4 format for a duration of 2-3 minute.

    


    after i tried to use webrtc to send video data for every 10 seconds and in server i tried to send it to rtmp but i cant send it, but i can see the connection of rtmp url and server is been taken place but i cant see the video i can see the logs in rtmp server as

    


    2024-04-14 12:35:45 ts=2024-04-14T07:05:45Z level=INFO component="RTMP" msg="no streams available" action="INVALID" address=":1935" client="172.17.0.1:37700" path="/3d30c5a9-2059-4843-8957-da963c7bc19b.stream" who="PUBLISH"
2024-04-14 12:35:45 ts=2024-04-14T07:05:45Z level=INFO component="RTMP" msg="no streams available" action="INVALID" address=":1935" client="172.17.0.1:37716" path="/3d30c5a9-2059-4843-8957-da963c7bc19b.stream" who="PUBLISH"
2024-04-14 12:35:45 ts=2024-04-14T07:05:45Z level=INFO component="RTMP" msg="no streams available" action="INVALID" address=":1935" client="172.17.0.1:37728" path="/3d30c5a9-2059-4843-8957-da963c7bc19b.stream" who="PUBLISH"   


    


    my frontend code

    


         const handleSendVideo = async () => {
        console.log("start");
    
        if (!ws) {
            console.error('WebSocket connection not established.');
            return;
        }
    
        try {
            const videoStream = await navigator.mediaDevices.getUserMedia({ video: true });
            const mediaRecorder = new MediaRecorder(videoStream);
    
            const requiredFrameSize = 460800;
            const frameDuration = 10 * 1000; // 10 seconds in milliseconds
    
            mediaRecorder.ondataavailable = async (event) => {
                if (ws.readyState !== WebSocket.OPEN) {
                    console.error('WebSocket connection is not open.');
                    return;
                }
    
                if (event.data.size > 0) {
                    const arrayBuffer = await event.data.arrayBuffer();
                    const uint8Array = new Uint8Array(arrayBuffer);
    
                    const width = videoStream.getVideoTracks()[0].getSettings().width;
                    const height = videoStream.getVideoTracks()[0].getSettings().height;
    
                    const numFrames = Math.ceil(uint8Array.length / requiredFrameSize);
    
                    for (let i = 0; i < numFrames; i++) {
                        const start = i * requiredFrameSize;
                        const end = Math.min((i + 1) * requiredFrameSize, uint8Array.length);
                        let frameData = uint8Array.subarray(start, end);
    
                        // Pad or trim the frameData to match the required size
                        if (frameData.length < requiredFrameSize) {
                            // Pad with zeros to reach the required size
                            const paddedData = new Uint8Array(requiredFrameSize);
                            paddedData.set(frameData, 0);
                            frameData = paddedData;
                        } else if (frameData.length > requiredFrameSize) {
                            // Trim to match the required size
                            frameData = frameData.subarray(0, requiredFrameSize);
                        }
    
                        const dataToSend = {
                            buffer: Array.from(frameData), // Convert Uint8Array to array of numbers
                            width: width,
                            height: height,
                            pixelFormat: 'yuv420p',
                            mode: 'SendRtmp'
                        };
    
                        console.log("Sending frame:", i);
                        ws.send(JSON.stringify(dataToSend));
                    }
                }
            };
    
            // Start recording and send data every 10 seconds
            mediaRecorder.start(frameDuration);
    
            console.log("MediaRecorder started.");
        } catch (error) {
            console.error('Error accessing media devices or starting recorder:', error);
        }
      };


    


    and my backend

    


        wss.on('connection', (ws) => {
    console.log('WebSocket connection established.');

    ws.on('message', async (data) => {
        try {
            const parsedData = JSON.parse(data);

            if (parsedData.mode === 'SendRtmp' && Array.isArray(parsedData.buffer)) {
                const { buffer, pixelFormat, width, height } = parsedData;
                const bufferArray = Buffer.from(buffer);

                await sendRtmpVideo(bufferArray, pixelFormat, width, height);
            } else {
                console.log('Received unknown or invalid mode or buffer data');
            }
        } catch (error) {
            console.error('Error parsing WebSocket message:', error);
        }
    });

    ws.on('close', () => {
        console.log('WebSocket connection closed.');
    });
    });
    const sendRtmpVideo = async (frameBuffer, pixelFormat, width, height) => {
    console.log("ffmpeg data",frameBuffer)
    try {
        const ratio = `${width}x${height}`;
        const ffmpegCommand = [
            '-re',
            '-f', 'rawvideo',
            '-pix_fmt', pixelFormat,
            '-s', ratio,
            '-i', 'pipe:0',
            '-c:v', 'libx264',
            '-preset', 'fast', // Specify the preset for libx264
            '-b:v', '3000k',    // Specify the video bitrate
            '-loglevel', 'debug',
            '-f', 'flv',
            // '-flvflags', 'no_duration_filesize', 
            RTMPLINK
        ];


        const ffmpeg = spawn('ffmpeg', ffmpegCommand);

        ffmpeg.on('exit', (code, signal) => {
            if (code === 0) {
                console.log('FFmpeg process exited successfully.');
            } else {
                console.error(`FFmpeg process exited with code ${code} and signal ${signal}`);
            }
        });

        ffmpeg.on('error', (error) => {
            console.error('FFmpeg spawn error:', error);
        });

        ffmpeg.stderr.on('data', (data) => {
            console.error(`FFmpeg stderr: ${data}`);
        });

        ffmpeg.stdin.write(frameBuffer, (err) => {
            if (err) {
                console.error('Error writing to FFmpeg stdin:', err);
            } else {
                console.log('Data written to FFmpeg stdin successfully.');
            }
            ffmpeg.stdin.end(); // Close stdin after writing the buffer
        });
        } catch (error) {
        console.error('Error in sendRtmpVideo:', error);
        }
    };



    


  • Transcoding/ Converting mpeg to h264 using ffmpeg h264_nvenc

    15 septembre 2024, par Mustafa

    I'm trying to transcode a video rendered by Davinci Resolve, using ffmpeg from mpeg4 to h264, on Ubuntu, using the following command.

    


    ffmpeg -y -vsync 0 -hwaccel cuda -hwaccel_output_format cuda -i input.mov -c:a copy -c:v h264_nvenc -b:v 8M -preset 7 output-h264_nvenc-p7.mp4 -v verbose


    


    And my Nvidia card is NVIDIA GeForce GTX 1650.

    


    I'm gettting the following error :

    


    [mpeg4 @ 0x55a893ed3200] Video width 3840 not within range from 48 to 2032
[mpeg4 @ 0x55a893ed3200] Failed setup for format cuda: hwaccel initialisation returned error.


    


    And here's the complete output. The command produces an output but its quality is significantly less than the original.

    


      built with gcc 13 (Ubuntu 13.2.0-23ubuntu4)
  configuration: --enable-nonfree --enable-cuda-nvcc --enable-libnpp --extra-cflags=-I/usr/local/cuda/include --extra-ldflags=-L/usr/local/cuda/lib64 --disable-static --enable-shared --enable-gpl --enable-libx264
  libavutil      59. 36.100 / 59. 36.100
  libavcodec     61. 13.100 / 61. 13.100
  libavformat    61.  5.101 / 61.  5.101
  libavdevice    61.  2.101 / 61.  2.101
  libavfilter    10.  2.102 / 10.  2.102
  libswscale      8.  2.100 /  8.  2.100
  libswresample   5.  2.100 /  5.  2.100
  libpostproc    58.  2.100 / 58.  2.100
-vsync is deprecated. Use -fps_mode
Passing a number to -vsync is deprecated, use a string argument as described in the manual.
Selecting decoder 'mpeg4' because of requested hwaccel method cuda
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'input.mov':
  Metadata:
    major_brand     : qt  
    minor_version   : 512
    compatible_brands: qt  
    creation_time   : 2024-09-15T04:25:29.000000Z
    encoder         : Blackmagic Design DaVinci Resolve
  Duration: 00:20:34.87, start: 0.000000, bitrate: 347243 kb/s
  Stream #0:0[0x1]: Video: mpeg4 (Simple Profile), 1 reference frame (mp4v / 0x7634706D), yuv420p(bt709/bt709/unknown, left), 3840x2160 [SAR 1:1 DAR 16:9], 345703 kb/s, 60 fps, 60 tbr, 15360 tbn (default)
      Metadata:
        creation_time   : 2024-09-15T04:25:29.000000Z
        handler_name    : VideoHandler
        vendor_id       :     
        encoder         : MPEG4 Video
        timecode        : 01:00:00:00
  Stream #0:1[0x2]: Audio: pcm_s16le (lpcm / 0x6D63706C), 48000 Hz, stereo, s16, 1536 kb/s (default)
      Metadata:
        creation_time   : 2024-09-15T04:25:29.000000Z
        handler_name    : SoundHandler
        vendor_id       : [0][0][0][0]
  Stream #0:2[0x3](eng): Data: none (tmcd / 0x64636D74) (default)
      Metadata:
        creation_time   : 2024-09-15T04:25:29.000000Z
        handler_name    : TimeCodeHandler
        timecode        : 01:00:00:00
[out#0/mp4 @ 0x55a8929e8d80] No explicit maps, mapping streams automatically...
[vost#0:0/h264_nvenc @ 0x55a8929f5880] Created video stream from input stream 0:0
[aost#0:1/copy @ 0x55a893867080] Created audio stream from input stream 0:1
Stream mapping:
  Stream #0:0 -> #0:0 (mpeg4 (native) -> h264 (h264_nvenc))
  Stream #0:1 -> #0:1 (copy)
[vost#0:0/h264_nvenc @ 0x55a8929f5880] Starting thread...
[vf#0:0 @ 0x55a8929f17c0] Starting thread...
[vist#0:0/mpeg4 @ 0x55a89293a100] [dec:mpeg4 @ 0x55a8929f4440] Starting thread...
[in#0/mov,mp4,m4a,3gp,3g2,mj2 @ 0x55a8929e0600] Starting thread...
Press [q] to stop, [?] for help
[mpeg4 @ 0x55a893ed3200] NVDEC capabilities:
[mpeg4 @ 0x55a893ed3200] format supported: yes, max_mb_count: 8192
[mpeg4 @ 0x55a893ed3200] min_width: 48, max_width: 2032
[mpeg4 @ 0x55a893ed3200] min_height: 16, max_height: 2032
[mpeg4 @ 0x55a893ed3200] Video width 3840 not within range from 48 to 2032
[mpeg4 @ 0x55a893ed3200] Failed setup for format cuda: hwaccel initialisation returned error.
[graph -1 input from stream 0:0 @ 0x75c970002e40] w:3840 h:2160 pixfmt:yuv420p tb:1/15360 fr:60/1 sar:1/1 csp:bt709 range:unknown
[graph -1 input from stream 0:0 @ 0x75c970002e40] video frame properties congruent with link at pts_time: 0
[h264_nvenc @ 0x55a8929e62c0] Using device cuda0 (type cuda) with h264_nvenc encoder.
[h264_nvenc @ 0x55a8929e62c0] Loaded Nvenc version 12.2
[h264_nvenc @ 0x55a8929e62c0] Nvenc initialized successfully
[h264_nvenc @ 0x55a8929e62c0] The selected preset is deprecated. Use p1 to p7 + -tune or fast/medium/slow.
Output #0, mp4, to 'output-h264_nvenc-p7.mp4':
  Metadata:
    major_brand     : qt  
    minor_version   : 512
    compatible_brands: qt  
    encoder         : Lavf61.5.101
  Stream #0:0: Video: h264 (Main), 1 reference frame (avc1 / 0x31637661), yuv420p(tv, bt709/bt709/unknown, progressive, left), 3840x2160 [SAR 1:1 DAR 16:9], q=2-31, 8000 kb/s, 60 fps, 15360 tbn (default)
      Metadata:
        creation_time   : 2024-09-15T04:25:29.000000Z
        handler_name    : VideoHandler
        vendor_id       :     
        timecode        : 01:00:00:00
        encoder         : Lavc61.13.100 h264_nvenc
      Side data:
        cpb: bitrate max/min/avg: 0/0/8000000 buffer size: 16000000 vbv_delay: N/A
  Stream #0:1: Audio: pcm_s16le (ipcm / 0x6D637069), 48000 Hz, stereo, s16, 1536 kb/s (default)
      Metadata:
        creation_time   : 2024-09-15T04:25:29.000000Z
        handler_name    : SoundHandler
        vendor_id       : [0][0][0][0]


    


    How can I find the cause of the error and resolve it ?

    


    Also is there a way to improve the quality of the output video as it's much lower than the original ? I tried doing the same conversion using libxh264 (using the CPU) and that produced a far better video.

    


  • how to not remove transparency using negate option on png files ffmpeg

    27 mars 2024, par Reilbas

    I'm new to FFMPEG.
To make it simple, I've created a program that applies a negative filter to small images in png format. For some of the images, everything is fine and the transparency is maintained... But for others, FFMPEG converted the transparency to either black or white pixels.

    


    I'm just using the command :
ffmpeg -y -i input.png -vf negate output.png

    


    I downloaded the latest binaries for windows from https://www.gyan.dev/ffmpeg/builds/ (2024/03/25)