Recherche avancée

Médias (0)

Mot : - Tags -/acrobat

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (81)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • (Dés)Activation de fonctionnalités (plugins)

    18 février 2011, par

    Pour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
    SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
    Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
    MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...)

Sur d’autres sites (9154)

  • Add a side data type for audio service type.

    14 novembre 2014, par Anton Khirnov
    Add a side data type for audio service type.
    

    Currently, audio service type is a field in AVCodecContext. However,
    side data is more appropriate for this kind of information.

    • [DBH] doc/APIchanges
    • [DBH] libavcodec/avcodec.h
    • [DBH] libavcodec/utils.c
    • [DBH] libavcodec/version.h
    • [DBH] libavfilter/af_ashowinfo.c
    • [DBH] libavutil/frame.h
    • [DBH] libavutil/version.h
  • Is it possible to use something other than a circular buffer when using ffmpeg to pass RTSP video to a WebRTC client ?

    3 novembre 2024, par Doctico

    I have a Node application that uses ffmpeg to receive the RTSP stream of an ip camera (h264), and then pass send it using node-webrtc to a remote client. However, in my current implementation, ffmpeg only outputs 8192 byte chunks, which must be buffered in order to create the full frame that my current node-webrtc flow expects. This circular buffer results in a 1-2 second delay when the client is viewing the video. Is there any way to pass the stream through node-webrtc as the chunks come in, or at least extract complete frames so that circular buffering is not necessary ?

    


    So far, I have tried this, which works but has a 1-2 second delay, and even higher delay with higher resolution cameras :

    


        async startStream() {

        const rtspUrl = 'rtsp://my-rtsp-url';

        const videoSource = new wrtc.nonstandard.RTCVideoSource();
        const videoTrack = videoSource.createTrack();

        const width = 640;
        const height = 480;
        const frameSize = width * height * 1.5; // YUV420p format

        //circular buffer:
        let frameBuffer = Buffer.alloc(0);
        const frameStream = new Writable({
            write: (chunk, encoding, callback) => {
                frameBuffer = Buffer.concat([frameBuffer, chunk]);

                while (frameBuffer.length >= frameSize) {
                    const frame = frameBuffer.slice(0, frameSize);
                    frameBuffer = frameBuffer.slice(frameSize);

                    videoSource.onFrame({
                        width: width,
                        height: height,
                        data: new Uint8ClampedArray(frame)
                    });
                }
                callback();
            }
        });

        const ffmpegProcess = ffmpeg(rtspUrl)
            .inputOptions([
                `-fflags nobuffer`,
                `-flags low_delay`,
                `-rtsp_transport tcp`,
                `-strict experimental`,
                `-analyzeduration 0`,
                `-threads 0`,
                `-hwaccel auto`
            ])
            .outputOptions([
                `-f rawvideo`,
                `-c:v rawvideo`,
                '-b:v', streamId === 1 ? '2000k' : '1000k',
                '-bf', '0',
                `-s ${width}x${height}`,
                `-pix_fmt yuv420p`,
                `-tune zerolatency`
            ])
            .on('start', (cmd) => console.log('FFmpeg started:', cmd))
            .on('error', error => {
                console.error('FFmpeg error:', error);
                this.stopStream();
            })
            .on('end', () => {
                console.log('FFmpeg stream ended');
                this.stopStream();
            })

        ffmpegProcess
            .pipe(frameStream);

        return videoTrack;
    }


    


  • How to use ffmpeg.wasm client side

    9 avril 2022, par Yordan Radev

    I am looking to use ffmpeg.wasm in a serverless Vue app, but have no idea how to integrate it. Ideally I want to use it to stitch together Canvas HTMLElement into a video output that the user can download. If anyone can make a JSFiddle, Codepen or whatever other medium demo, that would be great. Working off of this example I am not sure how to add a Canvas element as a frame to a file and then createURL for the resulting file.