Recherche avancée

Médias (91)

Autres articles (100)

  • MediaSPIP Core : La Configuration

    9 novembre 2010, par

    MediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
    Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...)

  • Les sons

    15 mai 2013, par
  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

Sur d’autres sites (14218)

  • Emulate YUV420 stream for OpenCV

    11 avril 2024, par Ivan

    I'm working on a project that will treat the images coming from a camera in YUV420 format connected by ethernet.

    


    During the development, we can't have constant access to the camera so I wanted to emulate the behaviour.

    


    My idea is to take video, convert it from mp4 to yuv using ffmpeg :

    


    ffmpeg -i input.mp4 -pix_fmt yuv420p output.yuv


    


    Then, stream the video in an infinite loop.
This works :

    


    ffmpeg -stream_loop -1 -f rawvideo -s 960x540 -r 30 -pix_fmt yuv420p -i output.yuv -f mpegts udp://127.0.0.1:23000


    


    I managed to read (and show) the images with :

    


    #In python
import cv2
cap = cv2.VideoCapture('udp://127.0.0.1:23000', cv2.CAP_FFMPEG)
...


    


    However, the shape of the images I get is (540, 960, 3), but I expected YUV420 format, I wanted (540*3/2, 960) or 3 channel 540x960, 540/2x960/2, 540/2x960/2.

    


    How can I get the "raw" format ?, I'm not sure if it is ffmpeg or opencv adding treatment to the stream.

    


  • Fluent-ffmpeg : merging video and audio = wrong frames

    5 juin 2015, par rhanb

    I’m trying to merge a video (mp4) without audio stream with an audio file (mp3). I’m developing under nodewebkit a video software which means that I have to use ogg files, so when the user upload a video or a audio file it converts it in ogg whatever its format. Then when the user want to export its video I’m exporting frames from a canvas to PNG images. Once this is done I’m creating a video from the frames with a 30 fps with this following code :

    var videoMaker = function () {

       console.log('videoMaker');

       var deffered = Q.defer();
       if (!FS.existsSync($rootScope.project.path + '/video')) {
           filestorageService.createFolder($rootScope.project.path + '/video');
       }
       audioMaker().then(function () {
           var commandVideo = new Ffmpeg({
               source: $rootScope.project.path + '/frames/%d.png'
           });
           commandVideo.setFfmpegPath(ffmpegPath);
           commandVideo.addOptions(['-c:v libx264', '-r 30']).withFpsInput(30).format('mp4').on('error', function (err) {
               console.log('video', err);
           }).on('end', function () {
               console.log('video win');
               deffered.resolve();
           }).save($rootScope.project.path + '/video/rendu.mp4');
       });
       return deffered.promise;
    };

    Then i’m reconverting the audio wich has been uploaded by the user to mp3 :

    var audioMaker = function () {

       console.log('audioMaker');
       var deffered = Q.defer();
       if ($rootScope.project.settings.music.path !== '') {
           FS.writeFileSync($rootScope.project.path + '/music/finalMusic.mp3', null);
           var commandAudio = new Ffmpeg({
               source: $rootScope.project.settings.music.path
           });
           commandAudio.setFfmpegPath(ffmpegPath);
           if ($rootScope.project.settings.music.fadeIn) {
               commandAudio.audioFilters('afade=t=in:ss=0:d=0.5');
           }
           console.log($rootScope.project.settings.music.fadeOut, $rootScope.project.settings.music.fadeIn);
           if ($rootScope.project.settings.music.fadeOut) {
               var time = sceneService.getTotalDuration() - 0.5;
               commandAudio.audioFilters('afade=t=out:st=' + time + ':d=0.5');
           }
           commandAudio.toFormat('mp3').on('end', function () {
               console.log('audio win');
               deffered.resolve();
           }).on('error', function (err) {
               console.log('audio', err);
           }).save($rootScope.project.path + '/music/finalMusic.mp3');
       } else {
           deffered.resolve();
       }
       return deffered.promise;
    };

    Until there everything is alright those files work well but when i do this :

    var command = new Ffmpeg({
       source: $rootScope.project.path + '/video/rendu.mp4'
    });
    command.setFfmpegPath(ffmpegPath);
    console.log($rootScope.project.settings.music.path !== '');
    if ($rootScope.project.settings.music.path !== '') {
       command.addInput($rootScope.project.path + '/music/finalMusic.mp3');
       command.addOptions(['-c:v copy', '-c:a copy']);
       if ($rootScope.project.settings.music.duration > sceneService.getTotalDuration()) {
           command.addOptions(['-shortest']);
       }
       command.on('error', function (err) {
           console.log(err);
       }).on('end', function () {
           console.log("win");
           //filestorageService.rmFolder($rootScope.project.path + '/frames');
       }).save($rootScope.project.path + '/video/rendu.mp4');
    } else {
       filestorageService.rmFolder($rootScope.project.path + '/frames');
    }

    And my final file has the music and the right duration but the frames aren’t right, any ideas ?

  • RTSP streaming to video file using FFMPEG library

    17 août 2017, par Gona

    I want to save video taken with a network camera(IP camera) as a video file like .avi.

    At first I tried OpenCV but I have a codec problem and try to use FFMPEG.
    This is the first time to use FFMPEG so I am looking for some sample code.

    The overall project structure is C# with a C++ DLL, so I want to save(or write or record) the camera stream as a video file in C++.

    Camera stream is received using RTSP, and the RTSP URL is also known.
    How to save RTSP stream as video file using FFMPEG library ? The codec is H264.

    I would appreciate it if you could show me some sample code.

    My development environment is 64-bit Windows 10 and Visual Studio 2015.
    I downloaded FFMPEG library version 20170817-92da230, 64-bit architecture and linking both Shared and Dev from here https://ffmpeg.zeranoe.com/builds/