Recherche avancée

Médias (91)

Autres articles (111)

  • Script d’installation automatique de MediaSPIP

    25 avril 2011, par

    Afin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
    Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
    La documentation de l’utilisation du script d’installation (...)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

Sur d’autres sites (11141)

  • In the using directive "using Accord.Video.FFMPEG", FFMPEG does not exist in the name space

    15 septembre 2019, par StewartMetcalfe

    I am building an Azure Function in Visual Studio to convert a videos frames to images. I’m using the VideoFileReader class from Accord.Video.FFMPEG class. The code works on my machine but when trying to build this as an Azure Function Project, the using directive Accord.Video.FFMPEG errors.
    And subsequently the type VideoFileReader can not be found.

    I have tried re-installing the Accord, Accord.Video and Accord.Video.FFMPEG NuGet packages.

    using System.IO;
    using Microsoft.Azure.WebJobs;
    using Microsoft.Extensions.Logging;
    using Accord;
    using Accord.Video;
    using Accord.Video.FFMPEG;

    namespace ConvertVideo
    {
       public static class Function1
       {
           [FunctionName("Function1")]
           public static void Run([BlobTrigger("videos/{name}", Connection = "AzureWebJobsStorage")]Stream myBlob, string name, ILogger log)
           {
               log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");

               //start a new videoFileReader
               using (var vFReader = new VideoFileReader())
               {
                   //open the video
                   vFReader.Open(name);
                   //get the framerate
                   double frameRate = vFReader.FrameRate.ToDouble();
                   //more code which converts a frame to jpg
               }
           }  
       }
    }
  • Encode and stream from Xbox 360 kinnect using ffmpeg

    17 juin 2015, par user3288346

    I want to live stream content obtained from Kinect onto my internal network.

    I have one physical machine which is my server and has ubuntu 14.04 on it. I connect remotely to it. I have installed ffmpeg and ffserver and can encode and stream stored video files on the server. However, I have a few problems when using the Xbox Kinect.

    I have xbox 360 kinect which I have attached through usb. I have followed this https://bitbucket.org/samirmenon/scl-manips-v2/wiki/vision/kinect, however I couldn’t get through the OpenCV part. When I run

    $ cmake-gui ..

    I get

    cmake-gui: cannot connect to X server

    I don’t have physical access to the machine. Probably, its due to accessing it remotely.

    When I do

    test@cloud-node-2:~/kinnect$ lsusb
    Bus 002 Device 006: ID 045e:02ae Microsoft Corp. Xbox NUI Camera
    Bus 002 Device 004: ID 045e:02b0 Microsoft Corp. Xbox NUI Motor
    Bus 002 Device 005: ID 045e:02ad Microsoft Corp. Xbox NUI Audio
    Bus 002 Device 003: ID 0409:005a NEC Corp. HighSpeed Hub
    Bus 002 Device 002: ID 0bda:0181 Realtek Semiconductor Corp.
    Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
    Bus 008 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
    Bus 007 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
    Bus 006 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
    Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
    Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
    Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
    Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub

    When I do

    test@cloud-node-2:~/kinnect$ ls -ltrh /dev/video*
    ls: cannot access /dev/video*: No such file or directory

    Therefore, I am not able to capture the video using ffmpeg. I have only begun to work on this. Can someone help me to guide through this ?

  • ffplay : how does it calculate the fps for playback ?

    21 octobre 2020, par Daniel

    I'm trying to playback a live media (h264) which is produced by a hardware encoder.

    


    The actual desired FPS on the encoder is set to 20, and when checking the logs of the encoder it prints "FPS statistics" every minute :

    


    2020-10-21 17:26:54.787 [  info] video_stream_thread(),video chn 0, fps: 19.989270
2020-10-21 17:27:54.836 [  info] video_stream_thread(),video chn 0, fps: 19.989270
2020-10-21 17:28:54.837 [  info] video_stream_thread(),video chn 0, fps: 20.005924
2020-10-21 17:29:54.837 [  info] video_stream_thread(),video chn 0, fps: 19.989270
2020-10-21 17:30:54.888 [  info] video_stream_thread(),video chn 0, fps: 19.989274
2020-10-21 17:31:54.918 [  info] video_stream_thread(),video chn 0, fps: 19.989264


    


    You can see it's varying, but not too much around 20.

    


    Question1 : Is this normal ? Or it should be exactly 20 every time ? To avoid confusion : I'd like to know if by the standard of H264, can this be accepted as a valid stream or this violates some
rule ?

    


    I'm trying to playback this stream with ffplay :

    


    $ ffplay rtsp://this_stream
Input #0, rtsp, from 'xyz'
  Metadata:
    title           : 
    comment         : substream
  Duration: N/A, start: 0.040000, bitrate: N/A
    Stream #0:0: Video: h264 (Constrained Baseline), yuv420p(progressive), 640x360, 25 fps, 25 tbr, 90k tbn, 180k tbc


    


    The thing is that ffplay thinks this is a stream with 25fps. And it also plays 25 frames each sec, causing the stream to stall and buffer in every few seconds.

    


    I believe the fps is calculated by some pts/dts values in the stream itself, and it's not hardcoded. Am I wrong here ?

    


    If I'm not wrong, why does ffplay thinks this stream runs at 25fps, whereas it only runs at (around) 20 ?