Recherche avancée

Médias (16)

Mot : - Tags -/mp3

Autres articles (60)

  • Personnaliser les catégories

    21 juin 2013, par

    Formulaire de création d’une catégorie
    Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
    Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire.
    Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
    Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

Sur d’autres sites (8798)

  • How to convert real pixel values to flutter logical pixel (Density Independent Pixel)

    28 mars 2020, par prem pattnaik

    I am using flutter-ffmpeg package to overlay an image onto the video, and after overlay i am drawing a rectangle over that image but the issue is, ffmpeg overlays image using real pixel data and flutter drawing rectangle using logical pixel, so how can i convert real pixel of ffmpeg to logical pixel of flutter so that i can change overlay dimension of image to match with rectangle.

  • How to take frames in real-time in a RTSP streaming ?

    30 juin 2018, par guijob

    I’m trying to grab frames with no delays using javacv and I’m kind of confusing how to do it and how javacv and other stuff properly work under the hoods.

    In my example, I have a RTSP streaming running with following configurations :

    Codec: H.264
    Frame Size: 1280x720
    Maximum Frame Rate: 60 fps

    In order to make it happen, I’ve made a thread like following :

    public class TakeFrameFromStreamingThread implements Runnable {
     private CircularFifoQueue queue;
     private Camera camera;
     private FFmpegFrameGrabber grabber = null;

     public TakeFrameFromStreamingThread(CircularFifoQueue queue, Camera camera) {
       try {
         this.queue = queue;
         this.camera = camera;
         this.initGrabber(camera);
       } catch (Exception e) {
         e.printStackTrace();
       }
     }

     @Override
     public void run() {
       try {
         while (true) {
           if (grabber == null) {
             initGrabber(camera); // connect
       }

       Frame frame = null;
       frame = grabber.grabImage();

       if (frame != null) {
         this.queue.add(frame);
       } else { // when frame == null then connection has been lost
         initGrabber(camera); // reconnect
       }
         }
       } catch (Exception e) {
         e.printStackTrace();
       }
     }

     private void initGrabber(Camera camera) throws Exception {
       grabber = new FFmpegFrameGrabber(camera.getURL()); // rtsp url
       grabber.setVideoCodec(avcodec.AV_CODEC_ID_H264);
       grabber.setOption("rtsp_transport", "tcp");
       grabber.setFrameRate(60);
       grabber.setImageWidth(camera.getResolution().getWidth());
       grabber.setImageHeight(camera.getResolution().getHeight());
       grabber.start();
     }
    }

    And it seems to work. Anytime I need a frame I pool this queue from my main thread.

    I’ve ended up with this solution solving another issue. I was getting stuck why does calling grabImage() every time I need a frame has returned just next frame instead of a real-time frame from streaming.

    By this solution, I’m guessing there is a buffer which javacv (or ffmpeg idk) fills with frames and then grabImage() just get a frame from this buffer. So it’s my first question :

    1) Is that right ? Does ffmpeg relies on a buffer to store frames and then grab() just get it from there ?

    Well, if that is a truth, then this buffer must be filled at some rate and, of course, if this rate is greater than my grabImage() calls rate, eventually I’ll lose my real-time feature and soon I’ll be even losing frames once buffer gets completely filled.

    In this scenario, my grabImage() takes about 50 ms, which gives me 20 fps rate getting frames from this buffer. Hence, I need to make sure ffmpeg is receiving frames at most 20 fps. So here’s my second question :

    2) How to know and change ffmpeg buffer rate ? I’m guessing it’s getting filled at a same rate of streaming (60 fps) or from property grabber.setFrameRate(). At all, I’m not sure if I should use grabber setters with same values from source streaming.

  • avcodec/ccaption_dec : Make real-time latency configurable

    19 juin 2021, par Pavel Koshevoy
    avcodec/ccaption_dec : Make real-time latency configurable
    

    Un-hardcode the 200ms minimum latency between emitting subtitle events
    so that those that wish to receive a subtitle event for every screen
    change could do so.

    The problem with delaying realtime output by any amount is that it is
    unknown when the next byte pair that would trigger output will happen.
    It may be within 200ms, or it may be several seconds later — that's
    not realtime at all.

    • [DH] libavcodec/ccaption_dec.c
    • [DH] libavcodec/version.h