Recherche avancée

Médias (21)

Mot : - Tags -/Nine Inch Nails

Autres articles (75)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

Sur d’autres sites (9072)

  • FFMPEG problems with real-time buffer

    27 juin 2024, par Charles Kiel

    I'm trying to use FFMPEG (Windows) to encode a stream from a video capture card via dshow and send to a RTMP server. THis is my command line ;

    



        ffmpeg -f dshow -i video="AVerMedia BDA Analog Capture Secondary":audio="Microphone (6- C-Media USB Audi" -vf scale=1280:720 -vcodec libx264 -r 30 -rtbufsize 702000k -acodec mp3 -ac 2 -ar 44100 -ab 128k -pix_fmt yuv420p -tune zerolatency -preset ultrafast -f flv "rtmp://xxx.xxx.xxx.xxx/stream/key"        ffmpeg version N-86950-g1bef008 Copyright (c) 2000-2017 the FFmpeg developers
    built with gcc 7.1.0 (GCC)
    configuration: --enable-gpl --enable-version3 --enable-cuda --enable-cuvid --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-nvenc --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable
    -libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264 --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libsnappy --enable-libsoxr --enable-libspe
    ex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-zlib
    libavutil      55. 70.100 / 55. 70.100
    libavcodec     57.102.100 / 57.102.100
    libavformat    57. 76.100 / 57. 76.100
    libavdevice    57.  7.100 / 57.  7.100
    libavfilter     6. 98.100 /  6. 98.100
    libswscale      4.  7.102 /  4.  7.102
    libswresample   2.  8.100 /  2.  8.100
    libpostproc    54.  6.100 / 54.  6.100
    Guessed Channel Layout for Input Stream #0.1 : stereo
    Input #0, dshow, from 'video=AVerMedia BDA Analog Capture Secondary:audio=Microphone (6- C-Media USB Audi':
    Duration: N/A, start: 2035.202000, bitrate: N/A
    Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 720x480, 29.97 fps, 29.97 tbr, 10000k tbn, 10000k tbc
    Stream #0:1: Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s
    [dshow @ 00000000005f90e0] real-time buffer [AVerMedia BDA Analog Capture Secondary] [video input] too full or near too full (68% of size: 3041280 [rtbufsize parameter])! frame dropped!
    [dshow @ 00000000005f90e0] real-time buffer [AVerMedia BDA Analog Capture Secondary] [video input] too full or near too full (90% of size: 3041280 [rtbufsize parameter])! frame dropped!
    [dshow @ 00000000005f90e0] real-time buffer [AVerMedia BDA Analog Capture Secondary] [video input] too full or near too full (113% of size: 3041280 [rtbufsize parameter])! frame dropped!
    Last message repeated 46 times
    Stream mapping:
    Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (libx264))
    Stream #0:1 -> #0:1 (pcm_s16le (native) -> mp3 (libmp3lame))
    Press [q] to stop, [?] for help
    [dshow @ 00000000005f90e0] real-time buffer [AVerMedia BDA Analog Capture Secondary] [video input] too full or near too full (113% of size: 3041280 [rtbufsize parameter])! frame dropped!
    [libx264 @ 0000000005b16640] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
    [libx264 @ 0000000005b16640] profile Constrained Baseline, level 3.1
    [libx264 @ 0000000005b16640] 264 - core 152 r2851 ba24899 - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=2
    1,11 fast_pskip=1 chroma_qp_offset=0 threads=11 lookahead_threads=11 sliced_threads=1 slices=11 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=0 intra_refresh=0 rc=crf mbtree=0 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ra
    tio=1.40 aq=0
    [dshow @ 00000000005f90e0] real-time buffer [AVerMedia BDA Analog Capture Secondary] [video input] too full or near too full (113% of size: 3041280 [rtbufsize parameter])! frame dropped!
    Past duration 0.999992 too large


    



    The buffer too full message are non-stop. I can use Open Broadcast Software (OBS) and stream with no problem (I'm pretty sure it also uses ffmpeg), so I'm doing something wrong.

    


  • How to convert real pixel values to flutter logical pixel (Density Independent Pixel)

    28 mars 2020, par prem pattnaik

    I am using flutter-ffmpeg package to overlay an image onto the video, and after overlay i am drawing a rectangle over that image but the issue is, ffmpeg overlays image using real pixel data and flutter drawing rectangle using logical pixel, so how can i convert real pixel of ffmpeg to logical pixel of flutter so that i can change overlay dimension of image to match with rectangle.

  • How to take frames in real-time in a RTSP streaming ?

    30 juin 2018, par guijob

    I’m trying to grab frames with no delays using javacv and I’m kind of confusing how to do it and how javacv and other stuff properly work under the hoods.

    In my example, I have a RTSP streaming running with following configurations :

    Codec: H.264
    Frame Size: 1280x720
    Maximum Frame Rate: 60 fps

    In order to make it happen, I’ve made a thread like following :

    public class TakeFrameFromStreamingThread implements Runnable {
     private CircularFifoQueue queue;
     private Camera camera;
     private FFmpegFrameGrabber grabber = null;

     public TakeFrameFromStreamingThread(CircularFifoQueue queue, Camera camera) {
       try {
         this.queue = queue;
         this.camera = camera;
         this.initGrabber(camera);
       } catch (Exception e) {
         e.printStackTrace();
       }
     }

     @Override
     public void run() {
       try {
         while (true) {
           if (grabber == null) {
             initGrabber(camera); // connect
       }

       Frame frame = null;
       frame = grabber.grabImage();

       if (frame != null) {
         this.queue.add(frame);
       } else { // when frame == null then connection has been lost
         initGrabber(camera); // reconnect
       }
         }
       } catch (Exception e) {
         e.printStackTrace();
       }
     }

     private void initGrabber(Camera camera) throws Exception {
       grabber = new FFmpegFrameGrabber(camera.getURL()); // rtsp url
       grabber.setVideoCodec(avcodec.AV_CODEC_ID_H264);
       grabber.setOption("rtsp_transport", "tcp");
       grabber.setFrameRate(60);
       grabber.setImageWidth(camera.getResolution().getWidth());
       grabber.setImageHeight(camera.getResolution().getHeight());
       grabber.start();
     }
    }

    And it seems to work. Anytime I need a frame I pool this queue from my main thread.

    I’ve ended up with this solution solving another issue. I was getting stuck why does calling grabImage() every time I need a frame has returned just next frame instead of a real-time frame from streaming.

    By this solution, I’m guessing there is a buffer which javacv (or ffmpeg idk) fills with frames and then grabImage() just get a frame from this buffer. So it’s my first question :

    1) Is that right ? Does ffmpeg relies on a buffer to store frames and then grab() just get it from there ?

    Well, if that is a truth, then this buffer must be filled at some rate and, of course, if this rate is greater than my grabImage() calls rate, eventually I’ll lose my real-time feature and soon I’ll be even losing frames once buffer gets completely filled.

    In this scenario, my grabImage() takes about 50 ms, which gives me 20 fps rate getting frames from this buffer. Hence, I need to make sure ffmpeg is receiving frames at most 20 fps. So here’s my second question :

    2) How to know and change ffmpeg buffer rate ? I’m guessing it’s getting filled at a same rate of streaming (60 fps) or from property grabber.setFrameRate(). At all, I’m not sure if I should use grabber setters with same values from source streaming.