
Recherche avancée
Autres articles (63)
-
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
Demande de création d’un canal
12 mars 2010, parEn fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...) -
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)
Sur d’autres sites (6924)
-
FFMPEG problems with real-time buffer
27 juin 2024, par Charles KielI'm trying to use FFMPEG (Windows) to encode a stream from a video capture card via dshow and send to a RTMP server. THis is my command line ;



ffmpeg -f dshow -i video="AVerMedia BDA Analog Capture Secondary":audio="Microphone (6- C-Media USB Audi" -vf scale=1280:720 -vcodec libx264 -r 30 -rtbufsize 702000k -acodec mp3 -ac 2 -ar 44100 -ab 128k -pix_fmt yuv420p -tune zerolatency -preset ultrafast -f flv "rtmp://xxx.xxx.xxx.xxx/stream/key" ffmpeg version N-86950-g1bef008 Copyright (c) 2000-2017 the FFmpeg developers
 built with gcc 7.1.0 (GCC)
 configuration: --enable-gpl --enable-version3 --enable-cuda --enable-cuvid --enable-d3d11va --enable-dxva2 --enable-libmfx --enable-nvenc --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable
 -libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264 --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libsnappy --enable-libsoxr --enable-libspe
 ex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-zlib
 libavutil 55. 70.100 / 55. 70.100
 libavcodec 57.102.100 / 57.102.100
 libavformat 57. 76.100 / 57. 76.100
 libavdevice 57. 7.100 / 57. 7.100
 libavfilter 6. 98.100 / 6. 98.100
 libswscale 4. 7.102 / 4. 7.102
 libswresample 2. 8.100 / 2. 8.100
 libpostproc 54. 6.100 / 54. 6.100
 Guessed Channel Layout for Input Stream #0.1 : stereo
 Input #0, dshow, from 'video=AVerMedia BDA Analog Capture Secondary:audio=Microphone (6- C-Media USB Audi':
 Duration: N/A, start: 2035.202000, bitrate: N/A
 Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 720x480, 29.97 fps, 29.97 tbr, 10000k tbn, 10000k tbc
 Stream #0:1: Audio: pcm_s16le, 44100 Hz, stereo, s16, 1411 kb/s
 [dshow @ 00000000005f90e0] real-time buffer [AVerMedia BDA Analog Capture Secondary] [video input] too full or near too full (68% of size: 3041280 [rtbufsize parameter])! frame dropped!
 [dshow @ 00000000005f90e0] real-time buffer [AVerMedia BDA Analog Capture Secondary] [video input] too full or near too full (90% of size: 3041280 [rtbufsize parameter])! frame dropped!
 [dshow @ 00000000005f90e0] real-time buffer [AVerMedia BDA Analog Capture Secondary] [video input] too full or near too full (113% of size: 3041280 [rtbufsize parameter])! frame dropped!
 Last message repeated 46 times
 Stream mapping:
 Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (libx264))
 Stream #0:1 -> #0:1 (pcm_s16le (native) -> mp3 (libmp3lame))
 Press [q] to stop, [?] for help
 [dshow @ 00000000005f90e0] real-time buffer [AVerMedia BDA Analog Capture Secondary] [video input] too full or near too full (113% of size: 3041280 [rtbufsize parameter])! frame dropped!
 [libx264 @ 0000000005b16640] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
 [libx264 @ 0000000005b16640] profile Constrained Baseline, level 3.1
 [libx264 @ 0000000005b16640] 264 - core 152 r2851 ba24899 - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=2
 1,11 fast_pskip=1 chroma_qp_offset=0 threads=11 lookahead_threads=11 sliced_threads=1 slices=11 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=0 intra_refresh=0 rc=crf mbtree=0 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ra
 tio=1.40 aq=0
 [dshow @ 00000000005f90e0] real-time buffer [AVerMedia BDA Analog Capture Secondary] [video input] too full or near too full (113% of size: 3041280 [rtbufsize parameter])! frame dropped!
 Past duration 0.999992 too large




The buffer too full message are non-stop. I can use Open Broadcast Software (OBS) and stream with no problem (I'm pretty sure it also uses ffmpeg), so I'm doing something wrong.


-
How to convert real pixel values to flutter logical pixel (Density Independent Pixel)
28 mars 2020, par prem pattnaikI am using flutter-ffmpeg package to overlay an image onto the video, and after overlay i am drawing a rectangle over that image but the issue is, ffmpeg overlays image using real pixel data and flutter drawing rectangle using logical pixel, so how can i convert real pixel of ffmpeg to logical pixel of flutter so that i can change overlay dimension of image to match with rectangle.
-
How to take frames in real-time in a RTSP streaming ?
30 juin 2018, par guijobI’m trying to grab frames with no delays using
javacv
and I’m kind of confusing how to do it and howjavacv
and other stuff properly work under the hoods.In my example, I have a RTSP streaming running with following configurations :
Codec: H.264
Frame Size: 1280x720
Maximum Frame Rate: 60 fpsIn order to make it happen, I’ve made a thread like following :
public class TakeFrameFromStreamingThread implements Runnable {
private CircularFifoQueue queue;
private Camera camera;
private FFmpegFrameGrabber grabber = null;
public TakeFrameFromStreamingThread(CircularFifoQueue queue, Camera camera) {
try {
this.queue = queue;
this.camera = camera;
this.initGrabber(camera);
} catch (Exception e) {
e.printStackTrace();
}
}
@Override
public void run() {
try {
while (true) {
if (grabber == null) {
initGrabber(camera); // connect
}
Frame frame = null;
frame = grabber.grabImage();
if (frame != null) {
this.queue.add(frame);
} else { // when frame == null then connection has been lost
initGrabber(camera); // reconnect
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
private void initGrabber(Camera camera) throws Exception {
grabber = new FFmpegFrameGrabber(camera.getURL()); // rtsp url
grabber.setVideoCodec(avcodec.AV_CODEC_ID_H264);
grabber.setOption("rtsp_transport", "tcp");
grabber.setFrameRate(60);
grabber.setImageWidth(camera.getResolution().getWidth());
grabber.setImageHeight(camera.getResolution().getHeight());
grabber.start();
}
}And it seems to work. Anytime I need a frame I
pool
thisqueue
from my main thread.I’ve ended up with this solution solving another issue. I was getting stuck why does calling
grabImage()
every time I need a frame has returned just next frame instead of a real-time frame from streaming.By this solution, I’m guessing there is a buffer which
javacv
(orffmpeg
idk) fills with frames and thengrabImage()
just get a frame from this buffer. So it’s my first question :1) Is that right ? Does
ffmpeg
relies on a buffer to store frames and thengrab()
just get it from there ?Well, if that is a truth, then this buffer must be filled at some rate and, of course, if this rate is greater than my
grabImage()
calls rate, eventually I’ll lose my real-time feature and soon I’ll be even losing frames once buffer gets completely filled.In this scenario, my
grabImage()
takes about 50 ms, which gives me 20 fps rate getting frames from this buffer. Hence, I need to make sureffmpeg
is receiving frames at most 20 fps. So here’s my second question :2) How to know and change
ffmpeg
buffer rate ? I’m guessing it’s getting filled at a same rate of streaming (60 fps) or from propertygrabber.setFrameRate()
. At all, I’m not sure if I should use grabber setters with same values from source streaming.