
Recherche avancée
Médias (1)
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (61)
-
ANNEXE : Les extensions, plugins SPIP des canaux
11 février 2010, parUn plugin est un ajout fonctionnel au noyau principal de SPIP. MediaSPIP consiste en un choix délibéré de plugins existant ou pas auparavant dans la communauté SPIP, qui ont pour certains nécessité soit leur création de A à Z, soit des ajouts de fonctionnalités.
Les extensions que MediaSPIP nécessite pour fonctionner
Depuis la version 2.1.0, SPIP permet d’ajouter des plugins dans le répertoire extensions/.
Les "extensions" ne sont ni plus ni moins que des plugins dont la particularité est qu’ils se (...) -
Gestion de la ferme
2 mars 2010, parLa ferme est gérée dans son ensemble par des "super admins".
Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
Dans un premier temps il utilise le plugin "Gestion de mutualisation" -
Possibilité de déploiement en ferme
12 avril 2011, parMediaSPIP peut être installé comme une ferme, avec un seul "noyau" hébergé sur un serveur dédié et utilisé par une multitude de sites différents.
Cela permet, par exemple : de pouvoir partager les frais de mise en œuvre entre plusieurs projets / individus ; de pouvoir déployer rapidement une multitude de sites uniques ; d’éviter d’avoir à mettre l’ensemble des créations dans un fourre-tout numérique comme c’est le cas pour les grandes plate-formes tout public disséminées sur le (...)
Sur d’autres sites (7503)
-
What's the difference between H265, x265, x264, H264 and which of these uses GPU instead of CPU and how to use H265 GPU NVIDIA acceleration in ffmpeg ?
30 juillet 2022, par Fab98I'm not sure if
x264/5
use CPU and if h264/5 use GPU and also if h265 is basically HEVC_NVENC for NVIDIA GPU acceleration. So, if you could give me more info about these encoding types it would be great. I understood that, summing up a lot, x26* use CPU and are slower but more accurate while h26* are the opposite but h265 is the most recent and optimal trade off.
Furthermore, I was trying to convert a video using GPU acceleration and my question is :


Does the following command tell to the GPU to use h265 to encode a video holding the same audio and at upgrading it at its maximum video quality ? Furthermore, are there other ways to express the same command ?


ffmpeg.exe -hwaccel_output_format cuda -i "input" -c:v hevc_nvenc -preset medium -rc constqp -qp 0 -c:a copy "output"



-
How do I send buffer to a virtual camera driver ?
15 juin 2022, par rvega01I'm currently working on a project in a Windows environment where I have a Node.js application that accepts an RTSP video stream from a button click and I would like to stream the video data to a virtual camera driver and have it displayed within Microsoft Teams and Skype. The driver that I'm using and have installed (https://github.com/robot9706/VirtualCameraDriver) mentions that the filter implemented in the driver contains a property with a GUID that I can send buffer to.


I was wondering if anyone know of a javascript/node.js library, ffmpeg/gstreamer command, or any solution that comes to mind that can allow me to send video data to that GUID. I have tried outputting the RTSP stream to a videosink with g-streamer but I was unable to specify the display name for the output to be directed to.


With ffmpeg, I was able to find the driver info with DShow but I can only implement the driver as an video input and I'm unable to select the driver as an output through DShow. If there is an gstreamer/ffmpeg solution, please let me know as I'm inexperienced with video streaming and I have looked through many threads here but I'm unable to find an answer.


-
how to set nVidia video bitrate in a system service ?
12 septembre 2021, par qycxI use an nvidia card for compression and decompression in a video conferencing software. The program uses ffmpeg. The program is a system service. Need to control the bit rate. But I found that I didn't control it. The setting code rate was 700k, and the actual value was 10M. What went wrong ?


/* put sample parameters */
 pUnit->encV_var.c->bit_rate = bitrate;//400000;
 /* resolution must be a multiple of two */
 pUnit->encV_var.c->width = in_w;//352;
 pUnit->encV_var.c->height = in_h;//288;
 pUnit->encV_var.c->qmin=10;
 pUnit->encV_var.c->qmax=30;
 /* frames per second */
 AVRational t1={1,25};
 t1.den=fps;
 pUnit->encV_var.c->time_base = t1;//(AVRational){1, 25};
 AVRational t2={25,1};
 t2.num=fps;
 pUnit->encV_var.c->framerate = t2;//(AVRational){25, 1};

 /* emit one intra frame every ten frames
 * check frame pict_type before passing frame
 * to encoder, if frame->pict_type is AV_PICTURE_TYPE_I
 * then gop_size is ignored and the output of encoder
 * will always be I frame irrespective to gop_size
 */
 pUnit->encV_var.c->gop_size = 256;//10;
 pUnit->encV_var.c->max_b_frames = 0;//1;
 pUnit->encV_var.c->pix_fmt = AV_PIX_FMT_YUV420P;