
Recherche avancée
Autres articles (111)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs
Sur d’autres sites (8971)
-
How to add custom external video filter to ffmpeg at runtime ?
9 décembre 2019, par alexTrying to build ffplay app that uses ffmpeg LGPL build with custom filter that I want to plug in during runtime. Differently from adding custom filter during compilation described scenario is not documented, thus tried to experiment. Tried executing avfilter_register and avfilter_graph_alloc_filter functions before avfilter_graph_parse_ptr with avfilter_graph_parse_ptr resulting in -22 return value (Invalid argument).
Filter graph string :
[in] split [T1], fifo, [T2] overlay=0:H/2 [out]; [T1] fifo, crop=iw:ih/2:0:ih/2, my_edgedetect=low=0.1:high=0.4 [T2]
Lines added to configure_filtergraph function in ffplay.c :
if ((ret = avfilter_register(&ff_vf_edgedetect)) < 0)
goto fail;
if ((ret = avfilter_graph_alloc_filter(graph, &ff_vf_edgedetect, ff_vf_edgedetect.name)) < 0)
goto fail;Custom edge detect filter name (AVFilter.name) changed to my_edgedetect.
What is the proper way to register custom filter during runtime ?
P.S.
Same question on Zeranoa forum -
avdevice/decklink_enc : don't take for granted that first frame to decklink output...
3 mars 2023, par Devin Heitmuelleravdevice/decklink_enc : don't take for granted that first frame to decklink output will be PTS 0
The existing code assumed that the first frame received by the decklink
output would always be PTS zero. However if running in other timing
modes than the default of CBR, items such as frame dropping at the
beginning may result in starting at a non-zero PTS.For example, in our setup because we discard probing data and run
with "-vsync 2" the first video frame scheduled to the decklink
output will have a PTS around 170. Scheduling frames too far into
the future will either fail or cause a backlog of frames scheduled
far enough into the future that the entire pipeline will stall.Issue can be reproduced with the following command-line :
./ffmpeg -copyts -i foo.ts -f decklink -vcodec v210 -ac 2 'DeckLink Duo (4)'
Keep track of the PTS of the first frame received, so that when
we enable start playback we can provide that value to the decklink
driver.Thanks to Marton Balint for review and suggestion to use
AV_NOPTS_VALUE rather than zero for the initial value.Signed-off-by : Devin Heitmueller <dheitmueller@ltnglobal.com>
Signed-off-by : Marton Balint <cus@passwd.hu> -
Confused about x264 and encoding video frames
26 février 2015, par spartygwI built a test driver for encoding a series of images I have captured. I am using libx264 and based my driver off of this guy’s answer :
In my case I am starting out by reading in a JPG image and converting to YUV and passing that same frame over and over in a loop to the x264 encoder.
My expectation was that since the frame is the same that the output from the encoder would be very small and constant.
Instead I find that the NAL payload is varied from a few bytes to a few KB and also varies highly depending on the frame rate I specify in the encoder parameters.
Obviously I don’t understand video encoding. Why does the output size vary so much ?
int main()
{
Image image(WIDTH, HEIGHT);
image.FromJpeg("frame-1.jpg");
unsigned char *data = image.GetRGB();
x264_param_t param;
x264_param_default_preset(&param, "fast", "zerolatency");
param.i_threads = 1;
param.i_width = WIDTH;
param.i_height = HEIGHT;
param.i_fps_num = FPS;
param.i_fps_den = 1;
// Intra refres:
param.i_keyint_max = FPS;
param.b_intra_refresh = 1;
//Rate control:
param.rc.i_rc_method = X264_RC_CRF;
param.rc.f_rf_constant = FPS-5;
param.rc.f_rf_constant_max = FPS+5;
//For streaming:
param.b_repeat_headers = 1;
param.b_annexb = 1;
x264_param_apply_profile(&param, "baseline");
// initialize the encoder
x264_t* encoder = x264_encoder_open(&param);
x264_picture_t pic_in, pic_out;
x264_picture_alloc(&pic_in, X264_CSP_I420, WIDTH, HEIGHT);
// X264 expects YUV420P data use libswscale
// (from ffmpeg) to convert images to the right format
struct SwsContext* convertCtx =
sws_getContext(WIDTH, HEIGHT, PIX_FMT_RGB24, WIDTH, HEIGHT,
PIX_FMT_YUV420P, SWS_FAST_BILINEAR,
NULL, NULL, NULL);
// encoding is as simple as this then, for each frame do:
// data is a pointer to your RGB structure
int srcstride = WIDTH*3; //RGB stride is just 3*width
sws_scale(convertCtx, &data, &srcstride, 0, HEIGHT,
pic_in.img.plane, pic_in.img.i_stride);
x264_nal_t* nals;
int i_nals;
int frame_size =
x264_encoder_encode(encoder, &nals, &i_nals, &pic_in, &pic_out);
int max_loop=15;
int this_loop=1;
while (frame_size >= 0 && --max_loop)
{
cout << "------------" << this_loop++ << "-----------------\n";
cout << "Frame size = " << frame_size << endl;
cout << "output has " << pic_out.img.i_csp << " colorspace\n";
cout << "output has " << pic_out.img.i_plane << " # img planes\n";
cout << "i_nals = " << i_nals << endl;
for (int n=0; n