Recherche avancée

Médias (0)

Mot : - Tags -/xmlrpc

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (62)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

Sur d’autres sites (7324)

  • FFMPEG - adding a nullsrc causes my script to report "1000 duplicate frames"

    14 juin 2020, par rossmcm

    I'm trying to add coloured rectangle highlights to a video, appearing at different locations and times. The highlights are on a 6x6 grid of 320x180 rectangles.

    



    Originally I didn't have the nullsrc=size=1920x1080 thinking that it would start with an empty image, but it seems that that causes it to make assumptions about where the input is coming from. So I added the nullsrc=size=1920x1080 to start with a transparent 1920x1080 image but this command returns a warning that 1000 duplicate frames have been produced and it keeps going past the end of the input video with no signs of stopping.

    



    ffmpeg -y \       
  -i "Input.mp4" \           
  -filter_complex \ 
 "nullsrc=size=1920x1080, drawbox=x=(3-1)*320:y=(3-1)*180:w=320:h=180:t=7:c=cyan,   fade=in:st=10:d=1:alpha=1, fade=out:st=40:d=1:alpha=1[tmp1]; \
  nullsrc=size=1920x1080, drawbox=x=(4-1)*320:y=(4-1)*180:w=320:h=180:t=7:c=blue,   fade=in:st=20:d=1:alpha=1, fade=out:st=50:d=1:alpha=1[tmp2]; \
  nullsrc=size=1920x1080, drawbox=x=(5-1)*320:y=(5-1)*180:w=320:h=180:t=7:c=green,  fade=in:st=30:d=1:alpha=1, fade=out:st=60:d=1:alpha=1[tmp3]; \
  nullsrc=size=1920x1080, drawbox=x=(6-1)*320:y=(6-1)*180:w=320:h=180:t=7:c=yellow, fade=in:st=40:d=1:alpha=1, fade=out:st=70:d=1:alpha=1[tmp4]; \
  [tmp1][tmp2] overlay=0:0[ovr1]; \ 
  [tmp3][tmp4] overlay=0:0[ovr2]; \ 
  [ovr1][ovr2] overlay=0:0[boxes]; \ 
  [0:v][boxes] overlay=0:0" \        
  "Output.mp4"


    



    The input video is around 01:45 long. Log of run :

    



    ffmpeg -y        -loglevel verbose       -i "Input.mp4"          -filter_complex " nullsrc=size=1920x1080, drawbox=x=(3-1)*320:y=(3-1)*180:w=320:h=180:t=7:c=cyan,   fade=in:st=10:d=1:alpha=1, fade=out:st=40:d=1:alpha=1[tmp1]; nullsrc=size=1920x1080, drawbox=x  =(4-1)*320:y=(4-1)*180:w=320:h=180:t=7:c=blue,   fade=in:st=20:d=1:alpha=1, fade=out:st=50:d=1:alpha=1[tmp2]; nullsrc=size=1920x1080, drawbox=x=(5-1)*320:y=(5-1)*180:w=320:h=180:t=7:c=green,  fa
de=in:st=30:d=1:alpha=1, fade=out:st=60:d=1:alpha=1[tmp3]; nullsrc=size=1920x1080, drawbox=x=(6-1)*320:y=(6-1)*180:w=320:h=180:t=7:c=yellow, fade=in:st=40:d=1:alpha=1, fade=out:st=70:d=1:alpha=1[tmp4]; [tmp1][tmp2] overlay=0:0[ovr1]; [tmp3][tmp4] overlay=0:0[ovr2]; [ovr1][ovr2] overlay=0:0[boxes]; [0:v][boxes] overlay=0:0"      "Output.mp4"
ffmpeg version 3.4 Copyright (c) 2000-2017 the FFmpeg developers
  built with gcc 7.2.0 (GCC)
  configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-bzlib --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvid
stab --enable-libvorbis --enable-cuda --enable-cuvid --enable-d3d11va --enable-nvenc --enable-dxva2 --enable-avisynth --enable-libmfx
  libavutil      55. 78.100 / 55. 78.100
  libavcodec     57.107.100 / 57.107.100
  libavformat    57. 83.100 / 57. 83.100
  libavdevice    57. 10.100 / 57. 10.100
  libavfilter     6.107.100 /  6.107.100
  libswscale      4.  8.100 /  4.  8.100
  libswresample   2.  9.100 /  2.  9.100
  libpostproc    54.  7.100 / 54.  7.100
[h264 @ 000001df2b623ea0] Reinit context to 1920x1088, pix_fmt: yuv420p
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'Input.mp4':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    encoder         : Lavf58.29.100
  Duration: 00:01:48.67, start: 0.000000, bitrate: 1825 kb/s
    Stream #0:0(und): Video: h264 (High), 1 reference frame (avc1 / 0x31637661), yuv420p(left), 1920x1080 (1920x1088) [SAR 1:1 DAR 16:9], 1693 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc (default)
    Metadata:
      handler_name    : VideoHandler
    Stream #0:1(und): Audio: mp3 (mp4a / 0x6134706D), 44100 Hz, stereo, s16p, 127 kb/s (default)
    Metadata:
      handler_name    : SoundHandler
[Parsed_nullsrc_0 @ 000001df2b61af00] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_2 @ 000001df2b6a6860] type:in start_time:10.000000 duration:1.000000 alpha:1
[Parsed_fade_3 @ 000001df2b9ddec0] type:out start_time:40.000000 duration:1.000000 alpha:1
[Parsed_nullsrc_4 @ 000001df2bc00560] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_6 @ 000001df29fe6780] type:in start_time:20.000000 duration:1.000000 alpha:1
[Parsed_fade_7 @ 000001df29fe6840] type:out start_time:50.000000 duration:1.000000 alpha:1
[Parsed_nullsrc_8 @ 000001df2b642dc0] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_10 @ 000001df2b6442a0] type:in start_time:30.000000 duration:1.000000 alpha:1
[Parsed_fade_11 @ 000001df2b6444e0] type:out start_time:60.000000 duration:1.000000 alpha:1
[Parsed_nullsrc_12 @ 000001df2b62d000] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_14 @ 000001df2b62d580] type:in start_time:40.000000 duration:1.000000 alpha:1
[Parsed_fade_15 @ 000001df2b62ddc0] type:out start_time:70.000000 duration:1.000000 alpha:1
Stream mapping:
  Stream #0:0 (h264) -> overlay:main (graph 0)
  overlay (graph 0) -> Stream #0:0 (libx264)
  Stream #0:1 -> #0:1 (mp3 (native) -> aac (native))
Press [q] to stop, [?] for help
[h264 @ 000001df2b8e5040] Reinit context to 1920x1088, pix_fmt: yuv420p
[graph_1_in_0_1 @ 000001df2b62e100] tb:1/44100 samplefmt:s16p samplerate:44100 chlayout:0x3
[format_out_0_1 @ 000001df2b62e5e0] auto-inserting filter 'auto_resampler_0' between the filter 'Parsed_anull_0' and the filter 'format_out_0_1'
[auto_resampler_0 @ 000001df2b62dd00] ch:2 chl:stereo fmt:s16p r:44100Hz -> ch:2 chl:stereo fmt:fltp r:44100Hz
[Parsed_nullsrc_0 @ 000001df2b62e440] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_2 @ 000001df2b62df60] type:in start_time:10.000000 duration:1.000000 alpha:1
[Parsed_fade_3 @ 000001df2b62e6c0] type:out start_time:40.000000 duration:1.000000 alpha:1
[Parsed_nullsrc_4 @ 000001df2b62d9c0] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_6 @ 000001df2b62e520] type:in start_time:20.000000 duration:1.000000 alpha:1
[Parsed_fade_7 @ 000001df2b62d8e0] type:out start_time:50.000000 duration:1.000000 alpha:1
[Parsed_nullsrc_8 @ 000001df2b62e1e0] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_10 @ 000001df2b62e2a0] type:in start_time:30.000000 duration:1.000000 alpha:1
[Parsed_fade_11 @ 000001df2b62e380] type:out start_time:60.000000 duration:1.000000 alpha:1
[Parsed_nullsrc_12 @ 000001df2b62dc20] size:1920x1080 rate:25/1 duration:-1.000000 sar:1/1
[Parsed_fade_14 @ 000001df2b816f60] type:in start_time:40.000000 duration:1.000000 alpha:1
[Parsed_fade_15 @ 000001df2b817ac0] type:out start_time:70.000000 duration:1.000000 alpha:1
[graph 0 input from stream 0:0 @ 000001df2b817ba0] w:1920 h:1080 pixfmt:yuv420p tb:1/15360 fr:30/1 sar:1/1 sws_param:flags=2
[Parsed_drawbox_1 @ 000001df2b62da80] x:640 y:360 w:320 h:180 color:0xA9A610FF
[Parsed_drawbox_5 @ 000001df2b62e780] x:960 y:540 w:320 h:180 color:0x29F06EFF
[Parsed_overlay_16 @ 000001df2b817860] main w:1920 h:1080 fmt:yuva420p overlay w:1920 h:1080 fmt:yuva420p
[Parsed_overlay_16 @ 000001df2b817860] [framesync @ 000001df2b815348] Selected 1/25 time base
[Parsed_overlay_16 @ 000001df2b817860] [framesync @ 000001df2b815348] Sync level 2
[Parsed_drawbox_9 @ 000001df2b62db60] x:1280 y:720 w:320 h:180 color:0x515B51FF
[Parsed_drawbox_13 @ 000001df2b818be0] x:1600 y:900 w:320 h:180 color:0xD21092FF
[Parsed_overlay_17 @ 000001df2b816d00] main w:1920 h:1080 fmt:yuva420p overlay w:1920 h:1080 fmt:yuva420p
[Parsed_overlay_17 @ 000001df2b816d00] [framesync @ 000001df2b815e48] Selected 1/25 time base
[Parsed_overlay_17 @ 000001df2b816d00] [framesync @ 000001df2b815e48] Sync level 2
[Parsed_overlay_18 @ 000001df2b8183c0] main w:1920 h:1080 fmt:yuva420p overlay w:1920 h:1080 fmt:yuva420p
[Parsed_overlay_18 @ 000001df2b8183c0] [framesync @ 000001df2b815668] Selected 1/25 time base
[Parsed_overlay_18 @ 000001df2b8183c0] [framesync @ 000001df2b815668] Sync level 2
[Parsed_overlay_19 @ 000001df2b817d40] main w:1920 h:1080 fmt:yuv420p overlay w:1920 h:1080 fmt:yuva420p
[Parsed_overlay_19 @ 000001df2b817d40] [framesync @ 000001df2b816488] Selected 1/76800 time base
[Parsed_overlay_19 @ 000001df2b817d40] [framesync @ 000001df2b816488] Sync level 2
[libx264 @ 000001df2b62cd40] using SAR=1/1
[libx264 @ 000001df2b62cd40] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
[libx264 @ 000001df2b62cd40] profile High, level 4.0
[libx264 @ 000001df2b62cd40] 264 - core 152 r2851 ba24899 - H.264/MPEG-4 AVC codec - Copyleft 2003-2017 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=12 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weigh
tb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'Output.mp4':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    encoder         : Lavf57.83.100
    Stream #0:0: Video: h264 (libx264), 1 reference frame (avc1 / 0x31637661), yuv420p(left), 1920x1080 [SAR 1:1 DAR 16:9], q=-1--1, 30 fps, 15360 tbn, 30 tbc (default)
    Metadata:
      encoder         : Lavc57.107.100 libx264
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
    Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, delay 1024, 128 kb/s (default)
    Metadata:
      handler_name    : SoundHandler
      encoder         : Lavc57.107.100 aac
[Parsed_overlay_19 @ 000001df2b817d40] [framesync @ 000001df2b816488] Sync level 1speed=0.78x
Past duration 0.800774 too large
*** 1 dup!5 fps= 24 q=29.0 size=   21760kB time=00:01:46.86 bitrate=1668.0kbits/s speed=0.78x
    Last message repeated 1 times
*** 1 dup!8 fps= 24 q=29.0 size=   21760kB time=00:01:47.30 bitrate=1661.3kbits/s dup=2 drop=0 speed=0.78x
    Last message repeated 2 times
*** 1 dup!3 fps= 24 q=29.0 size=   21760kB time=00:01:47.80 bitrate=1653.6kbits/s dup=5 drop=0 speed=0.781x
    Last message repeated 1 times

...

    Last message repeated 2 times
*** 1 dup!3 fps= 25 q=29.0 size=   27392kB time=00:05:05.80 bitrate= 733.8kbits/s dup=995 drop=0 speed=0.832x
    Last message repeated 1 times
*** 1 dup!6 fps= 25 q=29.0 size=   27392kB time=00:05:06.23 bitrate= 732.8kbits/s dup=997 drop=0 speed=0.832x
    Last message repeated 1 times
*** 1 dup!0 fps= 25 q=29.0 size=   27392kB time=00:05:06.70 bitrate= 731.6kbits/s dup=999 drop=0 speed=0.832x
    Last message repeated 1 times
More than 1000 frames duplicated
*** 1 dup!3 fps= 25 q=29.0 size=   27392kB time=00:05:07.13 bitrate= 730.6kbits/s dup=1001 drop=0 speed=0.832x
    Last message repeated 2 times
*** 1 dup!8 fps= 25 q=29.0 size=   27392kB time=00:05:07.63 bitrate= 729.4kbits/s dup=1004 drop=0 speed=0.832x
    Last message repeated 1 times
... 
*** 1 dup!9 fps= 25 q=29.0 size=   27904kB time=00:05:18.66 bitrate= 717.3kbits/s dup=1059 drop=0 speed=0.834x
    Last message repeated 1 times
*** 1 dup!3 fps= 25 q=29.0 size=   27904kB time=00:05:19.13 bitrate= 716.3kbits/s dup=1061 drop=0 speed=0.834x
frame= 9635 fps= 25 q=-1.0 Lsize=   28364kB time=00:05:21.06 bitrate= 723.7kbits/s dup=1062 drop=0 speed=0.837x
video:26539kB audio:1637kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.669889%
Input file #0 (Input.mp4):
  Input stream #0:0 (video): 3260 packets read (23006797 bytes); 3260 frames decoded;
  Input stream #0:1 (audio): 4025 packets read (1682285 bytes); 4025 frames decoded (4636800 samples);
  Total: 7285 packets (24689082 bytes) demuxed
Output file #0 (Output.mp4):
  Output stream #0:0 (video): 9635 frames encoded; 9635 packets muxed (27175566 bytes);
  Output stream #0:1 (audio): 4529 frames encoded (4636800 samples); 4530 packets muxed (1676214 bytes);
  Total: 14165 packets (28851780 bytes) muxed
[libx264 @ 000001df2b62cd40] frame I:39    Avg QP:16.49  size:213954
[libx264 @ 000001df2b62cd40] frame P:2446  Avg QP:17.77  size:  6277
[libx264 @ 000001df2b62cd40] frame B:7150  Avg QP:30.38  size:   486
[libx264 @ 000001df2b62cd40] consecutive B-frames:  0.8%  0.8%  0.1% 98.3%
[libx264 @ 000001df2b62cd40] mb I  I16..4: 13.2% 43.0% 43.8%
[libx264 @ 000001df2b62cd40] mb P  I16..4:  0.2%  0.2%  0.1%  P16..4:  7.1%  3.3%  1.7%  0.0%  0.0%    skip:87.5%
[libx264 @ 000001df2b62cd40] mb B  I16..4:  0.0%  0.0%  0.0%  B16..8:  4.3%  0.1%  0.0%  direct: 0.0%  skip:95.5%  L0:41.9% L1:56.1% BI: 2.0%
[libx264 @ 000001df2b62cd40] 8x8 transform intra:44.1% inter:65.8%
[libx264 @ 000001df2b62cd40] coded y,uvDC,uvAC intra: 71.3% 83.5% 52.5% inter: 1.1% 1.5% 0.0%
[libx264 @ 000001df2b62cd40] i16 v,h,dc,p: 31% 28%  4% 38%
[libx264 @ 000001df2b62cd40] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 28% 20% 10%  6%  6%  7%  7%  8%  8%
[libx264 @ 000001df2b62cd40] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 33% 28%  7%  4%  6%  6%  6%  5%  5%
[libx264 @ 000001df2b62cd40] i8c dc,h,v,p: 34% 27% 28% 11%
[libx264 @ 000001df2b62cd40] Weighted P-Frames: Y:0.0% UV:0.0%
[libx264 @ 000001df2b62cd40] ref P L0: 71.1% 14.9% 10.6%  3.4%
[libx264 @ 000001df2b62cd40] ref B L0: 93.3%  5.9%  0.8%
[libx264 @ 000001df2b62cd40] ref B L1: 97.8%  2.2%
[libx264 @ 000001df2b62cd40] kb/s:676.90
[aac @ 000001df2b6a5620] Qavg: 1165.766
Exiting normally, received signal 2.
Terminate batch job (Y/N)? y


    


  • FFMPEG Debug "Invalid data found when processing input"

    29 septembre 2020, par dgcipp

    How can I debug this error ? I am trying to display a vp8-encoded video stream (TCP) from Android (x86) to my PC but I got this message from ffplay : "tcp ://10.8.0.3:49152 ? listen : Invalid data found when processing input"

    


    With h264 it works and the same code in an android device works with vp8 too.

    


    Command in the server :

    


    ffplay -i tcp://10.8.0.3:49152?listen


    


    I have tried adding verbosity or log-level debug, but it shows the same. How can I get a detailed error ?

    


    enter image description here

    


  • "invalid argument" for av_buffersrc_write_frame / av_buffersrc_add_frame

    1er février 2017, par TheSHEEEP

    I am trying to use FFmpeg C api to create a filter to merge two audio streams. Trying to follow the code from here : Implementing a multiple input filter graph with the Libavfilter library in Android NDK

    Everything seems to be alright.

    However, as soon as I call av_buffersrc_write_frame (or av_buffersrc_add_frame or av_buffersrc_add_frame_flags, it doesn’t matter), FFmpeg just reports "invalid argument" and nothing else - an utterly useless error message, as it could mean everything.
    Which argument is invalid ? What is wrong about it ? Nobody knows.

    I am initializing the graph and "grabbing" the contexts of the buffer sources for later use like this :

    // Alloc filter graph
    *filter_graph = avfilter_graph_alloc();
    if ((*filter_graph) == NULL) {
       os::log("Error: Cannot allocate filter graph.");
       return AVERROR(ENOMEM);
    }

    // Building the filter string, ommitted

    int result = avfilter_graph_parse2(*filter_graph, filterString.c_str(), &gis, &gos, NULL);
    if (result < 0)
    {
       char errorBuf[1024];
       av_make_error_string(errorBuf, 1024, result);
       log("Error: Parsing filter string: %s", errorBuf);
       return AVERROR_EXIT;
    }

    // Configure the graph
    result = avfilter_graph_config(*filter_graph, NULL);
    if (result < 0)
    {
       char errorBuf[1024];
       av_make_error_string(errorBuf, 1024, result);
       log("Error: Configuring filter graph: %s", errorBuf);
       return AVERROR_EXIT;
    }

    // Get the buffer source and buffer sink contexts
    for (unsigned int i = 0; i < (*filter_graph)->nb_filters; ++i) {
       AVFilterContext* filterContext = (*filter_graph)->filters[i];

       // The first two filters should be the abuffers
       std::string name = filterContext->name;
       if (name.find("abuffer") != name.npos && i < 2) {
           inputs[i].buffer_source_context = filterContext;
       }

       // abuffersink is the one we need to get the converted frames from
       if (name.find("abuffersink") != name.npos) {
           *buffer_sink_context = filterContext;
       }
    }

    There are absolutely no errors in the initialization. At least FFmpeg has only this to say about it, which I think looks good :

    FFMPEG: [Parsed_abuffer_0 @ 0ddfe840] Setting 'time_base' to value '1/48000'
    FFMPEG: [Parsed_abuffer_0 @ 0ddfe840] Setting 'sample_rate' to value '48000'
    FFMPEG: [Parsed_abuffer_0 @ 0ddfe840] Setting 'sample_fmt' to value '1'
    FFMPEG: [Parsed_abuffer_0 @ 0ddfe840] Setting 'channel_layout' to value '3'
    FFMPEG: [Parsed_abuffer_0 @ 0ddfe840] Setting 'channels' to value '2'
    FFMPEG: [Parsed_abuffer_0 @ 0ddfe840] tb:1/48000 samplefmt:s16 samplerate:48000 chlayout:3
    FFMPEG: [Parsed_abuffer_1 @ 0ddfe500] Setting 'time_base' to value '1/44100'
    FFMPEG: [Parsed_abuffer_1 @ 0ddfe500] Setting 'sample_rate' to value '44100'
    FFMPEG: [Parsed_abuffer_1 @ 0ddfe500] Setting 'sample_fmt' to value '1'
    FFMPEG: [Parsed_abuffer_1 @ 0ddfe500] Setting 'channel_layout' to value '3'
    FFMPEG: [Parsed_abuffer_1 @ 0ddfe500] Setting 'channels' to value '2'
    FFMPEG: [Parsed_abuffer_1 @ 0ddfe500] tb:1/44100 samplefmt:s16 samplerate:44100 chlayout:3
    FFMPEG: [Parsed_volume_3 @ 0ddfe580] Setting 'volume' to value '2'
    FFMPEG: [Parsed_aresample_4 @ 0ddfe660] Setting 'sample_rate' to value '48000'
    FFMPEG: [Parsed_aformat_5 @ 0ddfe940] Setting 'sample_fmts' to value 'fltp'
    FFMPEG: [Parsed_aformat_5 @ 0ddfe940] Setting 'channel_layouts' to value '3'

    Then, I am trying to add a frame (that has been decoded beforehand) like this :

    // "buffer_source_context" is one of the "inputs[i].buffer_source_context" from the code above
    int result = av_buffersrc_write_frame(  buffer_source_context,
                                               input_frame);
    if (result < 0) {
       char errorBuf[1024];
       av_make_error_string(errorBuf, 1024, result);
       log("Error: While adding to buffer source: %s", errorBuf);
       return AVERROR_EXIT;
    }

    And the result is the mentioned "invalid argument".

    The buffer_source_context is one of those noted from the code above and the input_frame is perfectly fine as well.
    Before the filtering code was added, the same frame was passed to an encoder instead without a problem.

    I am at a loss at what the error could be here. I log FFmpeg errors at the lowest possible level, but not a single error is shown. I am using FFmpeg 3.1.1.