Recherche avancée

Médias (0)

Mot : - Tags -/auteurs

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (52)

  • Le plugin : Podcasts.

    14 juillet 2010, par

    Le problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
    Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
    Types de fichiers supportés dans les flux
    Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Prérequis à l’installation

    31 janvier 2010, par

    Préambule
    Cet article n’a pas pour but de détailler les installations de ces logiciels mais plutôt de donner des informations sur leur configuration spécifique.
    Avant toute chose SPIPMotion tout comme MediaSPIP est fait pour tourner sur des distributions Linux de type Debian ou dérivées (Ubuntu...). Les documentations de ce site se réfèrent donc à ces distributions. Il est également possible de l’utiliser sur d’autres distributions Linux mais aucune garantie de bon fonctionnement n’est possible.
    Il (...)

Sur d’autres sites (6652)

  • FFMPEG atempo introducing phasing for multichannel mono audio tracks

    27 septembre 2021, par BrainNoWerk

    Is this a bug, or expected behaviour ? When converting materials from PAL to NTSC I invoke atempo as follows :

    


    -map 0:a:? -af atempo=24000/25025 ^
-c:a pcm_s24le


    


    I use this in a windows batch file (hence the caret) as a catch-all for all files that need to be converted, not having to deal with how many audio channels might be present or in what order.

    


    However, when my input was a broadcast MXF with 10channel mono audio (1 per stream) it introduced wild phasing between the tracks.

    


    Merging the tracks into a single stream to be processed by atempo resulted in no phasing.

    


    -filter_complex "[0:a:0][0:a:1][0:a:2][0:a:3][0:a:4][0:a:5][0:a:6][0:a:7][0:a:8][0:a:9] amerge=inputs=10, atempo=24000/25025[FRC]" ^
-map "[FRC]" -c:a pcm_s24le


    


    Is this expected behaviour ? I can't see any documentation detailing the need to first use amerge before invoking atempo.

    


    If indeed this step is necessary, is there a way to "wildcard" the amerge operation so that I don't have to manually enter all the audio channels, and then the "inputs=" ? This would allow me to make it more universal.

    


    This is my first question on stack overflow, so please be gentle. I've come here to find so many answers to my FFMPEG questions in the past—but this seems to be an edge case I can't get much detail on.

    


    Thanks !

    


    EDIT :

    


    This output using the wildcard produces phasing :

    


    C:\Windows>ffmpeg -ss 00:05:13.0 -r 24000/1001 -i "\\bdfs11\array21\Eps101_1920x1080_20_51_DV_CC_25fps_20210622.mov" -t 00:00:22.0 -map 0:v:0 -c:v mpeg2video -profile:v 0 -level:v 2 -b:v 50000k -minrate 50000k -maxrate 50000k -pix_fmt yuv422p -vtag xd5d -force_key_frames "expr:gte(t,n_forced*1)" -streamid 0:481 -streamid 1:129 -map 0:a:? -af atempo=24000/25025 -c:a pcm_s24le "R:\2_SERIES\%~n1_25to23976_works.%Container%" -y
ffmpeg version N-94566-gddd92ba2c6 Copyright (c) 2000-2019 the FFmpeg developers
      built with gcc 9.1.1 (GCC) 20190807
      configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --enable-libmfx --enable-amf --enable-ffnvcodec --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt
      libavutil      56. 33.100 / 56. 33.100
      libavcodec     58. 55.100 / 58. 55.100
      libavformat    58. 30.100 / 58. 30.100
      libavdevice    58.  9.100 / 58.  9.100
      libavfilter     7. 58.100 /  7. 58.100
      libswscale      5.  6.100 /  5.  6.100
      libswresample   3.  6.100 /  3.  6.100
      libpostproc    55.  6.100 / 55.  6.100
    [mov,mp4,m4a,3gp,3g2,mj2 @ 06ea4cc0] Could not find codec parameters for stream 12 (Subtitle: none (c708 / 0x38303763), 1920x1080, 21 kb/s): unknown codec
    Consider increasing the value for the 'analyzeduration' and 'probesize' options
    Guessed Channel Layout for Input Stream #0.2 : mono
    Guessed Channel Layout for Input Stream #0.3 : mono
    Guessed Channel Layout for Input Stream #0.4 : mono
    Guessed Channel Layout for Input Stream #0.5 : mono
    Guessed Channel Layout for Input Stream #0.6 : mono
    Guessed Channel Layout for Input Stream #0.7 : mono
    Guessed Channel Layout for Input Stream #0.8 : mono
    Guessed Channel Layout for Input Stream #0.9 : mono
    Guessed Channel Layout for Input Stream #0.10 : mono
    Guessed Channel Layout for Input Stream #0.11 : mono
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '\\bdfs11\array21\Eps101_1920x1080_20_51_DV_CC_25fps_20210622.mov':
      Metadata:
        major_brand     : qt
        minor_version   : 537199360
        compatible_brands: qt
        creation_time   : 2021-06-22T17:39:50.000000Z
      Duration: 00:59:08.16, start: 0.000000, bitrate: 217983 kb/s
        Stream #0:0(eng): Video: prores (HQ) (apch / 0x68637061), yuv422p10le(tv, bt709, progressive), 1920x1080, 206438 kb/s, SAR 1:1 DAR 16:9, 25 fps, 25 tbr, 25 tbn, 25 tbc (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Video Media Handler
          encoder         : Apple ProRes 422 HQ
          timecode        : 00:59:59:00
        Stream #0:1(eng): Data: none (tmcd / 0x64636D74) (default)
        Metadata:
          rotate          : 0
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Time Code Media Handler
          reel_name       : untitled
          timecode        : 00:59:59:00
        Stream #0:2(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
        Stream #0:3(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
        Stream #0:4(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
        Stream #0:5(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
        Stream #0:6(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
        Stream #0:7(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
        Stream #0:8(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
        Stream #0:9(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
        Stream #0:10(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
        Stream #0:11(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
        Stream #0:12(eng): Subtitle: none (c708 / 0x38303763), 1920x1080, 21 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Closed Caption Media Handler
    Stream mapping:
      Stream #0:0 -> #0:0 (prores (native) -> mpeg2video (native))
      Stream #0:2 -> #0:1 (pcm_s24le (native) -> pcm_s24le (native))
      Stream #0:3 -> #0:2 (pcm_s24le (native) -> pcm_s24le (native))
      Stream #0:4 -> #0:3 (pcm_s24le (native) -> pcm_s24le (native))
      Stream #0:5 -> #0:4 (pcm_s24le (native) -> pcm_s24le (native))
      Stream #0:6 -> #0:5 (pcm_s24le (native) -> pcm_s24le (native))
      Stream #0:7 -> #0:6 (pcm_s24le (native) -> pcm_s24le (native))
      Stream #0:8 -> #0:7 (pcm_s24le (native) -> pcm_s24le (native))
      Stream #0:9 -> #0:8 (pcm_s24le (native) -> pcm_s24le (native))
      Stream #0:10 -> #0:9 (pcm_s24le (native) -> pcm_s24le (native))
      Stream #0:11 -> #0:10 (pcm_s24le (native) -> pcm_s24le (native))
    Press [q] to stop, [?] for help
    [mpeg2video @ 06f8aa40] Automatically choosing VBV buffer size of 746 kbyte
    Output #0, mxf, to 'R:\2_SERIES\Eps101_1920x1080_20_51_DV_CC_25fps_20210622_25to23976_works.mxf':
      Metadata:
        major_brand     : qt
        minor_version   : 537199360
        compatible_brands: qt
        encoder         : Lavf58.30.100
        Stream #0:0(eng): Video: mpeg2video (4:2:2) (xd5d / 0x64356478), yuv422p, 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 50000 kb/s, 23.98 fps, 23.98 tbn, 23.98 tbc (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Video Media Handler
          timecode        : 00:59:59:00
          encoder         : Lavc58.55.100 mpeg2video
        Side data:
          cpb: bitrate max/min/avg: 50000000/50000000/50000000 buffer size: 6111232 vbv_delay: 18446744073709551615
        Stream #0:1(eng): Audio: pcm_s24le, 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
          encoder         : Lavc58.55.100 pcm_s24le
        Stream #0:2(eng): Audio: pcm_s24le, 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
          encoder         : Lavc58.55.100 pcm_s24le
        Stream #0:3(eng): Audio: pcm_s24le, 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
          encoder         : Lavc58.55.100 pcm_s24le
        Stream #0:4(eng): Audio: pcm_s24le, 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
          encoder         : Lavc58.55.100 pcm_s24le
        Stream #0:5(eng): Audio: pcm_s24le, 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
          encoder         : Lavc58.55.100 pcm_s24le
        Stream #0:6(eng): Audio: pcm_s24le, 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
          encoder         : Lavc58.55.100 pcm_s24le
        Stream #0:7(eng): Audio: pcm_s24le, 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
          encoder         : Lavc58.55.100 pcm_s24le
        Stream #0:8(eng): Audio: pcm_s24le, 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
          encoder         : Lavc58.55.100 pcm_s24le
        Stream #0:9(eng): Audio: pcm_s24le, 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
          encoder         : Lavc58.55.100 pcm_s24le
        Stream #0:10(eng): Audio: pcm_s24le, 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
        Metadata:
          creation_time   : 2021-06-22T17:39:50.000000Z
          handler_name    : Apple Sound Media Handler
          encoder         : Lavc58.55.100 pcm_s24le
    frame=  527 fps= 52 q=2.0 Lsize=  166106kB time=00:00:22.00 bitrate=61851.7kbits/s speed=2.19x
    video:133971kB audio:30938kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.726204%


    


    This is the output that produces no phasing

    


    C:\Windows>ffmpeg -ss 00:05:13.0 -r 24000/1001 -i "\\bdfs11\array21\Eps101_1920x1080_20_51_DV_CC_25fps_20210622.mov" -t 00:00:22.0 -map 0:v:0 -c:v mpeg2video -profile:v 0 -level:v 2 -b:v 50000k -minrate 50000k -maxrate 50000k -pix_fmt yuv422p -vtag xd5d -force_key_frames "expr:gte(t,n_forced*1)" -streamid 0:481 -streamid 1:129 -filter_complex "[0:a:0][0:a:1][0:a:2][0:a:3][0:a:4][0:a:5][0:a:6][0:a:7][0:a:8][0:a:9] amerge=inputs=10, atempo=24000/25025[FRC]" -map "[FRC]" -c:a pcm_s24le "R:\2_SERIES\Eps101_1920x1080_20_51_DV_CC_25fps_20210622_25to23976_works.mxf" -y
ffmpeg version N-94566-gddd92ba2c6 Copyright (c) 2000-2019 the FFmpeg developers
  built with gcc 9.1.1 (GCC) 20190807
  configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --enable-libmfx --enable-amf --enable-ffnvcodec --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt
  libavutil      56. 33.100 / 56. 33.100
  libavcodec     58. 55.100 / 58. 55.100
  libavformat    58. 30.100 / 58. 30.100
  libavdevice    58.  9.100 / 58.  9.100
  libavfilter     7. 58.100 /  7. 58.100
  libswscale      5.  6.100 /  5.  6.100
  libswresample   3.  6.100 /  3.  6.100
  libpostproc    55.  6.100 / 55.  6.100
[mov,mp4,m4a,3gp,3g2,mj2 @ 064f5580] Could not find codec parameters for stream 12 (Subtitle: none (c708 / 0x38303763), 1920x1080, 21 kb/s): unknown codec
Consider increasing the value for the 'analyzeduration' and 'probesize' options
Guessed Channel Layout for Input Stream #0.2 : mono
Guessed Channel Layout for Input Stream #0.3 : mono
Guessed Channel Layout for Input Stream #0.4 : mono
Guessed Channel Layout for Input Stream #0.5 : mono
Guessed Channel Layout for Input Stream #0.6 : mono
Guessed Channel Layout for Input Stream #0.7 : mono
Guessed Channel Layout for Input Stream #0.8 : mono
Guessed Channel Layout for Input Stream #0.9 : mono
Guessed Channel Layout for Input Stream #0.10 : mono
Guessed Channel Layout for Input Stream #0.11 : mono
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '\\bdfs11\array21\Eps101_1920x1080_20_51_DV_CC_25fps_20210622.mov':
  Metadata:
    major_brand     : qt
    minor_version   : 537199360
    compatible_brands: qt
    creation_time   : 2021-06-22T17:39:50.000000Z
  Duration: 00:59:08.16, start: 0.000000, bitrate: 217983 kb/s
    Stream #0:0(eng): Video: prores (HQ) (apch / 0x68637061), yuv422p10le(tv, bt709, progressive), 1920x1080, 206438 kb/s, SAR 1:1 DAR 16:9, 25 fps, 25 tbr, 25 tbn, 25 tbc (default)
    Metadata:
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Apple Video Media Handler
      encoder         : Apple ProRes 422 HQ
      timecode        : 00:59:59:00
    Stream #0:1(eng): Data: none (tmcd / 0x64636D74) (default)
    Metadata:
      rotate          : 0
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Time Code Media Handler
      reel_name       : untitled
      timecode        : 00:59:59:00
    Stream #0:2(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
    Metadata:
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Apple Sound Media Handler
    Stream #0:3(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
    Metadata:
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Apple Sound Media Handler
    Stream #0:4(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
    Metadata:
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Apple Sound Media Handler
    Stream #0:5(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
    Metadata:
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Apple Sound Media Handler
    Stream #0:6(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
    Metadata:
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Apple Sound Media Handler
    Stream #0:7(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
    Metadata:
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Apple Sound Media Handler
    Stream #0:8(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
    Metadata:
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Apple Sound Media Handler
    Stream #0:9(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
    Metadata:
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Apple Sound Media Handler
    Stream #0:10(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
    Metadata:
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Apple Sound Media Handler
    Stream #0:11(eng): Audio: pcm_s24le (lpcm / 0x6D63706C), 48000 Hz, mono, s32 (24 bit), 1152 kb/s (default)
    Metadata:
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Apple Sound Media Handler
    Stream #0:12(eng): Subtitle: none (c708 / 0x38303763), 1920x1080, 21 kb/s (default)
    Metadata:
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Apple Closed Caption Media Handler
Stream mapping:
  Stream #0:2 (pcm_s24le) -> amerge:in0 (graph 0)
  Stream #0:3 (pcm_s24le) -> amerge:in1 (graph 0)
  Stream #0:4 (pcm_s24le) -> amerge:in2 (graph 0)
  Stream #0:5 (pcm_s24le) -> amerge:in3 (graph 0)
  Stream #0:6 (pcm_s24le) -> amerge:in4 (graph 0)
  Stream #0:7 (pcm_s24le) -> amerge:in5 (graph 0)
  Stream #0:8 (pcm_s24le) -> amerge:in6 (graph 0)
  Stream #0:9 (pcm_s24le) -> amerge:in7 (graph 0)
  Stream #0:10 (pcm_s24le) -> amerge:in8 (graph 0)
  Stream #0:11 (pcm_s24le) -> amerge:in9 (graph 0)
  Stream #0:0 -> #0:0 (prores (native) -> mpeg2video (native))
  atempo (graph 0) -> Stream #0:1 (pcm_s24le)
Press [q] to stop, [?] for help
[Parsed_amerge_0 @ 06e18dc0] No channel layout for input 1
[Parsed_amerge_0 @ 06e18dc0] Input channel layouts overlap: output layout will be determined by the number of distinct input channels
[mpeg2video @ 06dea000] Automatically choosing VBV buffer size of 746 kbyte
Output #0, mxf, to 'R:\2_SERIES\Eps101_1920x1080_20_51_DV_CC_25fps_20210622_25to23976_works.mxf':
  Metadata:
    major_brand     : qt
    minor_version   : 537199360
    compatible_brands: qt
    encoder         : Lavf58.30.100
    Stream #0:0(eng): Video: mpeg2video (4:2:2) (xd5d / 0x64356478), yuv422p, 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 50000 kb/s, 23.98 fps, 23.98 tbn, 23.98 tbc (default)
    Metadata:
      creation_time   : 2021-06-22T17:39:50.000000Z
      handler_name    : Apple Video Media Handler
      timecode        : 00:59:59:00
      encoder         : Lavc58.55.100 mpeg2video
    Side data:
      cpb: bitrate max/min/avg: 50000000/50000000/50000000 buffer size: 6111232 vbv_delay: 18446744073709551615
    Stream #0:1: Audio: pcm_s24le, 48000 Hz, 10 channels (FL+FR+FC+LFE+BL+BR+FLC+FRC+BC+SL), s32, 11520 kb/s (default)
    Metadata:
      encoder         : Lavc58.55.100 pcm_s24le
frame=  527 fps= 61 q=2.0 Lsize=  165571kB time=00:00:22.00 bitrate=61652.6kbits/s speed=2.56x
video:133971kB audio:30938kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.402084%


    


    Let me know if you need more detail than what I've provided.

    


    Tangential questions related to this job and potentially not worth their own thread, even though I've looked extensively and not found the answers (happy to post them individually if that's necessary) :

    


      

    1. I can't seem to split any portion of the filter_complex above with a caret (^) within a windows batch file (no number of spaces before or after resolve this issue). It breaks the chain and the filter graphs complain of no input.

      


    2. 


    3. Is FFMBC still the only way to include broadcast closed captioning ? This functionality doesn't exist within FFMPEG ?

      


    4. 


    


  • record mediasoup RTP stream using FFmpeg for Firefox

    30 juillet 2024, par Hadi Aghandeh

    I am trying to record WebRTC stream using mediasoup. I could record successfully on chrome and safari 13/14/15. However on Firefox the does not work.

    


    Client side code is a vue js component which gets rtp-compabilities using socket.io and create producers after the server creates the transports. This works good on chrome and safari.

    


    const { connect , createLocalTracks } = require('twilio-video');
const SocketClient = require("socket.io-client");
const SocketPromise = require("socket.io-promise").default;
const MediasoupClient = require("mediasoup-client");

export default {
    data() {
        return {
            errors: [],
            isReady: false,
            isRecording: false,
            loading: false,
            sapio: {
                token: null,
                connectionId: 0
            },
            server: {
                host: 'https://rtc.test',
                ws: '/server',
                socket: null,
            },
            peer: {},
        }
    },
    mounted() {
        this.init();
    },
    methods: {
        async init() {
            await this.startCamera();

            if (this.takeId) {
                await this.recordBySapioServer();
            }
        },
        startCamera() {
            return new Promise( (resolve, reject) => {
                if (window.videoMediaStreamObject) {
                    this.setVideoElementStream(window.videoMediaStreamObject);
                    resolve();
                } else {
                    // Get user media as required
                    try {
                        this.localeStream = navigator.mediaDevices.getUserMedia({
                            audio: true,
                            video: true,
                        }).then((stream) => {
                            this.setVideoElementStream(stream);
                            resolve();
                        })
                    } catch (err) {
                        console.error(err);
                        reject();
                    }
                }
            })
        },
        setVideoElementStream(stream) {
            this.localStream = stream;
            this.$refs.video.srcObject = stream;
            this.$refs.video.muted = true;
            this.$refs.video.play().then((video) => {
                this.isStreaming = true;
                this.height = this.$refs.video.videoHeight;
                this.width = this.$refs.video.videoWidth;
            });
        },
        // first thing we need is connecting to websocket
        connectToSocket() {
            const serverUrl = this.server.host;
            console.log("Connect with sapio rtc server:", serverUrl);

            const socket = SocketClient(serverUrl, {
                path:  this.server.ws,
                transports: ["websocket"],
            });
            this.socket = socket;

            socket.on("connect", () => {
                console.log("WebSocket connected");
                // we ask for rtp-capabilities from server to send to us
                socket.emit('send-rtp-capabilities');
            });

            socket.on("error", (err) => {
                this.loading = true;
                console.error("WebSocket error:", err);
            });

            socket.on("router-rtp-capabilities", async (msg) => {
                const { routerRtpCapabilities, sessionId, externalId } = msg;
                console.log('[rtpCapabilities:%o]', routerRtpCapabilities);
                this.routerRtpCapabilities = routerRtpCapabilities;

                try {
                    const device = new MediasoupClient.Device();
                    // Load the mediasoup device with the router rtp capabilities gotten from the server
                    await device.load({ routerRtpCapabilities });

                    this.peer.sessionId = sessionId;
                    this.peer.externalId = externalId;
                    this.peer.device = device;

                    this.createTransport();
                } catch (error) {
                    console.error('failed to init device [error:%o]', error);
                    socket.disconnect();
                }
            });

            socket.on("create-transport", async (msg) => {
                console.log('handleCreateTransportRequest() [data:%o]', msg);

                try {
                    // Create the local mediasoup send transport
                    this.peer.sendTransport = await this.peer.device.createSendTransport(msg);
                    console.log('send transport created [id:%s]', this.peer.sendTransport.id);

                    // Set the transport listeners and get the users media stream
                    this.handleSendTransportListeners();
                    this.setTracks();
                    this.loading = false;
                } catch (error) {
                    console.error('failed to create transport [error:%o]', error);
                    socket.disconnect();
                }
            });

            socket.on("connect-transport", async (msg) => {
                console.log('handleTransportConnectRequest()');
                try {
                    const action = this.connectTransport;

                    if (!action) {
                        throw new Error('transport-connect action was not found');
                    }

                    await action(msg);
                } catch (error) {
                    console.error('ailed [error:%o]', error);
                }
            });

            socket.on("produce", async (msg) => {
                console.log('handleProduceRequest()');
                try {
                    if (!this.produce) {
                        throw new Error('produce action was not found');
                    }
                    await this.produce(msg);
                } catch (error) {
                    console.error('failed [error:%o]', error);
                }
            });

            socket.on("recording", async (msg) => {
                this.isRecording = true;
            });

            socket.on("recording-error", async (msg) => {
                this.isRecording = false;
                console.error(msg);
            });

            socket.on("recording-closed", async (msg) => {
                this.isRecording = false;
                console.warn(msg)
            });

        },
        createTransport() {
            console.log('createTransport()');

            if (!this.peer || !this.peer.device.loaded) {
                throw new Error('Peer or device is not initialized');
            }

            // First we must create the mediasoup transport on the server side
            this.socket.emit('create-transport',{
                sessionId: this.peer.sessionId
            });
        },
        handleSendTransportListeners() {
            this.peer.sendTransport.on('connect', this.handleTransportConnectEvent);
            this.peer.sendTransport.on('produce', this.handleTransportProduceEvent);
            this.peer.sendTransport.on('connectionstatechange', connectionState => {
                console.log('send transport connection state change [state:%s]', connectionState);
            });
        },
        handleTransportConnectEvent({ dtlsParameters }, callback, errback) {
            console.log('handleTransportConnectEvent()');
            try {
                this.connectTransport = (msg) => {
                    console.log('connect-transport action');
                    callback();
                    this.connectTransport = null;
                };

                this.socket.emit('connect-transport',{
                    sessionId: this.peer.sessionId,
                    transportId: this.peer.sendTransport.id,
                    dtlsParameters
                });

            } catch (error) {
                console.error('handleTransportConnectEvent() failed [error:%o]', error);
                errback(error);
            }
        },
        handleTransportProduceEvent({ kind, rtpParameters }, callback, errback)  {
            console.log('handleTransportProduceEvent()');
            try {
                this.produce = jsonMessage => {
                    console.log('handleTransportProduceEvent callback [data:%o]', jsonMessage);
                    callback({ id: jsonMessage.id });
                    this.produce = null;
                };

                this.socket.emit('produce', {
                    sessionId: this.peer.sessionId,
                    transportId: this.peer.sendTransport.id,
                    kind,
                    rtpParameters
                });
            } catch (error) {
                console.error('handleTransportProduceEvent() failed [error:%o]', error);
                errback(error);
            }
        },
        async recordBySapioServer() {
            this.loading = true;
            this.connectToSocket();
        },
        async setTracks() {
            // Start mediasoup-client's WebRTC producers
            const audioTrack = this.localStream.getAudioTracks()[0];
            this.peer.audioProducer = await this.peer.sendTransport.produce({
                track: audioTrack,
                codecOptions :
                    {
                        opusStereo : 1,
                        opusDtx    : 1
                    }
            });


            let encodings;
            let codec;
            const codecOptions = {videoGoogleStartBitrate : 1000};

            codec = this.peer.device.rtpCapabilities.codecs.find((c) => c.kind.toLowerCase() === 'video');
            if (codec.mimeType.toLowerCase() === 'video/vp9') {
                encodings = { scalabilityMode: 'S3T3_KEY' };
            } else {
                encodings = [
                    { scaleResolutionDownBy: 4, maxBitrate: 500000 },
                    { scaleResolutionDownBy: 2, maxBitrate: 1000000 },
                    { scaleResolutionDownBy: 1, maxBitrate: 5000000 }
                ];
            }
            const videoTrack = this.localStream.getVideoTracks()[0];
            this.peer.videoProducer =await this.peer.sendTransport.produce({
                track: videoTrack,
                encodings,
                codecOptions,
                codec
            });

        },
        startRecording() {
            this.Q.answer.recordingId = this.peer.externalId;
            this.socket.emit("start-record", {
                sessionId: this.peer.sessionId
            });
        },
        stopRecording() {
            this.socket.emit("stop-record" , {
                sessionId: this.peer.sessionId
            });
        },
    },

}





    


    console.log of my ffmpeg process :

    


    // sdp string
[sdpString:v=0
  o=- 0 0 IN IP4 127.0.0.1
  s=FFmpeg
  c=IN IP4 127.0.0.1
  t=0 0
  m=video 25549 RTP/AVP 101 
  a=rtpmap:101 VP8/90000
  a=sendonly
  m=audio 26934 RTP/AVP 100 
  a=rtpmap:100 opus/48000/2
  a=sendonly
  ]

// ffmpeg args
commandArgs:[
  '-loglevel',
  'debug',
  '-protocol_whitelist',
  'pipe,udp,rtp',
  '-fflags',
  '+genpts',
  '-f',
  'sdp',
  '-i',
  'pipe:0',
  '-map',
  '0:v:0',
  '-c:v',
  'copy',
  '-map',
  '0:a:0',
  '-strict',
  '-2',
  '-c:a',
  'copy',
  '-f',
  'webm',
  '-flags',
  '+global_header',
  '-y',
  'storage/recordings/26e63cb3-4f81-499e-941a-c0bb7f7f52ce.webm',
  [length]: 26
]
// ffmpeg log
ffmpeg::process::data [data:'ffmpeg version n4.4']
ffmpeg::process::data [data:' Copyright (c) 2000-2021 the FFmpeg developers']
ffmpeg::process::data [data:'\n']
ffmpeg::process::data [data:'  built with gcc 11.1.0 (GCC)\n']
ffmpeg::process::data [data:'  configuration: --prefix=/usr --disable-debug --disable-static --disable-stripping --enable-amf --enable-avisynth --enable-cuda-llvm --enable-lto --enable-fontconfig --enable-gmp --enable-gnutls --enable-gpl --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libdav1d --enable-libdrm --enable-libfreetype --enable-libfribidi --enable-libgsm --enable-libiec61883 --enable-libjack --enable-libmfx --enable-libmodplug --enable-libmp3lame --enable-libopencore_amrnb --enable-libopencore_amrwb --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librav1e --enable-librsvg --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxcb --enable-libxml2 --enable-libxvid --enable-libzimg --enable-nvdec --enable-nvenc --enable-shared --enable-version3\n']
ffmpeg::process::data [data:'  libavutil      56. 70.100 / 56. 70.100\n' +
  '  libavcodec     58.134.100 / 58.134.100\n' +
  '  libavformat    58. 76.100 / 58. 76.100\n' +
  '  libavdevice    58. 13.100 / 58. 13.100\n' +
  '  libavfilter     7.110.100 /  7.110.100\n' +
  '  libswscale      5.  9.100 /  5.  9.100\n' +
  '  libswresample   3.  9.100 /  3.  9.100\n' +
  '  libpostproc    55.  9.100 / 55.  9.100\n' +
  'Splitting the commandline.\n' +
  "Reading option '-loglevel' ... matched as option 'loglevel' (set logging level) with argument 'debug'.\n" +
  "Reading option '-protocol_whitelist' ..."]
ffmpeg::process::data [data:" matched as AVOption 'protocol_whitelist' with argument 'pipe,udp,rtp'.\n" +
  "Reading option '-fflags' ..."]
ffmpeg::process::data [data:" matched as AVOption 'fflags' with argument '+genpts'.\n" +
  "Reading option '-f' ... matched as option 'f' (force format) with argument 'sdp'.\n" +
  "Reading option '-i' ... matched as input url with argument 'pipe:0'.\n" +
  "Reading option '-map' ... matched as option 'map' (set input stream mapping) with argument '0:v:0'.\n" +
  "Reading option '-c:v' ... matched as option 'c' (codec name) with argument 'copy'.\n" +
  "Reading option '-map' ... matched as option 'map' (set input stream mapping) with argument '0:a:0'.\n" +
  "Reading option '-strict' ...Routing option strict to both codec and muxer layer\n" +
  " matched as AVOption 'strict' with argument '-2'.\n" +
  "Reading option '-c:a' ... matched as option 'c' (codec name) with argument 'copy'.\n" +
  "Reading option '-f' ... matched as option 'f' (force format) with argument 'webm'.\n" +
  "Reading option '-flags' ... matched as AVOption 'flags' with argument '+global_header'.\n" +
  "Reading option '-y' ... matched as option 'y' (overwrite output files) with argument '1'.\n" +
  "Reading option 'storage/recordings/26e63cb3-4f81-499e-941a-c0bb7f7f52ce.webm' ... matched as output url.\n" +
  'Finished splitting the commandline.\n' +
  'Parsing a group of options: global .\n' +
  'Applying option loglevel (set logging level) with argument debug.\n' +
  'Applying option y (overwrite output files) with argument 1.\n' +
  'Successfully parsed a group of options.\n' +
  'Parsing a group of options: input url pipe:0.\n' +
  'Applying option f (force format) with argument sdp.\n' +
  'Successfully parsed a group of options.\n' +
  'Opening an input file: pipe:0.\n' +
  "[sdp @ 0x55604dc58400] Opening 'pipe:0' for reading\n" +
  '[sdp @ 0x55604dc58400] video codec set to: vp8\n' +
  '[sdp @ 0x55604dc58400] audio codec set to: opus\n' +
  '[sdp @ 0x55604dc58400] audio samplerate set to: 48000\n' +
  '[sdp @ 0x55604dc58400] audio channels set to: 2\n' +
  '[udp @ 0x55604dc6c500] end receive buffer size reported is 425984\n' +
  '[udp @ 0x55604dc6c7c0] end receive buffer size reported is 425984\n' +
  '[sdp @ 0x55604dc58400] setting jitter buffer size to 500\n' +
  '[udp @ 0x55604dc6d900] end receive buffer size reported is 425984\n' +
  '[udp @ 0x55604dc6d2c0] end receive buffer size reported is 425984\n' +
  '[sdp @ 0x55604dc58400] setting jitter buffer size to 500\n']
ffmpeg::process::data [data:'[sdp @ 0x55604dc58400] Before avformat_find_stream_info() pos: 210 bytes read:210 seeks:0 nb_streams:2\n']
  **mediasoup:Consumer resume() +1s**
  **mediasoup:Channel request() [method:consumer.resume, id:12] +1s**
  **mediasoup:Channel request succeeded [method:consumer.resume, id:12] +0ms**
  **mediasoup:Consumer resume() +1ms**
  **mediasoup:Channel request() [method:consumer.resume, id:13] +0ms**
  **mediasoup:Channel request succeeded [method:consumer.resume, id:13] +0ms**
ffmpeg::process::data [data:'[sdp @ 0x55604dc58400] Could not find codec parameters for stream 0 (Video: vp8, 1 reference frame, yuv420p): unspecified size\n' +
  "Consider increasing the value for the 'analyzeduration' (0) and 'probesize' (5000000) options\n"]
ffmpeg::process::data [data:'[sdp @ 0x55604dc58400] After avformat_find_stream_info() pos: 210 bytes read:210 seeks:0 frames:0\n' +
  "Input #0, sdp, from 'pipe:0':\n" +
  '  Metadata:\n' +
  '    title           : FFmpeg\n' +
  '  Duration: N/A, bitrate: N/A\n' +
  '  Stream #0:0, 0, 1/90000: Video: vp8, 1 reference frame, yuv420p, 90k tbr, 90k tbn, 90k tbc\n' +
  '  Stream #0:1, 0, 1/48000: Audio: opus, 48000 Hz, stereo, fltp\n' +
  'Successfully opened the file.\n' +
  'Parsing a group of options: output url storage/recordings/26e63cb3-4f81-499e-941a-c0bb7f7f52ce.webm.\n' +
  'Applying option map (set input stream mapping) with argument 0:v:0.\n' +
  'Applying option c:v (codec name) with argument copy.\n' +
  'Applying option map (set input stream mapping) with argument 0:a:0.\n' +
  'Applying option c:a (codec name) with argument copy.\n' +
  'Applying option f (force format) with argument webm.\n' +
  'Successfully parsed a group of options.\n' +
  'Opening an output file: storage/recordings/26e63cb3-4f81-499e-941a-c0bb7f7f52ce.webm.\n' +
  "[file @ 0x55604dce5bc0] Setting default whitelist 'file,crypto,data'\n"]
ffmpeg::process::data [data:'Successfully opened the file.\n' +
  '[webm @ 0x55604dce0fc0] dimensions not set\n' +
  'Could not write header for output file #0 (incorrect codec parameters ?): Invalid argument\n' +
  'Error initializing output stream 0:1 -- \n' +
  'Stream mapping:\n' +
  '  Stream #0:0 -> #0:0 (copy)\n' +
  '  Stream #0:1 -> #0:1 (copy)\n' +
  '    Last message repeated 1 times\n' +
  '[AVIOContext @ 0x55604dc6dcc0] Statistics: 0 seeks, 0 writeouts\n' +
  '[AVIOContext @ 0x55604dc69380] Statistics: 210 bytes read, 0 seeks\n']
ffmpeg::process::close



    


    FFmpeg says dimensions not  set and Could not write header for output file when I use Firefox. This might be enough for understanding the problem, but if you need more information you can read how server side is performing.
Server-Side in summary can be something like this :
lets say we initialized worker and router at run time using following functions.

    


        // Start the mediasoup workers
module.exports.initializeWorkers = async () => {
  const { logLevel, logTags, rtcMinPort, rtcMaxPort } = config.worker;

  console.log('initializeWorkers() creating %d mediasoup workers', config.numWorkers);

  for (let i = 0; i < config.numWorkers; ++i) {
    const worker = await mediasoup.createWorker({
      logLevel, logTags, rtcMinPort, rtcMaxPort
    });

    worker.once('died', () => {
      console.error('worker::died worker has died exiting in 2 seconds... [pid:%d]', worker.pid);
      setTimeout(() => process.exit(1), 2000);
    });

    workers.push(worker);
  }
};


    


    module.exports.createRouter = async () => {
  const worker = getNextWorker();

  console.log('createRouter() creating new router [worker.pid:%d]', worker.pid);

  console.log(`config.router.mediaCodecs:${JSON.stringify(config.router.mediaCodecs)}`)

  return await worker.createRouter({ mediaCodecs: config.router.mediaCodecs });
};


    


    We pass router.rtpCompatibilities to the client. clients get the rtpCompatibilities and create a device and loads it. after that a transport must be created at server side.

    


        const handleCreateTransportRequest = async (jsonMessage) => {

  const transport = await createTransport('webRtc', router);

  var peer;
  try {peer = peers.get(jsonMessage.sessionId);}
  catch{console.log('peer not found')}
  
  peer.addTransport(transport);

  peer.socket.emit('create-transport',{
    id: transport.id,
    iceParameters: transport.iceParameters,
    iceCandidates: transport.iceCandidates,
    dtlsParameters: transport.dtlsParameters
  });
};


    


    Then after the client side also created the transport we listen to connect event an at the time of event, we request the server to create connection.

    


    const handleTransportConnectRequest = async (jsonMessage) => {
  var peer;
  try {peer = peers.get(jsonMessage.sessionId);}
  catch{console.log('peer not found')}

  if (!peer) {
    throw new Error(`Peer with id ${jsonMessage.sessionId} was not found`);
  }

  const transport = peer.getTransport(jsonMessage.transportId);

  if (!transport) {
    throw new Error(`Transport with id ${jsonMessage.transportId} was not found`);
  }

  await transport.connect({ dtlsParameters: jsonMessage.dtlsParameters });
  console.log('handleTransportConnectRequest() transport connected');
  peer.socket.emit('connect-transport');
};


    


    Similar thing happen on produce event.

    


    const handleProduceRequest = async (jsonMessage) => {
  console.log('handleProduceRequest [data:%o]', jsonMessage);

  var peer;
  try {peer = peers.get(jsonMessage.sessionId);}
  catch{console.log('peer not found')}

  if (!peer) {
    throw new Error(`Peer with id ${jsonMessage.sessionId} was not found`);
  }

  const transport = peer.getTransport(jsonMessage.transportId);

  if (!transport) {
    throw new Error(`Transport with id ${jsonMessage.transportId} was not found`);
  }

  const producer = await transport.produce({
    kind: jsonMessage.kind,
    rtpParameters: jsonMessage.rtpParameters
  });

  peer.addProducer(producer);

  console.log('handleProducerRequest() new producer added [id:%s, kind:%s]', producer.id, producer.kind);

  peer.socket.emit('produce',{
    id: producer.id,
    kind: producer.kind
  });
};


    


    For Recording, first I create plain transports for audio and video producers.

    


    const rtpTransport = router.createPlainTransport(config.plainRtpTransport);


    


    then rtp transport must be connected to ports :

    


      await rtpTransport.connect({
    ip: '127.0.0.1',
    port: remoteRtpPort,
    rtcpPort: remoteRtcpPort
  });


    


    Then the consumer must also be created.

    


      const rtpConsumer = await rtpTransport.consume({
    producerId: producer.id,
    rtpCapabilities,
    paused: true
  });


    


    After that we can start recording using following code :

    


     this._rtpParameters = args;
    this._process = undefined;
    this._observer = new EventEmitter();
    this._peer = args.peer;

    this._sdpString = createSdpText(this._rtpParameters);
    this._sdpStream = convertStringToStream(this._sdpString);
    // create dir
    const dir = process.env.REOCRDING_PATH ?? 'storage/recordings';
    if (!fs.existsSync(dir)) shelljs.mkdir('-p', dir);
  
    this._extension = 'webm';
    // create file path
    this._path = `${dir}/${args.peer.sessionId}.${this._extension}`
    let loop = 0;
    while(fs.existsSync(this._path)) {
      this._path = `${dir}/${args.peer.sessionId}-${++loop}.${this._extension}`
    }

this._recordingnModel = await Recording.findOne({sessionIds: { $in: [this._peer.sessionId] }})
    this._recordingnModel.files.push(this._path);
    this._recordingnModel.save();

let proc  = ffmpeg(this._sdpStream)
    .inputOptions([
      '-protocol_whitelist','pipe,udp,rtp',
      '-f','sdp',
    ])
    .format(this._extension)
    .output(this._path)
    .size('720x?')
    .on('start', ()=>{
      this._peer.socket.emit('recording');
    })
    .on('end', ()=>{
      let path = this._path.replace('storage/recordings/', '');
      this._peer.socket.emit('recording-closed', {
        url: `${process.env.APP_URL}/recording/file/${path}`
      });
    });

    proc.run();
    this._process =  proc;
  }



    


  • Introducing Matomo SEO Web Vitals

    13 septembre 2021, par Ben Erskine — About, Analytics Tips, Plugins

    SEO Web Vitals track your critical website performance metrics and are a core element of SEO best practice. 

    Start using Matomo SEO Web Vitals to monitor your website performance, optimise your visitor experience, improve your search result rankings, and see how your site compares to your competitors.

    SEO Web Vitals

    What are SEO Web Vitals ?

    Web Vitals are made up of a number of important metrics, such as your website’s page speed and loading performance, these metrics all play an important role in search engine optimisation. 

    The more technical terms for these metrics are Page Speed Score, First Contentful Paint (FCP), Final Input Delay (FID), Last Contentful Paint (LCP) and Cumulative Layout Shift (CLS).

    Why should you use SEO Web Vitals ?

    SEO Web Vitals are being used more and more by search engines such as Google to rank websites so they help ensure a great page experience for users who arrive via links from their search results. 

    By monitoring your SEO Web Vitals you can see how good or bad a single page performs and then prioritise the optimisation of strategically important pages to help improve the ranking position within search engine results.

    For ease of use you can receive regular reports in your email inbox and you can configure custom alerts to automatically notify you when a page score changes significantly. This saves time by not having to check page performance scores manually while ensuring you will be notified should there be any important change that needs to be actioned.

    You should use SEO Web Vitals to understand how your site performance is impacting your overall visitor experience.

    Four key benefits of using SEO Web Vitals :

    Improve your search result rankings

    • SEO Web Vitals are a core element of SEO best practice and directly impact your search rankings.
    • Pages that load quickly and are more stable deliver a better user experience, so they’re ranked higher by search engines.

    Optimise your website visitor experience

    • Know how quickly pages on your website load to ensure you deliver an optimal visitor experience.
    • Identify page stability issues and implement the changes needed to enhance your visitor experience.

    Automate your website performance monitoring

    • Have peace of mind knowing if your metrics decrease, you can find and fix the root cause quickly.
    • Configure performance alerts and get automated reports sent to you.

    Incorporate website performance into your competitor analysis

    • These performance metrics are essentially open for anyone to inspect, so you can measure and benchmark your site against competitors. 

    How can I improve my SEO Web Vitals ?

    There are so many ways to improve these performance metrics, here are five of the common contributing factors.

    1. Your page speed score is a weighted average of your other performance metrics, so focus on improving the underlying metrics that contribute to this score.
    2. Ensure you use a high quality web host with an appropriate plan for your level of traffic to help improve your FCP time.
    3. Try removing large elements that aren’t required on your page to improve your LCP time.
    4. Optimise against Total Blocking Time to Improve your FID score.
    5. Consider using a Layout Shift Debugger to improve Your CLS Score

    Guide to Matomo SEO Web Vitals

    For more information and to learn how to configure SEO Web Vitals in Matomo, check out our full guide to SEO Web Vitals.

    You will learn :

    Need more resources ?

    Matomo Plugin SEO Web Vitals

    Matomo SEO Web Vitals FAQs