Recherche avancée

Médias (21)

Mot : - Tags -/Nine Inch Nails

Autres articles (89)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (9474)

  • How to make words in ASS subtitles appear one at a time ?

    21 septembre 2022, par ThaDon

    I am trying to burn subtitles into a video such that they appear in a word-by-word fashion instead of all at once.

    



    What I mean by this is, a word will appear, then another word will appear next to it, and so on. Eventually the line will clear, then repeat.

    



    Example :

    



    A video shows a person speaking about chess. Subtitles at the bottom of the screen say “winning”. Moments later, the subtitles change to “winning a”, then “winning a piece”, later “winning a piece now” and so on. Each consecutive word instantly appears in whole, but each word only appears when the speaker says it.

    



    I thought I could create an Advanced Substation Alpha file where subtitles share the same end-time but differing start times, however FFMPEG doesn't seem to cope very well when rendering the file :

    



    [Script Info]
; Script generated by FFmpeg/Lavc57.107.100
ScriptType: v4.00+
PlayResX: 384
PlayResY: 288

[V4+ Styles]
Format: Name, Fontname, Fontsize, PrimaryColour, SecondaryColour, OutlineColour, BackColour, Bold, Italic, Underline, StrikeOut, ScaleX, ScaleY, Spacing, Angle, BorderStyle, Outline, Shadow, Alignment, MarginL, MarginR, MarginV, Encoding
Style: Default,Arial,16,&Hffffff,&Hffffff,&H0,&H0,0,0,0,0,100,100,0,0,1,1,0,2,10,10,10,0

[Events]
Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text
Dialogue: 0,0:00:00.00,0:00:03.46,Default,,0,0,0,,I'm
Dialogue: 0,0:00:01.00,0:00:03.46,Default,,0,0,0,,a
Dialogue: 0,0:00:01.50,0:00:03.46,Default,,0,0,0,,subtitle


    



    The idea being that I'm would appear, then 1 second later a would show up next to it followed by subtitle a half second later

    


  • How to Match ASS Subtitle Font Size with Flutter Text Size for a 1080p Video ?

    16 décembre 2024, par Mostafa Fathi

    I’m working on a project where I need to synchronize the font size of ASS subtitles with the text size a user sees in a Flutter application. The goal is to ensure that the text size in a 1080p video matches what the user sees in the app.

    


    What I've Tried :

    


      

    1. Calculating font size using height ratio (PlayResY/DeviceHeight) :
    2. 


    


      

    • I used the formula :
    • 


    


    FontSize_ASS = FontSize_Flutter * (PlayResY / DeviceHeight)


    


      

    • While the result seemed logical, the final output in the video was smaller than expected.
    • 


    


      

    1. Adding a scaling factor :
    2. 


    


      

    • I introduced a scaling factor (around 3.0) to address the size discrepancy.
    • 


    • This improved the result but still felt inconsistent and lacked precision.
    • 


    


      

    1. Using force_style in FFmpeg :
    2. 


    


      

    • I applied the force_style parameter to control the font size in FFmpeg directly.
    • 


    


    ffmpeg -i input.mp4 -vf "subtitles=subtitle.ass:force_style='FontSize=90'" -c:a copy output.mp4


    


      

    • While it produced better results, it’s not an ideal solution as it bypasses the calculations in the ASS file.
    • 


    


      

    1. Aligning PlayResX and PlayResY in the ASS file :
I ensured that these parameters matched the target video resolution (1920×1080) :
    2. 


    


    PlayResX: 1920
PlayResY: 1080


    


      

    • Despite this adjustment, the font size didn’t align perfectly with the Flutter app text size.
    • 


    


      

    1. Reading font metrics from the font file dynamically :
To improve precision, I wrote a function in Flutter that reads font metrics (units per EM, ascender, and descender) from the TTF font file and calculates a more accurate scaling factor :
    2. 


    


    Future readFontMetrics(
      String fontFilePath, 
      double originalFontSize,
) async {
  final fontData = await File(fontFilePath).readAsBytes();
  final fontBytes = fontData.buffer.asUint8List();
  final byteData = ByteData.sublistView(fontBytes);

  int numTables = readUInt16BE(byteData, 4);
  int offsetTableStart = 12;
  Map> tables = {};

  for (int i = 0; i < numTables; i++) {
    int recordOffset = offsetTableStart + i * 16;
    String tag =
        utf8.decode(fontBytes.sublist(recordOffset, recordOffset          + 4));
    int offset = readUInt32BE(byteData, recordOffset + 8);
    int length = readUInt32BE(byteData, recordOffset + 12);

    tables[tag] = {
      'offset': offset,
      'length': length,
    };
  }

  if (!tables.containsKey('head') || !tables.containsKey('hhea'){
    print('Required tables not found in the font file.');
    return null;
  }

  int headOffset = tables['head']!['offset']!;
  int unitsPerEm = readUInt16BE(byteData, headOffset + 18);

  int hheaOffset = tables['hhea']!['offset']!;
  int ascender = readInt16BE(byteData, hheaOffset + 4);
  int descender = readInt16BE(byteData, hheaOffset + 6);

  print('unitsPerEm: $unitsPerEm');
  print('ascender: $ascender');
  print('descender: $descender');

  int nominalSize = unitsPerEm;
  int realDimensionSize = ascender - descender;
  double scaleFactor = realDimensionSize / nominalSize;
  double realFontSize = originalFontSize * scaleFactor;

  print('Scale Factor: $scaleFactor');
  print('Real Font Size: $realFontSize');

  return realFontSize;
}


    


      

    • This function dynamically reads the font properties (ascender, descender, and unitsPerEM) and calculates a scale factor to get the real font size. Despite this effort, discrepancies persist when mapping it to the ASS font size.
    • 


    


    Question :
How can I ensure that the font size in the ASS file accurately reflects the size the user sees in Flutter ? Is there a reliable method to calculate or align the sizes correctly across both systems ? Any insights or suggestions would be greatly appreciated.

    


    Thank you ! 🙏

    


  • FFmpeg generated video length doesn't match expected length [on hold]

    21 juin 2018, par BentCoder

    I am using command below to generate video out of images. The problem is the length out the video generated. It is always 5 seconds less than expected. If I have 4 images, I expect to see 20 seconds of video but I am getting 15 seconds instead. The very last image img-03.jpg just appears at the end of the video as if it is a cover image. Any idea why and solution ?

    Command

    ffmpeg -y -framerate 1/5 -f image2 \
    -i img-%2d.jpg \
    -c:v libvpx-vp9 \
    -r 25 \
    -crf 30 -b:v 0 \
    video.webm

    Images

    img-00.jpg
    img-01.jpg
    img-02.jpg
    img-03.jpg

    Output

    ffmpeg version 3.2.10-1~deb9u1~bpo8+1 Copyright (c) 2000-2018 the FFmpeg developers
     built with gcc 4.9.2 (Debian 4.9.2-10)
     configuration: --prefix=/usr --extra-version='1~deb9u1~bpo8+1' --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --disable-libebur128 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
     libavutil      55. 34.101 / 55. 34.101
     libavcodec     57. 64.101 / 57. 64.101
     libavformat    57. 56.101 / 57. 56.101
     libavdevice    57.  1.100 / 57.  1.100
     libavfilter     6. 65.100 /  6. 65.100
     libavresample   3.  1.  0 /  3.  1.  0
     libswscale      4.  2.100 /  4.  2.100
     libswresample   2.  3.100 /  2.  3.100
     libpostproc    54.  1.100 / 54.  1.100
    Input #0, image2, from 'img-%2d.jpg':
     Duration: 00:00:20.00, start: 0.000000, bitrate: N/A
       Stream #0:0: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 854x480 [SAR 1:1 DAR 427:240], 0.20 fps, 0.20 tbr, 0.20 tbn, 0.20 tbc
    [swscaler @ 0x55f2af60e800] deprecated pixel format used, make sure you did set range correctly
    [libvpx-vp9 @ 0x55f2af6276a0] v1.3.0
    Output #0, webm, to 'video.webm':
     Metadata:
       encoder         : Lavf57.56.101
       Stream #0:0: Video: vp9 (libvpx-vp9), yuv420p, 854x480 [SAR 1:1 DAR 427:240], q=-1--1, 25 fps, 1k tbn, 25 tbc
       Metadata:
         encoder         : Lavc57.64.101 libvpx-vp9
       Side data:
         cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
    Stream mapping:
     Stream #0:0 -> #0:0 (mjpeg (native) -> vp9 (libvpx-vp9))
    Press [q] to stop, [?] for help
    Input stream #0:0 frame changed from size:854x480 fmt:yuvj444p to size:854x480 fmt:yuvj420p
    [swscaler @ 0x55f2af602b60] deprecated pixel format used, make sure you did set range correctly
    frame=    4 fps=0.0 q=0.0 Lsize=     214kB time=00:00:15.00 bitrate= 117.0kbits/s speed=  16x    
    video:214kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.234810%