Recherche avancée

Médias (1)

Mot : - Tags -/swfupload

Autres articles (85)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

Sur d’autres sites (8113)

  • ffmpeg banner information for scale, preset and unsharp mask

    12 décembre 2019, par Brajesh

    I’m using ffmpeg with these three settings (scale, unsharp, preset) with different values, for example, the below could be scale of 1920, or a preset of veryfast etc.

    -vf scale=1280:720 unsharp=9:9:0.75:9:9:0.75 -preset superfast

    My question is, how can I look at the banner (the initial message ffmpeg displays when its running) to know what were the values of these three settings that I used ?

    I’m using a script to pass in the values, and I want to make sure that in the execution, the right value is being used and there is no error in my code.

    thank you

  • ffmpeg failing to add png mask to video : Requested planes not available

    23 août 2022, par Alexandr Sugak

    I am trying to add png mask to make webm video round (cut off its corners).

    


    The command I am using :

    


    video="./dist/tmp/19_2.webm"
mask="./dist/tmp/mask.png"
output="./dist/tmp/circle.webm"

ffmpeg -report -c:v libvpx-vp9 -i "${video}" -loop 1 -i "${mask}" -filter_complex " \
[1:v]alphaextract[alf];\
[0:v][alf]alphamerge" \
-c:a copy -c:v libvpx-vp9 "${output}"


    


    The command output :

    


    sh ./scripts/video_mask.sh 
ffmpeg started on 2022-08-23 at 17:27:48
Report written to "ffmpeg-20220823-172748.log"
Log level: 48
ffmpeg version 5.1-tessus Copyright (c) 2000-2022 the FFmpeg developers
  built with Apple clang version 11.0.0 (clang-1100.0.33.17)
  configuration: --cc=/usr/bin/clang --prefix=/opt/ffmpeg --extra-version=tessus --enable-avisynth --enable-fontconfig --enable-gpl --enable-libaom --enable-libass --enable-libbluray --enable-libdav1d --enable-libfreetype --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libmysofa --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264 --enable-libopenjpeg --enable-libopus --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvmaf --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-version3 --pkg-config-flags=--static --disable-ffplay
  libavutil      57. 28.100 / 57. 28.100
  libavcodec     59. 37.100 / 59. 37.100
  libavformat    59. 27.100 / 59. 27.100
  libavdevice    59.  7.100 / 59.  7.100
  libavfilter     8. 44.100 /  8. 44.100
  libswscale      6.  7.100 /  6.  7.100
  libswresample   4.  7.100 /  4.  7.100
  libpostproc    56.  6.100 / 56.  6.100
[libvpx-vp9 @ 0x7fa072f05140] v1.11.0-30-g888bafc78
    Last message repeated 1 times
Input #0, matroska,webm, from './dist/tmp/19_2.webm':
  Metadata:
    ENCODER         : Lavf59.27.100
  Duration: 00:00:02.77, start: -0.007000, bitrate: 308 kb/s
  Stream #0:0(eng): Video: vp9 (Profile 0), yuva420p(tv, unknown/bt709/iec61966-2-1, progressive), 640x480, SAR 1:1 DAR 4:3, 1k tbr, 1k tbn (default)
    Metadata:
      ALPHA_MODE      : 1
      ENCODER         : Lavc59.37.100 libvpx-vp9
      DURATION        : 00:00:02.744000000
  Stream #0:1(eng): Audio: opus, 48000 Hz, mono, fltp (default)
    Metadata:
      ENCODER         : Lavc59.37.100 libopus
      DURATION        : 00:00:02.767000000
Input #1, png_pipe, from './dist/tmp/mask.png':
  Duration: N/A, bitrate: N/A
  Stream #1:0: Video: png, pal8(pc), 640x480 [SAR 2835:2835 DAR 4:3], 25 fps, 25 tbr, 25 tbn
[libvpx-vp9 @ 0x7fa082f04880] v1.11.0-30-g888bafc78
Stream mapping:
  Stream #0:0 (libvpx-vp9) -> alphamerge
  Stream #1:0 (png) -> alphaextract:default
  alphamerge:default -> Stream #0:0 (libvpx-vp9)
  Stream #0:1 -> #0:1 (copy)
Press [q] to stop, [?] for help
[libvpx-vp9 @ 0x7fa082f04880] v1.11.0-30-g888bafc78
[Parsed_alphaextract_0 @ 0x7fa083906e80] Requested planes not available.
[Parsed_alphaextract_0 @ 0x7fa083906e80] Failed to configure input pad on Parsed_alphaextract_0
Error reinitializing filters!
Failed to inject frame into filter network: Invalid argument
Error while processing the decoded data for stream #0:0
Conversion failed!


    


    I've tried different combination of codecs and pixel formats but I still get the same error. My initial understanding was that ffmpeg fails to find the alpha channel in the input video. By setting -c:v libvpx-vp9 option it looks like ffmpeg correctly picks up yuva420p pixel format but it still gives the same error.

    


    What I am doing wrong ?

    


    Update : if I remove the alphaextract step as suggested in comments, the ffmpeg starts processing video indefinitely (the video I use to test is only 2 sec long). If I specify the number of frames manually, then the output is generated but the mask does not seem to have any effect :

    


    ffmpeg -c:v libvpx-vp9 -i "${video}" -loop 1 -i "${mask}" -filter_complex " \
[0:v][1:v]alphamerge" \
-c:a copy -b:v 2000k -vframes 60 "${output}"


    


     sh ./scripts/video_mask.sh 
ffmpeg version 5.1-tessus Copyright (c) 2000-2022 the FFmpeg developers
  built with Apple clang version 11.0.0 (clang-1100.0.33.17)
  configuration: --cc=/usr/bin/clang --prefix=/opt/ffmpeg --extra-version=tessus --enable-avisynth --enable-fontconfig --enable-gpl --enable-libaom --enable-libass --enable-libbluray --enable-libdav1d --enable-libfreetype --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libmysofa --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenh264 --enable-libopenjpeg --enable-libopus --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvmaf --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-version3 --pkg-config-flags=--static --disable-ffplay
  libavutil      57. 28.100 / 57. 28.100
  libavcodec     59. 37.100 / 59. 37.100
  libavformat    59. 27.100 / 59. 27.100
  libavdevice    59.  7.100 / 59.  7.100
  libavfilter     8. 44.100 /  8. 44.100
  libswscale      6.  7.100 /  6.  7.100
  libswresample   4.  7.100 /  4.  7.100
  libpostproc    56.  6.100 / 56.  6.100
[libvpx-vp9 @ 0x7fdd6b005f00] v1.11.0-30-g888bafc78
    Last message repeated 1 times
Input #0, matroska,webm, from './dist/tmp/19_2.webm':
  Metadata:
    ENCODER         : Lavf59.27.100
  Duration: 00:00:02.77, start: -0.007000, bitrate: 308 kb/s
  Stream #0:0(eng): Video: vp9 (Profile 0), yuva420p(tv, unknown/bt709/iec61966-2-1, progressive), 640x480, SAR 1:1 DAR 4:3, 1k tbr, 1k tbn (default)
    Metadata:
      ALPHA_MODE      : 1
      ENCODER         : Lavc59.37.100 libvpx-vp9
      DURATION        : 00:00:02.744000000
  Stream #0:1(eng): Audio: opus, 48000 Hz, mono, fltp (default)
    Metadata:
      ENCODER         : Lavc59.37.100 libopus
      DURATION        : 00:00:02.767000000
Input #1, png_pipe, from './dist/tmp/mask.png':
  Duration: N/A, bitrate: N/A
  Stream #1:0: Video: png, pal8(pc), 640x480 [SAR 2835:2835 DAR 4:3], 25 fps, 25 tbr, 25 tbn
File './dist/tmp/circle.webm' already exists. Overwrite? [y/N] y
[libvpx-vp9 @ 0x7fdd6b007ec0] v1.11.0-30-g888bafc78
Stream mapping:
  Stream #0:0 (libvpx-vp9) -> alphamerge
  Stream #1:0 (png) -> alphamerge
  alphamerge:default -> Stream #0:0 (libvpx-vp9)
  Stream #0:1 -> #0:1 (copy)
Press [q] to stop, [?] for help
[libvpx-vp9 @ 0x7fdd6b007ec0] v1.11.0-30-g888bafc78
[libvpx-vp9 @ 0x7fdd6b024580] v1.11.0-30-g888bafc78
Output #0, webm, to './dist/tmp/circle.webm':
  Metadata:
    encoder         : Lavf59.27.100
  Stream #0:0: Video: vp9, yuva420p(tv, unknown/bt709/iec61966-2-1, progressive), 640x480 [SAR 1:1 DAR 4:3], q=2-31, 2000 kb/s, 1k fps, 1k tbn
    Metadata:
      encoder         : Lavc59.37.100 libvpx-vp9
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
  Stream #0:1(eng): Audio: opus, 48000 Hz, mono, fltp (default)
    Metadata:
      ENCODER         : Lavc59.37.100 libopus
      DURATION        : 00:00:02.767000000
frame=   60 fps= 16 q=2.0 Lsize=     285kB time=00:00:01.98 bitrate=1175.5kbits/s speed=0.526x    
video:270kB audio:11kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.529399%


    


  • How does one encode a series of images into H264 using the x264 C API ?

    20 août 2019, par Rella

    How does one use the x264 C API to encode RBG images into H264 frames ? I already created a sequence of RBG images, how can I now transform that sequence into a sequence of H264 frames ? In particular, how do I encode this sequence of RGB images into a sequence of H264 frame consisting of a single initial H264 keyframe followed by dependent H264 frames ?