Recherche avancée

Médias (91)

Autres articles (71)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

  • Récupération d’informations sur le site maître à l’installation d’une instance

    26 novembre 2010, par

    Utilité
    Sur le site principal, une instance de mutualisation est définie par plusieurs choses : Les données dans la table spip_mutus ; Son logo ; Son auteur principal (id_admin dans la table spip_mutus correspondant à un id_auteur de la table spip_auteurs)qui sera le seul à pouvoir créer définitivement l’instance de mutualisation ;
    Il peut donc être tout à fait judicieux de vouloir récupérer certaines de ces informations afin de compléter l’installation d’une instance pour, par exemple : récupérer le (...)

Sur d’autres sites (8674)

  • Revision 2b8dc065d1 : google style guide include guards Change-Id : I2c252f3ddcc99e96c1f5d3dab8bcb25a2

    30 novembre 2012, par Jim Bankoski

    Changed Paths : Modify /vp9/common/arm/vp9_bilinearfilter_arm.h Modify /vp9/common/arm/vp9_idct_arm.h Modify /vp9/common/arm/vp9_loopfilter_arm.h Modify /vp9/common/arm/vp9_recon_arm.h Modify /vp9/common/arm/vp9_subpixel_arm.h Modify /vp9/common/vp9_alloccommon.h (...)

  • Node 18 or Node 20 break ffmpeg (in google cloud functions -> ffprobe was killed with signal SIGSEGV)

    10 janvier 2024, par user20206929

    Please see below, the code is working on node js 16, but not when upgrading to node 18 or 20.

    


    const ffmpeg = require("fluent-ffmpeg");

// Following is inside a .https.onRequest Google Cloud function with enough memory

try {
  const duration = new Promise((resolve, reject) => {
  ffmpeg.ffprobe(videoUrl, async (err, metadata) => {
    if (err) {
      if (res.headersSent) {
        console.error("Response already sent");
        return;
      } else {
        console.log("Metadata:", metadata);
        console.log("err: " + err);
        res.status(400).send("Error getting video metadata");
        return;
      }
    }
  const duration = metadata.format.duration;
  console.log("video duration in second: " + duration);
  resolve(duration);
  });
});
  videoDuration = await duration;
} catch (err) {
  console.log(err);
  throw err;
}


    


    When upgrading to node 18/20 (No other change than upgrading node), the error "ffprobe not found" appears.

    


    But setting the path manually using ffmpeg.setFfprobePath(ffprobePath) ;
trigger the error : Error : ffprobe was killed with signal SIGSEGV

    


    So it seem its a permissions issue.

    


    However, I tried a lot of different solutions, none of them made this work.
For instance i tried to download manually the ffprobe from the official website https://ffbinaries.com/downloads. Then manually add it to the code.

    


    I tried to use https://www.npmjs.com/package/@ffprobe-installer/ffprobe or others package like https://www.npmjs.com/package/ffprobe-static

    


    I also tried to download the ffprobe file to the temporary folder of google cloud, and change the permission of this folder.

    


    All of those was doing the same error.

    


    None of what i could think of made any difference.

    


    Please help because i need to update node 16 to 18 or 20 before google remove node 16 on january 31 2024 and for now i don't see a solution.

    


    I also looked for other solution to get this duration from a video file url, but using ffmpeg seem to be the only one that should work out of the box. As it is working on node 16.

    


    Thank you,

    


    UPDATE - 11/26/2023

    


    GCP Functions NodeJS 16 runtime uses Ubuntu 18.04 with FFMpeg installed.
NodeJS 18/20 use Ubuntu 22.04, and Google decided not to include FFMpeg.

    


    https://cloud.google.com/functions/docs/runtime-support#node.js
https://cloud.google.com/functions/docs/reference/system-packages

    


    No workaround or solutions found as of now

    


    UPDATE - 01/10/2024

    


    Google added back ffmpeg to latest version, this is working as before now.

    


  • FFmpeg : What re-encoding settings can be used to achieve results similar to Google Drive's video processing ?

    4 août 2023, par Mycroft_47

    Context :

    


    I have a large collection of videos recorded by my phone's camera, which is taking up a significant amount of space. Recently, I noticed that when I uploaded a video to Google Drive and then downloaded it again using IDM (by clicking on the pop-up that IDM displays when it detects something that can be downloaded here's what i mean), the downloaded video retained the same visual quality but occupied much less space. Upon further research, I discovered that Google re-encodes uploaded videos using H.264 video encoding, and I believe I can achieve similar compression using FFmpeg.

    


    Problem :

    


    Despite experimenting with various FFmpeg commands, I haven't been able to replicate Google Drive's compression. Every attempt using -codec:v libx264 option alone resulted in videos larger than the original files.

    


    While adjusting the -crf parameter to a higher value and opting for a faster -preset option did yield smaller file sizes, it unfortunately came at the cost of a noticeable degradation in visual quality and the appearance of some visible artifacts in the video.

    


    Google Drive's processing, on the other hand, strikes a commendable balance, achieving a satisfactory file size without compromising visual clarity, (I should note that upon zooming in on this video, I observed some minor blurring, but it was acceptable to me).

    


    Note :

    


    I'm aware that using the H.265 video encoder instead of H.264 may give better results. However, to ensure fairness and avoid any potential bias, I think the optimal approach is first to find the best command using the H.264 video encoder. Once identified, I can then replace -codec:v libx264 with -codec:v libx265. This approach will ensure that the chosen command is really the best that FFMPEG can achieve, and that it is not solely influenced by the superior performance of H.265 when used from the outset.

    


    Here's the FFMPEG command I am currently using :

    


    ffmpeg -hide_banner -loglevel verbose ^
    -i input.mp4 ^
    -codec:v libx264 ^
    -crf 36 -preset ultrafast ^
    -codec:a libopus -b:a 112k ^
    -movflags use_metadata_tags+faststart -map_metadata 0 ^
    output.mp4


    


    





    


    


    


    


    


    


    


    



    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    


    Video file Size (bytes) Bit rate (bps) Encoder FFPROB - JSON
    Original (named 'raw 1.mp4') 31,666,777 10,314,710  !!! link
    Without crf 36,251,852 11,805,216 Lavf60.3.100 link
    With crf 10,179,113 3,314,772 Lavf60.3.100 link
    Gdrive 6,726,189 2,190,342 Google link

    


    


    Those files can be found here.

    


    Update :

    


    I continued my experiments with the video "raw_1.mp4" and found some interesting results that resemble those shown in this blog post, (I recommend consulting this answer).

    


    In the following figure, I observed that using the -preset set to veryfast provided the most advantageous results, striking the optimal balance between compression ratio and compression time, (Note that a negative percentage in the compression variable indicates an increase in file size after processing) :
enter image description here

    


    In this figure, I used the H.264 encoder and compared the compression ratio of different outputted files resulting from seven different values of the -crf parameter (CRF values used : 25, 27, 29, 31, 33, 35, 37),
enter image description here

    


    For this figure, I've switched the encoder to H.265 while maintaining the same CRF values used in the previous figure :
enter image description here

    


    Based on these results, the -preset veryfast and a -crf value of 31 are my current preferred settings for FFmpeg, until they are proven to be suboptimal choices.
As a result, the FFmpeg command I'll use is as follows :

    


    ffmpeg -hide_banner -loglevel verbose ^
    -i input.mp4 ^
    -codec:v libx264 ^
    -crf 31 -preset veryfast ^
    -codec:a libopus -b:a 112k ^
    -movflags use_metadata_tags+faststart -map_metadata 0 ^
    output.mp4


    


    Note that these choices are based solely on the compression results obtained so far, and they do not take into account the visual quality of the outputted files.