Recherche avancée

Médias (91)

Autres articles (97)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

Sur d’autres sites (8308)

  • ffmpeg kit flutter IOS No such filter : 'drawtext' Error

    6 août 2023, par Patel Milan

    I am using ffmpeg_kit_flutter and apply drawtext filter on video but i am getting errors No such filter: 'drawtext'

    


    input video link is input.mp4

    


    This command it working

    


    ffmpeg -y -i input.mp4 -filter_complex '[0]scale=540:-1[s];[s]drawtext=text='your_text_here':fontsize=24:fontcolor=white:x=(w-text_w)/2:y=(h-text_h)/2' output.mp4


    


    code sample is bellow

    


    /// Create Video With Text&#xA;Future<void> createVideoWithText() async {&#xA;  final file = File(&#x27;${(await getTemporaryDirectory()).path}/output.mp4&#x27;);&#xA;  String outPut = file.path;&#xA;&#xA;  String command = "-y -i $inputFilePath -filter_complex &#x27;[0]scale=540:-1[s];[s]drawtext=text=&#x27;MY_TEXT&#x27;:fontsize=24:fontcolor=white:x=(w-text_w)/2:y=(h-text_h)/2&#x27; $outPut";&#xA;&#xA;  FFmpegKit.executeAsync(&#xA;    command,&#xA;    (session) async {&#xA;      final returnCode = await session.getReturnCode();&#xA;&#xA;      if (ReturnCode.isSuccess(returnCode)) {&#xA;print(&#x27;Success full add text on video&#x27;);&#xA;      }else{print(&#x27;Error to adding text on video&#x27;);}&#xA;    },&#xA;  );&#xA;}&#xA;</void>

    &#xA;

  • Fluent ffmpeg not running synchronously

    14 mai 2022, par sciencaholic

    I am writing a program where I need to process a video multiple times using ffmpeg. The ffmpeg codes (below) are inside a 'then' statement of a promise.

    &#xA;&#xA;

    ffmpeg(path)&#xA;  .size(&#x27;640x?&#x27;)&#xA;  .aspect(&#x27;1:1&#x27;)&#xA;  .autopad(&#x27;#682BAB&#x27;)&#xA;  .saveToFile(`${userDirPath}/11-${userFileName}`)&#xA;  .on(&#x27;end&#x27;, () => {&#xA;    ffmpeg()&#xA;      .input(&#x27;test-11-start.mp4&#x27;)&#xA;      .mergeAdd(`${userDirPath}/11-${userFileName}`)&#xA;      .mergeAdd(&#x27;test-11-end.mp4&#x27;)&#xA;      .mergeToFile(`${userDirPath}/11-final-${userFileName}`, &#x27;temp/&#x27;)&#xA;      .on(&#x27;end&#x27;, () => console.log(&#x27;FFmpeg done!&#x27;));&#xA;  });&#xA;

    &#xA;&#xA;

    There is another ffmpeg function after this (same, but with a different aspect ratio) and then, a 'then' statement with some other functions.

    &#xA;&#xA;

    The problem is that this ffmpeg function runs asynchronously, and the next statements (which use the resulting file of ffmpeg func) are executed before it finishes executing and so I want it to run synchronously. I've tried async await (below) but it still runs asynchronously. What is wrong with code ?

    &#xA;&#xA;

    async function ffmpegMerge() {&#xA;  try {&#xA;    await ffmpeg(path)&#xA;    .size(&#x27;640x?&#x27;)&#xA;    .aspect(&#x27;1:1&#x27;)&#xA;    .autopad(&#x27;#682BAB&#x27;)&#xA;    .saveToFile(`${userDirPath}/11-${userFileName}`)&#xA;    .on(&#x27;end&#x27;, () => {&#xA;      ffmpeg()&#xA;        .input(`test-11-start.mp4`)&#xA;        .mergeAdd(`${userDirPath}/11-${userFileName}`)&#xA;        .mergeAdd(`test-11-end.mp4`)&#xA;        .mergeToFile(`${userDirPath}/11-final-${userFileName}.mp4`, &#x27;temp/&#x27;)&#xA;        .on(&#x27;end&#x27;, () => console.log(&#x27;FFmpeg done!&#x27;));&#xA;    })&#xA;  }&#xA;  catch (err) {&#xA;    return Promise.reject(new Error(err));&#xA;  }&#xA;}&#xA;

    &#xA;

  • Overlay a video and an image over a background video and shift that background video's position to the right

    4 août 2023, par sybr

    I'm currently working on a way to improve my production process for the videos that I'm making. I usually edit videos using two folders full of clips. Being able to automate putting these together in Premiere would save me a lot of time.

    &#xA;

    I'm using the FFMpegCORE C# library, as well as Xabe.FFMpeg to achieve what I'm trying to do, and I've currently reached the point of creating two separate videos using the clips I mentioned earlier.

    &#xA;

    The final step that I need to solve is to overlay the overlay video (overlay.mp4) over the background video (background.mp4), add an overlay image to add a clean divider (overlay.png) between the edges of the videos, and to somehow shift the position of the background video 33% to the right (background.mp4).

    &#xA;

    This is the script I've come up with so far :

    &#xA;

    ffmpeg -i overlay.mp4 -i background.mp4 -i overlay.png -filter_complex \ &#xA;"[1:v]scale=608:1080[a]; [0:v]pad=1920:1080[b]; [b][a]overlay[out]; \&#xA;[out][2:v]overlay[out]; [0][1]amix[a];" \&#xA;-map [out] -map [a] -c:v libx264 -crf 30 -pix_fmt yuv420p -y output.mp4&#xA;

    &#xA;

    Is there a way to shift the background video to the right ? Or should I manually edit the video files ?

    &#xA;

    Would love some help here, also eager to learn more about FFmpeg, so some explanation would be highly appreciated !

    &#xA;

    Edit :

    &#xA;

    Just found the solution, padding the background video for 33% and then cropping the final out back to 1920x1080 seemed to do the trick !

    &#xA;

    ffmpeg -i overlay.mp4 -i background.mp4 -i overlay.png -filter_complex "[1:v]scale=608:1080[a]; [0:v]pad=2560:0:x=1920:y=1080[b]; [b][a]overlay[out]; [out][2:v]overlay[out]; [out]crop=1920:1080:0:0[out]; [0][1]amix[a];" -map [out] -map [a] -c:v libx264 -crf 30 -pix_fmt yuv420p -y output.mp4&#xA;

    &#xA;