Recherche avancée

Médias (1)

Mot : - Tags -/framasoft

Autres articles (106)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Gestion de la ferme

    2 mars 2010, par

    La ferme est gérée dans son ensemble par des "super admins".
    Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
    Dans un premier temps il utilise le plugin "Gestion de mutualisation"

  • MediaSPIP Player : problèmes potentiels

    22 février 2011, par

    Le lecteur ne fonctionne pas sur Internet Explorer
    Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
    Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...)

Sur d’autres sites (9905)

  • How can I save the downloaded and converted audios to a specific folder

    19 décembre 2022, par Retire Young

    I'm trying to download Videos from a specific playlist and converting it to a m4a file for further downloading. The process works perfectly but the files are being saved in the same folder where my Phyton script is located. I choose a specific Output Folder but the files don't appear there.
Any ways to save the files in the desired folder ?

    


    An error message reads as follows : "input.mp4: No such file or directory"
In my Code block, I censored the playlist link and the user name on the path. Just imagine a working playlist link instead. Thanks in advance !

    


    import os
import subprocess


    


    Replace with the URL of the YouTube playlist

    


    playlist_url = "PLAYLISTLINK"

    


    Set the output directory for the downloaded files

    


    output_dir = "/Users/USER/Documents/VideoBot/Output"

    


    Use youtube-dl to download the latest video in the playlist

    


    subprocess.run(["youtube-dl", "-f", "bestaudio[ext=m4a]", "--playlist-start", "1", "--playlist-end", "1", playlist_url])

    


    Extract the audio from the downloaded video using ffmpeg

    


    subprocess.run(["ffmpeg", "-i", "input.mp4", "-vn", "-acodec", "copy", "output.m4a"])

    


    Move the extracted audio file to the output directory

    


    os.rename("output.m4a", os.path.join(output_dir, "output.m4a"))

    


  • How to save a video on top of it a text widget changes every couple of seconds flutter ?

    23 septembre 2023, par abdallah mostafa

    I've been working on auto subtitle tool for videos but I did not How to save a final video.
should I record the video or get screenshots of all the frames and combine them together to be a video.

    


    I've used FFmpegKit but it's so hard to make the position of the text

    


      Future<void> saveSubtitle(&#xA;      {required double leftPosition,&#xA;      required double topPosition,&#xA;      required double opacityOfBackground,&#xA;      required String backgroundColor,&#xA;      required String subtitleColor}) async {&#xA;    emit(ExportSubtitleLoading());&#xA;&#xA;    String fontDirectoryPath =&#xA;        await _exportSubtitle.writeFontToFile(&#x27;assets/fonts/arial.ttf&#x27;);&#xA;    if (backgroundColor == &#x27;Transparent&#x27;) {&#xA;      opacityOfBackground = 0.0;&#xA;      backgroundColor = &#x27;black&#x27;;&#xA;    }&#xA;    String subtitleFilter = "";&#xA;    for (var subtitle in subtitles!.fotmatedSubtitle!) {&#xA;      double startTime = _exportSubtitle.timeToSeconds(subtitle.interval![0]);&#xA;      double endTime = _exportSubtitle.timeToSeconds(subtitle.interval![1]);&#xA;      String text = subtitle.displayText!.replaceComma;&#xA;      int fontSize = controller!.value.aspectRatio > 0.5625 ? 24 * 3 : 24;&#xA;      if (countWords(text) > 9 &amp;&amp; countWords(text) &lt;= 15) {&#xA;        // Add line breaks ("\n") to the text&#xA;        text = _exportSubtitle.addLineBreaks(&#xA;          text,&#xA;        );&#xA;      } else {&#xA;        text = _exportSubtitle.addLineBreaks(text, true);&#xA;      }&#xA;      final centeredNumber = text.split(&#x27;\n&#x27;);&#xA;      // centeredNumber[2].split(&#x27; &#x27;).logger;&#xA;      // return;&#xA;      for (var i = 0; i &lt; centeredNumber.length; i&#x2B;&#x2B;) {&#xA;        if (i == 0) {&#xA;          if (centeredNumber.length > 1 &amp;&amp;&#xA;              centeredNumber[i].split(&#x27; &#x27;).join().length >&#xA;                  centeredNumber[i &#x2B; 1].split(&#x27; &#x27;).join().length) {&#xA;            subtitleFilter &#x2B;=&#xA;                " drawtext=text=&#x27;${centeredNumber[i]}&#x27;:enable=&#x27;between(t,$startTime,$endTime)&#x27;:x=$leftPosition-30:y=$topPosition:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";&#xA;          } else {&#xA;            subtitleFilter &#x2B;=&#xA;                " drawtext=text=&#x27;${centeredNumber[i]}&#x27;:enable=&#x27;between(t,$startTime,$endTime)&#x27;:x=$leftPosition&#x2B;20:y=$topPosition:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";&#xA;          }&#xA;        } else if (i == 1) {&#xA;          subtitleFilter &#x2B;=&#xA;              " drawtext=text=&#x27;${centeredNumber[i]}&#x27;:enable=&#x27;between(t,$startTime,$endTime)&#x27;:x=$leftPosition:y=$topPosition&#x2B;25:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";&#xA;        } else {&#xA;          if (centeredNumber.length > 1 &amp;&amp;&#xA;              centeredNumber[i - 1].split(&#x27; &#x27;).join().length >&#xA;                  centeredNumber[i].split(&#x27; &#x27;).join().length) {&#xA;            subtitleFilter &#x2B;=&#xA;                " drawtext=text=&#x27;${centeredNumber[i]}&#x27;:enable=&#x27;between(t,$startTime,$endTime)&#x27;:x=$leftPosition&#x2B;text_w/16:y=$topPosition&#x2B;50:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";&#xA;          } else {&#xA;            subtitleFilter &#x2B;=&#xA;                " drawtext=text=&#x27;${centeredNumber[i]}&#x27;:enable=&#x27;between(t,$startTime,$endTime)&#x27;:x=$leftPosition-text_w/16:y=$topPosition&#x2B;50:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";&#xA;          }&#xA;        }&#xA;      }&#xA;&#xA;      // subtitleFilter &#x2B;=&#xA;      //     " drawtext=text=&#x27;$text&#x27;:enable=&#x27;between(t,$startTime,$endTime)&#x27;:x=$leftPosition:y=$topPosition:fontsize=24:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";&#xA;    }&#xA;&#xA;    final finalFilter = "\"$subtitleFilter\"";&#xA;    // final finalFilter =&#xA;    //     "\"$subtitleFilter split[s1][s2];[s1]crop=w=576:h=1024,scale=576:1024[p];[s2][p]overlay=x=(main_w-overlay_w)/2:y=(main_h-overlay_h)/2[v]\"";&#xA;    final dir = await getTemporaryDirectory();&#xA;    String outputPath = &#x27;${dir.path}/ex_vid.mp4&#x27;;&#xA;    final arguments = [&#xA;      &#x27;-y&#x27;,&#xA;      &#x27;-i&#x27;,&#xA;      inputFile,&#xA;      &#x27;-vf&#x27;,&#xA;      finalFilter,&#xA;      &#x27;-c:v&#x27;,&#xA;      &#x27;libx264&#x27;,&#xA;      &#x27;-c:a&#x27;,&#xA;      &#x27;copy&#x27;,&#xA;      outputPath&#xA;    ];&#xA;    arguments.join(&#x27; &#x27;).logger;&#xA;    // return;&#xA;    // String command =&#xA;    //     "-y -i $inputFile -vf \" drawtext=text=&#x27;You know those cat are memes that everybody uses\nin their videos and the TV movie clips that people use.&#x27;:enable=&#x27;between(t,0,4.000)&#x27;:x=(w-text_w)/2:y=(h-text_h)/2:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5, drawtext=text=&#x27;Well, who are the four best free\nwebsites to find a move?&#x27;:enable=&#x27;between(t,4.000,6.240)&#x27;:x=(w-text_w)/2:y=(h-text_h)/2&#x2B;30:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5, split[s1][s2];[s1]crop=w=576:h=1024,scale=576:1024[p];[s2][p]overlay=x=(main_w-overlay_w)/2:y=(main_h-overlay_h)/2[v]\" -c:v libx264 -c:a copy $outputPath";&#xA;&#xA;    String command =&#xA;        "-y -i $inputFile -vf \" drawtext=text=&#x27;You know those cat are memes that everybody uses\nin their videos and the TV movie clips that people use.&#x27;:enable=&#x27;between(t,0,4.000)&#x27;:x=(w-text_w)/2:y=(h-text_h)/2:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5, drawtext=text=&#x27;Well, who are the four best free\nwebsites to find a move?&#x27;:enable=&#x27;between(t,4.000,6.240)&#x27;:x=$leftPosition:y=$topPosition:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5 \" -c:v libx264 -c:a copy $outputPath";&#xA;&#xA;    &#x27;=================&#x27;.logger;&#xA;    // FFmpegKitConfig.enableUTF8Charset();&#xA;    command.logger;&#xA;    await FFmpegKit.execute(arguments.join(&#x27; &#x27;)).then((session) async {&#xA;      final returnCode = await session.getReturnCode();&#xA;&#xA;      if (ReturnCode.isSuccess(returnCode)) {&#xA;        (&#x27;The Converstion is Success&#x27;).logger;&#xA;        final path = await _exportSubtitle.exportFile(File(outputPath));&#xA;        emit(ExportSubtitleSuccess(path));&#xA;      } else if (ReturnCode.isCancel(returnCode)) {&#xA;        // CANCEL&#xA;        (&#x27;The Converstion is Cancelled&#x27;).logger;&#xA;      } else {&#xA;        emit(ExportSubtitleerror());&#xA;        (&#x27;The Converstion Have an Error&#x27;).logger;&#xA;      }&#xA;    });&#xA;  }&#xA;</void>

    &#xA;

    This function, named saveSubtitle. It is responsible for applying subtitle to a video using FFmpeg. Here's a breakdown of what this function does :

    &#xA;

    It starts by emitting an event to indicate that the subtitle export process is loading.

    &#xA;

    It obtains the file path of a font (arial.ttf) from assets and stores it in fontDirectoryPath.

    &#xA;

    It checks if the background color for subtitles is set to "Transparent." If so, it sets the opacityOfBackground to 0.0 and changes the backgroundColor to black.

    &#xA;

    It initializes an empty subtitleFilter string, which will store FFmpeg filter commands for each subtitle.

    &#xA;

    It iterates through the subtitles and calculates the start and end time, text, and font size for each subtitle.

    &#xA;

    For each subtitle, it calculates the position (x and y coordinates) based on the leftPosition and topPosition. It also sets the font color, font file path, and background color with opacity for the subtitle.

    &#xA;

    It appends the FFmpeg drawtext filter command for each subtitle to the subtitleFilter string.

    &#xA;

    After processing all subtitles, it wraps the subtitleFilter string in double quotes and prepares to use it as an argument for the FFmpeg command.

    &#xA;

    It specifies the output path for the video with subtitles.

    &#xA;

    It constructs the FFmpeg command using various arguments, including the input video file, the subtitle filter, video and audio codecs, and the output path.

    &#xA;

    It executes the FFmpeg command using FFmpegKit and waits for the conversion process to complete.

    &#xA;

    Once the conversion is finished, it checks the return code to determine if it was successful. If successful, it emits a success event with the path to the exported video with subtitles. If canceled or if an error occurred, it emits corresponding events to indicate the status.

    &#xA;

    In summary, this function is used to add subtitles to a video by overlaying text on specific positions and with specified styles. It utilizes FFmpeg for video processing and emits events to notify the application about the export status.

    &#xA;

  • How to mix videos from two participants in a video call layout where each video from the participants are not in sync in janus videoroom and ffmpeg ?

    5 mars 2024, par Adrien3001

    I am using janus gateway to implement a video call app using videoroom plugin. If for example there are two participants in the call the recorded mjr files will be 4 (one audio and video mjr for each user). I am using ffmpeg to first convert the audio to opus and the video to webm for each user. I then combine the video with the audio for each user. Finally I want to mix the resulting two webm videos in a video call layout where one video will overlay the other at the bottom right corner and will be a smaller screen (picture in picture mode). The problem is that I noticed sometimes one of the webm video will have a longer duration than the other and the final mp4 video is out of sync. This is the bash script im using for the video processing to the final mp4 video :

    &#xA;

    #!/bin/bash&#xA;set -x  # Enable debugging&#xA;&#xA;# Check if the correct number of arguments are provided&#xA;if [ "$#" -ne 3 ]; then&#xA;    echo "Usage: $0 videoroomid bossid minionid"&#xA;    exit 1&#xA;fi&#xA;&#xA;videoroomid=$1&#xA;bossid=$2&#xA;minionid=$3&#xA;&#xA;# Define paths and tools&#xA;MJR_DIR="/opt/janus/recordings-folder"&#xA;OUTPUT_DIR="/opt/janus/recordings-folder"&#xA;JANUS_PP_REC="/usr/bin/janus-pp-rec"&#xA;FFMPEG="/usr/bin/ffmpeg"&#xA;THUMBNAIL_DIR="/home/adrienubuntu/okok.spassolab-ubuntu.com/okok-recordings-thumbnail"&#xA;&#xA;# Function to convert MJR to WebM (for video) and Opus (for audio)&#xA;convert_mjr() {&#xA;    local mjr_file=$1&#xA;    local output_file=$2&#xA;    local type=$(basename "$mjr_file" | cut -d &#x27;-&#x27; -f 6)&#xA;&#xA;    echo "Attempting to convert file: $mjr_file"&#xA;&#xA;    if [ ! -f "$mjr_file" ]; then&#xA;        echo "MJR file not found: $mjr_file"&#xA;        return 1&#xA;    fi&#xA;&#xA;    if [ "$type" == "audio" ]; then&#xA;        # Convert audio to Opus format&#xA;        echo "Converting audio to Opus: $mjr_file"&#xA;        $JANUS_PP_REC "$mjr_file" "$output_file"&#xA;        if [ $? -ne 0 ]; then&#xA;            echo "Conversion failed for file: $mjr_file"&#xA;            return 1&#xA;        fi&#xA;        # Check and adjust audio sample rate&#xA;        adjust_audio_sample_rate "$output_file"&#xA;    elif [ "$type" == "video" ]; then&#xA;        # Convert video to WebM format with VP8 video codec&#xA;        echo "Converting video to WebM: $mjr_file"&#xA;        $JANUS_PP_REC "$mjr_file" "$output_file"&#xA;        if [ $? -ne 0 ]; then&#xA;            echo "Conversion failed for file: $mjr_file"&#xA;            return 1&#xA;        fi&#xA;        # Check and convert to constant frame rate&#xA;        convert_to_constant_frame_rate "$output_file"&#xA;    fi&#xA;&#xA;    echo "Conversion successful: $output_file"&#xA;    return 0&#xA;}&#xA;&#xA;# Function to merge audio (Opus) and video (WebM) files&#xA;merge_audio_video() {&#xA;    local audio_file=$1&#xA;    local video_file=$2&#xA;    local merged_file="${video_file%.*}_merged.webm"&#xA;&#xA;    echo "Merging audio and video files into: $merged_file"&#xA;&#xA;    # Merge audio and video&#xA;    $FFMPEG -y -i "$video_file" -i "$audio_file" -c:v copy -c:a libopus -map 0:v:0 -map 1:a:0 "$merged_file"&#xA;&#xA;    if [ $? -eq 0 ]; then&#xA;        echo "Merging successful: $merged_file"&#xA;        return 0&#xA;    else&#xA;        echo "Error during merging."&#xA;        return 1&#xA;    fi&#xA;}&#xA;&#xA;# Function to check if MJR files exist&#xA;check_mjr_files_exist() {&#xA;    local videoroomid=$1&#xA;    local bossid=$2&#xA;    local minionid=$3&#xA;&#xA;    if ! ls ${MJR_DIR}/videoroom-${videoroomid}-user-${bossid}-*-video-*.mjr &amp;>/dev/null ||&#xA;       ! ls ${MJR_DIR}/videoroom-${videoroomid}-user-${minionid}-*-video-*.mjr &amp;>/dev/null; then&#xA;        echo "Error: MJR files not found for videoroomid: $videoroomid, bossid: $bossid, minionid: $minionid"&#xA;        exit 1&#xA;    fi&#xA;}&#xA;&#xA;# Function to calculate delay&#xA;calculate_delay() {&#xA;    local video1=$1&#xA;    local video2=$2&#xA;&#xA;    # Get the start time of the first video&#xA;    local start_time1=$(ffprobe -v error -show_entries format=start_time -of default=noprint_wrappers=1:nokey=1 "$video1")&#xA;&#xA;    # Get the start time of the second video&#xA;    local start_time2=$(ffprobe -v error -show_entries format=start_time -of default=noprint_wrappers=1:nokey=1 "$video2")&#xA;&#xA;    # Calculate the delay (in seconds)&#xA;    local delay=$(echo "$start_time2 - $start_time1" | bc)&#xA;&#xA;    # If the delay is negative, make it positive&#xA;    if [ $(echo "$delay &lt; 0" | bc) -eq 1 ]; then&#xA;        delay=$(echo "-1 * $delay" | bc)&#xA;    fi&#xA;&#xA;    echo "$delay"&#xA;}&#xA;&#xA;# Function to adjust audio sample rate&#xA;adjust_audio_sample_rate() {&#xA;    local audio_file=$1&#xA;    local desired_sample_rate=48000  # Set the desired sample rate&#xA;&#xA;    # Get the current sample rate of the audio file&#xA;    local current_sample_rate=$(ffprobe -v error -show_entries stream=sample_rate -of default=noprint_wrappers=1:nokey=1 "$audio_file")&#xA;&#xA;    # Check if the sample rate needs to be adjusted&#xA;    if [ "$current_sample_rate" -ne "$desired_sample_rate" ]; then&#xA;        echo "Adjusting audio sample rate from $current_sample_rate to $desired_sample_rate"&#xA;        local temp_file="${audio_file%.*}_temp.opus"&#xA;        $FFMPEG -y -i "$audio_file" -ar "$desired_sample_rate" "$temp_file"&#xA;        mv "$temp_file" "$audio_file"&#xA;    fi&#xA;}&#xA;&#xA;# Function to convert video to a constant frame rate&#xA;convert_to_constant_frame_rate() {&#xA;    local video_file=$1&#xA;    local desired_frame_rate=30  # Set the desired frame rate&#xA;&#xA;    # Check if the video has a variable frame rate&#xA;    local has_vfr=$(ffprobe -v error -select_streams v -show_entries stream=r_frame_rate -of default=noprint_wrappers=1:nokey=1 "$video_file")&#xA;&#xA;    if [ "$has_vfr" == "0/0" ]; then&#xA;        echo "Video has a variable frame rate. Converting to a constant frame rate of $desired_frame_rate fps."&#xA;        local temp_file="${video_file%.*}_temp.webm"&#xA;        $FFMPEG -y -i "$video_file" -r "$desired_frame_rate" -c:v libvpx -b:v 1M -c:a copy "$temp_file"&#xA;        mv "$temp_file" "$video_file"&#xA;    fi&#xA;}&#xA;&#xA;# Main processing function&#xA;process_videos() {&#xA;    # Check if MJR files exist&#xA;    check_mjr_files_exist "$videoroomid" "$bossid" "$minionid"&#xA;&#xA;    # Output a message indicating the start of processing&#xA;    echo "Processing started for videoroomid: $videoroomid, bossid: $bossid, minionid: $minionid"&#xA;&#xA;    # Process boss&#x27;s files&#xA;    local boss_audio_files=($(ls ${MJR_DIR}/videoroom-${videoroomid}-user-${bossid}-*-audio-*.mjr))&#xA;    local boss_video_files=($(ls ${MJR_DIR}/videoroom-${videoroomid}-user-${bossid}-*-video-*.mjr))&#xA;    local boss_merged_files=()&#xA;&#xA;    for i in "${!boss_audio_files[@]}"; do&#xA;        local audio_file=${boss_audio_files[$i]}&#xA;        local video_file=${boss_video_files[$i]}&#xA;        convert_mjr "$audio_file" "${audio_file%.*}.opus"&#xA;        convert_mjr "$video_file" "${video_file%.*}.webm"&#xA;        if merge_audio_video "${audio_file%.*}.opus" "${video_file%.*}.webm"; then&#xA;            boss_merged_files&#x2B;=("${video_file%.*}_merged.webm")&#xA;        fi&#xA;    done&#xA;&#xA;    # Concatenate boss&#x27;s merged files&#xA;    if [ ${#boss_merged_files[@]} -gt 0 ]; then&#xA;        local boss_concat_list=$(mktemp)&#xA;        for file in "${boss_merged_files[@]}"; do&#xA;            echo "file &#x27;$file&#x27;" >> "$boss_concat_list"&#xA;        done&#xA;        $FFMPEG -y -f concat -safe 0 -i "$boss_concat_list" -c copy "${OUTPUT_DIR}/${bossid}_final.webm"&#xA;        rm "$boss_concat_list"&#xA;    fi&#xA;&#xA;    # Process minion&#x27;s files&#xA;    local minion_audio_files=($(ls ${MJR_DIR}/videoroom-${videoroomid}-user-${minionid}-*-audio-*.mjr))&#xA;    local minion_video_files=($(ls ${MJR_DIR}/videoroom-${videoroomid}-user-${minionid}-*-video-*.mjr))&#xA;    local minion_merged_files=()&#xA;&#xA;    for i in "${!minion_audio_files[@]}"; do&#xA;        local audio_file=${minion_audio_files[$i]}&#xA;        local video_file=${minion_video_files[$i]}&#xA;        convert_mjr "$audio_file" "${audio_file%.*}.opus"&#xA;        convert_mjr "$video_file" "${video_file%.*}.webm"&#xA;        if merge_audio_video "${audio_file%.*}.opus" "${video_file%.*}.webm"; then&#xA;            minion_merged_files&#x2B;=("${video_file%.*}_merged.webm")&#xA;        fi&#xA;    done&#xA;&#xA;    # Concatenate minion&#x27;s merged files&#xA;    if [ ${#minion_merged_files[@]} -gt 0 ]; then&#xA;        local minion_concat_list=$(mktemp)&#xA;        for file in "${minion_merged_files[@]}"; do&#xA;            echo "file &#x27;$file&#x27;" >> "$minion_concat_list"&#xA;        done&#xA;        $FFMPEG -y -f concat -safe 0 -i "$minion_concat_list" -c copy "${OUTPUT_DIR}/${minionid}_final.webm"&#xA;        rm "$minion_concat_list"&#xA;    fi&#xA;&#xA;    if [ -f "${OUTPUT_DIR}/${bossid}_final.webm" ] &amp;&amp; [ -f "${OUTPUT_DIR}/${minionid}_final.webm" ]; then&#xA;        final_mp4="${OUTPUT_DIR}/final-output-${videoroomid}-${bossid}-${minionid}.mp4"&#xA;        echo "Combining boss and minion videos into: $final_mp4"&#xA;&#xA;        # Calculate the delay between the boss and minion videos&#xA;        delay=$(calculate_delay "${OUTPUT_DIR}/${bossid}_final.webm" "${OUTPUT_DIR}/${minionid}_final.webm")&#xA;&#xA;        # Convert the delay to milliseconds for the adelay filter&#xA;        delay_ms=$(echo "$delay * 1000" | bc)&#xA;&#xA;        $FFMPEG -i "${OUTPUT_DIR}/${bossid}_final.webm" -i "${OUTPUT_DIR}/${minionid}_final.webm" -filter_complex \&#xA;        "[0:v]transpose=1,scale=160:-1[boss_clip]; \&#xA;        [0:a]volume=2.0[boss_audio]; \&#xA;        [1:a]volume=2.0,adelay=${delay_ms}|${delay_ms}[minion_audio]; \&#xA;        [1:v][boss_clip]overlay=W-w-10:H-h-10:shortest=0[output]; \&#xA;        [boss_audio][minion_audio]amix=inputs=2:duration=longest[audio]" \&#xA;        -map "[output]" -map "[audio]" -c:v libx264 -crf 20 -preset veryfast -c:a aac -strict experimental "$final_mp4"&#xA;&#xA;        if [ $? -ne 0 ]; then&#xA;            echo "Error combining boss and minion videos"&#xA;            exit 1&#xA;        else&#xA;            echo "Combining boss and minion videos successful"&#xA;            # Generate a thumbnail at 5 seconds into the video&#xA;            thumbnail="${OUTPUT_DIR}/$(basename "$final_mp4" .mp4).png"&#xA;            echo "Generating thumbnail for: $final_mp4"&#xA;            $FFMPEG -ss 00:00:05 -i "$final_mp4" -vframes 1 -q:v 2 "$thumbnail"&#xA;            echo "Thumbnail generated: $thumbnail"&#xA;            sudo mv -f "$thumbnail" "/home/adrienubuntu/okok.spassolab-ubuntu.com/okok-recordings-thumbnail/"&#xA;&#xA;            sudo mv -f "$final_mp4" "/home/adrienubuntu/okok.spassolab-ubuntu.com/okok-live-recordings/"&#xA;            rm -f "${OUTPUT_DIR}"/*.opus "${OUTPUT_DIR}"/*.webm "${OUTPUT_DIR}"/*.mp4&#xA;        fi&#xA;    else&#xA;        echo "Error: One or both final videos are missing"&#xA;        exit 1&#xA;    fi&#xA;&#xA;    # Output a message indicating the end of processing&#xA;    echo "Processing completed for videoroomid: $videoroomid, bossid: $bossid, minionid: $minionid"&#xA;}&#xA;&#xA;process_videos&#xA;

    &#xA;

    I will then test by calling this command ./name-of-file.sh videoroomid bossid minionid. Is there a way to solve this and still keep the whole process in a dynamic way ? Thanks in advance

    &#xA;