
Recherche avancée
Médias (1)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
Autres articles (106)
-
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Gestion de la ferme
2 mars 2010, parLa ferme est gérée dans son ensemble par des "super admins".
Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
Dans un premier temps il utilise le plugin "Gestion de mutualisation" -
MediaSPIP Player : problèmes potentiels
22 février 2011, parLe lecteur ne fonctionne pas sur Internet Explorer
Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...)
Sur d’autres sites (9905)
-
How can I save the downloaded and converted audios to a specific folder
19 décembre 2022, par Retire YoungI'm trying to download Videos from a specific playlist and converting it to a m4a file for further downloading. The process works perfectly but the files are being saved in the same folder where my Phyton script is located. I choose a specific Output Folder but the files don't appear there.
Any ways to save the files in the desired folder ?


An error message reads as follows :
"input.mp4: No such file or directory"

In my Code block, I censored the playlist link and the user name on the path. Just imagine a working playlist link instead. Thanks in advance !

import os
import subprocess



Replace
with the URL of the YouTube playlist


playlist_url = "PLAYLISTLINK"


Set the output directory for the downloaded files


output_dir = "/Users/USER/Documents/VideoBot/Output"


Use
youtube-dl
to download the latest video in the playlist

subprocess.run(["youtube-dl", "-f", "bestaudio[ext=m4a]", "--playlist-start", "1", "--playlist-end", "1", playlist_url])


Extract the audio from the downloaded video using
ffmpeg


subprocess.run(["ffmpeg", "-i", "input.mp4", "-vn", "-acodec", "copy", "output.m4a"])


Move the extracted audio file to the output directory


os.rename("output.m4a", os.path.join(output_dir, "output.m4a"))


-
How to save a video on top of it a text widget changes every couple of seconds flutter ?
23 septembre 2023, par abdallah mostafaI've been working on auto subtitle tool for videos but I did not How to save a final video.
should I record the video or get screenshots of all the frames and combine them together to be a video.


I've used FFmpegKit but it's so hard to make the position of the text


Future<void> saveSubtitle(
 {required double leftPosition,
 required double topPosition,
 required double opacityOfBackground,
 required String backgroundColor,
 required String subtitleColor}) async {
 emit(ExportSubtitleLoading());

 String fontDirectoryPath =
 await _exportSubtitle.writeFontToFile('assets/fonts/arial.ttf');
 if (backgroundColor == 'Transparent') {
 opacityOfBackground = 0.0;
 backgroundColor = 'black';
 }
 String subtitleFilter = "";
 for (var subtitle in subtitles!.fotmatedSubtitle!) {
 double startTime = _exportSubtitle.timeToSeconds(subtitle.interval![0]);
 double endTime = _exportSubtitle.timeToSeconds(subtitle.interval![1]);
 String text = subtitle.displayText!.replaceComma;
 int fontSize = controller!.value.aspectRatio > 0.5625 ? 24 * 3 : 24;
 if (countWords(text) > 9 && countWords(text) <= 15) {
 // Add line breaks ("\n") to the text
 text = _exportSubtitle.addLineBreaks(
 text,
 );
 } else {
 text = _exportSubtitle.addLineBreaks(text, true);
 }
 final centeredNumber = text.split('\n');
 // centeredNumber[2].split(' ').logger;
 // return;
 for (var i = 0; i < centeredNumber.length; i++) {
 if (i == 0) {
 if (centeredNumber.length > 1 &&
 centeredNumber[i].split(' ').join().length >
 centeredNumber[i + 1].split(' ').join().length) {
 subtitleFilter +=
 " drawtext=text='${centeredNumber[i]}':enable='between(t,$startTime,$endTime)':x=$leftPosition-30:y=$topPosition:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";
 } else {
 subtitleFilter +=
 " drawtext=text='${centeredNumber[i]}':enable='between(t,$startTime,$endTime)':x=$leftPosition+20:y=$topPosition:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";
 }
 } else if (i == 1) {
 subtitleFilter +=
 " drawtext=text='${centeredNumber[i]}':enable='between(t,$startTime,$endTime)':x=$leftPosition:y=$topPosition+25:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";
 } else {
 if (centeredNumber.length > 1 &&
 centeredNumber[i - 1].split(' ').join().length >
 centeredNumber[i].split(' ').join().length) {
 subtitleFilter +=
 " drawtext=text='${centeredNumber[i]}':enable='between(t,$startTime,$endTime)':x=$leftPosition+text_w/16:y=$topPosition+50:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";
 } else {
 subtitleFilter +=
 " drawtext=text='${centeredNumber[i]}':enable='between(t,$startTime,$endTime)':x=$leftPosition-text_w/16:y=$topPosition+50:fontsize=$fontSize:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";
 }
 }
 }

 // subtitleFilter +=
 // " drawtext=text='$text':enable='between(t,$startTime,$endTime)':x=$leftPosition:y=$topPosition:fontsize=24:fontcolor=$subtitleColor:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@$opacityOfBackground,";
 }

 final finalFilter = "\"$subtitleFilter\"";
 // final finalFilter =
 // "\"$subtitleFilter split[s1][s2];[s1]crop=w=576:h=1024,scale=576:1024[p];[s2][p]overlay=x=(main_w-overlay_w)/2:y=(main_h-overlay_h)/2[v]\"";
 final dir = await getTemporaryDirectory();
 String outputPath = '${dir.path}/ex_vid.mp4';
 final arguments = [
 '-y',
 '-i',
 inputFile,
 '-vf',
 finalFilter,
 '-c:v',
 'libx264',
 '-c:a',
 'copy',
 outputPath
 ];
 arguments.join(' ').logger;
 // return;
 // String command =
 // "-y -i $inputFile -vf \" drawtext=text='You know those cat are memes that everybody uses\nin their videos and the TV movie clips that people use.':enable='between(t,0,4.000)':x=(w-text_w)/2:y=(h-text_h)/2:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5, drawtext=text='Well, who are the four best free\nwebsites to find a move?':enable='between(t,4.000,6.240)':x=(w-text_w)/2:y=(h-text_h)/2+30:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5, split[s1][s2];[s1]crop=w=576:h=1024,scale=576:1024[p];[s2][p]overlay=x=(main_w-overlay_w)/2:y=(main_h-overlay_h)/2[v]\" -c:v libx264 -c:a copy $outputPath";

 String command =
 "-y -i $inputFile -vf \" drawtext=text='You know those cat are memes that everybody uses\nin their videos and the TV movie clips that people use.':enable='between(t,0,4.000)':x=(w-text_w)/2:y=(h-text_h)/2:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5, drawtext=text='Well, who are the four best free\nwebsites to find a move?':enable='between(t,4.000,6.240)':x=$leftPosition:y=$topPosition:fontsize=24:fontcolor=white:fontfile=$fontDirectoryPath:box=1:boxcolor=$backgroundColor@0.5 \" -c:v libx264 -c:a copy $outputPath";

 '================='.logger;
 // FFmpegKitConfig.enableUTF8Charset();
 command.logger;
 await FFmpegKit.execute(arguments.join(' ')).then((session) async {
 final returnCode = await session.getReturnCode();

 if (ReturnCode.isSuccess(returnCode)) {
 ('The Converstion is Success').logger;
 final path = await _exportSubtitle.exportFile(File(outputPath));
 emit(ExportSubtitleSuccess(path));
 } else if (ReturnCode.isCancel(returnCode)) {
 // CANCEL
 ('The Converstion is Cancelled').logger;
 } else {
 emit(ExportSubtitleerror());
 ('The Converstion Have an Error').logger;
 }
 });
 }
</void>


This function, named saveSubtitle. It is responsible for applying subtitle to a video using FFmpeg. Here's a breakdown of what this function does :


It starts by emitting an event to indicate that the subtitle export process is loading.


It obtains the file path of a font (arial.ttf) from assets and stores it in fontDirectoryPath.


It checks if the background color for subtitles is set to "Transparent." If so, it sets the opacityOfBackground to 0.0 and changes the backgroundColor to black.


It initializes an empty subtitleFilter string, which will store FFmpeg filter commands for each subtitle.


It iterates through the subtitles and calculates the start and end time, text, and font size for each subtitle.


For each subtitle, it calculates the position (x and y coordinates) based on the leftPosition and topPosition. It also sets the font color, font file path, and background color with opacity for the subtitle.


It appends the FFmpeg drawtext filter command for each subtitle to the subtitleFilter string.


After processing all subtitles, it wraps the subtitleFilter string in double quotes and prepares to use it as an argument for the FFmpeg command.


It specifies the output path for the video with subtitles.


It constructs the FFmpeg command using various arguments, including the input video file, the subtitle filter, video and audio codecs, and the output path.


It executes the FFmpeg command using FFmpegKit and waits for the conversion process to complete.


Once the conversion is finished, it checks the return code to determine if it was successful. If successful, it emits a success event with the path to the exported video with subtitles. If canceled or if an error occurred, it emits corresponding events to indicate the status.


In summary, this function is used to add subtitles to a video by overlaying text on specific positions and with specified styles. It utilizes FFmpeg for video processing and emits events to notify the application about the export status.


-
How to mix videos from two participants in a video call layout where each video from the participants are not in sync in janus videoroom and ffmpeg ?
5 mars 2024, par Adrien3001I am using janus gateway to implement a video call app using videoroom plugin. If for example there are two participants in the call the recorded mjr files will be 4 (one audio and video mjr for each user). I am using ffmpeg to first convert the audio to opus and the video to webm for each user. I then combine the video with the audio for each user. Finally I want to mix the resulting two webm videos in a video call layout where one video will overlay the other at the bottom right corner and will be a smaller screen (picture in picture mode). The problem is that I noticed sometimes one of the webm video will have a longer duration than the other and the final mp4 video is out of sync. This is the bash script im using for the video processing to the final mp4 video :


#!/bin/bash
set -x # Enable debugging

# Check if the correct number of arguments are provided
if [ "$#" -ne 3 ]; then
 echo "Usage: $0 videoroomid bossid minionid"
 exit 1
fi

videoroomid=$1
bossid=$2
minionid=$3

# Define paths and tools
MJR_DIR="/opt/janus/recordings-folder"
OUTPUT_DIR="/opt/janus/recordings-folder"
JANUS_PP_REC="/usr/bin/janus-pp-rec"
FFMPEG="/usr/bin/ffmpeg"
THUMBNAIL_DIR="/home/adrienubuntu/okok.spassolab-ubuntu.com/okok-recordings-thumbnail"

# Function to convert MJR to WebM (for video) and Opus (for audio)
convert_mjr() {
 local mjr_file=$1
 local output_file=$2
 local type=$(basename "$mjr_file" | cut -d '-' -f 6)

 echo "Attempting to convert file: $mjr_file"

 if [ ! -f "$mjr_file" ]; then
 echo "MJR file not found: $mjr_file"
 return 1
 fi

 if [ "$type" == "audio" ]; then
 # Convert audio to Opus format
 echo "Converting audio to Opus: $mjr_file"
 $JANUS_PP_REC "$mjr_file" "$output_file"
 if [ $? -ne 0 ]; then
 echo "Conversion failed for file: $mjr_file"
 return 1
 fi
 # Check and adjust audio sample rate
 adjust_audio_sample_rate "$output_file"
 elif [ "$type" == "video" ]; then
 # Convert video to WebM format with VP8 video codec
 echo "Converting video to WebM: $mjr_file"
 $JANUS_PP_REC "$mjr_file" "$output_file"
 if [ $? -ne 0 ]; then
 echo "Conversion failed for file: $mjr_file"
 return 1
 fi
 # Check and convert to constant frame rate
 convert_to_constant_frame_rate "$output_file"
 fi

 echo "Conversion successful: $output_file"
 return 0
}

# Function to merge audio (Opus) and video (WebM) files
merge_audio_video() {
 local audio_file=$1
 local video_file=$2
 local merged_file="${video_file%.*}_merged.webm"

 echo "Merging audio and video files into: $merged_file"

 # Merge audio and video
 $FFMPEG -y -i "$video_file" -i "$audio_file" -c:v copy -c:a libopus -map 0:v:0 -map 1:a:0 "$merged_file"

 if [ $? -eq 0 ]; then
 echo "Merging successful: $merged_file"
 return 0
 else
 echo "Error during merging."
 return 1
 fi
}

# Function to check if MJR files exist
check_mjr_files_exist() {
 local videoroomid=$1
 local bossid=$2
 local minionid=$3

 if ! ls ${MJR_DIR}/videoroom-${videoroomid}-user-${bossid}-*-video-*.mjr &>/dev/null ||
 ! ls ${MJR_DIR}/videoroom-${videoroomid}-user-${minionid}-*-video-*.mjr &>/dev/null; then
 echo "Error: MJR files not found for videoroomid: $videoroomid, bossid: $bossid, minionid: $minionid"
 exit 1
 fi
}

# Function to calculate delay
calculate_delay() {
 local video1=$1
 local video2=$2

 # Get the start time of the first video
 local start_time1=$(ffprobe -v error -show_entries format=start_time -of default=noprint_wrappers=1:nokey=1 "$video1")

 # Get the start time of the second video
 local start_time2=$(ffprobe -v error -show_entries format=start_time -of default=noprint_wrappers=1:nokey=1 "$video2")

 # Calculate the delay (in seconds)
 local delay=$(echo "$start_time2 - $start_time1" | bc)

 # If the delay is negative, make it positive
 if [ $(echo "$delay < 0" | bc) -eq 1 ]; then
 delay=$(echo "-1 * $delay" | bc)
 fi

 echo "$delay"
}

# Function to adjust audio sample rate
adjust_audio_sample_rate() {
 local audio_file=$1
 local desired_sample_rate=48000 # Set the desired sample rate

 # Get the current sample rate of the audio file
 local current_sample_rate=$(ffprobe -v error -show_entries stream=sample_rate -of default=noprint_wrappers=1:nokey=1 "$audio_file")

 # Check if the sample rate needs to be adjusted
 if [ "$current_sample_rate" -ne "$desired_sample_rate" ]; then
 echo "Adjusting audio sample rate from $current_sample_rate to $desired_sample_rate"
 local temp_file="${audio_file%.*}_temp.opus"
 $FFMPEG -y -i "$audio_file" -ar "$desired_sample_rate" "$temp_file"
 mv "$temp_file" "$audio_file"
 fi
}

# Function to convert video to a constant frame rate
convert_to_constant_frame_rate() {
 local video_file=$1
 local desired_frame_rate=30 # Set the desired frame rate

 # Check if the video has a variable frame rate
 local has_vfr=$(ffprobe -v error -select_streams v -show_entries stream=r_frame_rate -of default=noprint_wrappers=1:nokey=1 "$video_file")

 if [ "$has_vfr" == "0/0" ]; then
 echo "Video has a variable frame rate. Converting to a constant frame rate of $desired_frame_rate fps."
 local temp_file="${video_file%.*}_temp.webm"
 $FFMPEG -y -i "$video_file" -r "$desired_frame_rate" -c:v libvpx -b:v 1M -c:a copy "$temp_file"
 mv "$temp_file" "$video_file"
 fi
}

# Main processing function
process_videos() {
 # Check if MJR files exist
 check_mjr_files_exist "$videoroomid" "$bossid" "$minionid"

 # Output a message indicating the start of processing
 echo "Processing started for videoroomid: $videoroomid, bossid: $bossid, minionid: $minionid"

 # Process boss's files
 local boss_audio_files=($(ls ${MJR_DIR}/videoroom-${videoroomid}-user-${bossid}-*-audio-*.mjr))
 local boss_video_files=($(ls ${MJR_DIR}/videoroom-${videoroomid}-user-${bossid}-*-video-*.mjr))
 local boss_merged_files=()

 for i in "${!boss_audio_files[@]}"; do
 local audio_file=${boss_audio_files[$i]}
 local video_file=${boss_video_files[$i]}
 convert_mjr "$audio_file" "${audio_file%.*}.opus"
 convert_mjr "$video_file" "${video_file%.*}.webm"
 if merge_audio_video "${audio_file%.*}.opus" "${video_file%.*}.webm"; then
 boss_merged_files+=("${video_file%.*}_merged.webm")
 fi
 done

 # Concatenate boss's merged files
 if [ ${#boss_merged_files[@]} -gt 0 ]; then
 local boss_concat_list=$(mktemp)
 for file in "${boss_merged_files[@]}"; do
 echo "file '$file'" >> "$boss_concat_list"
 done
 $FFMPEG -y -f concat -safe 0 -i "$boss_concat_list" -c copy "${OUTPUT_DIR}/${bossid}_final.webm"
 rm "$boss_concat_list"
 fi

 # Process minion's files
 local minion_audio_files=($(ls ${MJR_DIR}/videoroom-${videoroomid}-user-${minionid}-*-audio-*.mjr))
 local minion_video_files=($(ls ${MJR_DIR}/videoroom-${videoroomid}-user-${minionid}-*-video-*.mjr))
 local minion_merged_files=()

 for i in "${!minion_audio_files[@]}"; do
 local audio_file=${minion_audio_files[$i]}
 local video_file=${minion_video_files[$i]}
 convert_mjr "$audio_file" "${audio_file%.*}.opus"
 convert_mjr "$video_file" "${video_file%.*}.webm"
 if merge_audio_video "${audio_file%.*}.opus" "${video_file%.*}.webm"; then
 minion_merged_files+=("${video_file%.*}_merged.webm")
 fi
 done

 # Concatenate minion's merged files
 if [ ${#minion_merged_files[@]} -gt 0 ]; then
 local minion_concat_list=$(mktemp)
 for file in "${minion_merged_files[@]}"; do
 echo "file '$file'" >> "$minion_concat_list"
 done
 $FFMPEG -y -f concat -safe 0 -i "$minion_concat_list" -c copy "${OUTPUT_DIR}/${minionid}_final.webm"
 rm "$minion_concat_list"
 fi

 if [ -f "${OUTPUT_DIR}/${bossid}_final.webm" ] && [ -f "${OUTPUT_DIR}/${minionid}_final.webm" ]; then
 final_mp4="${OUTPUT_DIR}/final-output-${videoroomid}-${bossid}-${minionid}.mp4"
 echo "Combining boss and minion videos into: $final_mp4"

 # Calculate the delay between the boss and minion videos
 delay=$(calculate_delay "${OUTPUT_DIR}/${bossid}_final.webm" "${OUTPUT_DIR}/${minionid}_final.webm")

 # Convert the delay to milliseconds for the adelay filter
 delay_ms=$(echo "$delay * 1000" | bc)

 $FFMPEG -i "${OUTPUT_DIR}/${bossid}_final.webm" -i "${OUTPUT_DIR}/${minionid}_final.webm" -filter_complex \
 "[0:v]transpose=1,scale=160:-1[boss_clip]; \
 [0:a]volume=2.0[boss_audio]; \
 [1:a]volume=2.0,adelay=${delay_ms}|${delay_ms}[minion_audio]; \
 [1:v][boss_clip]overlay=W-w-10:H-h-10:shortest=0[output]; \
 [boss_audio][minion_audio]amix=inputs=2:duration=longest[audio]" \
 -map "[output]" -map "[audio]" -c:v libx264 -crf 20 -preset veryfast -c:a aac -strict experimental "$final_mp4"

 if [ $? -ne 0 ]; then
 echo "Error combining boss and minion videos"
 exit 1
 else
 echo "Combining boss and minion videos successful"
 # Generate a thumbnail at 5 seconds into the video
 thumbnail="${OUTPUT_DIR}/$(basename "$final_mp4" .mp4).png"
 echo "Generating thumbnail for: $final_mp4"
 $FFMPEG -ss 00:00:05 -i "$final_mp4" -vframes 1 -q:v 2 "$thumbnail"
 echo "Thumbnail generated: $thumbnail"
 sudo mv -f "$thumbnail" "/home/adrienubuntu/okok.spassolab-ubuntu.com/okok-recordings-thumbnail/"

 sudo mv -f "$final_mp4" "/home/adrienubuntu/okok.spassolab-ubuntu.com/okok-live-recordings/"
 rm -f "${OUTPUT_DIR}"/*.opus "${OUTPUT_DIR}"/*.webm "${OUTPUT_DIR}"/*.mp4
 fi
 else
 echo "Error: One or both final videos are missing"
 exit 1
 fi

 # Output a message indicating the end of processing
 echo "Processing completed for videoroomid: $videoroomid, bossid: $bossid, minionid: $minionid"
}

process_videos



I will then test by calling this command ./name-of-file.sh videoroomid bossid minionid. Is there a way to solve this and still keep the whole process in a dynamic way ? Thanks in advance