
Recherche avancée
Autres articles (98)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
ANNEXE : Les plugins utilisés spécifiquement pour la ferme
5 mars 2010, parLe site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (9317)
-
frame skipping problem in ffmpeg and ffprobe
3 mai 2019, par user3851234I am trying to check the details of the encoded frames using ffmpeg and ffporbe. When I print the details I notice that the details related to some frames are missing. For example using the fallowing command
ffmpeg -i input.avi -vf showinfo -f null -
I get :
n : 247 pts : 247 pts_time:8.24157 pos : 624096 ...n : 248 pts : 249 pts_time:8.3083 pos : 624168 ...
n : 249 pts : 258 pts_time:8.6086 pos : 624352 ...
n : 250 pts : 270 pts_time:9.009 pos : 630212 ...
You can see that there is jump from 258 to 270. In addition, when I print the number of decoded frames, it doesn’t match the actual number of frames.
ffprobe -count_frames -select_streams v:0 -show_entries stream=nb_read_frames -of default=nokey=1:noprint_wrappers=1 input.avi
it outputs 480 but the correct number is 509
Changing nb_read_frames to nb_frames solves the problem of the frame numbers but the other details are still missing.
What does ffmpeg/ffprobe skip some frames ? -
The problem with the AudioDispatcher, the analysis in audioDispatcherFactory is not running, TarsosDSP
4 février, par roman_gor_I'm making an application for sound analysis and spotlight control. The colors of the spotlight change to the beat of the music. I use the TarsosDSP library for this, additionally downloaded the FFmpeg-Kit library to convert audio to WAV format, PCM 16L to work with audioDispatcher.
The problem is that when audio is transmitted in the correct format, dispatcher starts and immediately ends. The boolean Process method is not executed, but the process Finished() method is executed. I found out that the stream starts, the file is not empty, it is converted to the correct format, BUT the getFrameLength() method, when interacting with the AudioStream to which I pass the filePath, returns the file path value -1, that is, in fact, it is not filled in. I've already searched through everything, and the github library code, and all the neural networks, I don't know how to solve this issue. The problem is with AudioDispatcher and AudioDispatcherFactory.from Pipe() ?


private void playAndAnalyzeAudio(String filePath, Uri uri)
 {
 if (mediaPlayer != null)
 mediaPlayer.release();
 mediaPlayer = MediaPlayer.create(requireContext(), uri);

 new Thread(() -> {
 extractAudio(inputFilePath, outputFilePath);
 getActivity().runOnUiThread(() -> {
 mediaPlayer = MediaPlayer.create(requireContext(), uri);
 if (mediaPlayer != null) {
 mediaPlayer.start(); // Start music after analyze
 startSendingData(); // Start data sending
 }
 });
 }).start();
 }

 private void analyzeAudio(String filePath)
 {
 try {
 AudioDispatcher audioDispatcher = AudioDispatcherFactory.fromPipe(filePath, 44100, 1024, 0);
 MFCC mfcc = new MFCC(1024, 44100, 13, 50, 20, 10000);
 audioDispatcher.addAudioProcessor(mfcc);
 Log.d("AUDIO_ANALYSIS", "Начинаем анализ аудиофайла..." + audioDispatcher);
 audioDispatcher.addAudioProcessor(new AudioProcessor() {
 @Override
 public boolean process(AudioEvent audioEvent) {
 Log.d("AUDIO_ANALYSIS", "Обрабатываем аудио...");

 float[] amplitudes = audioEvent.getFloatBuffer();
 Log.d("AUDIO_ANALYSIS", "Размер буфера: " + amplitudes.length);

 float[] mfccs = mfcc.getMFCC();
 if (mfccs == null) {
 Log.e("AUDIO_ANALYSIS", "MFCC не сгенерировался!");
 return true;
 }

 float currentBass = mfccs[0] + mfccs[1];
 float totalEnergy = 0;
 for (float amp : amplitudes) {
 totalEnergy += Math.abs(amp);
 }

 Log.d("AUDIO_ANALYSIS", "Bass Energy: " + currentBass + ", Total Energy: " + totalEnergy);

 if (currentBass > BASS_THRESHOLD || totalEnergy > ENERGY_THRESHOLD) {
 changeColor();
 Log.d("SONG", "Color wac changed on a : " + currentColor);
 brightness = MAX_BRIGHTNESS;
 } else {
 brightness *= 0.9f;
 }

 return true;
 }

 @Override
 public void processingFinished() {
 getActivity().runOnUiThread(() -> Toast.makeText(requireContext(), "Анализ завершён", Toast.LENGTH_SHORT).show());
 }
 });
 File file = new File(filePath);
 if (!file.exists() || file.length() == 0) {
 Log.e("AUDIO_ANALYSIS", "Error: file is empty! " + filePath);
 return;
 } else {
 Log.d("AUDIO_ANALYSIS", "File is, size: " + file.length() + " byte.");
 }
 Log.d("AUDIO_ANALYSIS", "Start of analyzing: " + filePath);
 File ffmpegFile = new File(getContext().getCacheDir(), "ffmpeg");
 if (!ffmpegFile.setExecutable(true)) {
 Log.e("AUDIO_ANALYSIS", "You don't have any roots for ffmpeg!");
 }
 else
 Log.e("AUDIO_ANALYSIS", "You have roots for ffmpeg!");

 new Thread(() -> {
 Log.d("AUDIO_ANALYSIS", "Start dispatcher...");
 audioDispatcher.run();
 Log.d("AUDIO_ANALYSIS", "Dispatcher end.");
 }).start();
 } catch (Exception e) {
 e.printStackTrace();
 Toast.makeText(requireContext(), "Error of analyzing", Toast.LENGTH_SHORT).show();
 }
 }
public void extractAudio(String inputFilePath, String outputFilePath) {
 File outputFile = new File(outputFilePath);
 if (outputFile.exists()) {
 outputFile.delete(); // Удаляем существующий файл
 }
 // Строим команду для извлечения аудио
 String command = "-i " + inputFilePath + " -vn -acodec pcm_s16le -ar 44100 -ac 2 " + outputFilePath;

 // Используем FFmpegKit для выполнения команды
 FFmpegKit.executeAsync(command, session -> {
 if (session.getReturnCode().isSuccess()) {
 Log.d("AUDIO_EXTRACT", "Аудио извлечено успешно: " + outputFilePath);
 analyzeAudio(outputFilePath); // Продолжаем анализировать аудио
 } else {
 Log.e("AUDIO_EXTRACT", "Ошибка извлечения аудио: " + session.getFailStackTrace());
 }
 });
 }



Sorry about the number of lines, i tried to describe the problem very detailed.
I tried to change AudioDispatcherFactory.fromPipe() on a AudioDispatcherFactory.fromFile(), but this method don't available in Android, only in Java, how i see the error "Javax.sound..., unexpected error, method don't available"
I tried to change String command in executeAudio() method, to change arguments of fromPipe() method, but in did not to bring success.
I want that my audio file will be correct analyze with audiodispatcher and then, that data from analyze will be transfered to arduino. Now in Logs I see "Color : null, value : 0.0.


-
lavc/av1 : Record reference ordering information for each frame
13 avril 2024, par Mark Thompsonlavc/av1 : Record reference ordering information for each frame
This is needed by Vulkan. Constructing this can't be delegated to CBS
because packets might contain multiple frames (when non-shown frames are
present) but we need separate snapshots immediately before each frame
for the decoder.