
Recherche avancée
Autres articles (104)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)
Sur d’autres sites (15790)
-
Mix PCM data from two decoded FFmpeg AVFrame* objects
15 juin 2016, par williamtroupI’m currently using the FFMpeg AutoGen project (in C# of course) to decode frames from an audio file and add them to a new stream that is being written to a video. This all works correctly, but I was wondering how one would go about mixing two AVFrame* objects together (after they are decoded).
I’ve mixed PCM data before, but was wondering if FFmpeg had a built in API to do the work more effectively.
This is how I’m currently doing it :
short* baseFrameBuffer1 = (short*)baseFrame->data_0;
short* baseFrameBuffer2 = (short*)baseFrame->data_1;
short* readFrameBuffer1 = (short*)readFrame->data_0;
short* readFrameBuffer2 = (short*)readFrame->data_1;
for (int frameIndex = 0; frameIndex< 1024; frameIndex++)
{
int dataSample1 = GetInRangeSample(baseFrameBuffer1[frameIndex] + readFrameBuffer1[frameIndex]);
int dataSample2 = GetInRangeSample(baseFrameBuffer2[frameIndex] + readFrameBuffer2[frameIndex]);
baseFrame->data_0[frameIndex] = (byte)dataSample1;
baseFrame->data_1[frameIndex] = (byte)dataSample2;
}
private static int GetInRangeSample(int sample)
{
if (sample > short.MaxValue)
{
sample = short.MaxValue;
}
if (sample < short.MinValue)
{
sample = short.MinValue;
}
return sample;
} -
lavc/huffyuvenc : Move a variable declaration up.
10 décembre 2017, par Carl Eugen Hoyos -
FFmpeg : Combine video files with different start times
18 mai 2016, par FabianI have two webm files with audio and video recordings of a video conference session. Both files only contain one side of the conversation. They are not of the same length (someone has joined before the other one), but I have the unix timestamp in milliseconds of the start time of each video file.
On a timeline they look like this :
webm 1: -----------------------------------------------
webm 2: -----------------------------or like this :
webm 1: -----------------------------------------------
webm 2: -----------------------------I would like to combine these two video files into one file so that :
- They appear next to each other (using the hstack option), and
- That they are mixed with taking the time stamps of the start times
into account. The final video should then look like this :
Target result : --------------===========================----
The beginning and the end of the new video would show a black placeholder for the video file that has no data at this time of the mixed stream.
At the moment I use this command :
ffmpeg -i 1463408731413703.webm -i 1463408880317860.webm -filter_complex \
"[0:v][1:v]hstack=inputs=2[v]; \
[0:a][1:a]amerge[a]" \
-map "[v]" -map "[a]" -ac 2 -c:v libvpx output.webmThis creates a video like this :
Not good result : =====================------------------
i.e. the conversation is out of sync.
How can I combine two video streams with different length and different start times using ffmpeg so that I will end up with "Target result" above ?
Thanks a lot !