
Recherche avancée
Autres articles (70)
-
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)
Sur d’autres sites (10658)
-
Crop video into a 4x4 grid/tiles/matrix efficiently via command-line ffmpeg ?
22 avril 2017, par DylanHello Stackoverflow community !
I dread having to ask questions, but there seems to be no efficient way to take a single input video and apply a matrix transformation/split the video into equal sized pieces, preferably 4x4=16 segments per input.
I tried using all the libraries such as ffmpeg and mencoder, but having 16 outputs can be as slow as 0.15x. The goal of my project is the split the video into 16 segments, rearrange those segments and combine back into a final video ; later reversing the process in HTML5 canvas. Here is a picture to help you understand what I am talking about :
the source but also the final destination after reorganizing the piecesI do not believe you can do this all in one command, so my goal is to crop into 16 mapped outputs quickly, then reassemble them in a different order. But I can do the other parts myself. Ideally there would be a way to move pixel blocks eg 100x100 and just move them around. My math is not strong enough..
I really appreciate the work you guys do !admin@dr.com
-
how to set ffmpeg bitrate, frame per second when scalling from low to high resolution/ high to low
21 janvier 2015, par user2303069I’m developing a video sharing app for mobile device users. I noticed if I convert video with ffmpeg from a low(ex 240x320) to a higher resolution(ex 480x-1) the video looses quality and also there is no sound at the end. My question to you is how should I filter(how many fps,bitrate ect) the converted file ? In terms of parameters, how should I convert to make sure the video comes out in good quality at it’s destination(the user who will play it on their mobile device[Android/BlackBerry]).
Here is my current code(java) :
/************************ CONVERTING TIME *************************
method to convert video-clip
*/
if(sn.toString().trim().startsWith("<>")){
String info=sn.toString().trim().substring(2);
String scale=info.substring(0, info.indexOf("."));
String width=scale.substring(0, scale.indexOf("x"));
String name=info.substring(info.indexOf(".")+1, info.length());
String to = null;
String from;
try{
//calculate to and from
to=name.substring(0, name.indexOf("$"));
from=name.substring(name.indexOf("$")+1, name.lastIndexOf("$"));
gui.textArea1.append("\nStart converting...: to="+to+" from="+from+" fileName="+name);
}catch(NullPointerException npe){
gui.textArea1.append("\nNullpointer in calculate name in converting: "+npe.getMessage());
}
final Path videoIn = Paths.get("c:\\wamp\\www\\iclips\\videoMessages\\"+name);
final Path encodingFile = Paths.get("c:\\wamp\\www\\iclips\\videoMessages\\scaled-"+name);
final Path errorFile = Paths.get("c:\\ffmpeg\\bin\\error.txt");
String pro;
int w=Integer.parseInt(width);
if(w<=240){
pro="baseline";
}else if(w>240&&w<=480){
pro="main";
}else if(w>480){
pro="high";
}else{
pro="baseline";
}
//int retCode;
try {
Files.deleteIfExists(encodingFile);
Files.deleteIfExists(errorFile);
final ProcessBuilder pb
= new ProcessBuilder("c:\\ffmpeg\\bin\\ffmpeg.exe",
"-i", videoIn.toString(),
"-y",
"-vf", "scale="+width+":-1",
// "-pix_fmt","yuv420p",
"-vcodec", "libx264",
"-vprofile", pro,
"c:\\wamp\\www\\iclips\\videoMessages\\scaled-"+name
); //or other command....
pb.redirectError(errorFile.toFile());
pb.redirectOutput(encodingFile.toFile());
final Process p = pb.start();
try {
p.waitFor();
if(p.exitValue()==0){
gui.textArea1.append("\n+++++++++++++Vic-^clip converted successfully:"
+ " ExitValue=["+String.valueOf(p.exitValue())+"] ++++++++++++++");
if(Files.deleteIfExists(videoIn)){
gui.textArea1.append("\n"+videoIn.toString()+" deleted!");
}
sendMsg("Scalling successfull:-) Video-clip name=scaled-"+name+"_", "\nSent scaled successfull "+username);
NotifyClientOfScaledVideoMessage(to,"^scaled-"+name+"_");
}else{
gui.textArea1.append("\nSomething went wrong with process-convert: ExitValue="+String.valueOf(p.exitValue()));
sendMsg("Unable to scale video, try again._", "\nSent scaled failed to "+username);
}
} catch (InterruptedException e) {
gui.textArea1.append("\nInterrupted process convert: "+e.getMessage());
}
} catch (IOException e) {
// deal with e here
gui.textArea1.append("\nIOException in Convert Video: "+e.getMessage());
}
}Thank you very much.
-
FFMPEG - Overlaying video with transparency, alpha has strange threshold
6 mars 2019, par tanker_I’m attempting to overlay an Apple ProRes 4444 video with a alpha/transparency onto a normal video. However, upon inspecting the final output from FFMPEG, compared to the same files overlaid on top of one another and rendered in Final Cut Pro, there is a discrepancy in how they render the edges around the object.
Screenshot comparison :
Additional closeup :
Here is my input :
ffmpeg \
-i background.MOV -x264opts colormatrix=bt709 \
-i alpha_object.MOV -x264opts colormatrix=bt709 \
-filter_complex " \
[0:v]setpts=PTS-STARTPTS, scale=1920x1080[top]; \
[1:v]setpts=PTS-STARTPTS, scale=1920x1080, \
colorchannelmixer=aa=1.0[bottom]; \
[top][bottom]overlay=shortest=1" \
-vcodec libx264 -qp 15 -an -shortest output.MOVAny idea what could be wrong ? Is there a option within FFMPEG’s available filters that I’m missing ?
All files are ingested and rendered in 1920x1080.