
Recherche avancée
Médias (91)
-
Richard Stallman et le logiciel libre
19 octobre 2011, par
Mis à jour : Mai 2013
Langue : français
Type : Texte
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
Elephants Dream - Cover of the soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
#7 Ambience
16 octobre 2011, par
Mis à jour : Juin 2015
Langue : English
Type : Audio
-
#6 Teaser Music
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
-
#5 End Title
16 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Audio
Autres articles (43)
-
(Dés)Activation de fonctionnalités (plugins)
18 février 2011, parPour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...)
Sur d’autres sites (9162)
-
FFmpeg can't run in some device Android
30 mars 2016, par tqni’m working in a video processing project and now i’m using ffmpeg library in Android. I’m facing with a strange problem with asus zenphone 4 t00l (or all x86 device, I’ve just test in this x86 phone). When start command in project, app always crash :
03-30 15:08:18.461 21068-21068/com.paditech.videa I/FFmpeg: Loading FFmpeg for armv7-neon CPU
03-30 15:08:18.781 21068-21068/com.paditech.videa I/System.out: Success........
03-30 15:26:12.933 21068-21068/com.paditech.videa E/IMGSRV: :0: PVRDRMOpen: TP3, ret = 59
03-30 15:26:12.933 21068-21068/com.paditech.videa E/IMGSRV: :0: PVRDRMOpen: TP3, ret = 63
03-30 15:26:15.913 21068-21068/com.paditech.videa D/VideoActivity: onDestroy
03-30 15:26:18.993 21068-21068/com.paditech.videa I/FFmpeg: Loading FFmpeg for armv7-neon CPU
03-30 15:26:19.263 21068-21068/com.paditech.videa I/System.out: Success........
03-30 15:26:42.583 21068-21184/com.paditech.videa D/FFmpeg: Running publishing updates method
03-30 15:26:42.583 21068-21068/com.paditech.videa E/IMGSRV: :0: PVRDRMOpen: TP3, ret = 59
03-30 15:26:42.583 21068-21068/com.paditech.videa E/IMGSRV: :0: PVRDRMOpen: TP3, ret = 85
03-30 15:26:42.683 21068-21068/com.paditech.videa D/VideoActivity: /data/data/com.paditech.videa/files/ffmpeg[1]: syntax error: '-�-4�' unexpected
03-30 15:26:42.683 21068-21068/com.paditech.videa D/VideoActivity: /data/data/com.paditech.videa/files/ffmpeg[1]: syntax error: '-�-4�' unexpectedFirst, I think the problem is ffmpeg library. So I created a test module with simple activity just load and run command and there’re no problem.
Second, I’m afraid that app cannot load version ffmpeg, because in my log, it’s armv7-neon althought it’s x86 (In my test module, it display x86). And according my search result, may be a external library make app to use abi arm (Build.CPU_API="armv7"). So I try remove all library in gradle in module Test and detected a library. But check in aar file of library, it support all x86 and arm. But after remove it, app still crash with same log. And strangely, althogh Test module register as arm but it’still run success (External question : Why add library jp.wasabeef:picasso-transformations make system register as armv7 although it’s x86)
Finally, i think problem is command. But after debug, I use same command for Test module but it still success. Now i’m still stuck with it. Could anyone help me to solve it. Thanks. Here my demo code :
String input = "/storage/emulated/0/Videa/Video/VIDEO_20160122_160020.mp4";
String output = "/storage/emulated/0/Videa/Audio/AUDIO_20160330_142501.wav";
String[] command = {
"-y",
"-i",
input,
"-vn",
"-f",
"wav",
output
};
FFmpeg ffmpeg = FFmpeg.getInstance(getApplicationContext());
try {
ffmpeg.execute(command, new ExecuteBinaryResponseHandler() {
@Override
public void onSuccess(String message) {
super.onSuccess(message);
System.out.println("Success " + message);
}
@Override
public void onFailure(String message) {
super.onFailure(message);
System.out.println("Failure " + message);
}
});
} catch (Exception e) {
e.printStackTrace();
}And here is Failed command :
ArrayList<string> cmd = new ArrayList<string>();
cmd.add("-y");
cmd.add("-i");
cmd.add(input);
cmd.add("-vn");
cmd.add("-f");
cmd.add("wav");
cmd.add(output);
String[] result = new String[cmd.size()];
return cmd.toArray(result);
</string></string> -
I want help in making video collage, tried everything but unsuccessful
16 mars 2016, par Haider AliI want to make Video Collage in which 2 or more videos should be displayed in one frame and then they can be converted into one Video file.
I tried examples but they just add videos at the end of each video to make a long one combine video.
Any Help PleaseString FILE_PATH = "/storage/sdcard0/testing.mp4";
String FILE_PATH2 = "/storage/sdcard0/testing1.mp4";
String FILE_PATH3 = "/storage/sdcard0/testing2.mp4";
File file1 = new File(FILE_PATH);
File file2 = new File(FILE_PATH2);
File file3 = new File(FILE_PATH3);
private ProgressDialog pDialog;
ImageView img,img2,img3;
MediaMetadataRetriever retriever2 = new MediaMetadataRetriever();
MediaMetadataRetriever retriever3 = new MediaMetadataRetriever();
ArrayList<bitmap> bitmapArray1 = new ArrayList<bitmap>();
ArrayList<bitmap> bitmapArray2 = new ArrayList<bitmap>();
ArrayList<bitmap> bitmapArray3 = new ArrayList<bitmap>();
File ScreenDIR = new File("/sdcard/Screens/");
</bitmap></bitmap></bitmap></bitmap></bitmap></bitmap>// have the object build the directory structure, if needed.
double id1=0,id2=0,id3=0;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
ScreenDIR.mkdirs();
img = (ImageView)findViewById(R.id.imageView);
img2 = (ImageView)findViewById(R.id.imageView2);
img3 = (ImageView)findViewById(R.id.imageView3);
new LoadAllProducts().execute();
}
class LoadAllProducts extends AsyncTask {
/**
* Before starting background thread Show Progress Dialog
* */
@Override
protected void onPreExecute() {
super.onPreExecute();
pDialog = new ProgressDialog(MainActivity.this);
pDialog.setMessage("Extracting Frames. Please wait...");
pDialog.setIndeterminate(false);
pDialog.setCancelable(false);
pDialog.show();
}
/**
* getting All products from url
* */
protected String doInBackground(String... args) {
if(file1.exists()){
for (long i = 0; i < 5000; i += 1000/14) { // lenms - video length in milliseconds
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
retriever.setDataSource(file1.toString());
// Bitmap bitmap = retriever.getFrameAtTime((i*1000/14), MediaMetadataRetriever.OPTION_CLOSEST_SYNC);
saveBitmapToCahche( getResizedBitmap((retriever.getFrameAtTime((i*1000/14), MediaMetadataRetriever.OPTION_CLOSEST_SYNC)), 500) ,String.valueOf(id1));
id1++;
//bitmapArray1.add(bitmap);
/* File file = new File(ScreenDIR, "sketchpad1" + id1 + ".png");
FileOutputStream fOut = null;
try {
fOut = new FileOutputStream(file);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
//bitmap.compress(Bitmap.CompressFormat.PNG, 30, fOut);
try {
fOut.flush();
} catch (IOException e) {
e.printStackTrace();
}
try {
fOut.close();
} catch (IOException e) {
e.printStackTrace();
}*/
}
}
/* if(file2.exists()){
retriever2.setDataSource(file2.toString());
for (long i = 0; i < 3000; i += 1000/24) { // lenms - video length in milliseconds
bitmap2 = retriever2.getFrameAtTime(i*1000/29, MediaMetadataRetriever.OPTION_CLOSEST_SYNC);
//bitmapArray2.add(bitmap2);
File file = new File(ScreenDIR, "sketchpad2" + id2 + ".png");
FileOutputStream fOut = null;
try {
fOut = new FileOutputStream(file);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
bitmap2.compress(Bitmap.CompressFormat.PNG, 85, fOut);
try {
fOut.flush();
} catch (IOException e) {
e.printStackTrace();
}
try {
fOut.close();
id2++;
} catch (IOException e) {
e.printStackTrace();
}
}
}
if(file3.exists()){
retriever3.setDataSource(file3.toString());
for (long i = 0; i < 3000; i += 1000/24) { // lenms - video length in milliseconds
bitmap3 = retriever3.getFrameAtTime(i*1000/29, MediaMetadataRetriever.OPTION_CLOSEST_SYNC);
// bitmapArray3.add(bitmap3);
File file = new File(ScreenDIR, "sketchpad3" + id3 + ".png");
FileOutputStream fOut = null;
try {
fOut = new FileOutputStream(file);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
bitmap3.compress(Bitmap.CompressFormat.PNG, 85, fOut);
try {
fOut.flush();
} catch (IOException e) {
e.printStackTrace();
}
try {
fOut.close();
id3++;
} catch (IOException e) {
e.printStackTrace();
}
}
}*/
return null;
}
/**
* After completing background task Dismiss the progress dialog
* **/
protected void onPostExecute(String file_url) {
// dismiss the dialog after getting all products
pDialog.dismiss();
img.setImageBitmap(retrieveBitmapFromCache(String.valueOf(id2)));
id2 = 50;
img2.setImageBitmap(retrieveBitmapFromCache(String.valueOf(id2)));
id2 = 69;
img3.setImageBitmap(retrieveBitmapFromCache(String.valueOf(id2)));
// img2.setImageBitmap(bitmapArray2.get(0));
// img3.setImageBitmap(bitmapArray3.get(0));
}
}
public void saveBitmapToCahche(Bitmap bb,String ID ){
Cache.getInstance().getLru().put(ID, bb);
}
public Bitmap retrieveBitmapFromCache(String ID) {
Bitmap bitmap = (Bitmap) Cache.getInstance().getLru().get(ID);
return bitmap;
}
public Bitmap getResizedBitmap(Bitmap image, int maxSize) {
int width = image.getWidth();
int height = image.getHeight();
float bitmapRatio = (float)width / (float) height;
if (bitmapRatio > 0) {
width = maxSize;
height = (int) (width / bitmapRatio);
} else {
height = maxSize;
width = (int) (height * bitmapRatio);
}
return Bitmap.createScaledBitmap(image, width, height, true);
}
}`
-
Why FFMPEG commands not working in marshmallows and lollipop ?
22 février 2016, par Andy Developerwhy my code is not working in marshmallows and lollipop devices. or any idea how to use FFMPEG in that versions. any help.
import android.os.Bundle;
import android.os.Environment;
import android.support.v7.app.AppCompatActivity;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.Toast;
import com.kru.ffmpeglibs.FFmpeg;
import com.kru.ffmpeglibs.FFmpegExecuteResponseHandler;
import com.kru.ffmpeglibs.FFmpegLoadBinaryResponseHandler;
import com.kru.ffmpeglibs.exceptions.FFmpegCommandAlreadyRunningException;
import com.kru.ffmpeglibs.exceptions.FFmpegNotSupportedException;
public class CommandsActivity extends AppCompatActivity {
private FFmpeg fFmpeg;
private Button btnGenerate;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
fFmpeg = FFmpeg.getInstance(CommandsActivity.this);
executeBinary();
btnGenerate = (Button) findViewById(R.id.btnGenerate);
btnGenerate.setOnClickListener(new OnClickListener() {
@Override
public void onClick(View v) {
try {
String[] ffmpegCommand = { "-i "
+ Environment.getExternalStorageDirectory()
.getPath()
+ "/vid.mp4"
+ " -r 10 "
+ Environment.getExternalStorageDirectory()
.getPath()
+ "/com.mobvcasting.mjpegffmpeg/frame_%05d.jpg" };
executeCommand(ffmpegCommand);
} catch (FFmpegCommandAlreadyRunningException e) {
e.printStackTrace();
}
}
});
}
private void executeCommand(String[] cmd)
throws FFmpegCommandAlreadyRunningException {
fFmpeg.execute(cmd, new FFmpegExecuteResponseHandler() {
@Override
public void onSuccess(String message) {
Toast.makeText(CommandsActivity.this, "Sucesses..",
Toast.LENGTH_SHORT).show();
System.out.println(message);
}
@Override
public void onProgress(String message) {
// Toast.makeText(MainActivity.this, "On Process",
// Toast.LENGTH_SHORT).show();
System.out.println(message);
}
@Override
public void onFailure(String message) {
Toast.makeText(CommandsActivity.this, "Fail this",
Toast.LENGTH_SHORT).show();
System.out.println(message);
}
@Override
public void onStart() {
}
@Override
public void onFinish() {
Toast.makeText(CommandsActivity.this, "Finish",
Toast.LENGTH_SHORT).show();
}
});
}
private void executeBinary() {
try {
fFmpeg.loadBinary(new FFmpegLoadBinaryResponseHandler() {
@Override
public void onFailure() {
}
@Override
public void onSuccess() {
}
@Override
public void onStart() {
}
@Override
public void onFinish() {
}
}); } catch (FFmpegNotSupportedException e) { e.printStackTrace();
}
}
}Here is my code but it still not working. please tell me what is wrong in the code
The exception i got is something like this.02-22 11:18:41.469: E/AndroidRuntime(27839): FATAL EXCEPTION: main
02-22 11:18:41.469: E/AndroidRuntime(27839): java.lang.UnsatisfiedLinkError: Native method not found: com.kru.ffmpeglibs.ArmArchHelper.cpuArchFromJNI:()Ljava/lang/String;
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.kru.ffmpeglibs.ArmArchHelper.cpuArchFromJNI(Native Method)
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.kru.ffmpeglibs.CpuArchHelper.getCpuArch(CpuArchHelper.java:61)
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.kru.ffmpeglibs.FFmpeg.loadBinary(FFmpeg.java:40)
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.kru.sampleffmpeg.MainActivity.loadFFMpegBinary(MainActivity.java:68)
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.kru.sampleffmpeg.MainActivity.onCreate(MainActivity.java:36)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.Activity.performCreate(Activity.java:5372)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1104)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2257)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2349)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.ActivityThread.access$700(ActivityThread.java:159)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1316)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.os.Handler.dispatchMessage(Handler.java:99)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.os.Looper.loop(Looper.java:176)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.ActivityThread.main(ActivityThread.java:5419)
02-22 11:18:41.469: E/AndroidRuntime(27839): at java.lang.reflect.Method.invokeNative(Native Method)
02-22 11:18:41.469: E/AndroidRuntime(27839): at java.lang.reflect.Method.invoke(Method.java:525)
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1046)
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:862)
02-22 11:18:41.469: E/AndroidRuntime(27839): at dalvik.system.NativeStart.main(Native Method)