
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (81)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)
Sur d’autres sites (9982)
-
I want help in making video collage, tried everything but unsuccessful
16 mars 2016, par Haider AliI want to make Video Collage in which 2 or more videos should be displayed in one frame and then they can be converted into one Video file.
I tried examples but they just add videos at the end of each video to make a long one combine video.
Any Help PleaseString FILE_PATH = "/storage/sdcard0/testing.mp4";
String FILE_PATH2 = "/storage/sdcard0/testing1.mp4";
String FILE_PATH3 = "/storage/sdcard0/testing2.mp4";
File file1 = new File(FILE_PATH);
File file2 = new File(FILE_PATH2);
File file3 = new File(FILE_PATH3);
private ProgressDialog pDialog;
ImageView img,img2,img3;
MediaMetadataRetriever retriever2 = new MediaMetadataRetriever();
MediaMetadataRetriever retriever3 = new MediaMetadataRetriever();
ArrayList<bitmap> bitmapArray1 = new ArrayList<bitmap>();
ArrayList<bitmap> bitmapArray2 = new ArrayList<bitmap>();
ArrayList<bitmap> bitmapArray3 = new ArrayList<bitmap>();
File ScreenDIR = new File("/sdcard/Screens/");
</bitmap></bitmap></bitmap></bitmap></bitmap></bitmap>// have the object build the directory structure, if needed.
double id1=0,id2=0,id3=0;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
ScreenDIR.mkdirs();
img = (ImageView)findViewById(R.id.imageView);
img2 = (ImageView)findViewById(R.id.imageView2);
img3 = (ImageView)findViewById(R.id.imageView3);
new LoadAllProducts().execute();
}
class LoadAllProducts extends AsyncTask {
/**
* Before starting background thread Show Progress Dialog
* */
@Override
protected void onPreExecute() {
super.onPreExecute();
pDialog = new ProgressDialog(MainActivity.this);
pDialog.setMessage("Extracting Frames. Please wait...");
pDialog.setIndeterminate(false);
pDialog.setCancelable(false);
pDialog.show();
}
/**
* getting All products from url
* */
protected String doInBackground(String... args) {
if(file1.exists()){
for (long i = 0; i < 5000; i += 1000/14) { // lenms - video length in milliseconds
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
retriever.setDataSource(file1.toString());
// Bitmap bitmap = retriever.getFrameAtTime((i*1000/14), MediaMetadataRetriever.OPTION_CLOSEST_SYNC);
saveBitmapToCahche( getResizedBitmap((retriever.getFrameAtTime((i*1000/14), MediaMetadataRetriever.OPTION_CLOSEST_SYNC)), 500) ,String.valueOf(id1));
id1++;
//bitmapArray1.add(bitmap);
/* File file = new File(ScreenDIR, "sketchpad1" + id1 + ".png");
FileOutputStream fOut = null;
try {
fOut = new FileOutputStream(file);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
//bitmap.compress(Bitmap.CompressFormat.PNG, 30, fOut);
try {
fOut.flush();
} catch (IOException e) {
e.printStackTrace();
}
try {
fOut.close();
} catch (IOException e) {
e.printStackTrace();
}*/
}
}
/* if(file2.exists()){
retriever2.setDataSource(file2.toString());
for (long i = 0; i < 3000; i += 1000/24) { // lenms - video length in milliseconds
bitmap2 = retriever2.getFrameAtTime(i*1000/29, MediaMetadataRetriever.OPTION_CLOSEST_SYNC);
//bitmapArray2.add(bitmap2);
File file = new File(ScreenDIR, "sketchpad2" + id2 + ".png");
FileOutputStream fOut = null;
try {
fOut = new FileOutputStream(file);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
bitmap2.compress(Bitmap.CompressFormat.PNG, 85, fOut);
try {
fOut.flush();
} catch (IOException e) {
e.printStackTrace();
}
try {
fOut.close();
id2++;
} catch (IOException e) {
e.printStackTrace();
}
}
}
if(file3.exists()){
retriever3.setDataSource(file3.toString());
for (long i = 0; i < 3000; i += 1000/24) { // lenms - video length in milliseconds
bitmap3 = retriever3.getFrameAtTime(i*1000/29, MediaMetadataRetriever.OPTION_CLOSEST_SYNC);
// bitmapArray3.add(bitmap3);
File file = new File(ScreenDIR, "sketchpad3" + id3 + ".png");
FileOutputStream fOut = null;
try {
fOut = new FileOutputStream(file);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
bitmap3.compress(Bitmap.CompressFormat.PNG, 85, fOut);
try {
fOut.flush();
} catch (IOException e) {
e.printStackTrace();
}
try {
fOut.close();
id3++;
} catch (IOException e) {
e.printStackTrace();
}
}
}*/
return null;
}
/**
* After completing background task Dismiss the progress dialog
* **/
protected void onPostExecute(String file_url) {
// dismiss the dialog after getting all products
pDialog.dismiss();
img.setImageBitmap(retrieveBitmapFromCache(String.valueOf(id2)));
id2 = 50;
img2.setImageBitmap(retrieveBitmapFromCache(String.valueOf(id2)));
id2 = 69;
img3.setImageBitmap(retrieveBitmapFromCache(String.valueOf(id2)));
// img2.setImageBitmap(bitmapArray2.get(0));
// img3.setImageBitmap(bitmapArray3.get(0));
}
}
public void saveBitmapToCahche(Bitmap bb,String ID ){
Cache.getInstance().getLru().put(ID, bb);
}
public Bitmap retrieveBitmapFromCache(String ID) {
Bitmap bitmap = (Bitmap) Cache.getInstance().getLru().get(ID);
return bitmap;
}
public Bitmap getResizedBitmap(Bitmap image, int maxSize) {
int width = image.getWidth();
int height = image.getHeight();
float bitmapRatio = (float)width / (float) height;
if (bitmapRatio > 0) {
width = maxSize;
height = (int) (width / bitmapRatio);
} else {
height = maxSize;
width = (int) (height * bitmapRatio);
}
return Bitmap.createScaledBitmap(image, width, height, true);
}
}`
-
Why FFMPEG commands not working in marshmallows and lollipop ?
22 février 2016, par Andy Developerwhy my code is not working in marshmallows and lollipop devices. or any idea how to use FFMPEG in that versions. any help.
import android.os.Bundle;
import android.os.Environment;
import android.support.v7.app.AppCompatActivity;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.Toast;
import com.kru.ffmpeglibs.FFmpeg;
import com.kru.ffmpeglibs.FFmpegExecuteResponseHandler;
import com.kru.ffmpeglibs.FFmpegLoadBinaryResponseHandler;
import com.kru.ffmpeglibs.exceptions.FFmpegCommandAlreadyRunningException;
import com.kru.ffmpeglibs.exceptions.FFmpegNotSupportedException;
public class CommandsActivity extends AppCompatActivity {
private FFmpeg fFmpeg;
private Button btnGenerate;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
fFmpeg = FFmpeg.getInstance(CommandsActivity.this);
executeBinary();
btnGenerate = (Button) findViewById(R.id.btnGenerate);
btnGenerate.setOnClickListener(new OnClickListener() {
@Override
public void onClick(View v) {
try {
String[] ffmpegCommand = { "-i "
+ Environment.getExternalStorageDirectory()
.getPath()
+ "/vid.mp4"
+ " -r 10 "
+ Environment.getExternalStorageDirectory()
.getPath()
+ "/com.mobvcasting.mjpegffmpeg/frame_%05d.jpg" };
executeCommand(ffmpegCommand);
} catch (FFmpegCommandAlreadyRunningException e) {
e.printStackTrace();
}
}
});
}
private void executeCommand(String[] cmd)
throws FFmpegCommandAlreadyRunningException {
fFmpeg.execute(cmd, new FFmpegExecuteResponseHandler() {
@Override
public void onSuccess(String message) {
Toast.makeText(CommandsActivity.this, "Sucesses..",
Toast.LENGTH_SHORT).show();
System.out.println(message);
}
@Override
public void onProgress(String message) {
// Toast.makeText(MainActivity.this, "On Process",
// Toast.LENGTH_SHORT).show();
System.out.println(message);
}
@Override
public void onFailure(String message) {
Toast.makeText(CommandsActivity.this, "Fail this",
Toast.LENGTH_SHORT).show();
System.out.println(message);
}
@Override
public void onStart() {
}
@Override
public void onFinish() {
Toast.makeText(CommandsActivity.this, "Finish",
Toast.LENGTH_SHORT).show();
}
});
}
private void executeBinary() {
try {
fFmpeg.loadBinary(new FFmpegLoadBinaryResponseHandler() {
@Override
public void onFailure() {
}
@Override
public void onSuccess() {
}
@Override
public void onStart() {
}
@Override
public void onFinish() {
}
}); } catch (FFmpegNotSupportedException e) { e.printStackTrace();
}
}
}Here is my code but it still not working. please tell me what is wrong in the code
The exception i got is something like this.02-22 11:18:41.469: E/AndroidRuntime(27839): FATAL EXCEPTION: main
02-22 11:18:41.469: E/AndroidRuntime(27839): java.lang.UnsatisfiedLinkError: Native method not found: com.kru.ffmpeglibs.ArmArchHelper.cpuArchFromJNI:()Ljava/lang/String;
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.kru.ffmpeglibs.ArmArchHelper.cpuArchFromJNI(Native Method)
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.kru.ffmpeglibs.CpuArchHelper.getCpuArch(CpuArchHelper.java:61)
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.kru.ffmpeglibs.FFmpeg.loadBinary(FFmpeg.java:40)
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.kru.sampleffmpeg.MainActivity.loadFFMpegBinary(MainActivity.java:68)
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.kru.sampleffmpeg.MainActivity.onCreate(MainActivity.java:36)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.Activity.performCreate(Activity.java:5372)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1104)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2257)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2349)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.ActivityThread.access$700(ActivityThread.java:159)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1316)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.os.Handler.dispatchMessage(Handler.java:99)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.os.Looper.loop(Looper.java:176)
02-22 11:18:41.469: E/AndroidRuntime(27839): at android.app.ActivityThread.main(ActivityThread.java:5419)
02-22 11:18:41.469: E/AndroidRuntime(27839): at java.lang.reflect.Method.invokeNative(Native Method)
02-22 11:18:41.469: E/AndroidRuntime(27839): at java.lang.reflect.Method.invoke(Method.java:525)
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1046)
02-22 11:18:41.469: E/AndroidRuntime(27839): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:862)
02-22 11:18:41.469: E/AndroidRuntime(27839): at dalvik.system.NativeStart.main(Native Method) -
FFmpeg can not open video file after adding the GLsurfaceView to render frames
4 avril 2016, par Kyle LoThe source code works perfectly without any modification.
I successfully use the below function to play the specified video.
playview.openVideoFile("/sdcard/Test/mv.mp4");
And for the research purpose I need to display the frame by using OpenGL ES. So I remove the original method below.
ANativeWindow* window = ANativeWindow_fromSurface(env, javaSurface);
ANativeWindow_Buffer buffer;
if (ANativeWindow_lock(window, &buffer, NULL) == 0) {
memcpy(buffer.bits, pixels, w * h * 2);
ANativeWindow_unlockAndPost(window);
}
ANativeWindow_release(window);And I add FrameRenderer class into my project
public class FrameRenderer implements GLSurfaceView.Renderer {
public long time = 0;
public short framerate = 0;
public long fpsTime = 0;
public long frameTime = 0;
public float avgFPS = 0;
private PlayNative mNative = null;
@Override
public void onSurfaceCreated(GL10 gl, EGLConfig config) {/*do nothing*/}
@Override
public void onSurfaceChanged(GL10 gl, int width, int height) {
}
@Override
public void onDrawFrame(GL10 gl) {
mNative.render();
}In the native side I create a corresponding method in VideoPlay.cpp And I only use
glClearColor
to test if the OpenGL function works or not.void VideoPlay::render() {
glClearColor(1.0f, 0.0f, 0.0f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
}And the
onCreate
is as below.protected void onCreate(Bundle savedInstanceState) {
// TODO Auto-generated method stub
super.onCreate(savedInstanceState);
setContentView(R.layout.main_layout);
playview = new PlayView(this);
playview.openVideoFile("/sdcard/test_tt_racing.mp4");
//playview.openVideoFile("/sdcard/big_buck_bunny.mp4");
GLSurfaceView surface = (GLSurfaceView)findViewById(R.id.surfaceviewclass);
surface.setRenderer(new FrameRenderer());
...Then test it on the mobile, the screen becomes red which means the GLSurfaceView and OpenGL works fine.
But after I press the play bottom, whole the app stucked. And Show in the
LogMy question is why I can open the video whose path is totally the same with the previous one, just after I added the GLsurface renderer and how can I fix it ?