
Recherche avancée
Médias (1)
-
Somos millones 1
21 juillet 2014, par
Mis à jour : Juin 2015
Langue : français
Type : Video
Autres articles (68)
-
Diogene : création de masques spécifiques de formulaires d’édition de contenus
26 octobre 2010, parDiogene est un des plugins ? SPIP activé par défaut (extension) lors de l’initialisation de MediaSPIP.
A quoi sert ce plugin
Création de masques de formulaires
Le plugin Diogène permet de créer des masques de formulaires spécifiques par secteur sur les trois objets spécifiques SPIP que sont : les articles ; les rubriques ; les sites
Il permet ainsi de définir en fonction d’un secteur particulier, un masque de formulaire par objet, ajoutant ou enlevant ainsi des champs afin de rendre le formulaire (...) -
MediaSPIP version 0.1 Beta
16 avril 2011, parMediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...) -
Utilisation et configuration du script
19 janvier 2011, parInformations spécifiques à la distribution Debian
Si vous utilisez cette distribution, vous devrez activer les dépôts "debian-multimedia" comme expliqué ici :
Depuis la version 0.3.1 du script, le dépôt peut être automatiquement activé à la suite d’une question.
Récupération du script
Le script d’installation peut être récupéré de deux manières différentes.
Via svn en utilisant la commande pour récupérer le code source à jour :
svn co (...)
Sur d’autres sites (5211)
-
Background process that is run on file is still running after the file is deleted
19 octobre 2015, par Marko DjokicSo i have a process in Android that is run like this :
process = Runtime.getRuntime().exec(commandString);
Runtime.getRuntime().addShutdownHook(new Thread(new Runnable() {
public void run() {
Log.d("ShutDownHook Called");
process.destroy();
}
}));Where the command string contains a file path and some other arguments.
That process is a CPU heavy process and can take last as long as a few hours (Movie transcoding). The file is an ffmpeg static file. The problem is in some cases the process stays in the background even though i killed the app. That situation is on one of my phones when i kill the app with the task manager. That being said, the onDestroy() method is not called, nor is the onTerminate() from the application class, nor the shutdown hook from above.Also i have created a background service with a notification, so when exit the app the service should stay in the background. Even with this kind of control, when i kill the application with the task manager, the service is restarted and i lose all the references to my ffmpeg class, Async task, process etc, and i cannot stop the process because i have null pointers.
Anyway i can ensure that the service will not be tampered with the app kill, so i can kill the process with the notification bar from my service ?ServiceConnection mConnection = new ServiceConnection() {
public void onServiceDisconnected(ComponentName name) {
Log.d("HelperActivity", "onServiceDisconnected");
mIsBound = false;
mServer = null;
}
public void onServiceConnected(ComponentName name, IBinder service) {
Log.d("HelperActivity", "onServiceConnected");
mIsBound = true;
NotificationService.LocalBinder mLocalBinder = (NotificationService.LocalBinder)service;
mServer = mLocalBinder.getServerInstance();
//mServer.test();
}
};
//This is the onCreate of the application class
@Override
public void onCreate() {
super.onCreate();
Intent intent = new Intent(this,
NotificationService.class);
startService(intent);
bindService(intent, mConnection, Context.BIND_AUTO_CREATE);
Log.d("onCreate", "Called");
singleton = this;
}I have been using http://hiteshsondhi88.github.io/ffmpeg-android-java/ for the ffmpeg support. Of course added new features but the basic concept of running a ffmpeg command is the same.
One other strange thing is, when i start the app i delete the previous file, and still the process is running in the background, cause my transcoding performances are halved. The file is copied from assets to the internal storage every time the app is started.
File ffmpegFile = new File(FileUtils.getFFmpeg(context));
if (ffmpegFile.exists()) {
Log.d("File exists!");
ffmpegFile.delete();
}The file is deleted i checked, but my CPU is still used a lot.
Sorry for any mistakes during the post, it is my first one. -
Android studio + OpenCV + FFmpeg
26 février 2016, par t0mI have problem with code, which is functional only for Genymotion device (Android 4.1.1), but for Genymotion device 5.0.1 and real device Huawei honor 4c Android 4.4.2 not.
I have imported OpenCV 3.1 to Android studio by : http://stackoverflow.com/a/27421494/4244605
I added JavaCV with FFmpeg by : https://github.com/bytedeco/javacvAndroid studio 1.5.1
minSdkVersion 15
compileSdkVersion 23Code is only for test.
OpenCVCameraActivity.java :package co.timeiseverything.pstimeiseverything;
import android.app.Activity;
import android.hardware.Camera;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.Environment;
import android.util.Log;
import android.view.Menu;
import android.view.MenuItem;
import android.view.MotionEvent;
import android.view.SubMenu;
import android.view.SurfaceView;
import android.view.View;
import android.view.WindowManager;
import android.widget.Toast;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.Frame;
import org.opencv.android.BaseLoaderCallback;
import org.opencv.android.CameraBridgeViewBase;
import org.opencv.android.LoaderCallbackInterface;
import org.opencv.android.OpenCVLoader;
import org.opencv.core.Mat;
import java.io.File;
import java.nio.ShortBuffer;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.List;
import java.util.ListIterator;
@SuppressWarnings("ALL")
public class OpenCVCameraActivity extends Activity implements
CameraBridgeViewBase.CvCameraViewListener2,
View.OnTouchListener {
//name of activity, for DEBUGGING
private static final String TAG = OpenCVCameraActivity.class.getSimpleName();
private OpenCVCameraPreview mOpenCvCameraView;
private List mResolutionList;
private MenuItem[] mEffectMenuItems;
private SubMenu mColorEffectsMenu;
private MenuItem[] mResolutionMenuItems;
private SubMenu mResolutionMenu;
private static long frameCounter = 0;
long startTime = 0;
private Mat edgesMat;
boolean recording = false;
private int sampleAudioRateInHz = 44100;
private int imageWidth = 1920;
private int imageHeight = 1080;
private int frameRate = 30;
private Frame yuvImage = null;
private File ffmpeg_link;
private FFmpegFrameRecorder recorder;
/* audio data getting thread */
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
volatile boolean runAudioThread = true;
ShortBuffer[] samples;
private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
@Override
public void onManagerConnected(int status) {
switch (status) {
case LoaderCallbackInterface.SUCCESS:
Log.i(TAG, "OpenCV loaded successfully");
mOpenCvCameraView.enableView();
mOpenCvCameraView.setOnTouchListener(OpenCVCameraActivity.this);
break;
default:
super.onManagerConnected(status);
break;
}
}
};
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if(Static.DEBUG) Log.i(TAG, "onCreate()");
getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
try {
setContentView(R.layout.activity_opencv);
mOpenCvCameraView = (OpenCVCameraPreview) findViewById(R.id.openCVCameraPreview);
mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
mOpenCvCameraView.setCvCameraViewListener(this);
//mOpenCvCameraView.enableFpsMeter();
ffmpeg_link = new File(Environment.getExternalStorageDirectory(), "stream.mp4");
} catch (Exception e){
e.printStackTrace();
}
}
@Override
protected void onRestart() {
if (Static.DEBUG) Log.i(TAG, "onRestart()");
super.onRestart();
}
@Override
protected void onStart() {
if (Static.DEBUG) Log.i(TAG, "onStart()");
super.onStart();
}
@Override
protected void onResume() {
if (Static.DEBUG) Log.i(TAG, "onResume()");
super.onResume();
if (!OpenCVLoader.initDebug()) {
Log.i(TAG, "Internal OpenCV library not found. Using OpenCV Manager for initialization");
OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_11, this, mLoaderCallback);
} else {
Log.i(TAG, "OpenCV library found inside package. Using it!");
mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
}
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
if (Static.DEBUG) Log.i(TAG, "onCreateOptionsMenu()");
super.onCreateOptionsMenu(menu);
List<string> effects = mOpenCvCameraView.getEffectList();
if (effects == null) {
Log.e(TAG, "Color effects are not supported by device!");
return true;
}
mColorEffectsMenu = menu.addSubMenu("Color Effect");
mEffectMenuItems = new MenuItem[effects.size()];
int idx = 0;
ListIterator<string> effectItr = effects.listIterator();
while(effectItr.hasNext()) {
String element = effectItr.next();
mEffectMenuItems[idx] = mColorEffectsMenu.add(1, idx, Menu.NONE, element);
idx++;
}
mResolutionMenu = menu.addSubMenu("Resolution");
mResolutionList = mOpenCvCameraView.getResolutionList();
mResolutionMenuItems = new MenuItem[mResolutionList.size()];
ListIterator resolutionItr = mResolutionList.listIterator();
idx = 0;
while(resolutionItr.hasNext()) {
Camera.Size element = resolutionItr.next();
mResolutionMenuItems[idx] = mResolutionMenu.add(2, idx, Menu.NONE,
Integer.valueOf(element.width).toString() + "x" + Integer.valueOf(element.height).toString());
idx++;
}
return true;
}
@Override
protected void onPause() {
if (Static.DEBUG) Log.i(TAG, "onPause()");
super.onPause();
if (mOpenCvCameraView != null)
mOpenCvCameraView.disableView();
}
@Override
protected void onStop() {
if (Static.DEBUG) Log.i(TAG, "onStop()");
super.onStop();
}
@Override
protected void onDestroy() {
if (Static.DEBUG) Log.i(TAG, "onDestroy()");
super.onDestroy();
if (mOpenCvCameraView != null)
mOpenCvCameraView.disableView();
}
public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
++frameCounter;
//Log.i(TAG, "Frame number: "+frameCounter);
return inputFrame.rgba();
}
@Override
public void onCameraViewStarted(int width, int height) {
edgesMat = new Mat();
}
@Override
public void onCameraViewStopped() {
if (edgesMat != null)
edgesMat.release();
edgesMat = null;
}
public boolean onOptionsItemSelected(MenuItem item) {
Log.i(TAG, "called onOptionsItemSelected; selected item: " + item);
if (item.getGroupId() == 1)
{
mOpenCvCameraView.setEffect((String) item.getTitle());
Toast.makeText(this, mOpenCvCameraView.getEffect(), Toast.LENGTH_SHORT).show();
} else if (item.getGroupId() == 2) {
int id = item.getItemId();
Camera.Size resolution = mResolutionList.get(id);
mOpenCvCameraView.setResolution(resolution);
resolution = mOpenCvCameraView.getResolution();
String caption = Integer.valueOf(resolution.width).toString() + "x" + Integer.valueOf(resolution.height).toString();
Toast.makeText(this, caption, Toast.LENGTH_SHORT).show();
}
return true;
}
@Override
public boolean onTouch(View v, MotionEvent event) {
Log.i(TAG,"onTouch event");
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd_HH-mm-ss");
String currentDateandTime = sdf.format(new Date());
String fileName = Environment.getExternalStorageDirectory().getPath() +
"/sample_picture_" + currentDateandTime + ".jpg";
mOpenCvCameraView.takePicture(fileName);
Toast.makeText(this, fileName + " saved", Toast.LENGTH_SHORT).show();
return false;
}
/**
* Click to ImageButton to start recording.
*/
public void onClickBtnStartRecord2(View v) {
if (Static.DEBUG) Log.i(TAG, "onClickBtnStartRecord()");
if(!recording)
startRecording();
else
stopRecording();
}
private void startRecording() {
if (Static.DEBUG) Log.i(TAG, "startRecording()");
initRecorder();
try {
recorder.start();
startTime = System.currentTimeMillis();
recording = true;
audioThread.start();
} catch(FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
}
private void stopRecording() {
if (Static.DEBUG) Log.i(TAG, "stopRecording()");
runAudioThread = false;
try {
audioThread.join();
} catch(InterruptedException e) {
e.printStackTrace();
}
audioRecordRunnable = null;
audioThread = null;
if(recorder != null && recording) {
recording = false;
Log.v(TAG, "Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
} catch(FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
recorder = null;
}
}
//---------------------------------------
// initialize ffmpeg_recorder
//---------------------------------------
private void initRecorder() {
Log.w(TAG, "init recorder");
try {
if (yuvImage == null) {
yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
Log.i(TAG, "create yuvImage");
}
Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.getAbsolutePath());
Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.exists());
recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
recorder.setFormat("mp4");
recorder.setSampleRate(sampleAudioRateInHz);
// Set in the surface changed method
recorder.setFrameRate(frameRate);
Log.i(TAG, "recorder initialize success");
audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
runAudioThread = true;
} catch (Exception e){
e.printStackTrace();
}
}
//---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {
@Override
public void run() {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
// Audio
int bufferSize;
ShortBuffer audioData;
int bufferReadResult;
bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
audioData = ShortBuffer.allocate(bufferSize);
Log.d(TAG, "audioRecord.startRecording()");
audioRecord.startRecording();
/* ffmpeg_audio encoding loop */
while(runAudioThread) {
//Log.v(TAG,"recording? " + recording);
bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
audioData.limit(bufferReadResult);
if(bufferReadResult > 0) {
Log.v(TAG, "bufferReadResult: " + bufferReadResult);
// If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
// Why? Good question...
if(recording) {
try {
recorder.recordSamples(audioData);
//Log.v(TAG,"recording " + 1024*i + " to " + 1024*i+1024);
} catch(FFmpegFrameRecorder.Exception e) {
Log.v(TAG, e.getMessage());
e.printStackTrace();
}
}
}
}
Log.v(TAG, "AudioThread Finished, release audioRecord");
/* encoding finish, release recorder */
if(audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
Log.v(TAG, "audioRecord released");
}
}
}
}
</string></string>OpenCVCameraPreview.java :
package co.timeiseverything.pstimeiseverything;
import android.content.Context;
import android.hardware.Camera;
import android.util.AttributeSet;
import android.util.Log;
import org.opencv.android.JavaCameraView;
import java.io.FileOutputStream;
import java.util.List;
public class OpenCVCameraPreview extends JavaCameraView implements Camera.PictureCallback {
private static final String TAG = OpenCVCameraPreview.class.getSimpleName();
private String mPictureFileName;
public OpenCVCameraPreview(Context context, AttributeSet attrs) {
super(context, attrs);
}
public List<string> getEffectList() {
return mCamera.getParameters().getSupportedColorEffects();
}
public boolean isEffectSupported() {
return (mCamera.getParameters().getColorEffect() != null);
}
public String getEffect() {
return mCamera.getParameters().getColorEffect();
}
public void setEffect(String effect) {
Camera.Parameters params = mCamera.getParameters();
params.setColorEffect(effect);
mCamera.setParameters(params);
}
public List getResolutionList() {
return mCamera.getParameters().getSupportedPreviewSizes();
}
public void setResolution(Camera.Size resolution) {
disconnectCamera();
mMaxHeight = resolution.height;
mMaxWidth = resolution.width;
connectCamera(getWidth(), getHeight());
}
public Camera.Size getResolution() {
return mCamera.getParameters().getPreviewSize();
}
public void takePicture(final String fileName) {
Log.i(TAG, "Taking picture");
this.mPictureFileName = fileName;
// Postview and jpeg are sent in the same buffers if the queue is not empty when performing a capture.
// Clear up buffers to avoid mCamera.takePicture to be stuck because of a memory issue
mCamera.setPreviewCallback(null);
// PictureCallback is implemented by the current class
mCamera.takePicture(null, null, this);
}
@Override
public void onPictureTaken(byte[] data, Camera camera) {
Log.i(TAG, "Saving a bitmap to file");
// The camera preview was automatically stopped. Start it again.
mCamera.startPreview();
mCamera.setPreviewCallback(this);
// Write the image in a file (in jpeg format)
try {
FileOutputStream fos = new FileOutputStream(mPictureFileName);
fos.write(data);
fos.close();
} catch (java.io.IOException e) {
Log.e("PictureDemo", "Exception in photoCallback", e);
}
}
}
</string>Gradle :
apply plugin: 'com.android.application'
android {
compileSdkVersion 23
buildToolsVersion "23.0.2"
defaultConfig {
applicationId "co.timeiseverything.pstimeiseverything"
minSdkVersion 15
targetSdkVersion 23
versionCode 1
versionName "1.0"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
}
}
packagingOptions {
exclude 'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.properties'
exclude 'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.xml'
exclude 'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.properties'
exclude 'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.xml'
}
}
repositories {
mavenCentral()
}
dependencies {
compile fileTree(include: ['*.jar'], dir: 'libs')
testCompile 'junit:junit:4.12'
compile 'com.android.support:appcompat-v7:23.1.1'
compile 'com.google.android.gms:play-services-appindexing:8.1.0'
compile group: 'org.bytedeco', name: 'javacv', version: '1.1'
compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.0.0-1.1', classifier: 'android-arm'
compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.0.0-1.1', classifier: 'android-x86'
compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.8.1-1.1', classifier: 'android-arm'
compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.8.1-1.1', classifier: 'android-x86'
compile project(':openCVLibrary310')
}proguard-rules.pro
Edited by : linkjniLibs :
app/src/main/jniLibs :armeabi armeabi-v7a arm64-v8a mips mips64 x86 x86_64
Problem
02-19 11:57:37.684 1759-1759/ I/OpenCVCameraActivity: onClickBtnStartRecord()
02-19 11:57:37.684 1759-1759/ I/OpenCVCameraActivity: startRecording()
02-19 11:57:37.684 1759-1759/ W/OpenCVCameraActivity: init recorder
02-19 11:57:37.691 1759-1759/ I/OpenCVCameraActivity: create yuvImage
02-19 11:57:37.691 1759-1759/ I/OpenCVCameraActivity: ffmpeg_url: /storage/emulated/0/stream.mp4
02-19 11:57:37.696 1759-1759/ I/OpenCVCameraActivity: ffmpeg_url: false
02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x1d arg 0x18cc3
02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x6ffffffe arg 0x21c30
02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x6fffffff arg 0x1
02-19 11:57:37.838 1759-1759/co.timeiseverything.pstimeiseverything E/art: dlopen("/data/app/co.timeiseverything.pstimeiseverything-2/lib/x86/libjniavutil.so", RTLD_LAZY) failed: dlopen failed: cannot locate symbol "av_version_info" referenced by "libjniavutil.so"...
02-19 11:57:37.843 1759-1759/co.timeiseverything.pstimeiseverything I/art: Rejecting re-init on previously-failed class java.lang.Class
02-19 11:57:37.844 1759-1759/co.timeiseverything.pstimeiseverything E/AndroidRuntime: FATAL EXCEPTION: main
Process: co.timeiseverything.pstimeiseverything, PID: 1759
java.lang.IllegalStateException: Could not execute method of the activity
at android.view.View$1.onClick(View.java:4020)
at android.view.View.performClick(View.java:4780)
at android.view.View$PerformClick.run(View.java:19866)
at android.os.Handler.handleCallback(Handler.java:739)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:135)
at android.app.ActivityThread.main(ActivityThread.java:5254)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698)
Caused by: java.lang.reflect.InvocationTargetException
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at android.view.View$1.onClick(View.java:4015)
at android.view.View.performClick(View.java:4780)
at android.view.View$PerformClick.run(View.java:19866)
at android.os.Handler.handleCallback(Handler.java:739)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:135)
at android.app.ActivityThread.main(ActivityThread.java:5254)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698)
Caused by: java.lang.UnsatisfiedLinkError: org.bytedeco.javacpp.avutil
at java.lang.Class.classForName(Native Method)
at java.lang.Class.forName(Class.java:309)
at org.bytedeco.javacpp.Loader.load(Loader.java:413)
at org.bytedeco.javacpp.Loader.load(Loader.java:381)
at org.bytedeco.javacpp.avcodec$AVPacket.<clinit>(avcodec.java:1650)
at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:149)
at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:129)
at co.timeiseverything.pstimeiseverything.OpenCVCameraActivity.initRecorder(OpenCVCameraActivity.java:320)
at co.timeiseverything.pstimeiseverything.OpenCVCameraActivity.startRecording(OpenCVCameraActivity.java:266)
at co.timeiseverything.pstimeiseverything.OpenCVCameraActivity.onClickBtnStartRecord2(OpenCVCameraActivity.java:259)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at android.view.View$1.onClick(View.java:4015)
at android.view.View.performClick(View.java:4780)
at android.view.View$PerformClick.run(View.java:19866)
at android.os.Handler.handleCallback(Handler.java:739)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:135)
at android.app.ActivityThread.main(ActivityThread.java:5254)
at java.lang.reflect.Method.invoke(Native Method)
at java.lang.reflect.Method.invoke(Method.java:372)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698)
</init></init></clinit> -
Youtube Watch Me Android application closing unexpectedly
18 septembre 2015, par KichuI create android application from https://github.com/youtube/yt-watchme.
When I try to create the event using the "CREATE LIVE EVENT" button.It throws some following error. I think it’s happening due to the camera permission issue.
ERROR :
09-17 11:43:53.582 32383-32383/com.google.android.apps.watchme E/AndroidRuntime﹕ FATAL EXCEPTION: main
Process: com.google.android.apps.watchme, PID: 32383
java.lang.NoSuchMethodError: com.google.android.apps.watchme.StreamerActivity.checkSelfPermission
at com.google.android.apps.watchme.StreamerActivity.startStreaming(StreamerActivity.java:174)
at com.google.android.apps.watchme.StreamerActivity.access$200(StreamerActivity.java:46)
at com.google.android.apps.watchme.StreamerActivity$1.onServiceConnected(StreamerActivity.java:63)
at android.app.LoadedApk$ServiceDispatcher.doConnected(LoadedApk.java:1110)
at android.app.LoadedApk$ServiceDispatcher$RunConnection.run(LoadedApk.java:1127)
at android.os.Handler.handleCallback(Handler.java:733)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:136)
at android.app.ActivityThread.main(ActivityThread.java:5097)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:515)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:785)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:601)
at dalvik.system.NativeStart.main(Native Method)Any one please suggest . How can I solve this issue.
Thanks In Advance
Code Update :
Manifest file XML
<manifest package="com.google.android.apps.watchme">
<application>
<activity>
<action></action>
<category></category>
</activity>
<activity></activity>
<service></service>
</application>
</manifest>Stream Activity File
/*
* Copyright (c) 2014 Google Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except
* in compliance with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software distributed under the License
* is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express
* or implied. See the License for the specific language governing permissions and limitations under
* the License.
*/
package com.google.android.apps.watchme;
import android.Manifest;
import android.app.Activity;
import android.content.ComponentName;
import android.content.Context;
import android.content.Intent;
import android.content.ServiceConnection;
import android.content.pm.PackageManager;
import android.hardware.Camera;
import android.os.Bundle;
import android.os.IBinder;
import android.os.PowerManager;
import android.support.design.widget.Snackbar;
import android.support.v4.app.ActivityCompat;
import android.util.Log;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.ToggleButton;
import com.google.android.apps.watchme.util.Utils;
import com.google.android.apps.watchme.util.YouTubeApi;
import java.util.ArrayList;
import java.util.List;
/**
* @author Ibrahim Ulukaya <ulukaya@google.com>
* <p></p>
* StreamerActivity class which previews the camera and streams via StreamerService.
*/
public class StreamerActivity extends Activity {
// CONSTANTS
// TODO: Stop hardcoding this and read values from the camera's supported sizes.
public static final int CAMERA_WIDTH = 640;
public static final int CAMERA_HEIGHT = 480;
private static final int REQUEST_CAMERA_MICROPHONE = 0;
// Member variables
private StreamerService streamerService;
private ServiceConnection streamerConnection = new ServiceConnection() {
@Override
public void onServiceConnected(ComponentName className, IBinder service) {
Log.d(MainActivity.APP_NAME, "onServiceConnected");
streamerService = ((StreamerService.LocalBinder) service).getService();
restoreStateFromService();
startStreaming();
}
@Override
public void onServiceDisconnected(ComponentName className) {
Log.e(MainActivity.APP_NAME, "onServiceDisconnected");
// This should never happen, because our service runs in the same process.
streamerService = null;
}
};
private PowerManager.WakeLock wakeLock;
private Preview preview;
private String rtmpUrl;
private String broadcastId;
@Override
public void onCreate(Bundle savedInstanceState) {
Log.d(MainActivity.APP_NAME, "onCreate");
super.onCreate(savedInstanceState);
broadcastId = getIntent().getStringExtra(YouTubeApi.BROADCAST_ID_KEY);
//Log.v(MainActivity.APP_NAME, broadcastId);
rtmpUrl = getIntent().getStringExtra(YouTubeApi.RTMP_URL_KEY);
if (rtmpUrl == null) {
Log.w(MainActivity.APP_NAME, "No RTMP URL was passed in; bailing.");
finish();
}
Log.i(MainActivity.APP_NAME, String.format("Got RTMP URL '%s' from calling activity.", rtmpUrl));
setContentView(R.layout.streamer);
preview = (Preview) findViewById(R.id.surfaceViewPreview);
if (!bindService(new Intent(this, StreamerService.class), streamerConnection,
BIND_AUTO_CREATE | BIND_DEBUG_UNBIND)) {
Log.e(MainActivity.APP_NAME, "Failed to bind StreamerService!");
}
final ToggleButton toggleButton = (ToggleButton) findViewById(R.id.toggleBroadcasting);
toggleButton.setOnClickListener(new OnClickListener() {
@Override
public void onClick(View v) {
if (toggleButton.isChecked()) {
streamerService.startStreaming(rtmpUrl);
} else {
streamerService.stopStreaming();
}
}
});
}
@Override
protected void onResume() {
Log.d(MainActivity.APP_NAME, "onResume");
super.onResume();
if (streamerService != null) {
restoreStateFromService();
}
}
@Override
protected void onPause() {
Log.d(MainActivity.APP_NAME, "onPause");
super.onPause();
if (preview != null) {
preview.setCamera(null);
}
if (streamerService != null) {
streamerService.releaseCamera();
}
}
@Override
protected void onDestroy() {
Log.d(MainActivity.APP_NAME, "onDestroy");
super.onDestroy();
if (streamerConnection != null) {
unbindService(streamerConnection);
}
stopStreaming();
if (streamerService != null) {
streamerService.releaseCamera();
}
}
private void restoreStateFromService() {
preview.setCamera(Utils.getCamera(Camera.CameraInfo.CAMERA_FACING_FRONT));
}
private void startStreaming() {
Log.d(MainActivity.APP_NAME, "startStreaming");
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
wakeLock = pm.newWakeLock(PowerManager.SCREEN_DIM_WAKE_LOCK, this.getClass().getName());
wakeLock.acquire();
if (!streamerService.isStreaming()) {
String cameraPermission = Manifest.permission.CAMERA;
String microphonePermission = Manifest.permission.RECORD_AUDIO;
int hasCamPermission = checkSelfPermission(cameraPermission);
int hasMicPermission = checkSelfPermission(microphonePermission);
List<string> permissions = new ArrayList<string>();
if (hasCamPermission != PackageManager.PERMISSION_GRANTED) {
permissions.add(cameraPermission);
if (ActivityCompat.shouldShowRequestPermissionRationale(this,
Manifest.permission.CAMERA)) {
// Provide rationale in Snackbar to request permission
Snackbar.make(preview, R.string.permission_camera_rationale,
Snackbar.LENGTH_INDEFINITE).show();
} else {
// Explain in Snackbar to turn on permission in settings
Snackbar.make(preview, R.string.permission_camera_explain,
Snackbar.LENGTH_INDEFINITE).show();
}
}
if (hasMicPermission != PackageManager.PERMISSION_GRANTED) {
permissions.add(microphonePermission);
if (ActivityCompat.shouldShowRequestPermissionRationale(this,
Manifest.permission.RECORD_AUDIO)) {
// Provide rationale in Snackbar to request permission
Snackbar.make(preview, R.string.permission_microphone_rationale,
Snackbar.LENGTH_INDEFINITE).show();
} else {
// Explain in Snackbar to turn on permission in settings
Snackbar.make(preview, R.string.permission_microphone_explain,
Snackbar.LENGTH_INDEFINITE).show();
}
}
if (!permissions.isEmpty()) {
String[] params = permissions.toArray(new String[permissions.size()]);
ActivityCompat.requestPermissions(this, params, REQUEST_CAMERA_MICROPHONE);
} else {
// We already have permission, so handle as normal
streamerService.startStreaming(rtmpUrl);
}
}
}
/**
* Callback received when a permissions request has been completed.
*/
@Override
public void onRequestPermissionsResult(int requestCode,
String permissions[], int[] grantResults) {
switch (requestCode) {
case REQUEST_CAMERA_MICROPHONE: {
Log.i(MainActivity.APP_NAME, "Received response for camera with mic permissions request.");
// We have requested multiple permissions for contacts, so all of them need to be
// checked.
if (Utils.verifyPermissions(grantResults)) {
// permissions were granted, yay! do the
// streamer task you need to do.
streamerService.startStreaming(rtmpUrl);
} else {
Log.i(MainActivity.APP_NAME, "Camera with mic permissions were NOT granted.");
Snackbar.make(preview, R.string.permissions_not_granted,
Snackbar.LENGTH_SHORT)
.show();
}
break;
}
// other 'switch' lines to check for other
// permissions this app might request
}
return;
}
private void stopStreaming() {
Log.d(MainActivity.APP_NAME, "stopStreaming");
if (wakeLock != null) {
wakeLock.release();
wakeLock = null;
}
if (streamerService.isStreaming()) {
streamerService.stopStreaming();
}
}
public void endEvent(View view) {
Intent data = new Intent();
data.putExtra(YouTubeApi.BROADCAST_ID_KEY, broadcastId);
if (getParent() == null) {
setResult(Activity.RESULT_OK, data);
} else {
getParent().setResult(Activity.RESULT_OK, data);
}
finish();
}
}
</string></string>Update ERROR CODE :
08:57:14.447 18829-18829/com.google.android.apps.watchme E/AndroidRuntime﹕ FATAL EXCEPTION: main
Process: com.google.android.apps.watchme, PID: 18829
java.lang.UnsatisfiedLinkError: Couldn't load ffmpeg from loader dalvik.system.PathClassLoader[DexPathList[[zip file "/data/app/com.google.android.apps.watchme-1.apk"],nativeLibraryDirectories=[/data/app-lib/com.google.android.apps.watchme-1, /vendor/lib, /system/lib]]]: findLibrary returned null
at java.lang.Runtime.loadLibrary(Runtime.java:358)
at java.lang.System.loadLibrary(System.java:526)
at com.google.android.apps.watchme.Ffmpeg.<clinit>(Ffmpeg.java:26)
at com.google.android.apps.watchme.VideoStreamingConnection.open(VideoStreamingConnection.java:71)
at com.google.android.apps.watchme.StreamerService.startStreaming(StreamerService.java:80)
at com.google.android.apps.watchme.StreamerActivity.startStreaming(StreamerActivity.java:212)
at com.google.android.apps.watchme.StreamerActivity.access$200(StreamerActivity.java:47)
at com.google.android.apps.watchme.StreamerActivity$1.onServiceConnected(StreamerActivity.java:64)
at android.app.LoadedApk$ServiceDispatcher.doConnected(LoadedApk.java:1110)
at android.app.LoadedApk$ServiceDispatcher$RunConnection.run(LoadedApk.java:1127)
at android.os.Handler.handleCallback(Handler.java:733)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:136)
at android.app.ActivityThread.main(ActivityThread.java:5097)
at java.lang.reflect.Method.invokeNative(Native Method)
at java.lang.reflect.Method.invoke(Method.java:515)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:785)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:601)
at dalvik.system.NativeStart.main(Native Method)
</clinit>