Recherche avancée

Médias (0)

Mot : - Tags -/protocoles

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (53)

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

  • List of compatible distributions

    26 avril 2011, par

    The table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
    If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...)

  • Des sites réalisés avec MediaSPIP

    2 mai 2011, par

    Cette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
    Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.

Sur d’autres sites (6071)

  • Android + OpenCV + video recorder

    19 février 2016, par t0m

    I have problem with code, which is functional only for Genymotion device (Android 4.1.1), but for Genymotion device 5.0.1 and real device Huawei honor 4c Android 4.4.2 not.

    I have imported OpenCV 3.1 to Android studio by : http://stackoverflow.com/a/27421494/4244605
    I added JavaCV with FFmpeg by : https://github.com/bytedeco/javacv

    minSdkVersion 15
    compileSdkVersion 23

    OpenCVCameraActivity.java :

    package co.timeiseverything.pstimeiseverything;

    import android.app.Activity;
    import android.hardware.Camera;
    import android.media.AudioFormat;
    import android.media.AudioRecord;
    import android.media.MediaRecorder;
    import android.os.Bundle;
    import android.os.Environment;
    import android.util.Log;
    import android.view.Menu;
    import android.view.MenuItem;
    import android.view.MotionEvent;
    import android.view.SubMenu;
    import android.view.SurfaceView;
    import android.view.View;
    import android.view.WindowManager;
    import android.widget.Toast;

    import org.bytedeco.javacv.FFmpegFrameRecorder;
    import org.bytedeco.javacv.Frame;
    import org.opencv.android.BaseLoaderCallback;
    import org.opencv.android.CameraBridgeViewBase;
    import org.opencv.android.LoaderCallbackInterface;
    import org.opencv.android.OpenCVLoader;
    import org.opencv.core.Mat;

    import java.io.File;
    import java.nio.ShortBuffer;
    import java.text.SimpleDateFormat;
    import java.util.Date;
    import java.util.List;
    import java.util.ListIterator;

    @SuppressWarnings("ALL")
    public class OpenCVCameraActivity extends Activity implements
           CameraBridgeViewBase.CvCameraViewListener2,
           View.OnTouchListener {

       //name of activity, for DEBUGGING
       private static final String TAG = OpenCVCameraActivity.class.getSimpleName();

       private OpenCVCameraPreview mOpenCvCameraView;
       private List mResolutionList;
       private MenuItem[] mEffectMenuItems;
       private SubMenu mColorEffectsMenu;
       private MenuItem[] mResolutionMenuItems;
       private SubMenu mResolutionMenu;

       private static long frameCounter = 0;

       long startTime = 0;
       private Mat edgesMat;
       boolean recording = false;
       private int sampleAudioRateInHz = 44100;
       private int imageWidth = 1920;
       private int imageHeight = 1080;
       private int frameRate = 30;
       private Frame yuvImage = null;
       private File ffmpeg_link;
       private FFmpegFrameRecorder recorder;

       /* audio data getting thread */
       private AudioRecord audioRecord;
       private AudioRecordRunnable audioRecordRunnable;
       private Thread audioThread;
       volatile boolean runAudioThread = true;
       ShortBuffer[] samples;


       private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
           @Override
           public void onManagerConnected(int status) {
               switch (status) {
                   case LoaderCallbackInterface.SUCCESS:
                       Log.i(TAG, "OpenCV loaded successfully");
                       mOpenCvCameraView.enableView();
                       mOpenCvCameraView.setOnTouchListener(OpenCVCameraActivity.this);
                   break;
                   default:
                       super.onManagerConnected(status);
                   break;
               }
           }
       };

       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           if(Static.DEBUG) Log.i(TAG, "onCreate()");

           getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);

           try {
               setContentView(R.layout.activity_opencv);

               mOpenCvCameraView = (OpenCVCameraPreview) findViewById(R.id.openCVCameraPreview);
               mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
               mOpenCvCameraView.setCvCameraViewListener(this);

               //mOpenCvCameraView.enableFpsMeter();

               ffmpeg_link = new File(Environment.getExternalStorageDirectory(), "stream.mp4");
           } catch (Exception e){
               e.printStackTrace();
           }
       }

       @Override
       protected void onRestart() {
           if (Static.DEBUG) Log.i(TAG, "onRestart()");
           super.onRestart();
       }

       @Override
       protected void onStart() {
           if (Static.DEBUG) Log.i(TAG, "onStart()");
           super.onStart();
       }

       @Override
       protected void onResume() {
           if (Static.DEBUG) Log.i(TAG, "onResume()");
           super.onResume();

           if (!OpenCVLoader.initDebug()) {
               Log.i(TAG, "Internal OpenCV library not found. Using OpenCV Manager for initialization");
               OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_11, this, mLoaderCallback);
           } else {
               Log.i(TAG, "OpenCV library found inside package. Using it!");
               mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
           }

       }

       @Override
       public boolean onCreateOptionsMenu(Menu menu) {
           if (Static.DEBUG) Log.i(TAG, "onCreateOptionsMenu()");
           super.onCreateOptionsMenu(menu);

           List<string> effects = mOpenCvCameraView.getEffectList();

           if (effects == null) {
               Log.e(TAG, "Color effects are not supported by device!");
               return true;
           }

           mColorEffectsMenu = menu.addSubMenu("Color Effect");
           mEffectMenuItems = new MenuItem[effects.size()];

           int idx = 0;
           ListIterator<string> effectItr = effects.listIterator();
           while(effectItr.hasNext()) {
               String element = effectItr.next();
               mEffectMenuItems[idx] = mColorEffectsMenu.add(1, idx, Menu.NONE, element);
               idx++;
           }

           mResolutionMenu = menu.addSubMenu("Resolution");
           mResolutionList = mOpenCvCameraView.getResolutionList();
           mResolutionMenuItems = new MenuItem[mResolutionList.size()];

           ListIterator resolutionItr = mResolutionList.listIterator();
           idx = 0;
           while(resolutionItr.hasNext()) {
               Camera.Size element = resolutionItr.next();
               mResolutionMenuItems[idx] = mResolutionMenu.add(2, idx, Menu.NONE,
                       Integer.valueOf(element.width).toString() + "x" + Integer.valueOf(element.height).toString());
               idx++;
           }

           return true;
       }

       @Override
       protected void onPause() {
           if (Static.DEBUG) Log.i(TAG, "onPause()");
           super.onPause();

           if (mOpenCvCameraView != null)
               mOpenCvCameraView.disableView();

       }

       @Override
       protected void onStop() {
           if (Static.DEBUG) Log.i(TAG, "onStop()");
           super.onStop();
       }

       @Override
       protected void onDestroy() {
           if (Static.DEBUG) Log.i(TAG, "onDestroy()");
           super.onDestroy();

           if (mOpenCvCameraView != null)
               mOpenCvCameraView.disableView();
       }

       public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {

           ++frameCounter;
           //Log.i(TAG, "Frame number: "+frameCounter);

           return inputFrame.rgba();
       }


       @Override
       public void onCameraViewStarted(int width, int height) {
           edgesMat = new Mat();
       }

       @Override
       public void onCameraViewStopped() {
           if (edgesMat != null)
               edgesMat.release();

           edgesMat = null;
       }

       public boolean onOptionsItemSelected(MenuItem item) {
           Log.i(TAG, "called onOptionsItemSelected; selected item: " + item);
           if (item.getGroupId() == 1)
           {
               mOpenCvCameraView.setEffect((String) item.getTitle());
               Toast.makeText(this, mOpenCvCameraView.getEffect(), Toast.LENGTH_SHORT).show();
           } else if (item.getGroupId() == 2) {
               int id = item.getItemId();
               Camera.Size resolution = mResolutionList.get(id);
               mOpenCvCameraView.setResolution(resolution);
               resolution = mOpenCvCameraView.getResolution();
               String caption = Integer.valueOf(resolution.width).toString() + "x" + Integer.valueOf(resolution.height).toString();
               Toast.makeText(this, caption, Toast.LENGTH_SHORT).show();
           }

           return true;
       }

       @Override
       public boolean onTouch(View v, MotionEvent event) {
           Log.i(TAG,"onTouch event");
           SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd_HH-mm-ss");
           String currentDateandTime = sdf.format(new Date());
           String fileName = Environment.getExternalStorageDirectory().getPath() +
                   "/sample_picture_" + currentDateandTime + ".jpg";
           mOpenCvCameraView.takePicture(fileName);
           Toast.makeText(this, fileName + " saved", Toast.LENGTH_SHORT).show();
           return false;
       }

       /**
        * Click to ImageButton to start recording.
        */
       public void onClickBtnStartRecord2(View v) {
           if (Static.DEBUG) Log.i(TAG, "onClickBtnStartRecord()");

           if(!recording)
               startRecording();
           else
               stopRecording();
       }

       private void startRecording() {
           if (Static.DEBUG) Log.i(TAG, "startRecording()");
           initRecorder();

           try {
               recorder.start();
               startTime = System.currentTimeMillis();
               recording = true;
               audioThread.start();
           } catch(FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
       }

       private void stopRecording() {
           if (Static.DEBUG) Log.i(TAG, "stopRecording()");

           runAudioThread = false;
           try {
               audioThread.join();
           } catch(InterruptedException e) {
               e.printStackTrace();
           }
           audioRecordRunnable = null;
           audioThread = null;

           if(recorder != null &amp;&amp; recording) {

               recording = false;
               Log.v(TAG, "Finishing recording, calling stop and release on recorder");
               try {
                   recorder.stop();
                   recorder.release();
               } catch(FFmpegFrameRecorder.Exception e) {
                   e.printStackTrace();
               }
               recorder = null;
           }
       }


       //---------------------------------------
       // initialize ffmpeg_recorder
       //---------------------------------------
       private void initRecorder() {

           Log.w(TAG, "init recorder");
           try {

               if (yuvImage == null) {
                   yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
                   Log.i(TAG, "create yuvImage");
               }

               Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.getAbsolutePath());
               Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.exists());
               recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
               recorder.setFormat("mp4");
               recorder.setSampleRate(sampleAudioRateInHz);
               // Set in the surface changed method
               recorder.setFrameRate(frameRate);

               Log.i(TAG, "recorder initialize success");

               audioRecordRunnable = new AudioRecordRunnable();
               audioThread = new Thread(audioRecordRunnable);
               runAudioThread = true;
           } catch (Exception e){
               e.printStackTrace();
           }
       }

       //---------------------------------------------
       // audio thread, gets and encodes audio data
       //---------------------------------------------
       class AudioRecordRunnable implements Runnable {

           @Override
           public void run() {
               android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

               // Audio
               int bufferSize;
               ShortBuffer audioData;
               int bufferReadResult;

               bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
               audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

               audioData = ShortBuffer.allocate(bufferSize);

               Log.d(TAG, "audioRecord.startRecording()");
               audioRecord.startRecording();

               /* ffmpeg_audio encoding loop */
               while(runAudioThread) {
                   //Log.v(TAG,"recording? " + recording);
                   bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
                   audioData.limit(bufferReadResult);
                   if(bufferReadResult > 0) {
                       Log.v(TAG, "bufferReadResult: " + bufferReadResult);
                       // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                       // Why?  Good question...
                       if(recording) {
                               try {
                                   recorder.recordSamples(audioData);
                                   //Log.v(TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                               } catch(FFmpegFrameRecorder.Exception e) {
                                   Log.v(TAG, e.getMessage());
                                   e.printStackTrace();
                               }
                       }
                   }
               }
               Log.v(TAG, "AudioThread Finished, release audioRecord");

               /* encoding finish, release recorder */
               if(audioRecord != null) {
                   audioRecord.stop();
                   audioRecord.release();
                   audioRecord = null;
                   Log.v(TAG, "audioRecord released");
               }
           }
       }
    }
    </string></string>

    OpenCVCameraPreview.java :

    package co.timeiseverything.pstimeiseverything;

    import android.content.Context;
    import android.hardware.Camera;
    import android.util.AttributeSet;
    import android.util.Log;

    import org.opencv.android.JavaCameraView;

    import java.io.FileOutputStream;
    import java.util.List;

    public class OpenCVCameraPreview extends JavaCameraView implements Camera.PictureCallback {

       private static final String TAG =  OpenCVCameraPreview.class.getSimpleName();
       private String mPictureFileName;

       public OpenCVCameraPreview(Context context, AttributeSet attrs) {
           super(context, attrs);
       }

       public List<string> getEffectList() {
           return mCamera.getParameters().getSupportedColorEffects();
       }

       public boolean isEffectSupported() {
           return (mCamera.getParameters().getColorEffect() != null);
       }

       public String getEffect() {
           return mCamera.getParameters().getColorEffect();
       }

       public void setEffect(String effect) {
           Camera.Parameters params = mCamera.getParameters();
           params.setColorEffect(effect);
           mCamera.setParameters(params);
       }

       public List getResolutionList() {
           return mCamera.getParameters().getSupportedPreviewSizes();
       }

       public void setResolution(Camera.Size resolution) {
           disconnectCamera();
           mMaxHeight = resolution.height;
           mMaxWidth = resolution.width;
           connectCamera(getWidth(), getHeight());
       }

       public Camera.Size getResolution() {
           return mCamera.getParameters().getPreviewSize();
       }

       public void takePicture(final String fileName) {
           Log.i(TAG, "Taking picture");
           this.mPictureFileName = fileName;
           // Postview and jpeg are sent in the same buffers if the queue is not empty when performing a capture.
           // Clear up buffers to avoid mCamera.takePicture to be stuck because of a memory issue
           mCamera.setPreviewCallback(null);

           // PictureCallback is implemented by the current class
           mCamera.takePicture(null, null, this);
       }

       @Override
       public void onPictureTaken(byte[] data, Camera camera) {
           Log.i(TAG, "Saving a bitmap to file");
           // The camera preview was automatically stopped. Start it again.
           mCamera.startPreview();
           mCamera.setPreviewCallback(this);

           // Write the image in a file (in jpeg format)
           try {
               FileOutputStream fos = new FileOutputStream(mPictureFileName);

               fos.write(data);
               fos.close();

           } catch (java.io.IOException e) {
               Log.e("PictureDemo", "Exception in photoCallback", e);
           }

       }
    }
    </string>

    Gradle :

    apply plugin: 'com.android.application'

    android {
       compileSdkVersion 23
       buildToolsVersion "23.0.2"

       defaultConfig {
           applicationId "co.timeiseverything.pstimeiseverything"
           minSdkVersion 15
           targetSdkVersion 23
           versionCode 1
           versionName "1.0"
       }
       buildTypes {
           release {
               minifyEnabled false
               proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
           }
       }

       packagingOptions {
           exclude 'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.properties'
           exclude 'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.xml'
           exclude 'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.properties'
           exclude 'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.xml'
       }
    }

    repositories {
       mavenCentral()
    }

    dependencies {
       compile fileTree(include: ['*.jar'], dir: 'libs')
       testCompile 'junit:junit:4.12'
       compile 'com.android.support:appcompat-v7:23.1.1'
       compile 'com.google.android.gms:play-services-appindexing:8.1.0'

       compile group: 'org.bytedeco', name: 'javacv', version: '1.1'
       compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.0.0-1.1', classifier: 'android-arm'
       compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.0.0-1.1', classifier: 'android-x86'
       compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.8.1-1.1', classifier: 'android-arm'
       compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.8.1-1.1', classifier: 'android-x86'

       compile project(':openCVLibrary310')
    }

    proguard-rules.pro
    Edited by : link

    jniLibs :
    app/src/main/jniLibs :

    armeabi armeabi-v7a arm64-v8a mips mips64 x86 x86_64

    Problem

    02-19 11:57:37.684 1759-1759/ I/OpenCVCameraActivity: onClickBtnStartRecord()
    02-19 11:57:37.684 1759-1759/ I/OpenCVCameraActivity: startRecording()
    02-19 11:57:37.684 1759-1759/ W/OpenCVCameraActivity: init recorder
    02-19 11:57:37.691 1759-1759/ I/OpenCVCameraActivity: create yuvImage
    02-19 11:57:37.691 1759-1759/ I/OpenCVCameraActivity: ffmpeg_url: /storage/emulated/0/stream.mp4
    02-19 11:57:37.696 1759-1759/ I/OpenCVCameraActivity: ffmpeg_url: false
    02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x1d arg 0x18cc3
    02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x6ffffffe arg 0x21c30
    02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x6fffffff arg 0x1
    02-19 11:57:37.838 1759-1759/co.timeiseverything.pstimeiseverything E/art: dlopen("/data/app/co.timeiseverything.pstimeiseverything-2/lib/x86/libjniavutil.so", RTLD_LAZY) failed: dlopen failed: cannot locate symbol "av_version_info" referenced by "libjniavutil.so"...
    02-19 11:57:37.843 1759-1759/co.timeiseverything.pstimeiseverything I/art: Rejecting re-init on previously-failed class java.lang.Class
    02-19 11:57:37.844 1759-1759/co.timeiseverything.pstimeiseverything E/AndroidRuntime: FATAL EXCEPTION: main
                                           Process: co.timeiseverything.pstimeiseverything, PID: 1759
                                           java.lang.IllegalStateException: Could not execute method of the activity
                                               at android.view.View$1.onClick(View.java:4020)
                                               at android.view.View.performClick(View.java:4780)
                                               at android.view.View$PerformClick.run(View.java:19866)
                                               at android.os.Handler.handleCallback(Handler.java:739)
                                               at android.os.Handler.dispatchMessage(Handler.java:95)
                                               at android.os.Looper.loop(Looper.java:135)
                                               at android.app.ActivityThread.main(ActivityThread.java:5254)
                                               at java.lang.reflect.Method.invoke(Native Method)
                                               at java.lang.reflect.Method.invoke(Method.java:372)
                                               at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903)
                                               at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698)
                                            Caused by: java.lang.reflect.InvocationTargetException
                                               at java.lang.reflect.Method.invoke(Native Method)
                                               at java.lang.reflect.Method.invoke(Method.java:372)
                                               at android.view.View$1.onClick(View.java:4015)
                                               at android.view.View.performClick(View.java:4780) 
                                               at android.view.View$PerformClick.run(View.java:19866) 
                                               at android.os.Handler.handleCallback(Handler.java:739) 
                                               at android.os.Handler.dispatchMessage(Handler.java:95) 
                                               at android.os.Looper.loop(Looper.java:135) 
                                               at android.app.ActivityThread.main(ActivityThread.java:5254) 
                                               at java.lang.reflect.Method.invoke(Native Method) 
                                               at java.lang.reflect.Method.invoke(Method.java:372) 
                                               at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903) 
                                               at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698) 
                                            Caused by: java.lang.UnsatisfiedLinkError: org.bytedeco.javacpp.avutil
                                               at java.lang.Class.classForName(Native Method)
                                               at java.lang.Class.forName(Class.java:309)
                                               at org.bytedeco.javacpp.Loader.load(Loader.java:413)
                                               at org.bytedeco.javacpp.Loader.load(Loader.java:381)
                                               at org.bytedeco.javacpp.avcodec$AVPacket.<clinit>(avcodec.java:1650)
                                               at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:149)
                                               at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:129)
                                               at co.timeiseverything.pstimeiseverything.OpenCVCameraActivity.initRecorder(OpenCVCameraActivity.java:320)
                                               at co.timeiseverything.pstimeiseverything.OpenCVCameraActivity.startRecording(OpenCVCameraActivity.java:266)
                                               at co.timeiseverything.pstimeiseverything.OpenCVCameraActivity.onClickBtnStartRecord2(OpenCVCameraActivity.java:259)
                                               at java.lang.reflect.Method.invoke(Native Method) 
                                               at java.lang.reflect.Method.invoke(Method.java:372) 
                                               at android.view.View$1.onClick(View.java:4015) 
                                               at android.view.View.performClick(View.java:4780) 
                                               at android.view.View$PerformClick.run(View.java:19866) 
                                               at android.os.Handler.handleCallback(Handler.java:739) 
                                               at android.os.Handler.dispatchMessage(Handler.java:95) 
                                               at android.os.Looper.loop(Looper.java:135) 
                                               at android.app.ActivityThread.main(ActivityThread.java:5254) 
                                               at java.lang.reflect.Method.invoke(Native Method) 
                                               at java.lang.reflect.Method.invoke(Method.java:372) 
                                               at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903) 
                                               at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698) 
    </init></init></clinit>
  • Android studio + OpenCV + FFmpeg

    26 février 2016, par t0m

    I have problem with code, which is functional only for Genymotion device (Android 4.1.1), but for Genymotion device 5.0.1 and real device Huawei honor 4c Android 4.4.2 not.

    I have imported OpenCV 3.1 to Android studio by : http://stackoverflow.com/a/27421494/4244605
    I added JavaCV with FFmpeg by : https://github.com/bytedeco/javacv

    Android studio 1.5.1
    minSdkVersion 15
    compileSdkVersion 23

    Code is only for test.
    OpenCVCameraActivity.java :

    package co.timeiseverything.pstimeiseverything;

    import android.app.Activity;
    import android.hardware.Camera;
    import android.media.AudioFormat;
    import android.media.AudioRecord;
    import android.media.MediaRecorder;
    import android.os.Bundle;
    import android.os.Environment;
    import android.util.Log;
    import android.view.Menu;
    import android.view.MenuItem;
    import android.view.MotionEvent;
    import android.view.SubMenu;
    import android.view.SurfaceView;
    import android.view.View;
    import android.view.WindowManager;
    import android.widget.Toast;

    import org.bytedeco.javacv.FFmpegFrameRecorder;
    import org.bytedeco.javacv.Frame;
    import org.opencv.android.BaseLoaderCallback;
    import org.opencv.android.CameraBridgeViewBase;
    import org.opencv.android.LoaderCallbackInterface;
    import org.opencv.android.OpenCVLoader;
    import org.opencv.core.Mat;

    import java.io.File;
    import java.nio.ShortBuffer;
    import java.text.SimpleDateFormat;
    import java.util.Date;
    import java.util.List;
    import java.util.ListIterator;

    @SuppressWarnings("ALL")
    public class OpenCVCameraActivity extends Activity implements
           CameraBridgeViewBase.CvCameraViewListener2,
           View.OnTouchListener {

       //name of activity, for DEBUGGING
       private static final String TAG = OpenCVCameraActivity.class.getSimpleName();

       private OpenCVCameraPreview mOpenCvCameraView;
       private List mResolutionList;
       private MenuItem[] mEffectMenuItems;
       private SubMenu mColorEffectsMenu;
       private MenuItem[] mResolutionMenuItems;
       private SubMenu mResolutionMenu;

       private static long frameCounter = 0;

       long startTime = 0;
       private Mat edgesMat;
       boolean recording = false;
       private int sampleAudioRateInHz = 44100;
       private int imageWidth = 1920;
       private int imageHeight = 1080;
       private int frameRate = 30;
       private Frame yuvImage = null;
       private File ffmpeg_link;
       private FFmpegFrameRecorder recorder;

       /* audio data getting thread */
       private AudioRecord audioRecord;
       private AudioRecordRunnable audioRecordRunnable;
       private Thread audioThread;
       volatile boolean runAudioThread = true;
       ShortBuffer[] samples;


       private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
           @Override
           public void onManagerConnected(int status) {
               switch (status) {
                   case LoaderCallbackInterface.SUCCESS:
                       Log.i(TAG, "OpenCV loaded successfully");
                       mOpenCvCameraView.enableView();
                       mOpenCvCameraView.setOnTouchListener(OpenCVCameraActivity.this);
                   break;
                   default:
                       super.onManagerConnected(status);
                   break;
               }
           }
       };

       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           if(Static.DEBUG) Log.i(TAG, "onCreate()");

           getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);

           try {
               setContentView(R.layout.activity_opencv);

               mOpenCvCameraView = (OpenCVCameraPreview) findViewById(R.id.openCVCameraPreview);
               mOpenCvCameraView.setVisibility(SurfaceView.VISIBLE);
               mOpenCvCameraView.setCvCameraViewListener(this);

               //mOpenCvCameraView.enableFpsMeter();

               ffmpeg_link = new File(Environment.getExternalStorageDirectory(), "stream.mp4");
           } catch (Exception e){
               e.printStackTrace();
           }
       }

       @Override
       protected void onRestart() {
           if (Static.DEBUG) Log.i(TAG, "onRestart()");
           super.onRestart();
       }

       @Override
       protected void onStart() {
           if (Static.DEBUG) Log.i(TAG, "onStart()");
           super.onStart();
       }

       @Override
       protected void onResume() {
           if (Static.DEBUG) Log.i(TAG, "onResume()");
           super.onResume();

           if (!OpenCVLoader.initDebug()) {
               Log.i(TAG, "Internal OpenCV library not found. Using OpenCV Manager for initialization");
               OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_2_4_11, this, mLoaderCallback);
           } else {
               Log.i(TAG, "OpenCV library found inside package. Using it!");
               mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
           }

       }

       @Override
       public boolean onCreateOptionsMenu(Menu menu) {
           if (Static.DEBUG) Log.i(TAG, "onCreateOptionsMenu()");
           super.onCreateOptionsMenu(menu);

           List<string> effects = mOpenCvCameraView.getEffectList();

           if (effects == null) {
               Log.e(TAG, "Color effects are not supported by device!");
               return true;
           }

           mColorEffectsMenu = menu.addSubMenu("Color Effect");
           mEffectMenuItems = new MenuItem[effects.size()];

           int idx = 0;
           ListIterator<string> effectItr = effects.listIterator();
           while(effectItr.hasNext()) {
               String element = effectItr.next();
               mEffectMenuItems[idx] = mColorEffectsMenu.add(1, idx, Menu.NONE, element);
               idx++;
           }

           mResolutionMenu = menu.addSubMenu("Resolution");
           mResolutionList = mOpenCvCameraView.getResolutionList();
           mResolutionMenuItems = new MenuItem[mResolutionList.size()];

           ListIterator resolutionItr = mResolutionList.listIterator();
           idx = 0;
           while(resolutionItr.hasNext()) {
               Camera.Size element = resolutionItr.next();
               mResolutionMenuItems[idx] = mResolutionMenu.add(2, idx, Menu.NONE,
                       Integer.valueOf(element.width).toString() + "x" + Integer.valueOf(element.height).toString());
               idx++;
           }

           return true;
       }

       @Override
       protected void onPause() {
           if (Static.DEBUG) Log.i(TAG, "onPause()");
           super.onPause();

           if (mOpenCvCameraView != null)
               mOpenCvCameraView.disableView();

       }

       @Override
       protected void onStop() {
           if (Static.DEBUG) Log.i(TAG, "onStop()");
           super.onStop();
       }

       @Override
       protected void onDestroy() {
           if (Static.DEBUG) Log.i(TAG, "onDestroy()");
           super.onDestroy();

           if (mOpenCvCameraView != null)
               mOpenCvCameraView.disableView();
       }

       public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {

           ++frameCounter;
           //Log.i(TAG, "Frame number: "+frameCounter);

           return inputFrame.rgba();
       }


       @Override
       public void onCameraViewStarted(int width, int height) {
           edgesMat = new Mat();
       }

       @Override
       public void onCameraViewStopped() {
           if (edgesMat != null)
               edgesMat.release();

           edgesMat = null;
       }

       public boolean onOptionsItemSelected(MenuItem item) {
           Log.i(TAG, "called onOptionsItemSelected; selected item: " + item);
           if (item.getGroupId() == 1)
           {
               mOpenCvCameraView.setEffect((String) item.getTitle());
               Toast.makeText(this, mOpenCvCameraView.getEffect(), Toast.LENGTH_SHORT).show();
           } else if (item.getGroupId() == 2) {
               int id = item.getItemId();
               Camera.Size resolution = mResolutionList.get(id);
               mOpenCvCameraView.setResolution(resolution);
               resolution = mOpenCvCameraView.getResolution();
               String caption = Integer.valueOf(resolution.width).toString() + "x" + Integer.valueOf(resolution.height).toString();
               Toast.makeText(this, caption, Toast.LENGTH_SHORT).show();
           }

           return true;
       }

       @Override
       public boolean onTouch(View v, MotionEvent event) {
           Log.i(TAG,"onTouch event");
           SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd_HH-mm-ss");
           String currentDateandTime = sdf.format(new Date());
           String fileName = Environment.getExternalStorageDirectory().getPath() +
                   "/sample_picture_" + currentDateandTime + ".jpg";
           mOpenCvCameraView.takePicture(fileName);
           Toast.makeText(this, fileName + " saved", Toast.LENGTH_SHORT).show();
           return false;
       }

       /**
        * Click to ImageButton to start recording.
        */
       public void onClickBtnStartRecord2(View v) {
           if (Static.DEBUG) Log.i(TAG, "onClickBtnStartRecord()");

           if(!recording)
               startRecording();
           else
               stopRecording();
       }

       private void startRecording() {
           if (Static.DEBUG) Log.i(TAG, "startRecording()");
           initRecorder();

           try {
               recorder.start();
               startTime = System.currentTimeMillis();
               recording = true;
               audioThread.start();
           } catch(FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
       }

       private void stopRecording() {
           if (Static.DEBUG) Log.i(TAG, "stopRecording()");

           runAudioThread = false;
           try {
               audioThread.join();
           } catch(InterruptedException e) {
               e.printStackTrace();
           }
           audioRecordRunnable = null;
           audioThread = null;

           if(recorder != null &amp;&amp; recording) {

               recording = false;
               Log.v(TAG, "Finishing recording, calling stop and release on recorder");
               try {
                   recorder.stop();
                   recorder.release();
               } catch(FFmpegFrameRecorder.Exception e) {
                   e.printStackTrace();
               }
               recorder = null;
           }
       }


       //---------------------------------------
       // initialize ffmpeg_recorder
       //---------------------------------------
       private void initRecorder() {

           Log.w(TAG, "init recorder");
           try {

               if (yuvImage == null) {
                   yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
                   Log.i(TAG, "create yuvImage");
               }

               Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.getAbsolutePath());
               Log.i(TAG, "ffmpeg_url: " + ffmpeg_link.exists());
               recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
               recorder.setFormat("mp4");
               recorder.setSampleRate(sampleAudioRateInHz);
               // Set in the surface changed method
               recorder.setFrameRate(frameRate);

               Log.i(TAG, "recorder initialize success");

               audioRecordRunnable = new AudioRecordRunnable();
               audioThread = new Thread(audioRecordRunnable);
               runAudioThread = true;
           } catch (Exception e){
               e.printStackTrace();
           }
       }

       //---------------------------------------------
       // audio thread, gets and encodes audio data
       //---------------------------------------------
       class AudioRecordRunnable implements Runnable {

           @Override
           public void run() {
               android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

               // Audio
               int bufferSize;
               ShortBuffer audioData;
               int bufferReadResult;

               bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
               audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

               audioData = ShortBuffer.allocate(bufferSize);

               Log.d(TAG, "audioRecord.startRecording()");
               audioRecord.startRecording();

               /* ffmpeg_audio encoding loop */
               while(runAudioThread) {
                   //Log.v(TAG,"recording? " + recording);
                   bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
                   audioData.limit(bufferReadResult);
                   if(bufferReadResult > 0) {
                       Log.v(TAG, "bufferReadResult: " + bufferReadResult);
                       // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                       // Why?  Good question...
                       if(recording) {
                               try {
                                   recorder.recordSamples(audioData);
                                   //Log.v(TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                               } catch(FFmpegFrameRecorder.Exception e) {
                                   Log.v(TAG, e.getMessage());
                                   e.printStackTrace();
                               }
                       }
                   }
               }
               Log.v(TAG, "AudioThread Finished, release audioRecord");

               /* encoding finish, release recorder */
               if(audioRecord != null) {
                   audioRecord.stop();
                   audioRecord.release();
                   audioRecord = null;
                   Log.v(TAG, "audioRecord released");
               }
           }
       }
    }
    </string></string>

    OpenCVCameraPreview.java :

    package co.timeiseverything.pstimeiseverything;

    import android.content.Context;
    import android.hardware.Camera;
    import android.util.AttributeSet;
    import android.util.Log;

    import org.opencv.android.JavaCameraView;

    import java.io.FileOutputStream;
    import java.util.List;

    public class OpenCVCameraPreview extends JavaCameraView implements Camera.PictureCallback {

       private static final String TAG =  OpenCVCameraPreview.class.getSimpleName();
       private String mPictureFileName;

       public OpenCVCameraPreview(Context context, AttributeSet attrs) {
           super(context, attrs);
       }

       public List<string> getEffectList() {
           return mCamera.getParameters().getSupportedColorEffects();
       }

       public boolean isEffectSupported() {
           return (mCamera.getParameters().getColorEffect() != null);
       }

       public String getEffect() {
           return mCamera.getParameters().getColorEffect();
       }

       public void setEffect(String effect) {
           Camera.Parameters params = mCamera.getParameters();
           params.setColorEffect(effect);
           mCamera.setParameters(params);
       }

       public List getResolutionList() {
           return mCamera.getParameters().getSupportedPreviewSizes();
       }

       public void setResolution(Camera.Size resolution) {
           disconnectCamera();
           mMaxHeight = resolution.height;
           mMaxWidth = resolution.width;
           connectCamera(getWidth(), getHeight());
       }

       public Camera.Size getResolution() {
           return mCamera.getParameters().getPreviewSize();
       }

       public void takePicture(final String fileName) {
           Log.i(TAG, "Taking picture");
           this.mPictureFileName = fileName;
           // Postview and jpeg are sent in the same buffers if the queue is not empty when performing a capture.
           // Clear up buffers to avoid mCamera.takePicture to be stuck because of a memory issue
           mCamera.setPreviewCallback(null);

           // PictureCallback is implemented by the current class
           mCamera.takePicture(null, null, this);
       }

       @Override
       public void onPictureTaken(byte[] data, Camera camera) {
           Log.i(TAG, "Saving a bitmap to file");
           // The camera preview was automatically stopped. Start it again.
           mCamera.startPreview();
           mCamera.setPreviewCallback(this);

           // Write the image in a file (in jpeg format)
           try {
               FileOutputStream fos = new FileOutputStream(mPictureFileName);

               fos.write(data);
               fos.close();

           } catch (java.io.IOException e) {
               Log.e("PictureDemo", "Exception in photoCallback", e);
           }

       }
    }
    </string>

    Gradle :

    apply plugin: 'com.android.application'

    android {
       compileSdkVersion 23
       buildToolsVersion "23.0.2"

       defaultConfig {
           applicationId "co.timeiseverything.pstimeiseverything"
           minSdkVersion 15
           targetSdkVersion 23
           versionCode 1
           versionName "1.0"
       }
       buildTypes {
           release {
               minifyEnabled false
               proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
           }
       }

       packagingOptions {
           exclude 'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.properties'
           exclude 'META-INF/maven/org.bytedeco.javacpp-presets/opencv/pom.xml'
           exclude 'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.properties'
           exclude 'META-INF/maven/org.bytedeco.javacpp-presets/ffmpeg/pom.xml'
       }
    }

    repositories {
       mavenCentral()
    }

    dependencies {
       compile fileTree(include: ['*.jar'], dir: 'libs')
       testCompile 'junit:junit:4.12'
       compile 'com.android.support:appcompat-v7:23.1.1'
       compile 'com.google.android.gms:play-services-appindexing:8.1.0'

       compile group: 'org.bytedeco', name: 'javacv', version: '1.1'
       compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.0.0-1.1', classifier: 'android-arm'
       compile group: 'org.bytedeco.javacpp-presets', name: 'opencv', version: '3.0.0-1.1', classifier: 'android-x86'
       compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.8.1-1.1', classifier: 'android-arm'
       compile group: 'org.bytedeco.javacpp-presets', name: 'ffmpeg', version: '2.8.1-1.1', classifier: 'android-x86'

       compile project(':openCVLibrary310')
    }

    proguard-rules.pro
    Edited by : link

    jniLibs :
    app/src/main/jniLibs :

    armeabi armeabi-v7a arm64-v8a mips mips64 x86 x86_64

    Problem

    02-19 11:57:37.684 1759-1759/ I/OpenCVCameraActivity: onClickBtnStartRecord()
    02-19 11:57:37.684 1759-1759/ I/OpenCVCameraActivity: startRecording()
    02-19 11:57:37.684 1759-1759/ W/OpenCVCameraActivity: init recorder
    02-19 11:57:37.691 1759-1759/ I/OpenCVCameraActivity: create yuvImage
    02-19 11:57:37.691 1759-1759/ I/OpenCVCameraActivity: ffmpeg_url: /storage/emulated/0/stream.mp4
    02-19 11:57:37.696 1759-1759/ I/OpenCVCameraActivity: ffmpeg_url: false
    02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x1d arg 0x18cc3
    02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x6ffffffe arg 0x21c30
    02-19 11:57:37.837 1759-1759/ W/linker: libjniavutil.so: unused DT entry: type 0x6fffffff arg 0x1
    02-19 11:57:37.838 1759-1759/co.timeiseverything.pstimeiseverything E/art: dlopen("/data/app/co.timeiseverything.pstimeiseverything-2/lib/x86/libjniavutil.so", RTLD_LAZY) failed: dlopen failed: cannot locate symbol "av_version_info" referenced by "libjniavutil.so"...
    02-19 11:57:37.843 1759-1759/co.timeiseverything.pstimeiseverything I/art: Rejecting re-init on previously-failed class java.lang.Class
    02-19 11:57:37.844 1759-1759/co.timeiseverything.pstimeiseverything E/AndroidRuntime: FATAL EXCEPTION: main
                                           Process: co.timeiseverything.pstimeiseverything, PID: 1759
                                           java.lang.IllegalStateException: Could not execute method of the activity
                                               at android.view.View$1.onClick(View.java:4020)
                                               at android.view.View.performClick(View.java:4780)
                                               at android.view.View$PerformClick.run(View.java:19866)
                                               at android.os.Handler.handleCallback(Handler.java:739)
                                               at android.os.Handler.dispatchMessage(Handler.java:95)
                                               at android.os.Looper.loop(Looper.java:135)
                                               at android.app.ActivityThread.main(ActivityThread.java:5254)
                                               at java.lang.reflect.Method.invoke(Native Method)
                                               at java.lang.reflect.Method.invoke(Method.java:372)
                                               at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903)
                                               at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698)
                                            Caused by: java.lang.reflect.InvocationTargetException
                                               at java.lang.reflect.Method.invoke(Native Method)
                                               at java.lang.reflect.Method.invoke(Method.java:372)
                                               at android.view.View$1.onClick(View.java:4015)
                                               at android.view.View.performClick(View.java:4780) 
                                               at android.view.View$PerformClick.run(View.java:19866) 
                                               at android.os.Handler.handleCallback(Handler.java:739) 
                                               at android.os.Handler.dispatchMessage(Handler.java:95) 
                                               at android.os.Looper.loop(Looper.java:135) 
                                               at android.app.ActivityThread.main(ActivityThread.java:5254) 
                                               at java.lang.reflect.Method.invoke(Native Method) 
                                               at java.lang.reflect.Method.invoke(Method.java:372) 
                                               at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903) 
                                               at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698) 
                                            Caused by: java.lang.UnsatisfiedLinkError: org.bytedeco.javacpp.avutil
                                               at java.lang.Class.classForName(Native Method)
                                               at java.lang.Class.forName(Class.java:309)
                                               at org.bytedeco.javacpp.Loader.load(Loader.java:413)
                                               at org.bytedeco.javacpp.Loader.load(Loader.java:381)
                                               at org.bytedeco.javacpp.avcodec$AVPacket.<clinit>(avcodec.java:1650)
                                               at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:149)
                                               at org.bytedeco.javacv.FFmpegFrameRecorder.<init>(FFmpegFrameRecorder.java:129)
                                               at co.timeiseverything.pstimeiseverything.OpenCVCameraActivity.initRecorder(OpenCVCameraActivity.java:320)
                                               at co.timeiseverything.pstimeiseverything.OpenCVCameraActivity.startRecording(OpenCVCameraActivity.java:266)
                                               at co.timeiseverything.pstimeiseverything.OpenCVCameraActivity.onClickBtnStartRecord2(OpenCVCameraActivity.java:259)
                                               at java.lang.reflect.Method.invoke(Native Method) 
                                               at java.lang.reflect.Method.invoke(Method.java:372) 
                                               at android.view.View$1.onClick(View.java:4015) 
                                               at android.view.View.performClick(View.java:4780) 
                                               at android.view.View$PerformClick.run(View.java:19866) 
                                               at android.os.Handler.handleCallback(Handler.java:739) 
                                               at android.os.Handler.dispatchMessage(Handler.java:95) 
                                               at android.os.Looper.loop(Looper.java:135) 
                                               at android.app.ActivityThread.main(ActivityThread.java:5254) 
                                               at java.lang.reflect.Method.invoke(Native Method) 
                                               at java.lang.reflect.Method.invoke(Method.java:372) 
                                               at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:903) 
                                               at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:698) 
    </init></init></clinit>
  • MediaPlayer within TextureView not working as intended

    15 mars 2016, par Russiee

    I’ve put in a MediaPlayer within a TextureView, which itself is located inside a ListView.

    Yesterday, the MediaPlayer worked as intended with a test .mp4 clip.
    Today, the MediaPlayer tries doing some kind of FFmpeg Extractor, for which i’ve been unable to find any kind of information about.

    This is the stack trace :

    03-14 13:43:00.076 477-16532/? V/FFmpegExtractor: SniffFFMPEG
    03-14 13:43:00.076 477-16532/? I/FFmpegExtractor: android-source:0xafcff040
    03-14 13:43:00.077 477-16532/? D/FFMPEG: android source begin open
    03-14 13:43:00.077 477-16532/? D/FFMPEG: android open, url: android-source:0xafcff040
    03-14 13:43:00.077 477-16532/? D/FFMPEG: ffmpeg open android data source success, source ptr: 0xafcff040
    03-14 13:43:00.077 477-16532/? D/FFMPEG: android source open success
    03-14 13:43:00.149 477-16532/? I/FFMPEG: Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'android-source:0xafcff040':
    03-14 13:43:00.149 477-16532/? I/FFMPEG:   Metadata:
    03-14 13:43:00.149 477-16532/? I/FFMPEG:     major_brand     : qt  
    03-14 13:43:00.149 477-16532/? I/FFMPEG:     minor_version   : 0
    03-14 13:43:00.149 477-16532/? I/FFMPEG:     compatible_brands: qt  
    03-14 13:43:00.149 477-16532/? I/FFMPEG:     creation_time   : 2016-03-13 19:24:58
    03-14 13:43:00.149 477-16532/? I/FFMPEG:   Duration: 00:00:10.88, start: 0.000000, bitrate: 11209 kb/s
    03-14 13:43:00.149 477-16532/? I/FFMPEG:     Stream #0:0(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 86 kb/s (default)
    03-14 13:43:00.149 477-16532/? I/FFMPEG:     Metadata:
    03-14 13:43:00.149 477-16532/? I/FFMPEG:       creation_time   : 2016-03-13 19:24:58
    03-14 13:43:00.149 477-16532/? I/FFMPEG:       handler_name    : Core Media Data Handler
    03-14 13:43:00.149 477-16532/? I/FFMPEG:     Stream #0:1(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 11113 kb/s, 29.98 fps, 29.97 tbr, 600 tbn, 50 tbc (default)
    03-14 13:43:00.149 477-16532/? I/FFMPEG:     Metadata:
    03-14 13:43:00.149 477-16532/? I/FFMPEG:       creation_time   : 2016-03-13 19:24:58
    03-14 13:43:00.149 477-16532/? I/FFMPEG:       handler_name    : Core Media Data Handler
    03-14 13:43:00.149 477-16532/? I/FFMPEG:       encoder         : H.264
    03-14 13:43:00.149 477-16532/? I/FFMPEG:     Stream #0:2(und): Data: none (mebx / 0x7862656D), 1 kb/s (default)
    03-14 13:43:00.149 477-16532/? I/FFMPEG:     Metadata:
    03-14 13:43:00.149 477-16532/? I/FFMPEG:       creation_time   : 2016-03-13 19:24:58
    03-14 13:43:00.149 477-16532/? I/FFMPEG:       handler_name    : Core Media Data Handler
    03-14 13:43:00.149 477-16532/? I/FFMPEG:     Stream #0:3(und): Data: none (mebx / 0x7862656D), 0 kb/s (default)
    03-14 13:43:00.149 477-16532/? I/FFMPEG:     Metadata:
    03-14 13:43:00.149 477-16532/? I/FFMPEG:       creation_time   : 2016-03-13 19:24:58
    03-14 13:43:00.149 477-16532/? I/FFMPEG:       handler_name    : Core Media Data Handler
    03-14 13:43:00.149 477-16532/? I/FFmpegExtractor: FFmpegExtrator, url: android-source:0xafcff040, format_name: mov,mp4,m4a,3gp,3g2,mj2, format_long_name: QuickTime / MOV
    03-14 13:43:00.149 477-16532/? I/FFmpegExtractor: list the formats suppoted by ffmpeg:
    03-14 13:43:00.149 477-16532/? I/FFmpegExtractor: ========================================
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[00]: mpeg
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[01]: mpegts
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[02]: mov,mp4,m4a,3gp,3g2,mj2
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[03]: matroska,webm
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[04]: asf
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[05]: rm
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[06]: flv
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[07]: swf
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[08]: avi
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[09]: ape
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[10]: dts
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[11]: flac
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[12]: ac3
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[13]: wav
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[14]: ogg
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[15]: vc1
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: format_names[16]: hevc
    03-14 13:43:00.149 477-16532/? I/FFmpegExtractor: ========================================
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: major_brand tag is:qt  
    03-14 13:43:00.149 477-16532/? I/FFmpegExtractor: [mp4]format is mov, confidence should be larger than mpeg4
    03-14 13:43:00.149 477-16532/? D/FFMPEG: android source close
    03-14 13:43:00.149 477-16532/? I/FFmpegExtractor: sniff through BetterSniffFFMPEG success
    03-14 13:43:00.149 477-16532/? D/FFmpegExtractor: ffmpeg detected media content as 'video/mp4' with confidence 0.41
    03-14 13:43:00.149 477-16532/? I/MediaExtractor: Use extended extractor for the special mime(video/mp4) or codec
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: FFmpegExtractor::FFmpegExtractor
    03-14 13:43:00.149 477-16532/? V/FFmpegExtractor: mFilename: android-source:0xafcff040
    03-14 13:43:00.150 477-16532/? D/FFMPEG: android source begin open
    03-14 13:43:00.150 477-16532/? D/FFMPEG: android open, url: android-source:0xafcff040
    03-14 13:43:00.150 477-16532/? D/FFMPEG: ffmpeg open android data source success, source ptr: 0xafcff040
    03-14 13:43:00.150 477-16532/? D/FFMPEG: android source open success
    03-14 13:43:00.230 477-16532/? V/FFmpegExtractor: file startTime: 0
    03-14 13:43:00.230 477-16532/? I/FFmpegExtractor: the duration is 00:00:10.87
    03-14 13:43:00.230 477-16532/? I/FFmpegExtractor: stream_index: 0
    03-14 13:43:00.230 477-16532/? I/FFmpegExtractor: support the codec(aac)
    03-14 13:43:00.230 477-16532/? V/FFmpegExtractor: Tag mp4a/0x6134706d with codec(aac)
    03-14 13:43:00.230 477-16532/? V/FFmpegExtractor: audio stream extradata(2):
    03-14 13:43:00.230 477-16532/? V/codec_utils: AAC
    03-14 13:43:00.230 477-16532/? V/codec_utils: aac profile: 1, sf_index: 4, channel: 1
    03-14 13:43:00.230 477-16532/? I/FFmpegExtractor: bit_rate: 86249, sample_rate: 44100, channels: 1, bits_per_coded_sample: 16, block_align:0
    03-14 13:43:00.230 477-16532/? I/FFmpegExtractor: the time is 00:00:10.93
    03-14 13:43:00.230 477-16532/? V/FFmpegExtractor: audio startTime:0
    03-14 13:43:00.230 477-16532/? V/FFmpegExtractor: create a audio track
    03-14 13:43:00.230 477-16532/? I/FFmpegExtractor: stream_index: 1
    03-14 13:43:00.230 477-16532/? I/FFmpegExtractor: support the codec(h264)
    03-14 13:43:00.230 477-16532/? V/FFmpegExtractor: Tag avc1/0x31637661 with codec(h264)
    03-14 13:43:00.230 477-16532/? V/FFmpegExtractor: video stream extradata:
    03-14 13:43:00.230 477-16532/? V/codec_utils: AVC
    03-14 13:43:00.230 477-16532/? I/FFmpegExtractor: width: 1920, height: 1080, bit_rate: 11113682
    03-14 13:43:00.230 477-16532/? I/FFmpegExtractor: the time is 00:00:10.87
    03-14 13:43:00.230 477-16532/? V/FFmpegExtractor: video startTime:0
    03-14 13:43:00.230 477-16532/? V/FFmpegExtractor: create a video track
    03-14 13:43:00.230 477-16532/? V/FFmpegExtractor: the stream is AVC, the length of a NAL unit: 4
    03-14 13:43:00.230 477-16532/? V/FFmpegExtractor: Starting reader thread
    03-14 13:43:00.230 477-16532/? D/FFmpegExtractor: Reader thread started
    03-14 13:43:00.231 477-16532/? V/FFmpegExtractor: mProbePkts: 0, mEOF: 0, pb->error(if has): 0, mDefersToCreateVideoTrack: 0, mDefersToCreateAudioTrack: 0
    03-14 13:43:00.231 477-16532/? D/FFmpegExtractor: supported mime: video/mp4
    03-14 13:43:00.231 477-16532/? V/FFmpegExtractor: FFmpegExtractor::getMetaData
    03-14 13:43:00.231 477-16537/? V/FFmpegExtractor: FFmpegExtractor enter thread(readerEntry)
    03-14 13:43:00.231 477-16532/? V/FFmpegExtractor: FFmpegExtractor::getTrack[0]
    03-14 13:43:00.231 477-16532/? V/FFmpegExtractor: FFmpegExtractor::getTrackMetaData[0]
    03-14 13:43:00.231 477-16532/? V/FFmpegExtractor: FFmpegExtractor::getTrack[1]
    03-14 13:43:00.231 477-16532/? V/FFmpegExtractor: FFmpegExtractor::getTrackMetaData[1]
    03-14 13:43:00.232 10973-10973/com.hobbyte.touringandroid E/MediaPlayer: Should have subtitle controller already set

    Upon trying to play the video file, I get this log :

    03-14 13:52:06.133 743-821/system_process W/AudioTrack: AUDIO_OUTPUT_FLAG_FAST denied by client
    03-14 13:52:06.133 477-477/? D/NuPlayerDriver: start(0xb0f05040)
    03-14 13:52:06.133 477-16531/? I/GenericSource: start
    03-14 13:52:06.133 477-16531/? V/FFmpegExtractor: FFmpegExtractor::Track::start audio
    03-14 13:52:06.133 477-16531/? V/FFmpegExtractor: FFmpegExtractor::Track::start video
    03-14 13:52:06.133 477-16532/? V/FFmpegExtractor: read audio flush pkt
    03-14 13:52:06.133 477-16532/? V/FFmpegExtractor: read video flush pkt
    03-14 13:52:06.139 477-24743/? D/SoftFFmpegAudio: SoftFFmpegAudio component: OMX.ffmpeg.aac.decoder mMode: 1
    03-14 13:52:06.141 477-24743/? V/SoftFFmpegAudio: get pcm params, nChannels:4294967295, nSamplingRate:4294967295
    03-14 13:52:06.141 477-24743/? V/SoftFFmpegAudio: set OMX_IndexParamAudioPcm, nChannels:1, nSampleRate:44100, nBitsPerSample:16
    03-14 13:52:06.141 477-24743/? V/SoftFFmpegAudio: set OMX_IndexParamAudioAac, nChannels:1, nSampleRate:44100
    03-14 13:52:06.141 477-24743/? E/OMXNodeInstance: setParameter(1866465283) ERROR: 0x8000101a
    03-14 13:52:06.141 477-24743/? V/SoftFFmpegAudio: get pcm params, nChannels:1, nSamplingRate:44100
    03-14 13:52:06.147 477-24744/? E/OMXNodeInstance: OMX_GetExtensionIndex OMX.google.android.index.storeMetaDataInBuffers failed
    03-14 13:52:06.147 477-24744/? E/ACodec: [OMX.google.h264.decoder] storeMetaDataInBuffers failed w/ err -2147483648
    03-14 13:52:06.149 477-24745/? I/SoftFFmpegAudio: got extradata, ignore: 0, size: 2
    03-14 13:52:06.150 477-24745/? I/SoftFFmpegAudio: extradata is ready, size: 2
    03-14 13:52:06.150 477-24745/? D/SoftFFmpegAudio: begin to open ffmpeg audio decoder(aac), mCtx sample_rate: 44100, channels: 1, , sample_fmt: (null)
    03-14 13:52:06.154 477-24745/? D/SoftFFmpegAudio: open ffmpeg audio decoder(aac) success, mCtx sample_rate: 44100, channels: 1, sample_fmt: fltp
    03-14 13:52:06.154 477-24745/? I/SoftFFmpegAudio: Create sample rate converter for conversion of 44100 Hz fltp 1 channels(mono) to 44100 Hz s16 1 channels(mono)!
    03-14 13:52:06.154 477-24743/? V/SoftFFmpegAudio: get pcm params, nChannels:1, nSamplingRate:44100
    03-14 13:52:06.155 477-24739/? D/AudioSink: bufferCount (8) is too small and increased to 12
    03-14 13:52:06.162 477-24747/? E/SoftAVC: Decoder failed: -2
    03-14 13:52:06.162 477-24744/? E/ACodec: [OMX.google.h264.decoder] ERROR(0x80001001)
    03-14 13:52:06.162 477-24744/? E/ACodec: signalError(omxError 0x80001001, internalError -2147483648)
    03-14 13:52:06.163 477-24741/? E/MediaCodec: Codec reported err 0x80001001, actionCode 0, while in state 6
    03-14 13:52:06.167 477-24740/? E/NuPlayerDecoder: Failed to queue input buffer for OMX.google.h264.decoder (err=-38)
    03-14 13:52:06.167 477-16531/? E/NuPlayer: received error(0xffffffda) from video decoder, flushing(0), now shutting down
    03-14 13:52:06.168 10973-11040/com.hobbyte.touringandroid E/MediaPlayer: error (1, -38)
    03-14 13:52:06.168 10973-10973/com.hobbyte.touringandroid E/MediaPlayer: Error (1,-38)
    03-14 13:52:06.168 477-24740/? E/NuPlayerDecoder: failed to flush OMX.google.h264.decoder (err=-38)
    03-14 13:52:06.168 477-16531/? E/NuPlayer: received error(0xffffffda) from video decoder, flushing(2), now shutting down
    03-14 13:52:06.169 10973-10989/com.hobbyte.touringandroid E/MediaPlayer: error (1, -38)
    03-14 13:52:06.170 10973-10973/com.hobbyte.touringandroid E/MediaPlayer: Error (1,-38)

    For reference, here is my class file :

       package com.hobbyte.touringandroid.ui.adapter;

    import android.content.Context;
    import android.graphics.SurfaceTexture;
    import android.media.AudioManager;
    import android.media.MediaPlayer;
    import android.util.DisplayMetrics;
    import android.view.Gravity;
    import android.view.LayoutInflater;
    import android.view.Surface;
    import android.view.TextureView;
    import android.view.View;
    import android.view.ViewGroup;
    import android.widget.ArrayAdapter;
    import android.widget.ImageButton;
    import android.widget.ImageView;
    import android.widget.SeekBar;
    import android.widget.TextView;

    import com.google.android.exoplayer.ExoPlayer;
    import com.google.android.exoplayer.FrameworkSampleSource;
    import com.google.android.exoplayer.MediaCodecVideoTrackRenderer;
    import com.google.android.exoplayer.SampleSource;
    import com.google.android.exoplayer.TrackRenderer;
    import com.hobbyte.touringandroid.App;
    import com.hobbyte.touringandroid.tourdata.ListViewItem;
    import com.hobbyte.touringandroid.internet.LoadImageFromURL;
    import com.hobbyte.touringandroid.R;

    import java.io.File;
    import java.io.IOException;
    import java.util.regex.Matcher;
    import java.util.regex.Pattern;

    /**
    * @author Nikita
    */
    public class PoiContentAdapter extends ArrayAdapter<listviewitem> {
       private static final String TAG = "PoiContentAdapter";

       public static final int HEADER = 0;
       public static final int BODY = 1;
       public static final int IMAGE = 2;
       public static final int VIDEO = 3;

       private static Pattern namePattern;
       private static final String FILE_NAME_PATTERN = "https?:\\/\\/[-\\w\\.\\/]*\\/(.+\\.(jpe?g|png|mp4))";

       private ListViewItem[] items;

       private String keyID;

       private TextureView textureView;
       private MediaPlayer player;
       private AudioManager audio;

       private ImageButton play;
       private ImageButton replay;
       private ImageButton mute;
       private ImageButton max;
       private SeekBar volume;

       private String filePath;

       @Override
       public int getViewTypeCount() {
           return 4;
       }

       @Override
           public int getItemViewType(int position) {
           return items[position].getType();
       }

       public PoiContentAdapter(Context context, ListViewItem[] content, String keyID) {
           super(context, 0, content);
           this.keyID = keyID;
           items = content;
           namePattern = Pattern.compile(FILE_NAME_PATTERN);
       }

       /**
        * Inflates a certain view depending on the type of ListViewItem (Normal text or Image URL)
        * @param position Position of item in the ItemList
        * @param view View
        * @param parent ParentView
        * @return the view in question
        */
       @Override
       public View getView(int position, View view, ViewGroup parent) {
           ListViewItem listViewItem = items[position];
           int listViewItemType = getItemViewType(position);
           String filename = null;

           TextView contentView;

           if (listViewItem.getUrl() != null) {
               Matcher m = namePattern.matcher(listViewItem.getUrl());
               if (m.matches()) {
                   filename = m.group(1);
               }
           }

           if (view == null) {
               if (listViewItemType == IMAGE) {
                   view = LayoutInflater.from(getContext()).inflate(R.layout.poi_image, parent, false);
               } else if(listViewItemType == VIDEO) {
                   view = LayoutInflater.from(getContext()).inflate(R.layout.poi_video, parent, false);
               } else {
                   view = LayoutInflater.from(getContext()).inflate(R.layout.poi_content, parent, false);
               }
           }

           switch (listViewItemType) {
               case IMAGE:
                   ImageView imageView = (ImageView) view.findViewById(R.id.poiContentImageView);
                   TextView textView = (TextView) view.findViewById(R.id.poiContentImageDesc);
                   textView.setText(listViewItem.getText());

                   if (filename != null) {
                       new LoadImageFromURL(imageView, App.context).execute(filename, keyID); //Load image in a separate thread
                   }
                   return view;

               case VIDEO:
                   filePath = getContext().getFilesDir() + "/" + String.format("%s/video/%s", keyID, filename);
                   File file = new File(filePath);
                   if(!file.exists()) {
                       view = LayoutInflater.from(getContext()).inflate(R.layout.poi_content, parent, false);
                       contentView = (TextView) view.findViewById(R.id.poiContentTextView);
                       contentView.setText("This contains a video." + "\n" + "Download this tour with Media to see this Video!" + "\n");
                       contentView.setGravity(Gravity.CENTER_HORIZONTAL);
                   } else {
                       System.out.println(filePath);
                       textureView = (TextureView) view.findViewById(R.id.poiContentVideoView);

                       DisplayMetrics metrics = App.context.getResources().getDisplayMetrics();
                       int height = metrics.heightPixels / 2;
                       int width = metrics.widthPixels;
                       textureView.setMinimumHeight(height);
                       textureView.setMinimumWidth(width);

                       play = (ImageButton) view.findViewById(R.id.playButton);
                       replay = (ImageButton) view.findViewById(R.id.replayButtoon);
                       mute = (ImageButton) view.findViewById(R.id.muteButton);
                       max = (ImageButton) view.findViewById(R.id.maxVolButton);
                       volume = (SeekBar) view.findViewById(R.id.volumeControl);
                       audio = (AudioManager) App.context.getSystemService(Context.AUDIO_SERVICE);

                       textureView.setSurfaceTextureListener(videoListener);
                       TextView videoDesc = (TextView) view.findViewById(R.id.poiContentVideoDesc);
                       videoDesc.setText(listViewItem.getText());
                   }
                   return view;
               case HEADER:
                   // TODO
                   if(view.findViewById(R.id.poiContentTextView) == null) {
                       view = LayoutInflater.from(getContext()).inflate(R.layout.poi_content, parent, false);
                   }
                   contentView = (TextView) view.findViewById(R.id.poiContentTextView);
                   contentView.setText(listViewItem.getText() + "\n");
                   if(listViewItem.getText().length() == 0) {
                       return new View(getContext());
                   }
                   return view;
               case BODY:
                   // TODO
                   contentView = (TextView) view.findViewById(R.id.poiContentTextView);
                   contentView.setText(listViewItem.getText() + "\n");
                   return view;
               default:
                   contentView = (TextView) view.findViewById(R.id.poiContentTextView);
                   contentView.setText("Something went wrong\n");
                   return view;
           }
       }

       private TextureView.SurfaceTextureListener videoListener = new TextureView.SurfaceTextureListener() {
           @Override
           public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
               Surface s = new Surface(surface);
               try {
                   player = new MediaPlayer();
                   player.setDataSource(filePath);
                   player.setSurface(s);
                   player.prepareAsync();
                   player.setAudioStreamType(AudioManager.STREAM_MUSIC);
                   player.setVideoScalingMode(MediaPlayer.VIDEO_SCALING_MODE_SCALE_TO_FIT);
                   player.setOnBufferingUpdateListener(new MediaPlayer.OnBufferingUpdateListener() {
                       @Override
                       public void onBufferingUpdate(MediaPlayer mp, int percent) {
                           //Do nothing
                       }
                   });
                   player.setOnCompletionListener(new MediaPlayer.OnCompletionListener() {
                       @Override
                       public void onCompletion(MediaPlayer mp) {
                           //Do nothing
                       }
                   });
                   player.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
                       @Override
                       public void onPrepared(final MediaPlayer mp) {
                           play.setOnClickListener(new View.OnClickListener() {
                               @Override
                               public void onClick(View v) {
                                   if(mp.isPlaying()) {
                                       mp.pause();
                                       play.setImageResource(R.mipmap.ic_play_arrow_white_36dp);
                                   } else {
                                       mp.start();
                                       play.setImageResource(R.mipmap.ic_pause_white_36dp);
                                   }
                               }
                           });

                           replay.setOnClickListener(new View.OnClickListener() {
                               @Override
                               public void onClick(View v) {
                                   if(mp.isPlaying()) {
                                       play.setImageResource(R.mipmap.ic_play_arrow_white_36dp);
                                       mp.pause();
                                       mp.seekTo(0);
                                   } else {
                                       play.setImageResource(R.mipmap.ic_play_arrow_white_36dp);
                                       mp.seekTo(0);
                                   }
                               }
                           });
                           mute.setOnClickListener(new View.OnClickListener() {
                               @Override
                               public void onClick(View v) {
                                   mp.setVolume(0.0f, 0.0f);
                                   volume.setProgress(0);
                               }
                           });
                           max.setOnClickListener(new View.OnClickListener() {
                               @Override
                               public void onClick(View v) {
                                   mp.setVolume(1.0f, 1.0f);
                                   volume.setProgress(audio.getStreamMaxVolume(AudioManager.STREAM_MUSIC));
                               }
                           });

                           int maxVolume = audio.getStreamMaxVolume(AudioManager.STREAM_MUSIC);
                           int currVolume = audio.getStreamVolume(AudioManager.STREAM_MUSIC);
                           volume.setMax(maxVolume);
                           volume.setProgress(currVolume);
                           volume.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
                               @Override
                               public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
                                   audio.setStreamVolume(AudioManager.STREAM_MUSIC, progress, 0);
                               }

                               @Override
                               public void onStartTrackingTouch(SeekBar seekBar) {

                               }

                               @Override
                               public void onStopTrackingTouch(SeekBar seekBar) {

                               }
                           });
                       }
                   });

                   player.setOnVideoSizeChangedListener(new MediaPlayer.OnVideoSizeChangedListener() {
                       @Override
                       public void onVideoSizeChanged(MediaPlayer mp, int width, int height) {
                           //Do nothing
                       }
                   });
               } catch (IllegalArgumentException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
               } catch (SecurityException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
               } catch (IllegalStateException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
               } catch (IOException e) {
                   // TODO Auto-generated catch block
                   e.printStackTrace();
               }
           }

           @Override
           public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {

           }

           @Override
           public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
               if (player != null) {
                   player.stop();
                   player.release();
                   player = null;
               }
               return true;
           }

           @Override
           public void onSurfaceTextureUpdated(SurfaceTexture surface) {

           }
       };
    }
    </listviewitem>

    I really do not understand in the slightest what is causing all these errors, and why the video file won’t play ?
    If anyone is able to help I will highly highly appreciate it !

    I am using Genymotion Emulator - Google Nexus 4 - API 21

    Thank you very much !