Recherche avancée

Médias (1)

Mot : - Tags -/pirate bay

Autres articles (54)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Les statuts des instances de mutualisation

    13 mars 2010, par

    Pour des raisons de compatibilité générale du plugin de gestion de mutualisations avec les fonctions originales de SPIP, les statuts des instances sont les mêmes que pour tout autre objets (articles...), seuls leurs noms dans l’interface change quelque peu.
    Les différents statuts possibles sont : prepa (demandé) qui correspond à une instance demandée par un utilisateur. Si le site a déjà été créé par le passé, il est passé en mode désactivé. publie (validé) qui correspond à une instance validée par un (...)

  • L’agrémenter visuellement

    10 avril 2011

    MediaSPIP est basé sur un système de thèmes et de squelettes. Les squelettes définissent le placement des informations dans la page, définissant un usage spécifique de la plateforme, et les thèmes l’habillage graphique général.
    Chacun peut proposer un nouveau thème graphique ou un squelette et le mettre à disposition de la communauté.

Sur d’autres sites (9886)

  • Solid FFmpeg wrapper for C#/.NET

    21 mars 2018, par Jacob Poul Richardt

    I have been searching the web for some time for a solid FFmpeg wrapper for C#/.NET. But I have yet to come up with something useful. I have found the following three projects, but all of them apears to be dead in early alpha stage.

    FFmpeg.NET
    ffmpeg-sharp
    FFLIB.NET

    So my question is if anyone knows of a wrapper project that is more mature ?
    I am not looking for a full transcoding engine with job queues and more.
    Just a simple wrapper so I do not have to make a command line call and then parse the console output, but can make method calls and use eventlisteners for progress.

    And please feel free to mention any active projects, even if they are stil in the early stages.

  • FFMPEG : RTSP to HLS restream stops with "No more output streams to write to, finishing."

    1er juin 2022, par Tim W

    I'm trying to do a live restream an RTSP feed from a webcam using ffmpeg, but the stream repeatedly stops with the error :

    



    "No more output streams to write to, finishing."

    



    The problem seems to get worse at higher bitrates (256kbps is mostly reliable) and is pretty random in its occurrence. At 1mbps, sometimes the stream will run for several hours without any trouble, on other occasions the stream will fail every few minutes. I've got a cron job running which restarts the stream automatically when it fails, but I would prefer to avoid the continued interruptions.

    



    I have seen this problem reported in a handful of other forums, so this is not a unique problem, but not one of those reports had a solution attached to it. My ffmpeg command looks like this :

    



    ffmpeg -loglevel verbose -r 25 -rtsp_transport tcp -i rtsp ://user:password@camera.url/live/ch0 -reset_timestamps 1 -movflags frag_keyframe+empty_moov -bufsize 7168k -stimeout 60000 -hls_flags temp_file -hls_time 5 -hls_wrap 180 -acodec copy -vcodec copy streaming.m3u8 > encode.log 2>&1

    



    What gets me is that the error makes no sense, this is a live stream so output is always wanted until I shut off the stream. So having it shut down because output isn't wanted is downright odd. If ffmpeg was complaining because of a problem with input it would make more sense.

    



    I'm running version 3.3.4, which I believe is the latest.

    



    Update 13 Oct 17 :

    



    After extensive testing I've established that "No more outputs" error message generated by FFMPEG is very misleading. The error seems to be generated if the data coming in from RTSP is delayed, eg by other activity on the router the camera is connected via. I've got a large buffer and timeout set which should be sufficient for 60 seconds, but I can still deliberately trigger this error with far shorter interruptions, so clearly the buffer and timeout aren't having the desired effect. This might be fixed by setting a QOS policy on the router and by checking that the TCP packets from the camera have a suitably high priority set, it's possible this isn't the case.

    



    However, I would still like to improve the robustness of the input stream if it is briefly interrupted. Is there any way to persuade FFMPEG to tolerate this or to actually make use of the buffer it seems to be ignoring ? Can FFMPEG be persuaded to simply stop writing output and wait for input to become available rather than bailing out ? Or could I get FFMPEG to duplicate the last complete frame until it's able to get more data ? I can live with the stream stuttering a bit, but I've got to significantly reduce the current behaviour where the stream drops at the slightest hint of a problem.

    



    Further update 13 Oct 2017 :

    



    After more tests, I've found that the problem actually seems to be that HLS is incapable of coping with a discontinuity in the incoming video stream. If I deliberately cut the network connection between the camera and FFMPEG, FFMPEG will wait for the connection to be re-established for quite a long time. If the interruption was long (>10 seconds) the stream will immediately drop with the "No More Outputs" error the instant that the connection is re-established. If the interruption is short, then RTSP will actually start pulling data from the camera again, but the stream will then drop with the same error a few seconds later. So it seems clear that the gap in the input data is causing the HLS encoder to have a fit and give up once the stream is resumed, but the size of the gap has an impact on whether the drop is instant or not.

    


  • Recording openCV processed frames on Android using CameraBridgeViewBase

    16 août 2018, par Razvan Ilin

    I basically want to do the same thing as here : Recording Live OpenCV Processing on Android

    I am able to preview the camera using openCV’s JavaCameraView, process the frames (finding color blobs) and then I got stuck when I had to record the video as well.

    I tried to implement the highest voted answer on the above question, but due to my limited knowledge with these technologies, I couldn’t understand exactly how to do it. Then I tried to use ffmpeg and used this as an example : https://github.com/bytedeco/javacv/blob/master/samples/RecordActivity.java

    The point where I got stuck here, was how to pass my CameraBridgeViewBase to the CameraView that is used to record the frames. I’m pretty sure I need to somehow modify my onCreate() method somehow, but I need some pointers on how to approach it.

    I’ll paste my code below :

    OpenCVActivity.java :

    package ch.hepia.iti.opencvnativeandroidstudio;

    import android.Manifest;
    import android.content.Context;
    import android.content.pm.PackageManager;
    import android.hardware.Camera;
    import android.media.AudioFormat;
    import android.media.AudioRecord;
    import android.media.CamcorderProfile;
    import android.media.MediaRecorder;
    import android.media.MediaScannerConnection;
    import android.os.Bundle;
    import android.os.Environment;
    import android.support.v4.app.ActivityCompat;
    import android.support.v7.app.AppCompatActivity;
    import android.util.Log;
    import android.view.Display;
    import android.view.KeyEvent;
    import android.view.LayoutInflater;
    import android.view.SurfaceHolder;
    import android.view.SurfaceView;
    import android.view.View;
    import android.view.WindowManager;
    import android.widget.Button;
    import android.widget.LinearLayout;
    import android.widget.RelativeLayout;
    import android.widget.Toast;

    import org.bytedeco.javacpp.avutil;
    import org.bytedeco.javacpp.opencv_core;
    import org.bytedeco.javacv.FFmpegFrameFilter;
    import org.bytedeco.javacv.FFmpegFrameRecorder;
    import org.bytedeco.javacv.Frame;
    import org.bytedeco.javacv.FrameFilter;
    import org.bytedeco.javacv.OpenCVFrameConverter;
    import org.opencv.android.BaseLoaderCallback;
    import org.opencv.android.CameraBridgeViewBase;
    import org.opencv.android.LoaderCallbackInterface;
    import org.opencv.android.OpenCVLoader;
    import org.opencv.core.Mat;
    import org.opencv.core.Scalar;

    import java.io.File;
    import java.io.IOException;
    import java.nio.ByteBuffer;
    import java.nio.ShortBuffer;
    import java.util.Collections;
    import java.util.Comparator;
    import java.util.List;

    public class OpenCVActivity extends AppCompatActivity implements CameraBridgeViewBase.CvCameraViewListener2, View.OnClickListener {

       private String ffmpeg_link = Environment.getExternalStorageDirectory().getAbsolutePath() + "/test2Video.mp4";
       private String LOG_TAG = "VideoTest";

       private static final String TAG = "OCVSample::Activity";
       private CameraBridgeViewBase _cameraBridgeViewBase;
       private Button recordButton;

       long startTime = 0;
       boolean recording = false;

       private FFmpegFrameRecorder recorder;

       private boolean isPreviewOn = false;

       /*Filter information, change boolean to true if adding a fitler*/
       private boolean addFilter = true;
       private String filterString = "";
       FFmpegFrameFilter filter;

       private int sampleAudioRateInHz = 44100;
       private int imageWidth = 320;
       private int imageHeight = 240;
       private int frameRate = 30;

       /* audio data getting thread */
       private AudioRecord audioRecord;
       private AudioRecordRunnable audioRecordRunnable;
       private Thread audioThread;
       volatile boolean runAudioThread = true;

       /* video data getting thread */
       private Camera cameraDevice;
       private CameraView cameraView;

       private Frame yuvImage = null;

       /* layout setting */
       private final int bg_screen_bx = 232;
       private final int bg_screen_by = 128;
       private final int bg_screen_width = 700;
       private final int bg_screen_height = 500;
       private final int bg_width = 1123;
       private final int bg_height = 715;
       private final int live_width = 640;
       private final int live_height = 480;
       private int screenWidth, screenHeight;
       private Button btnRecorderControl;

       /* The number of seconds in the continuous record loop (or 0 to disable loop). */
       final int RECORD_LENGTH = 0;
       Frame[] images;
       long[] timestamps;
       ShortBuffer[] samples;
       int imagesIndex, samplesIndex;

       private BaseLoaderCallback _baseLoaderCallback = new BaseLoaderCallback(this) {
           @Override
           public void onManagerConnected(int status) {
               switch (status) {
                   case LoaderCallbackInterface.SUCCESS: {
                       Log.i(TAG, "OpenCV loaded successfully");
                       // Load ndk built module, as specified in moduleName in build.gradle
                       // after opencv initialization
                       System.loadLibrary("native-lib");
                       _cameraBridgeViewBase.enableView();
                   }
                   break;
                   default: {
                       super.onManagerConnected(status);
                   }
               }
           }
       };

       @Override
       public void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);

           Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
           screenWidth = display.getWidth();
           screenHeight = display.getHeight();

           getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
           setContentView(R.layout.activity_main);

           // Permissions for Android 6+
           ActivityCompat.requestPermissions(OpenCVActivity.this,
                   new String[]{Manifest.permission.CAMERA},
                   1);

           _cameraBridgeViewBase = (CameraBridgeViewBase) findViewById(R.id.main_surface);
           _cameraBridgeViewBase.setVisibility(SurfaceView.VISIBLE);
           _cameraBridgeViewBase.setCvCameraViewListener(this);

           RelativeLayout.LayoutParams layoutParam = null;
           LayoutInflater myInflate = null;
           myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
           RelativeLayout topLayout = new RelativeLayout(this);
           setContentView(topLayout);
           LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(R.layout.activity_main, null);
           layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
           topLayout.addView(_cameraBridgeViewBase, layoutParam);

           recordButton = (Button) findViewById(R.id.recorder_control);
           recordButton.setText("Start");
           recordButton.setOnClickListener(this);

           /* add camera view */
           int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
           int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
           int prev_rw, prev_rh;
           if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
               prev_rh = display_height_d;
               prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
           } else {
               prev_rw = display_width_d;
               prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
           }
           layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
           layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
           layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);

           cameraDevice = Camera.open();
           cameraView = new CameraView(this, cameraDevice);
           topLayout.addView(cameraView, layoutParam);
       }

       //---------------------------------------
       // initialize ffmpeg_recorder
       //---------------------------------------
       private void initRecorder() {

           Log.w(LOG_TAG,"init recorder");

           if (RECORD_LENGTH > 0) {
               imagesIndex = 0;
               images = new Frame[RECORD_LENGTH * frameRate];
               timestamps = new long[images.length];
               for (int i = 0; i < images.length; i++) {
                   images[i] = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
                   timestamps[i] = -1;
               }
           } else if (yuvImage == null) {
               yuvImage = new Frame(imageWidth, imageHeight, Frame.DEPTH_UBYTE, 2);
               Log.i(LOG_TAG, "create yuvImage");
           }

           Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
           recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
           recorder.setFormat("mp4");
           recorder.setSampleRate(sampleAudioRateInHz);
           // Set in the surface changed method
           recorder.setFrameRate(frameRate);

           // The filterString  is any ffmpeg filter.
           // Here is the link for a list: https://ffmpeg.org/ffmpeg-filters.html
           filterString = "transpose=0";
           filter = new FFmpegFrameFilter(filterString, imageWidth, imageHeight);

           //default format on android
           filter.setPixelFormat(avutil.AV_PIX_FMT_NV21);

           Log.i(LOG_TAG, "recorder initialize success");

           audioRecordRunnable = new AudioRecordRunnable();
           audioThread = new Thread(audioRecordRunnable);
           runAudioThread = true;
       }

       public void startRecording() {

           initRecorder();

           try {
               recorder.start();
               startTime = System.currentTimeMillis();
               recording = true;
               audioThread.start();

               if(addFilter) {
                   filter.start();
               }

           } catch (FFmpegFrameRecorder.Exception | FrameFilter.Exception e) {
               e.printStackTrace();
           }
       }

       public void stopRecording() {

           runAudioThread = false;
           try {
               audioThread.join();
           } catch (InterruptedException e) {
               // reset interrupt to be nice
               Thread.currentThread().interrupt();
               return;
           }
           audioRecordRunnable = null;
           audioThread = null;

           if (recorder != null && recording) {
               if (RECORD_LENGTH > 0) {
                   Log.v(LOG_TAG,"Writing frames");
                   try {
                       int firstIndex = imagesIndex % samples.length;
                       int lastIndex = (imagesIndex - 1) % images.length;
                       if (imagesIndex <= images.length) {
                           firstIndex = 0;
                           lastIndex = imagesIndex - 1;
                       }
                       if ((startTime = timestamps[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
                           startTime = 0;
                       }
                       if (lastIndex < firstIndex) {
                           lastIndex += images.length;
                       }
                       for (int i = firstIndex; i <= lastIndex; i++) {
                           long t = timestamps[i % timestamps.length] - startTime;
                           if (t >= 0) {
                               if (t > recorder.getTimestamp()) {
                                   recorder.setTimestamp(t);
                               }
                               recorder.record(images[i % images.length]);
                           }
                       }

                       firstIndex = samplesIndex % samples.length;
                       lastIndex = (samplesIndex - 1) % samples.length;
                       if (samplesIndex <= samples.length) {
                           firstIndex = 0;
                           lastIndex = samplesIndex - 1;
                       }
                       if (lastIndex < firstIndex) {
                           lastIndex += samples.length;
                       }
                       for (int i = firstIndex; i <= lastIndex; i++) {
                           recorder.recordSamples(samples[i % samples.length]);
                       }
                   } catch (FFmpegFrameRecorder.Exception e) {
                       Log.v(LOG_TAG,e.getMessage());
                       e.printStackTrace();
                   }
               }

               recording = false;
               Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
               try {
                   recorder.stop();
                   recorder.release();
                   filter.stop();
                   filter.release();
               } catch (FFmpegFrameRecorder.Exception | FrameFilter.Exception e) {
                   e.printStackTrace();
               }
               recorder = null;

           }
       }

       @Override
       public boolean onKeyDown(int keyCode, KeyEvent event) {

           if (keyCode == KeyEvent.KEYCODE_BACK) {
               if (recording) {
                   stopRecording();
               }

               finish();

               return true;
           }

           return super.onKeyDown(keyCode, event);
       }

       //---------------------------------------------
       // audio thread, gets and encodes audio data
       //---------------------------------------------
       class AudioRecordRunnable implements Runnable {

           @Override
           public void run() {
               android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

               // Audio
               int bufferSize;
               ShortBuffer audioData;
               int bufferReadResult;

               bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
               audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                       AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

               if (RECORD_LENGTH > 0) {
                   samplesIndex = 0;
                   samples = new ShortBuffer[RECORD_LENGTH * sampleAudioRateInHz * 2 / bufferSize + 1];
                   for (int i = 0; i < samples.length; i++) {
                       samples[i] = ShortBuffer.allocate(bufferSize);
                   }
               } else {
                   audioData = ShortBuffer.allocate(bufferSize);
               }

               Log.d(LOG_TAG, "audioRecord.startRecording()");
               audioRecord.startRecording();

               /* ffmpeg_audio encoding loop */
               while (runAudioThread) {
                   if (RECORD_LENGTH > 0) {
                       audioData = samples[samplesIndex++ % samples.length];
                       audioData.position(0).limit(0);
                   }
                   //Log.v(LOG_TAG,"recording? " + recording);
                   bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
                   audioData.limit(bufferReadResult);
                   if (bufferReadResult > 0) {
                       Log.v(LOG_TAG,"bufferReadResult: " + bufferReadResult);
                       // If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
                       // Why?  Good question...
                       if (recording) {
                           if (RECORD_LENGTH <= 0) try {
                               recorder.recordSamples(audioData);
                               //Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
                           } catch (FFmpegFrameRecorder.Exception e) {
                               Log.v(LOG_TAG,e.getMessage());
                               e.printStackTrace();
                           }
                       }
                   }
               }
               Log.v(LOG_TAG,"AudioThread Finished, release audioRecord");

               /* encoding finish, release recorder */
               if (audioRecord != null) {
                   audioRecord.stop();
                   audioRecord.release();
                   audioRecord = null;
                   Log.v(LOG_TAG,"audioRecord released");
               }
           }
       }

       //---------------------------------------------
       // camera thread, gets and encodes video data
       //---------------------------------------------
       class CameraView extends SurfaceView implements SurfaceHolder.Callback, Camera.PreviewCallback {

           private SurfaceHolder mHolder;
           private Camera mCamera;

           public CameraView(Context context, Camera camera) {
               super(context);
               Log.w("camera","camera view");
               mCamera = camera;
               mHolder = getHolder();
               mHolder.addCallback(CameraView.this);
               mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
               mCamera.setPreviewCallback(CameraView.this);
           }

           @Override
           public void surfaceCreated(SurfaceHolder holder) {
               try {
                   stopPreview();
                   mCamera.setPreviewDisplay(holder);
               } catch (IOException exception) {
                   mCamera.release();
                   mCamera = null;
               }
           }

           public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
               stopPreview();

               Camera.Parameters camParams = mCamera.getParameters();
               List sizes = camParams.getSupportedPreviewSizes();
               // Sort the list in ascending order
               Collections.sort(sizes, new Comparator() {

                   public int compare(final Camera.Size a, final Camera.Size b) {
                       return a.width * a.height - b.width * b.height;
                   }
               });

               // Pick the first preview size that is equal or bigger, or pick the last (biggest) option if we cannot
               // reach the initial settings of imageWidth/imageHeight.
               for (int i = 0; i < sizes.size(); i++) {
                   if ((sizes.get(i).width >= imageWidth && sizes.get(i).height >= imageHeight) || i == sizes.size() - 1) {
                       imageWidth = sizes.get(i).width;
                       imageHeight = sizes.get(i).height;
                       Log.v(LOG_TAG, "Changed to supported resolution: " + imageWidth + "x" + imageHeight);
                       break;
                   }
               }
               camParams.setPreviewSize(imageWidth, imageHeight);

               Log.v(LOG_TAG,"Setting imageWidth: " + imageWidth + " imageHeight: " + imageHeight + " frameRate: " + frameRate);

               camParams.setPreviewFrameRate(frameRate);
               Log.v(LOG_TAG,"Preview Framerate: " + camParams.getPreviewFrameRate());

               mCamera.setParameters(camParams);

               // Set the holder (which might have changed) again
               try {
                   mCamera.setPreviewDisplay(holder);
                   mCamera.setPreviewCallback(CameraView.this);
                   startPreview();
               } catch (Exception e) {
                   Log.e(LOG_TAG, "Could not set preview display in surfaceChanged");
               }
           }

           @Override
           public void surfaceDestroyed(SurfaceHolder holder) {
               try {
                   mHolder.addCallback(null);
                   mCamera.setPreviewCallback(null);
               } catch (RuntimeException e) {
                   // The camera has probably just been released, ignore.
               }
           }

           public void startPreview() {
               if (!isPreviewOn && mCamera != null) {
                   isPreviewOn = true;
                   mCamera.startPreview();
               }
           }

           public void stopPreview() {
               if (isPreviewOn && mCamera != null) {
                   isPreviewOn = false;
                   mCamera.stopPreview();
               }
           }

           @Override
           public void onPreviewFrame(byte[] data, Camera camera) {
               if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
                   startTime = System.currentTimeMillis();
                   return;
               }
               if (RECORD_LENGTH > 0) {
                   int i = imagesIndex++ % images.length;
                   yuvImage = images[i];
                   timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
               }


               /* get video data */
               if (yuvImage != null && recording) {
                   ((ByteBuffer)yuvImage.image[0].position(0)).put(data);

                   if (RECORD_LENGTH <= 0) try {
                       Log.v(LOG_TAG,"Writing Frame");
                       long t = 1000 * (System.currentTimeMillis() - startTime);
                       if (t > recorder.getTimestamp()) {
                           recorder.setTimestamp(t);
                       }

                       if(addFilter) {
                           filter.push(yuvImage);
                           Frame frame2;
                           while ((frame2 = filter.pull()) != null) {
                               recorder.record(frame2, filter.getPixelFormat());
                           }
                       } else {
                           recorder.record(yuvImage);
                       }
                   } catch (FFmpegFrameRecorder.Exception | FrameFilter.Exception e) {
                       Log.v(LOG_TAG,e.getMessage());
                       e.printStackTrace();
                   }
               }
           }
       }

       @Override
       public void onClick(View v) {
           if (!recording) {
               startRecording();
               Log.w(LOG_TAG, "Start Button Pushed");
               recordButton.setText("Stop");
           } else {
               // This will trigger the audio recording loop to stop and then set isRecorderStart = false;
               stopRecording();
               Log.w(LOG_TAG, "Stop Button Pushed");
               recordButton.setText("Start");
           }
       }

       @Override
       public void onPause() {
           super.onPause();
           disableCamera();
       }

       @Override
       public void onResume() {
           super.onResume();
           if (!OpenCVLoader.initDebug()) {
               Log.d(TAG, "Internal OpenCV library not found. Using OpenCV Manager for initialization");
               OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_3_0_0, this, _baseLoaderCallback);
           } else {
               Log.d(TAG, "OpenCV library found inside package. Using it!");
               _baseLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
           }
       }

       @Override
       public void onRequestPermissionsResult(int requestCode, String permissions[], int[] grantResults) {
           switch (requestCode) {
               case 1: {
                   // If request is cancelled, the result arrays are empty.
                   if (grantResults.length > 0
                           && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
                       // permission was granted, yay! Do the
                       // contacts-related task you need to do.
                   } else {
                       // permission denied, boo! Disable the
                       // functionality that depends on this permission.
                       Toast.makeText(OpenCVActivity.this, "Permission denied to read your External storage", Toast.LENGTH_SHORT).show();
                   }
                   return;
               }
               // other 'case' lines to check for other
               // permissions this app might request
           }
       }

       public void onDestroy() {
           super.onDestroy();
           disableCamera();
       }

       public void disableCamera() {
           if (_cameraBridgeViewBase != null)
               _cameraBridgeViewBase.disableView();
       }

       public void onCameraViewStarted(int width, int height) {
           setup();
       }

       public void onCameraViewStopped() {
       }

       public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
           Mat rgba = inputFrame.rgba();
           salt(rgba.getNativeObjAddr());

           return rgba;
       }


       public native void salt(long matAddrGray);
       public native void setup();
    }

    activity_main.xml :

    <?xml version="1.0" encoding="utf-8"?>
    <linearlayout>

       

       <button></button>

    </linearlayout>

    Could really use some help with this. Has anyone done this before ?