Recherche avancée

Médias (1)

Mot : - Tags -/remix

Autres articles (51)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (8336)

  • Cant we use mp4 as format of FFmpegFrameRecorder while RTMP streaming

    13 avril 2016, par Saty

    I am trying to stream video with FFmpegFrameRecorder using javacv. All are working good however I want to use "mp4" as the recording format instead of FLV. is that possible because when i set that, it crashed ?

    I am using the below code which half of the internet community is using to live stream

    public class MainActivity extends Activity implements OnClickListener {

    private final static String LOG_TAG = "MainActivity";

    private PowerManager.WakeLock mWakeLock;

    private String ffmpeg_link = "rtmp://username:password@xxx.xxx.xxx.xxx:1935/live/test.flv";
    //private String ffmpeg_link = "/mnt/sdcard/new_stream.flv";

    private volatile FFmpegFrameRecorder recorder;
    boolean recording = false;
    long startTime = 0;

    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 320;
    private int imageHeight = 240;
    private int frameRate = 30;

    private Thread audioThread;
    volatile boolean runAudioThread = true;
    private AudioRecord audioRecord;
    private AudioRecordRunnable audioRecordRunnable;

    private CameraView cameraView;
    private IplImage yuvIplimage = null;

    private Button recordButton;
    private LinearLayout mainLayout;

    @Override
    public void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);

       setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
       setContentView(R.layout.activity_main);

       initLayout();
       initRecorder();
    }

    @Override
    protected void onResume() {
       super.onResume();

       if (mWakeLock == null) {
           PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
           mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, LOG_TAG);
           mWakeLock.acquire();
       }
    }

    @Override
    protected void onPause() {
       super.onPause();

       if (mWakeLock != null) {
           mWakeLock.release();
           mWakeLock = null;
       }
    }

    @Override
    protected void onDestroy() {
       super.onDestroy();

       recording = false;
    }


    private void initLayout() {

       mainLayout = (LinearLayout) this.findViewById(R.id.record_layout);

       recordButton = (Button) findViewById(R.id.recorder_control);
       recordButton.setText("Start");
       recordButton.setOnClickListener(this);

       cameraView = new CameraView(this);

       LinearLayout.LayoutParams layoutParam = new LinearLayout.LayoutParams(imageWidth, imageHeight);        
       mainLayout.addView(cameraView, layoutParam);
       Log.v(LOG_TAG, "added cameraView to mainLayout");
    }

    private void initRecorder() {
       Log.w(LOG_TAG,"initRecorder");

       if (yuvIplimage == null) {
           // Recreated after frame size is set in surface change method
           yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
           //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);

           Log.v(LOG_TAG, "IplImage.create");
       }

       recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
       Log.v(LOG_TAG, "FFmpegFrameRecorder: " + ffmpeg_link + " imageWidth: " + imageWidth + " imageHeight " + imageHeight);

       recorder.setFormat("flv");
       Log.v(LOG_TAG, "recorder.setFormat(\"flv\")");

       recorder.setSampleRate(sampleAudioRateInHz);
       Log.v(LOG_TAG, "recorder.setSampleRate(sampleAudioRateInHz)");

       // re-set in the surface changed method as well
       recorder.setFrameRate(frameRate);
       Log.v(LOG_TAG, "recorder.setFrameRate(frameRate)");

       // Create audio recording thread
       audioRecordRunnable = new AudioRecordRunnable();
       audioThread = new Thread(audioRecordRunnable);
    }

    // Start the capture
    public void startRecording() {
       try {
           recorder.start();
           startTime = System.currentTimeMillis();
           recording = true;
           audioThread.start();
       } catch (FFmpegFrameRecorder.Exception e) {
           e.printStackTrace();
       }
    }

    public void stopRecording() {
       // This should stop the audio thread from running
       runAudioThread = false;

       if (recorder != null && recording) {
           recording = false;
           Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
           try {
               recorder.stop();
               recorder.release();
           } catch (FFmpegFrameRecorder.Exception e) {
               e.printStackTrace();
           }
           recorder = null;
       }
    }

    @Override
    public boolean onKeyDown(int keyCode, KeyEvent event) {
       // Quit when back button is pushed
       if (keyCode == KeyEvent.KEYCODE_BACK) {
           if (recording) {
               stopRecording();
           }
           finish();
           return true;
       }
       return super.onKeyDown(keyCode, event);
    }

    @Override
    public void onClick(View v) {
       if (!recording) {
           startRecording();
           Log.w(LOG_TAG, "Start Button Pushed");
           recordButton.setText("Stop");
       } else {
           stopRecording();
           Log.w(LOG_TAG, "Stop Button Pushed");
           recordButton.setText("Start");
       }
    }

    //---------------------------------------------
    // audio thread, gets and encodes audio data
    //---------------------------------------------
    class AudioRecordRunnable implements Runnable {

       @Override
       public void run() {
           // Set the thread priority
           android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

           // Audio
           int bufferSize;
           short[] audioData;
           int bufferReadResult;

           bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
                   AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
           audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
                   AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

           audioData = new short[bufferSize];

           Log.d(LOG_TAG, "audioRecord.startRecording()");
           audioRecord.startRecording();

           // Audio Capture/Encoding Loop
           while (runAudioThread) {
               // Read from audioRecord
               bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
               if (bufferReadResult > 0) {
                   //Log.v(LOG_TAG,"audioRecord bufferReadResult: " + bufferReadResult);

                   // Changes in this variable may not be picked up despite it being "volatile"
                   if (recording) {
                       try {
                           // Write to FFmpegFrameRecorder
                           Buffer[] buffer = {ShortBuffer.wrap(audioData, 0, bufferReadResult)};                        
                           recorder.record(buffer);
                       } catch (FFmpegFrameRecorder.Exception e) {
                           Log.v(LOG_TAG,e.getMessage());
                           e.printStackTrace();
                       }
                   }
               }
           }
           Log.v(LOG_TAG,"AudioThread Finished");

           /* Capture/Encoding finished, release recorder */
           if (audioRecord != null) {
               audioRecord.stop();
               audioRecord.release();
               audioRecord = null;
               Log.v(LOG_TAG,"audioRecord released");
           }
       }
    }

    class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

       private boolean previewRunning = false;

       private SurfaceHolder holder;
       private Camera camera;

       private byte[] previewBuffer;

       long videoTimestamp = 0;

       Bitmap bitmap;
       Canvas canvas;

       public CameraView(Context _context) {
           super(_context);

           holder = this.getHolder();
           holder.addCallback(this);
           holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
       }

       @Override
       public void surfaceCreated(SurfaceHolder holder) {
           camera = Camera.open();

           try {
               camera.setPreviewDisplay(holder);
               camera.setPreviewCallback(this);

               Camera.Parameters currentParams = camera.getParameters();
               Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
               Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

               // Use these values
               imageWidth = currentParams.getPreviewSize().width;
               imageHeight = currentParams.getPreviewSize().height;
               frameRate = currentParams.getPreviewFrameRate();                

               bitmap = Bitmap.createBitmap(imageWidth, imageHeight, Bitmap.Config.ALPHA_8);


               /*
               Log.v(LOG_TAG,"Creating previewBuffer size: " + imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8);
               previewBuffer = new byte[imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8];
               camera.addCallbackBuffer(previewBuffer);
               camera.setPreviewCallbackWithBuffer(this);
               */              

               camera.startPreview();
               previewRunning = true;
           }
           catch (IOException e) {
               Log.v(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }  
       }

       public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
           Log.v(LOG_TAG,"Surface Changed: width " + width + " height: " + height);

           // We would do this if we want to reset the camera parameters
           /*
           if (!recording) {
               if (previewRunning){
                   camera.stopPreview();
               }
               try {
                   //Camera.Parameters cameraParameters = camera.getParameters();
                   //p.setPreviewSize(imageWidth, imageHeight);
                   //p.setPreviewFrameRate(frameRate);
                   //camera.setParameters(cameraParameters);

                   camera.setPreviewDisplay(holder);
                   camera.startPreview();
                   previewRunning = true;
               }
               catch (IOException e) {
                   Log.e(LOG_TAG,e.getMessage());
                   e.printStackTrace();
               }  
           }            
           */

           // Get the current parameters
           Camera.Parameters currentParams = camera.getParameters();
           Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
           Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

           // Use these values
           imageWidth = currentParams.getPreviewSize().width;
           imageHeight = currentParams.getPreviewSize().height;
           frameRate = currentParams.getPreviewFrameRate();

           // Create the yuvIplimage if needed
           yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
           //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
       }

       @Override
       public void surfaceDestroyed(SurfaceHolder holder) {
           try {
               camera.setPreviewCallback(null);

               previewRunning = false;
               camera.release();

           } catch (RuntimeException e) {
               Log.v(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }
       }

       @Override
       public void onPreviewFrame(byte[] data, Camera camera) {

           if (yuvIplimage != null && recording) {
               videoTimestamp = 1000 * (System.currentTimeMillis() - startTime);

               // Put the camera preview frame right into the yuvIplimage object
               yuvIplimage.getByteBuffer().put(data);

               // FAQ about IplImage:
               // - For custom raw processing of data, getByteBuffer() returns an NIO direct
               //   buffer wrapped around the memory pointed by imageData, and under Android we can
               //   also use that Buffer with Bitmap.copyPixelsFromBuffer() and copyPixelsToBuffer().
               // - To get a BufferedImage from an IplImage, we may call getBufferedImage().
               // - The createFrom() factory method can construct an IplImage from a BufferedImage.
               // - There are also a few copy*() methods for BufferedImage<->IplImage data transfers.

               // Let's try it..
               // This works but only on transparency
               // Need to find the right Bitmap and IplImage matching types

               /*
               bitmap.copyPixelsFromBuffer(yuvIplimage.getByteBuffer());
               //bitmap.setPixel(10,10,Color.MAGENTA);

               canvas = new Canvas(bitmap);
               Paint paint = new Paint();
               paint.setColor(Color.GREEN);
               float leftx = 20;
               float topy = 20;
               float rightx = 50;
               float bottomy = 100;
               RectF rectangle = new RectF(leftx,topy,rightx,bottomy);
               canvas.drawRect(rectangle, paint);

               bitmap.copyPixelsToBuffer(yuvIplimage.getByteBuffer());
               */
               //Log.v(LOG_TAG,"Writing Frame");

               try {

                   // Get the correct time
                   recorder.setTimestamp(videoTimestamp);

                   // Record the image into FFmpegFrameRecorder
                   recorder.record(yuvIplimage);

               } catch (FFmpegFrameRecorder.Exception e) {
                   Log.v(LOG_TAG,e.getMessage());
                   e.printStackTrace();
               }
           }
       }
    }

    }

  • Squeeze image when images are larger image than 1024X768 in FFMpeg with javacv

    14 avril 2016, par Saty

    I am using this below code to stream RTMP to my Adobe FMS Server... its doing good. It shows squeezed image if the camera resolution is above 1024X768.
    The issue came when we tested on a Tab which has camera resolution of 1200X800, The recorder automatically takes the resolution of 1024X768 which makes the preview and the actual video a squeezed one and one more thing is recording format does not support MP4

    Can anyone describe why its not working with this and can we use that format.

    public class MainActivity extends Activity implements OnClickListener {

    private final static String LOG_TAG = "MainActivity";

    private PowerManager.WakeLock mWakeLock;

    private String ffmpeg_link = "rtmp://username:password@xxx.xxx.xxx.xxx:1935/live/test.flv";

    //private String ffmpeg_link = "/mnt/sdcard/new_stream.flv" ;

    private volatile FFmpegFrameRecorder recorder;
    boolean recording = false;
    long startTime = 0;

    private int sampleAudioRateInHz = 44100;
    private int imageWidth = 320;
    private int imageHeight = 240;
    private int frameRate = 30;

    private Thread audioThread;
    volatile boolean runAudioThread = true;
    private AudioRecord audioRecord;
    private AudioRecordRunnable audioRecordRunnable;

    private CameraView cameraView;
    private IplImage yuvIplimage = null;

    private Button recordButton;
    private LinearLayout mainLayout;

    @Override
    public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
    setContentView(R.layout.activity_main);

    initLayout();
    initRecorder();
    }

    @Override
    protected void onResume() {
     super.onResume();

    if (mWakeLock == null) {
       PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
       mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, LOG_TAG);
       mWakeLock.acquire();
    }
    }

    @Override
    protected void onPause() {
    super.onPause();

    if (mWakeLock != null) {
       mWakeLock.release();
       mWakeLock = null;
    }
    }

    @Override
    protected void onDestroy() {
    super.onDestroy();

    recording = false;
    }


    private void initLayout() {

    mainLayout = (LinearLayout) this.findViewById(R.id.record_layout);

    recordButton = (Button) findViewById(R.id.recorder_control);
    recordButton.setText("Start");
    recordButton.setOnClickListener(this);

    cameraView = new CameraView(this);

    LinearLayout.LayoutParams layoutParam = new LinearLayout.LayoutParams(imageWidth, imageHeight);        
    mainLayout.addView(cameraView, layoutParam);
    Log.v(LOG_TAG, "added cameraView to mainLayout");
    }

    private void initRecorder() {
    Log.w(LOG_TAG,"initRecorder");

    if (yuvIplimage == null) {
       // Recreated after frame size is set in surface change method
       yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
       //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);

       Log.v(LOG_TAG, "IplImage.create");
     }

    recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
    Log.v(LOG_TAG, "FFmpegFrameRecorder: " + ffmpeg_link + " imageWidth: " + imageWidth + " imageHeight " + imageHeight);

    recorder.setFormat("flv");
    Log.v(LOG_TAG, "recorder.setFormat(\"flv\")");

    recorder.setSampleRate(sampleAudioRateInHz);
    Log.v(LOG_TAG, "recorder.setSampleRate(sampleAudioRateInHz)");

    // re-set in the surface changed method as well
    recorder.setFrameRate(frameRate);
    Log.v(LOG_TAG, "recorder.setFrameRate(frameRate)");

    // Create audio recording thread
    audioRecordRunnable = new AudioRecordRunnable();
    audioThread = new Thread(audioRecordRunnable);

    }

     // Start the capture
        public void startRecording() {
        try {
       recorder.start();
       startTime = System.currentTimeMillis();
       recording = true;
       audioThread.start();
    } catch (FFmpegFrameRecorder.Exception e) {
       e.printStackTrace();
        }
        }

        public void stopRecording() {
        // This should stop the audio thread from running
       runAudioThread = false;

       if (recorder != null && recording) {
       recording = false;
       Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
       try {
           recorder.stop();
           recorder.release();
       } catch (FFmpegFrameRecorder.Exception e) {
           e.printStackTrace();
       }
       recorder = null;
       }
       }

      @Override
      public boolean onKeyDown(int keyCode, KeyEvent event) {
    // Quit when back button is pushed
      if (keyCode == KeyEvent.KEYCODE_BACK) {
       if (recording) {
           stopRecording();
       }
       finish();
       return true;
       }
       return super.onKeyDown(keyCode, event);
       }

       @Override
       public void onClick(View v) {
       if (!recording) {
       startRecording();
       Log.w(LOG_TAG, "Start Button Pushed");
       recordButton.setText("Stop");
       } else {
       stopRecording();
       Log.w(LOG_TAG, "Stop Button Pushed");
       recordButton.setText("Start");
       }
       }

       //---------------------------------------------
       // audio thread, gets and encodes audio data
       //---------------------------------------------
       class AudioRecordRunnable implements Runnable {

       @Override
        public void run() {
       // Set the thread priority
         android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

       // Audio
       int bufferSize;
       short[] audioData;
       int bufferReadResult;

       bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
               AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT);
       audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
               AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);

       audioData = new short[bufferSize];

       Log.d(LOG_TAG, "audioRecord.startRecording()");
       audioRecord.startRecording();

       // Audio Capture/Encoding Loop
       while (runAudioThread) {
           // Read from audioRecord
           bufferReadResult = audioRecord.read(audioData, 0, audioData.length);
           if (bufferReadResult > 0) {
               //Log.v(LOG_TAG,"audioRecord bufferReadResult: " + bufferReadResult);

               // Changes in this variable may not be picked up despite it being "volatile"
               if (recording) {
                   try {
                       // Write to FFmpegFrameRecorder
                       Buffer[] buffer = {ShortBuffer.wrap(audioData, 0, bufferReadResult)};                        
                       recorder.record(buffer);
                   } catch (FFmpegFrameRecorder.Exception e) {
                       Log.v(LOG_TAG,e.getMessage());
                       e.printStackTrace();
                   }
               }
           }
       }
       Log.v(LOG_TAG,"AudioThread Finished");

       /* Capture/Encoding finished, release recorder */
       if (audioRecord != null) {
           audioRecord.stop();
           audioRecord.release();
           audioRecord = null;
           Log.v(LOG_TAG,"audioRecord released");
       }
    }

    }

     class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {

    private boolean previewRunning = false;

    private SurfaceHolder holder;
    private Camera camera;

    private byte[] previewBuffer;

    long videoTimestamp = 0;

    Bitmap bitmap;
    Canvas canvas;

    public CameraView(Context _context) {
       super(_context);

       holder = this.getHolder();
       holder.addCallback(this);
       holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
       camera = Camera.open();

       try {
           camera.setPreviewDisplay(holder);
           camera.setPreviewCallback(this);

           Camera.Parameters currentParams = camera.getParameters();
           Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
           Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

           // Use these values
           imageWidth = currentParams.getPreviewSize().width;
           imageHeight = currentParams.getPreviewSize().height;
           frameRate = currentParams.getPreviewFrameRate();                

           bitmap = Bitmap.createBitmap(imageWidth, imageHeight, Bitmap.Config.ALPHA_8);


           /*
           Log.v(LOG_TAG,"Creating previewBuffer size: " + imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8);
           previewBuffer = new byte[imageWidth * imageHeight * ImageFormat.getBitsPerPixel(currentParams.getPreviewFormat())/8];
           camera.addCallbackBuffer(previewBuffer);
           camera.setPreviewCallbackWithBuffer(this);
           */              

           camera.startPreview();
           previewRunning = true;
       }
       catch (IOException e) {
           Log.v(LOG_TAG,e.getMessage());
           e.printStackTrace();
       }  
    }

    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
       Log.v(LOG_TAG,"Surface Changed: width " + width + " height: " + height);

       // We would do this if we want to reset the camera parameters
       /*
       if (!recording) {
           if (previewRunning){
               camera.stopPreview();
           }
           try {
               //Camera.Parameters cameraParameters = camera.getParameters();
               //p.setPreviewSize(imageWidth, imageHeight);
               //p.setPreviewFrameRate(frameRate);
               //camera.setParameters(cameraParameters);

               camera.setPreviewDisplay(holder);
               camera.startPreview();
               previewRunning = true;
           }
           catch (IOException e) {
               Log.e(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }  
       }            
       */

       // Get the current parameters
       Camera.Parameters currentParams = camera.getParameters();
       Log.v(LOG_TAG,"Preview Framerate: " + currentParams.getPreviewFrameRate());
       Log.v(LOG_TAG,"Preview imageWidth: " + currentParams.getPreviewSize().width + " imageHeight: " + currentParams.getPreviewSize().height);

       // Use these values
       imageWidth = currentParams.getPreviewSize().width;
       imageHeight = currentParams.getPreviewSize().height;
       frameRate = currentParams.getPreviewFrameRate();

       // Create the yuvIplimage if needed
       yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
       //yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_32S, 2);
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
       try {
           camera.setPreviewCallback(null);

           previewRunning = false;
           camera.release();

       } catch (RuntimeException e) {
           Log.v(LOG_TAG,e.getMessage());
           e.printStackTrace();
       }
    }

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {

       if (yuvIplimage != null && recording) {
           videoTimestamp = 1000 * (System.currentTimeMillis() - startTime);

           // Put the camera preview frame right into the yuvIplimage object
           yuvIplimage.getByteBuffer().put(data);

           // FAQ about IplImage:
           // - For custom raw processing of data, getByteBuffer() returns an NIO direct
           //   buffer wrapped around the memory pointed by imageData, and under Android we can
           //   also use that Buffer with Bitmap.copyPixelsFromBuffer() and copyPixelsToBuffer().
           // - To get a BufferedImage from an IplImage, we may call getBufferedImage().
           // - The createFrom() factory method can construct an IplImage from a BufferedImage.
           // - There are also a few copy*() methods for BufferedImage<->IplImage data transfers.

           // Let's try it..
           // This works but only on transparency
           // Need to find the right Bitmap and IplImage matching types

           /*
           bitmap.copyPixelsFromBuffer(yuvIplimage.getByteBuffer());
           //bitmap.setPixel(10,10,Color.MAGENTA);

           canvas = new Canvas(bitmap);
           Paint paint = new Paint();
           paint.setColor(Color.GREEN);
           float leftx = 20;
           float topy = 20;
           float rightx = 50;
           float bottomy = 100;
           RectF rectangle = new RectF(leftx,topy,rightx,bottomy);
           canvas.drawRect(rectangle, paint);

           bitmap.copyPixelsToBuffer(yuvIplimage.getByteBuffer());
           */
           //Log.v(LOG_TAG,"Writing Frame");

           try {

               // Get the correct time
               recorder.setTimestamp(videoTimestamp);

               // Record the image into FFmpegFrameRecorder
               recorder.record(yuvIplimage);

           } catch (FFmpegFrameRecorder.Exception e) {
               Log.v(LOG_TAG,e.getMessage());
               e.printStackTrace();
           }
       }
    }

    }
    }

  • ffmpeg concat command of many files loses quality or makes result video to stall

    28 avril 2016, par edotom

    I have a Project that intends to have the union of several images and several videos converted to mp4 output.
    Source images come from a Windows App that converted bmps to jpgs. Those images were converted to mp4 using the following command :

    ffmpeg -report  -framerate 1/7 -i source.jpg -c:v libx264 -r 30 -pix_fmt yuv420p  destination.mp4

    Result video can be seen without problems.

    Source videos come from a vms in mkv format and are converted into mp4 using the following :

    ffmpeg -i source.mkv -vcodec copy -acodec copy destination.mp4

    As a result I have a series for different mp4 files that are copied to a txt file which will allow us to make a concatenation into a result mp4.

    The contents of txt will be something like this (jpgs should be placed alternately to videos) :

    file '1_VideoMkvto.mp4'
    file '2_VideoMkvto.mp4'
    file '3_BmptoJpg.mp4'
    file '4_VideoMkvto.mp4'
    file '5_BmptoJpg.mp4'
    file '7_VideoMkvto.mp4'
    file '8_JpgtoMp4.mp4'
    ....

    I use the following instruction :

     ffmpeg -report -f concat -i allVideos.txt -c copy results.mp4

    The issue that I’m experiencing is that in joined video, videos created originally from images (they were originally bmps converted to jpg) show problems like not being shown, shown for more or less time than original video or even worse... they make video player to stall when their part is played or closed without warning.

    Pd. I tried to concat all videos into one single file and all jpgs-mp4 into one single file and individually they can be seen easily, but when joined together, that’s when everything starts to fail.

    Raw jpg :

    "D:\\ffmpeg\\bin\\ffmpeg.exe" -loop 1 -i "D:\\Project2\\files
    \\12278\\PRU_SONOMETRO.jpg" -c:v libx264 -t 5 -y -vf "fps=7,format=yuv420p" "D:\
    \Project2\\files\\12278\\Video\\PRU_SONOMETRO.mp4"
    ffmpeg version N-76456-g6df2c94 Copyright (c) 2000-2015 the FFmpeg developers
    built with gcc 5.2.0 (GCC)
    configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-av
    isynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enab
    le-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --
    enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-l
    ibilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enab
    le-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --en
    able-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --ena
    ble-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc
    --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enabl
    e-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-lzma --ena
    ble-decklink --enable-zlib
    libavutil      55.  5.100 / 55.  5.100
    libavcodec     57. 14.100 / 57. 14.100
    libavformat    57. 14.100 / 57. 14.100
    libavdevice    57.  0.100 / 57.  0.100
    libavfilter     6. 14.101 /  6. 14.101
    libswscale      4.  0.100 /  4.  0.100
    libswresample   2.  0.100 /  2.  0.100
    libpostproc    54.  0.100 / 54.  0.100
    [mjpeg @ 008e3300] ignoring invalid SAR: 0/0
    Input #0, image2, from 'D:\\Project2\\files\\12278\\PRU_SONOMETRO
    .jpg':
    Duration: 00:00:00.04, start: 0.000000, bitrate: 31662 kb/s
       Stream #0:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 1920x1080,
    25 fps, 25 tbr, 25 tbn, 25 tbc
    [swscaler @ 00931d20] deprecated pixel format used, make sure you did set range
    correctly
    [libx264 @ 00900de0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
    [libx264 @ 00900de0] profile High, level 4.0
    [libx264 @ 00900de0] 264 - core 148 r2638 7599210 - H.264/MPEG-4 AVC codec - Cop
    yleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deb
    lock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 m
    e_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chro
    ma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 i
    nterlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1
    b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=7 scenecu
    t=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0
    qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
    Output #0, mp4, to 'D:\\Project2\\files\\12278\\Video\\PRU_SONOME
    TRO.mp4':
    Metadata:
       encoder         : Lavf57.14.100
       Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 1920x1
    080, q=-1--1, 7 fps, 14336 tbn, 7 tbc
       Metadata:
       encoder         : Lavc57.14.100 libx264
    Stream mapping:
    Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
    Press [q] to stop, [?] for help
    [mjpeg @ 008e5700] ignoring invalid SAR: 0/0
       Last message repeated 18 times
    [mjpeg @ 008e5700] ignoring invalid SAR: 0/0me=00:00:00.00 bitrate=N/A
       Last message repeated 18 times
    [mjpeg @ 008e5700] ignoring invalid SAR: 0/0me=00:00:00.00 bitrate=N/A
       Last message repeated 18 times
    [mjpeg @ 008e5700] ignoring invalid SAR: 0/0me=00:00:00.00 bitrate=N/A
       Last message repeated 17 times
    [mjpeg @ 008e5700] ignoring invalid SAR: 0/0me=00:00:00.00 bitrate=N/A
       Last message repeated 17 times
    [mjpeg @ 008e5700] ignoring invalid SAR: 0/0me=00:00:00.00 bitrate=N/A
       Last message repeated 17 times
    [mjpeg @ 008e5700] ignoring invalid SAR: 0/0me=00:00:00.00 bitrate=N/A
       Last message repeated 16 times
    frame=   35 fps=6.1 q=-1.0 Lsize=     212kB time=00:00:04.71 bitrate= 368.6kbits
    /s
    video:211kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing o
    verhead: 0.582017%
    [libx264 @ 00900de0] frame I:1     Avg QP:10.59  size:211484
    [libx264 @ 00900de0] frame P:9     Avg QP:19.90  size:   194
    [libx264 @ 00900de0] frame B:25    Avg QP:30.36  size:    82
    [libx264 @ 00900de0] consecutive B-frames:  2.9%  5.7%  0.0% 91.4%
    [libx264 @ 00900de0] mb I  I16..4:  1.4% 95.8%  2.8%
    [libx264 @ 00900de0] mb P  I16..4:  0.0%  0.0%  0.0%  P16..4:  0.5%  0.0%  0.0%
    0.0%  0.0%    skip:99.5%
    [libx264 @ 00900de0] mb B  I16..4:  0.0%  0.0%  0.0%  B16..8:  0.3%  0.0%  0.0%
    direct: 0.0%  skip:99.7%  L0:42.3% L1:57.7% BI: 0.0%
    [libx264 @ 00900de0] 8x8 transform intra:95.8% inter:73.4%
    [libx264 @ 00900de0] coded y,uvDC,uvAC intra: 92.7% 31.6% 11.8% inter: 0.0% 0.1%
    0.0%
    [libx264 @ 00900de0] i16 v,h,dc,p: 10% 51% 37%  2%
    [libx264 @ 00900de0] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 13% 15% 54%  3%  2%  2%  3%
    3%  4%
    [libx264 @ 00900de0] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 30% 30% 20%  3%  3%  3%  2%
    3%  5%
    [libx264 @ 00900de0] i8c dc,h,v,p: 56% 23% 20%  1%
    [libx264 @ 00900de0] Weighted P-Frames: Y:0.0% UV:0.0%
    [libx264 @ 00900de0] ref P L0: 94.3%  0.0%  2.7%  3.1%
    [libx264 @ 00900de0] ref B L0: 83.1% 15.0%  2.0%
    [libx264 @ 00900de0] ref B L1: 88.4% 11.6%
    [libx264 @ 00900de0] kb/s:344.46

    mkv converted to mp4 :

    D:\>"D:\\ffmpeg\\bin\\ffmpeg.exe" -i "D:\\Project2\\files\\12278\
    \46_REGISTROSALIDA1.mkv" -c:v copy -c:a libfdk_aac -b:a 128k "D:\\Project2\\CDA\
    \SicovCDA\\files\\12278\\video\\46_REGISTROSALIDA1.mp4"
    ffmpeg version N-76456-g6df2c94 Copyright (c) 2000-2015 the FFmpeg developers
    built with gcc 5.2.0 (GCC)
    configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-av
    isynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enab
    le-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --
    enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-l
    ibilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enab
    le-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --en
    able-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --ena
    ble-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc
    --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enabl
    e-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-lzma --ena
    ble-decklink --enable-zlib
    libavutil      55.  5.100 / 55.  5.100
    libavcodec     57. 14.100 / 57. 14.100
    libavformat    57. 14.100 / 57. 14.100
    libavdevice    57.  0.100 / 57.  0.100
    libavfilter     6. 14.101 /  6. 14.101
    libswscale      4.  0.100 /  4.  0.100
    libswresample   2.  0.100 /  2.  0.100
    libpostproc    54.  0.100 / 54.  0.100
    [matroska,webm @ 02fa0d40] Could not find codec parameters for stream 1 (Video:
    mjpeg, none(bt470bg/unknown/unknown), 1920x1080): unspecified pixel format
    Consider increasing the value for the 'analyzeduration' and 'probesize' options
    Input #0, matroska,webm, from 'D:\\Project2\\files\\12278\\46_REG
    ISTROSALIDA1.mkv':
    Metadata:
       encoder         : MilestoneMux
    Duration: 00:03:17.97, start: 0.000000, bitrate: 994 kb/s
       Stream #0:0(eng): Video: h264 (Main), yuvj420p(pc, bt709), 1920x1080 [SAR 1:
    1 DAR 16:9], 30 fps, 30 tbr, 1k tbn, 2k tbc (default)
       Stream #0:1(eng): Video: mjpeg, none(bt470bg/unknown/unknown), 1920x1080, SA
    R 1:1 DAR 16:9, 1k tbr, 1k tbn, 1k tbc (default)
    Codec AVOption b (set bitrate (in bits/s)) specified for output file #0 (D:\\Pro
    ject2\\files\\12278\\video\\46_REGISTROSALIDA1.mp4) has not been
    used for any stream. The most likely reason is either wrong type (e.g. a video o
    ption with no video streams) or that it is a private option of some encoder whic
    h was not actually used for any stream.
    File 'D:\\Project2\\files\\12278\\video\\46_REGISTROSALIDA1.mp4'
    already exists. Overwrite ? [y/N] y
    [mp4 @ 05d02580] Codec for stream 0 does not use global headers but container fo
    rmat requires global headers
    Output #0, mp4, to 'D:\\Project2\\files\\12278\\video\\46_REGISTR
    OSALIDA1.mp4':
    Metadata:
       encoder         : Lavf57.14.100
       Stream #0:0(eng): Video: h264 ([33][0][0][0] / 0x0021), yuvj420p, 1920x1080
    [SAR 1:1 DAR 16:9], q=2-31, 30 fps, 30 tbr, 16k tbn, 1k tbc (default)
    Stream mapping:
    Stream #0:0 -> #0:0 (copy)
    Press [q] to stop, [?] for help
    frame= 5940 fps=0.0 q=-1.0 Lsize=   24016kB time=00:03:17.97 bitrate= 993.8kbits
    /s
    video:23960kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing
    overhead: 0.234450%

    jpg(originally bmp) to mp4 :

       "D:\\ffmpeg\\bin\\ffmpeg.exe" -loop 1 -i "D:\\Project2\\files
    \\12278\\Sensorial.jpg" -c:v libx264 -t 5 -y -vf "fps=7,format=yuv420p" "D
    :\\Project2\\files\\12278\\Video\\Sensorial.mp4"
    ffmpeg version N-76456-g6df2c94 Copyright (c) 2000-2015 the FFmpeg developers
    built with gcc 5.2.0 (GCC)
    configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-av
    isynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enab
    le-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --
    enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-l
    ibilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enab
    le-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --en
    able-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --ena
    ble-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc
    --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enabl
    e-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-lzma --ena
    ble-decklink --enable-zlib
    libavutil      55.  5.100 / 55.  5.100
    libavcodec     57. 14.100 / 57. 14.100
    libavformat    57. 14.100 / 57. 14.100
    libavdevice    57.  0.100 / 57.  0.100
    libavfilter     6. 14.101 /  6. 14.101
    libswscale      4.  0.100 /  4.  0.100
    libswresample   2.  0.100 /  2.  0.100
    libpostproc    54.  0.100 / 54.  0.100
    Input #0, image2, from 'D:\\Project2\\files\\12278\\39_BmpSensori
    al.jpg':
    Duration: 00:00:00.04, start: 0.000000, bitrate: 25444 kb/s
       Stream #0:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 1920x1080
    [SAR 96:96 DAR 16:9], 25 fps, 25 tbr, 25 tbn, 25 tbc
    [swscaler @ 058663a0] deprecated pixel format used, make sure you did set range
    correctly
    [libx264 @ 00df0020] using SAR=1/1
    [libx264 @ 00df0020] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
    [libx264 @ 00df0020] profile High, level 4.0
    [libx264 @ 00df0020] 264 - core 148 r2638 7599210 - H.264/MPEG-4 AVC codec - Cop
    yleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deb
    lock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 m
    e_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chro
    ma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 i
    nterlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1
    b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=7 scenecu
    t=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0
    qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
    Output #0, mp4, to 'D:\\Project2\\files\\12278\\Video\\39_BmpSens
    orial.mp4':
    Metadata:
       encoder         : Lavf57.14.100
       Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 1920x1
    080 [SAR 1:1 DAR 16:9], q=-1--1, 7 fps, 14336 tbn, 7 tbc
       Metadata:
       encoder         : Lavc57.14.100 libx264
    Stream mapping:
    Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (libx264))
    Press [q] to stop, [?] for help
    frame=   35 fps= 11 q=-1.0 Lsize=      83kB time=00:00:04.71 bitrate= 144.7kbits
    /s
    video:82kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing ov
    erhead: 1.497583%
    [libx264 @ 00df0020] frame I:1     Avg QP:13.49  size: 80298
    [libx264 @ 00df0020] frame P:9     Avg QP: 9.09  size:   127
    [libx264 @ 00df0020] frame B:25    Avg QP: 8.68  size:    75
    [libx264 @ 00df0020] consecutive B-frames:  2.9%  5.7%  0.0% 91.4%
    [libx264 @ 00df0020] mb I  I16..4: 51.4% 39.7%  8.8%
    [libx264 @ 00df0020] mb P  I16..4:  0.0%  0.0%  0.0%  P16..4:  0.1%  0.0%  0.0%
    0.0%  0.0%    skip:99.9%
    [libx264 @ 00df0020] mb B  I16..4:  0.0%  0.0%  0.0%  B16..8:  0.1%  0.0%  0.0%
    direct: 0.0%  skip:99.9%  L0:44.9% L1:55.1% BI: 0.0%
    [libx264 @ 00df0020] 8x8 transform intra:39.7% inter:17.3%
    [libx264 @ 00df0020] coded y,uvDC,uvAC intra: 7.1% 9.4% 8.8% inter: 0.0% 0.0% 0.
    0%
    [libx264 @ 00df0020] i16 v,h,dc,p: 79% 18%  3%  0%
    [libx264 @ 00df0020] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 57% 36%  6%  0%  0%  0%  0%
    0%  0%
    [libx264 @ 00df0020] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 29% 28% 13%  3%  5%  6%  6%
    4%  6%
    [libx264 @ 00df0020] i8c dc,h,v,p: 83% 12%  4%  1%
    [libx264 @ 00df0020] Weighted P-Frames: Y:0.0% UV:0.0%
    [libx264 @ 00df0020] ref P L0: 88.9%  1.7%  2.7%  6.8%
    [libx264 @ 00df0020] ref B L0:  4.2% 95.8%
    [libx264 @ 00df0020] ref B L1: 98.3%  1.7%
    [libx264 @ 00df0020] kb/s:133.30

    final trimmed concat output :

       ffmpeg started on 2016-04-27 at 16:06:45
    Report written to "ffmpeg-20160427-160645.log"
    Command line:
    "D:\\ffmpeg\\bin\\ffmpeg.exe" -report -f concat -i "D:\\Project2\\files\\12297\\control.txt" -c copy "D:\\Project2\\files\\12297\\12297resultado.mp4"
    ffmpeg version N-76456-g6df2c94 Copyright (c) 2000-2015 the FFmpeg developers
     built with gcc 5.2.0 (GCC)
     configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libdcadec --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-lzma --enable-decklink --enable-zlib
     libavutil      55.  5.100 / 55.  5.100
     libavcodec     57. 14.100 / 57. 14.100
     libavformat    57. 14.100 / 57. 14.100
     libavdevice    57.  0.100 / 57.  0.100
     libavfilter     6. 14.101 /  6. 14.101
     libswscale      4.  0.100 /  4.  0.100
     libswresample   2.  0.100 /  2.  0.100
     libpostproc    54.  0.100 / 54.  0.100
    Splitting the commandline.
    Reading option '-report' ... matched as option 'report' (generate a report) with argument '1'.
    Reading option '-f' ... matched as option 'f' (force format) with argument 'concat'.
    Reading option '-i' ... matched as input file with argument 'D:\Project2\files\12297\control.txt'.
    Reading option '-c' ... matched as option 'c' (codec name) with argument 'copy'.
    Reading option 'D:\Project2\files\12297\12297resultado.mp4' ... matched as output file.
    Finished splitting the commandline.
    Parsing a group of options: global .
    Applying option report (generate a report) with argument 1.
    Successfully parsed a group of options.
    Parsing a group of options: input file D:\Project2\files\12297\control.txt.
    Applying option f (force format) with argument concat.
    Successfully parsed a group of options.
    Opening an input file: D:\Project2\files\12297\control.txt.
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] ISO: File Type Major Brand: isom
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] rfps: 7.000000 0.000016
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] rfps: 7.000000 0.000016
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] rfps: 14.000000 0.000063
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] rfps: 14.000000 0.000063
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] rfps: 21.000000 0.000143
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] rfps: 21.000000 0.000143
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] rfps: 28.000000 0.000253
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] rfps: 28.000000 0.000253
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] Before avformat_find_stream_info() pos: 7702749 bytes read:39314 seeks:1
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] All info found
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] After avformat_find_stream_info() pos: 71010 bytes read:110276 seeks:2 frames:1
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] Auto-inserting h264_mp4toannexb bitstream filter
    [concat @ 00ad1a00] Before avformat_find_stream_info() pos: 2586 bytes read:2586 seeks:0

    ---comment > file 0 of type mkv converted to mp4


    [concat @ 00ad1a00] file:0 stream:0 pts:0 pts_time:0 dts:0 dts_time:0 -> pts:0 pts_time:0 dts:0 dts_time:0
    [concat @ 00ad1a00] All info found
    [concat @ 00ad1a00] After avformat_find_stream_info() pos: 2586 bytes read:2586 seeks:0 frames:1
    Input #0, concat, from 'D:\Project2\files\12297\control.txt':
     Duration: N/A, start: 0.000000, bitrate: 509 kb/s
       Stream #0:0(eng), 1, 1/16000: Video: h264 (Main) (avc1 / 0x31637661), yuvj420p(pc, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 509 kb/s, 7.01 fps, 7 tbr, 16k tbn, 32k tbc
       Metadata:
         handler_name    : VideoHandler
    Successfully opened the file.
    Parsing a group of options: output file D:\Project2\files\12297\12297resultado.mp4.
    Applying option c (codec name) with argument copy.
    Successfully parsed a group of options.
    Opening an output file: D:\Project2\files\12297\12297resultado.mp4.
    Successfully opened the file.
    [mp4 @ 00ae7780] Codec for stream 0 does not use global headers but container format requires global headers
    Output #0, mp4, to 'D:\Project2\files\12297\12297resultado.mp4':
     Metadata:
       encoder         : Lavf57.14.100
       Stream #0:0(eng), 0, 1/16000: Video: h264 ([33][0][0][0] / 0x0021), yuvj420p, 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 509 kb/s, 7.01 fps, 7 tbr, 16k tbn, 16k tbc
       Metadata:
         handler_name    : VideoHandler
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
    Press [q] to stop, [?] for help
    [concat @ 00ad1a00] file:0 stream:0 pts:2288 pts_time:0.143 dts:2288 dts_time:0.143 -> pts:2288 pts_time:0.143 dts:2288 dts_time:0.143
    [concat @ 00ad1a00] file:0 stream:0 pts:1929280 pts_time:120.58 dts:1929280 dts_time:120.58 -> pts:1929280 pts_time:120.58 dts:1929280 dts_time:120.58
    [concat @ 00ad1a00] file:0 stream:0 pts:1931568 pts_time:120.723 dts:1931568 dts_time:120.723 -> pts:1931568 pts_time:120.723 dts:1931568 dts_time:120.723
    [concat @ 00ad1a00] file:0 stream:0 pts:1933856 pts_time:120.866 dts:1933856 dts_time:120.866 -> pts:1933856 pts_time:120.866 dts:1933856 dts_time:120.866


    ----comment > file 2 of type bmp to jpg converted to mp4

    [AVIOContext @ 00adbca0] Statistics: 2574258 bytes read, 2 seeks
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] ISO: File Type Major Brand: isom
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] Before avformat_find_stream_info() pos: 94491 bytes read:35717 seeks:1
    [h264 @ 00b73c20] user data:"x264 - core 148 r2638 7599210 - H.264/MPEG-4 AVC codec - Copyleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00"
    [h264 @ 00b73c20] no picture
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] All info found
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] After avformat_find_stream_info() pos: 73972 bytes read:109641 seeks:2 frames:1
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00adb460] Auto-inserting h264_mp4toannexb bitstream filter
    [concat @ 00ad1a00] file:2 stream:0 pts:0 pts_time:0 dts:-1024 dts_time:-0.0666667 -> pts:2808269 pts_time:182.83 dts:2807245 dts_time:182.763
    [mp4 @ 00ae7780] Non-monotonous DTS in output stream 0:0; previous: 2925280, current: 2807245; changing to 2925281. This may result in incorrect timestamps in the output file.
    [concat @ 00ad1a00] file:2 stream:0 pts:2048 pts_time:0.133333 dts:-512 dts_time:-0.0333333 -> pts:2810317 pts_time:182.963 dts:2807757 dts_time:182.797
    [mp4 @ 00ae7780] Non-monotonous DTS in output stream 0:0; previous: 2925281, current: 2807757; changing to 2925282. This may result in incorrect timestamps in the output file.
    [concat @ 00ad1a00] file:2 stream:0 pts:1024 pts_time:0.0666667 dts:0 dts_time:0 -> pts:2809293 pts_time:182.897 dts:2808269 dts_time:182.83
    [mp4 @ 00ae7780] Non-monotonous DTS in output stream 0:0; previous: 2925282, current: 2808269; changing to 2925283. This may result in incorrect timestamps in the output file.


    ----comment > file 6 of type jpg converted to mp4

    [AVIOContext @ 00adbca0] Statistics: 347409 bytes read, 2 seeks
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00ae8f40] Format mov,mp4,m4a,3gp,3g2,mj2 probed with size=2048 and score=100
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00ae8f40] ISO: File Type Major Brand: isom
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00ae8f40] Before avformat_find_stream_info() pos: 164080 bytes read:36076 seeks:1
    [h264 @ 04240060] user data:"x264 - core 148 r2638 7599210 - H.264/MPEG-4 AVC codec - Copyleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00"
    [h264 @ 04240060] no picture
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00ae8f40] All info found
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00ae8f40] After avformat_find_stream_info() pos: 144712 bytes read:180740 seeks:2 frames:1
    [mov,mp4,m4a,3gp,3g2,mj2 @ 00ae8f40] Auto-inserting h264_mp4toannexb bitstream filter
    [concat @ 00ad1a00] file:6 stream:0 pts:0 pts_time:0 dts:-1024 dts_time:-0.0666667 -> pts:5495132 pts_time:357.756 dts:5494108 dts_time:357.689
    [mp4 @ 00ae7780] Non-monotonous DTS in output stream 0:0; previous: 5724096, current: 5494108; changing to 5724097. This may result in incorrect timestamps in the output file.
    [concat @ 00ad1a00] file:6 stream:0 pts:2048 pts_time:0.133333 dts:-512 dts_time:-0.0333333 -> pts:5497180 pts_time:357.889 dts:5494620 dts_time:357.723
    [mp4 @ 00ae7780] Non-monotonous DTS in output stream 0:0; previous: 5724097, current: 5494620; changing to 5724098. This may result in incorrect timestamps in the output file.
    [mp4 @ 00ae7780] Non-monotonous DTS in output stream 0:0; previous: 5724304, current: 5600604; changing to 5724305. This may result in incorrect timestamps in the output file.
    [concat @ 00ad1a00] file:6 stream:0 pts:107008 pts_time:6.96667 dts:105984 dts_time:6.9 -> pts:5602140 pts_time:364.723 dts:5601116 dts_time:364.656
    [mp4 @ 00ae7780] Non-monotonous DTS in output stream 0:0; previous: 5724305, current: 5601116; changing to 5724306. This may result in incorrect timestamps in the output file.


    --- comment  end of process after 39 streams

    No more output streams to write to, finishing.
    frame=40348 fps=4628 q=-1.0 Lsize=  166581kB time=00:38:21.91 bitrate= 592.8kbits/s    
    video:166231kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.210593%
    Input file #0 (D:\Project2\files\12297\control.txt):
     Input stream #0:0 (video): 40348 packets read (170220039 bytes);
     Total: 40348 packets (170220039 bytes) demuxed
    Output file #0 (D:\Project2\files\12297\12297resultado.mp4):
     Output stream #0:0 (video): 40348 packets muxed (170220039 bytes);
     Total: 40348 packets (170220039 bytes) muxed
    0 frames successfully decoded, 0 decoding errors
    [AVIOContext @ 00ae85e0] Statistics: 34 seeks, 42736 writeouts
    [AVIOContext @ 00af0ca0] Statistics: 3921256 bytes read, 0 seeks
    [AVIOContext @ 00ac0d80] Statistics: 2586 bytes read, 0 seeks