
Recherche avancée
Médias (39)
-
Stereo master soundtrack
17 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
ED-ME-5 1-DVD
11 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Audio
-
1,000,000
27 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Demon Seed
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Four of Us are Dying
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Corona Radiata
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (63)
-
Amélioration de la version de base
13 septembre 2013Jolie sélection multiple
Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...) -
Installation en mode ferme
4 février 2011, parLe mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
C’est la méthode que nous utilisons sur cette même plateforme.
L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...) -
La sauvegarde automatique de canaux SPIP
1er avril 2010, parDans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)
Sur d’autres sites (12418)
-
DASH package (mpd) I made freezes after raising the onratechange event
7 mars 2016, par Nick BirkeI am trying to created an adaptive stream from an mp4 file. I have a decent amount of experience at doing streaming, and have successfully implemented smooth streaming via Silverlight. We want a solution for everything that works in HTML5. I am pursuing DASH. So here are the steps I take...
- Resample the mp4 to different sizes using ffmpeg.
- Ensure the files are properly fragmented using Bento4s mp4Fragment utility.
- Create the DASH mpd file and associated segments using Bento4s mp4dash utility.
I have done a lot of work to learn each of these utilities and create a process that gets to output that I think will work. To create reliable DASH artifacts. But it seems I am doing something wrong. Here are examples of command lines I run.
For step one I create three different files... for three different rates and sizes... (these may not be the exact command lines, they are from old notes... but the parameter values are the ones I am using for sure)
ffmpeg -y -i C:\_SIMULATED_SHARE_C\ff\big_buck_bunny.mp4 -c:v libx264 -x264opts keyint=24:min-keyint=24:no-scenecut -b:v 1500k -maxrate 1500k -bufsize 1000k -vf "scale=-1:720" C:\_SIMULATED_SHARE_C\ff\output\outputfile720.mp4
ffmpeg -y -i C:\_SIMULATED_SHARE_C\ff\big_buck_bunny.mp4 -c:v libx264 -x264opts keyint=24:min-keyint=24:no-scenecut -b:v 800k -maxrate 800k -bufsize 500k -vf "scale=-1:540" C:\_SIMULATED_SHARE_C\ff\output\outputfile540.mp4
ffmpeg -y -i C:\_SIMULATED_SHARE_C\ff\big_buck_bunny.mp4 -c:v libx264 -x264opts keyint=24:min-keyint=24:no-scenecut -b:v 400k -maxrate 400k -bufsize 400k -vf "scale=-1:360" C:\_SIMULATED_SHARE_C\ff\output\outputfile360.mp4So that gives you the three files that need to be fragmented. So I fragment them using the mp4fragment utility from Bento4. That command line is simple so I will not offer an example.
The I use Bento4s mp4dash to create the dash. Something like this...
"E:\\_INSTALL\\Bento4\\Bento4-SDK-1-4-3-607.x86-microsoft-win32-vs2010\\Bento4-SDK-1-4-3-607.x86-microsoft-win32-vs2010\\bin\\mp4dash.exe --mpd-name=\"\\\\d9fqt52\\_SIMULATED_SHARE_C\\ff\\output\\dash\\something.mpd\" -f -o \"\\\\d9fqt52\\_SIMULATED_SHARE_C\\ff\\output\\dash\" \"\\\\d9fqt52\\_SIMULATED_SHARE_C\\ff\\output\\bunny360f.mp4\" \"\\\\d9fqt52\\_SIMULATED_SHARE_C\\ff\\output\\bunny540f.mp4\" \"\\\\d9fqt52\\_SIMULATED_SHARE_C\\ff\\output\\bunny720f.mp4\" "
This runs ok. And creates the mpd file, an audio folder with segment files, an a video folder with 3 folders each with the segment files.
I am able to configure IIS to play the mpd ok. As well as the segement files. The video plays fine... and then it stops. It freezes between 50 and 60 seconds. Right before it freezes I can confirm that the onratechange event of the HTML5 video tag is raised. This is an event that is raised with the rate of play is changed. I do nothing to change this it just happens.
The actual video in question is about 1hr long. Also I am using the dash.js library as a player.
If you can identify any components or specifics I should change please let me know !
UPDATE ON MAR 7, 2016
As requested here is the MPD content...
<?xml version="1.0" ?>
<mpd mediapresentationduration="PT51M51.909S" minbuffertime="PT0.80S" profiles="urn:mpeg:dash:profile:isoff-live:2011" type="static" xmlns="urn:mpeg:dash:schema:mpd:2011">
<period>
<adaptationset lang="en" mimetype="audio/mp4" segmentalignment="true" startwithsap="1">
<segmenttemplate duration="801" initialization="$RepresentationID$/init.mp4" media="$RepresentationID$/seg-$Number$.m4f" startnumber="1" timescale="1000"></segmenttemplate>
<representation audiosamplingrate="48000" bandwidth="132491" codecs="mp4a.40.2">
<audiochannelconfiguration schemeiduri="urn:mpeg:dash:23003:3:audio_channel_configuration:2011" value="2"></audiochannelconfiguration>
</representation>
</adaptationset>
<adaptationset maxheight="720" maxwidth="1280" mimetype="video/mp4" minheight="540" minwidth="640" segmentalignment="true" startwithsap="1">
<segmenttemplate duration="801" initialization="$RepresentationID$/init.mp4" media="$RepresentationID$/seg-$Number$.m4f" startnumber="1" timescale="1000"></segmenttemplate>
<representation bandwidth="785536" codecs="avc1.64001F" framerate="30000/1001" height="540" scantype="progressive" width="960"></representation>
<representation bandwidth="403537" codecs="avc1.64001E" framerate="30000/1001" height="360" scantype="progressive" width="640"></representation>
<representation bandwidth="1496514" codecs="avc1.64001F" framerate="30000/1001" height="720" scantype="progressive" width="1280"></representation>
</adaptationset>
</period>
</mpd>UPDATE ON MAR 7, 2016
Here is the console output for two different videos that just stop for some unknown reason between 50 and 60s
///ABS VIDEO
58393][rulesController] [RULES]: playbackTimeRule [Fragment:download,82,MediaSegment,0video] Default
dash.all.js:11 [58395][rulesController] [RULES]: sameTimeRequestRule [Fragment:download,82,MediaSegment,0,video] Default
dash.all.js:11 [58401][indexHandler][audio] Getting the request for time: 82.082
dash.all.js:11 [58403][indexHandler][audio] Index for time 82.082 is 81
dash.all.js:11 [58405][indexHandler][audio] SegmentTemplate: 81.08099999999999 / 1361.235
dash.all.js:11 [58406][indexHandler][audio] SegmentTemplate: 82.082 / 1361.235
dash.all.js:11 [58407][rulesController] [RULES]: playbackTimeRule [Fragment:download,82,MediaSegment,0audio] Default
dash.all.js:11 [58417][fragmentLoader] loaded video:MediaSegment:82.082 (200, 18ms, 2ms)
dash.all.js:11 [58419][rulesController] [RULES]: sameTimeRequestRule [Fragment:download,82,MediaSegment,0,audio] Default
dash.all.js:11 [58435][fragmentLoader] loaded audio:MediaSegment:82.082 (200, 14ms, 0ms)
dash.all.js:11 [59390][indexHandler][video] Getting the request for time: 83.083
dash.all.js:11 [59392][indexHandler][video] Index for time 83.083 is 82
dash.all.js:11 [59393][indexHandler][video] SegmentTemplate: 82.082 / 1361.235
dash.all.js:11 [59394][indexHandler][video] SegmentTemplate: 83.08299999999998 / 1361.235
dash.all.js:11 [59395][rulesController] [RULES]: playbackTimeRule [Fragment:download,83,MediaSegment,0video] Default
dash.all.js:11 [59396][rulesController] [RULES]: sameTimeRequestRule [Fragment:download,83,MediaSegment,0,video] Default
dash.all.js:11 [59401][indexHandler][audio] Getting the request for time: 83.083
dash.all.js:11 [59403][indexHandler][audio] Index for time 83.083 is 82
dash.all.js:11 [59404][indexHandler][audio] SegmentTemplate: 82.082 / 1361.235
dash.all.js:11 [59406][indexHandler][audio] SegmentTemplate: 83.08299999999998 / 1361.235
dash.all.js:11 [59408][rulesController] [RULES]: playbackTimeRule [Fragment:download,83,MediaSegment,0audio] Default
dash.all.js:11 [59416][fragmentLoader] loaded video:MediaSegment:83.08299999999998 (200, 17ms, 1ms)
dash.all.js:11 [59418][rulesController] [RULES]: sameTimeRequestRule [Fragment:download,83,MediaSegment,0,audio] Default
dash.all.js:11 [59437][fragmentLoader] loaded audio:MediaSegment:83.08299999999998 (200, 14ms, 1ms)
dash.all.js:11 [59902][scheduleController][video] Stalling Buffer
dash.all.js:11 [59903][bufferController][video] Waiting for more buffer before starting playback.
dash.all.js:11 [59906][scheduleController][audio] Stalling Buffer
dash.all.js:11 [59907][bufferController][audio] Waiting for more buffer before starting playback.
dash.all.js:11 [63684][playbackController] <video> ratechange: 0
///HOG VIDEO
[58250][rulesController] [RULES]: playbackTimeRule [Fragment:download,105,MediaSegment,0audio] Default
dash.all.js:11 [58266][fragmentLoader] loaded video:MediaSegment:84.105 (200, 28ms, 1ms)
dash.all.js:11 [58269][rulesController] [RULES]: sameTimeRequestRule [Fragment:download,105,MediaSegment,0,audio] Default
dash.all.js:11 [58287][fragmentLoader] loaded audio:MediaSegment:84.105 (200, 14ms, 1ms)
dash.all.js:11 [58982][indexHandler][video] Getting the request for time: 84.906
dash.all.js:11 [58983][indexHandler][video] Index for time 84.906 is 105
dash.all.js:11 [58984][indexHandler][video] SegmentTemplate: 84.105 / 3111.909
dash.all.js:11 [58984][indexHandler][video] SegmentTemplate: 84.906 / 3111.909
dash.all.js:11 [58985][rulesController] [RULES]: playbackTimeRule [Fragment:download,106,MediaSegment,0video] Default
dash.all.js:11 [58986][rulesController] [RULES]: sameTimeRequestRule [Fragment:download,106,MediaSegment,0,video] Default
dash.all.js:11 [58989][indexHandler][audio] Getting the request for time: 84.906
dash.all.js:11 [58990][indexHandler][audio] Index for time 84.906 is 105
dash.all.js:11 [58991][indexHandler][audio] SegmentTemplate: 84.105 / 3111.909
dash.all.js:11 [58992][indexHandler][audio] SegmentTemplate: 84.906 / 3111.909
dash.all.js:11 [58993][rulesController] [RULES]: playbackTimeRule [Fragment:download,106,MediaSegment,0audio] Default
dash.all.js:11 [59002][fragmentLoader] loaded video:MediaSegment:84.906 (200, 14ms, 1ms)
dash.all.js:11 [59003][rulesController] [RULES]: sameTimeRequestRule [Fragment:download,106,MediaSegment,0,audio] Default
dash.all.js:11 [59014][fragmentLoader] loaded audio:MediaSegment:84.906 (200, 8ms, 1ms)
dash.all.js:11 [59735][scheduleController][video] Stalling Buffer
dash.all.js:11 [59736][bufferController][video] Waiting for more buffer before starting playback.
dash.all.js:11 [59738][scheduleController][audio] Stalling Buffer
dash.all.js:11 [59739][bufferController][audio] Waiting for more buffer before starting playback.
dash.all.js:11 [61695][playbackController] <video> ratechange: 0
</video></video> -
Android recording video with overlay view
6 mars 2016, par t0mI am trying app in android which have functionality to capture video with overlay views. (Also i need onPreviewFrame method.)
Via SurfaceView and JavaCV with FFmpeg :
OpenCVCameraActivity.java :
import android.app.Activity;
import android.content.Context;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.util.Log;
import android.view.KeyEvent;
import android.view.Menu;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.widget.FrameLayout;
import android.widget.Toast;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import org.bytedeco.javacv.Frame;
import java.io.File;
import java.io.IOException;
import java.nio.ByteBuffer;
import java.nio.ShortBuffer;
import java.util.List;
@SuppressWarnings("ALL")
public class OpenCVCameraActivity extends Activity {
private static final String TAG = OpenCVCameraActivity.class.getSimpleName();
private long startTime = 0;
private boolean isPreviewOn = false;
private int sampleAudioRateInHz = 44100;
private Camera.Size previewSize; //preview and Camera and Recorder width and height
private int recorderFrameRate = 25;
// audio data getting thread
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
private volatile boolean runAudioThread = true;
// video data getting thread
private Camera mCamera;
private CameraView mPreview;
private FFmpegFrameRecorder recorder;
private boolean recording = false;
private Frame yuvImage = null;
//storage
private Storage storage;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if(Static.DEBUG) Log.i(TAG, "onCreate()");
Thread.setDefaultUncaughtExceptionHandler(uncaughtExceptionHandler);
setContentView(R.layout.activity_opencv);
prepareCamera();
}
private Thread.UncaughtExceptionHandler uncaughtExceptionHandler =
new Thread.UncaughtExceptionHandler() {
public void uncaughtException(Thread thread, Throwable ex) {
if(Static.DEBUG) Log.e(TAG, "Uncaught exception", ex);
}
};
@Override
protected void onRestart() {
super.onRestart();
if (Static.DEBUG) Log.i(TAG, "onRestart()");
}
@Override
protected void onStart() {
super.onStart();
if (Static.DEBUG) Log.i(TAG, "onStart()");
}
@Override
protected void onResume() {
super.onResume();
if (Static.DEBUG) Log.i(TAG, "onResume()");
storage = new Storage(this);
if(storage.mExternalStorageAvailable == true && storage.mExternalStorageWriteable == false)
Static.showToast(this, getString(R.string.errExternalStorageReadOnly), Toast.LENGTH_LONG);
else if (storage.mExternalStorageAvailable == false && storage.mExternalStorageWriteable == false)
Static.showToast(this, getString(R.string.errExternalStorage), Toast.LENGTH_LONG);
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
if (Static.DEBUG) Log.i(TAG, "onCreateOptionsMenu()");
return super.onCreateOptionsMenu(menu);
}
@Override
protected void onPause() {
super.onPause();
if (Static.DEBUG) Log.i(TAG, "onPause()");
}
@Override
protected void onStop() {
super.onStop();
if (Static.DEBUG) Log.i(TAG, "onStop()");
}
@Override
protected void onDestroy() {
super.onDestroy();
if (Static.DEBUG) Log.i(TAG, "onDestroy()");
recording = false;
if (mPreview != null) {
mPreview.stopPreview();
}
if (mCamera != null) {
mCamera.stopPreview();
mCamera.release();
mCamera = null;
}
}
/** Prepare camera object.
* */
private void prepareCamera() {
//1. Open camera object
try {
mCamera = getCameraInstance(this);
} catch (Exception e) {
e.printStackTrace();
Static.showToast(this, e.getMessage(), Toast.LENGTH_LONG);
finish();
return;
}
setOptimalCameraParams();
//2. Connect Preview + 3. Start Preview + 8e Stop preview v metode surfaceChanged
mPreview = new CameraView(this, mCamera);
FrameLayout preview = (FrameLayout) findViewById(R.id.cameraPreview);
preview.addView(mPreview); //surfaceView to FrameLayout
if(Static.DEBUG) Log.i(TAG, "camera preview start: OK");
}
/**
* A safe way to get an instance of the Camera object.
*/
@SuppressWarnings("deprecation")
public static Camera getCameraInstance(Context ctx) throws Exception {
Camera c = Camera.open();
if (c == null)
throw new Exception(ctx.getString(R.string.errCameraNotAvailable));
if(Static.DEBUG) Log.i(TAG, "camera open");
return c; // attempt to get a Camera instance, otherwise null
}
/**Method set optmial parameters. Depend on HW possibilities.*/
@SuppressWarnings("deprecation")
private void setOptimalCameraParams(){
// Camera parameters
Camera.Parameters params = mCamera.getParameters();
List<string> focusModes = params.getSupportedFocusModes();
if (Static.DEBUG) Log.i(TAG, "focusModes():" + focusModes.toString());
if (Static.DEBUG) Log.i(TAG, "Camera parameters:\n" + params.flatten());
params.setRecordingHint(true); //MediaRecorder.start() to start faster
//Automatically autofocus if it's possible
if (params.getSupportedFocusModes().contains(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO)) {
params.setFocusMode(Camera.Parameters.FOCUS_MODE_CONTINUOUS_VIDEO);
} else if (focusModes.contains(Camera.Parameters.FOCUS_MODE_AUTO)) { //at least focus auto
params.setFocusMode(Camera.Parameters.FOCUS_MODE_AUTO);
}
// set Camera parameters
mCamera.setParameters(params);
}
//---------------------------------------
// initialize ffmpeg_recorder
//---------------------------------------
private void initRecorder() throws Exception {
if(Static.DEBUG) Log.i(TAG,"init recorder");
File output = null;
try {
output = storage.getOutputMediaFile(storage.MEDIA_TYPE_VIDEO);
if(output == null)
throw new Exception();
} catch (Exception e) {
e.printStackTrace();
throw new Exception(getString(R.string.errSetOutputFile));
}
if (yuvImage == null) {
yuvImage = new Frame(previewSize.width, previewSize.height, Frame.DEPTH_UBYTE, 2);
if(Static.DEBUG) Log.i(TAG, "create yuvImage");
}
if(Static.DEBUG) Log.i(TAG, "ffmpeg_url: " + output.getPath());
recorder = new FFmpegFrameRecorder(output.getPath(), previewSize.width, previewSize.height, 1);
//recorder.setVideoCodec(avcodec.AV_CODEC_ID_H264);
//recorder.setVideoOption("preset", "veryfast"); // or ultrafast or fast, etc.
//recorder.setVideoQuality(0); // maximum quality, replace recorder.setVideoBitrate(16384);
//recorder.setPixelFormat(avutil.AV_PIX_FMT_YUV420P);
recorder.setFormat("mp4");
recorder.setSampleRate(sampleAudioRateInHz);
recorder.setFrameRate(recorderFrameRate);
if(Static.DEBUG) Log.i(TAG, "recorder initialize success");
audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
runAudioThread = true;
}
public void startRecording() {
try {
initRecorder();
} catch (Exception e){
e.printStackTrace();
Static.showToast(this, e.getMessage(), Toast.LENGTH_LONG);
}
try {
recorder.start();
startTime = System.currentTimeMillis();
recording = true;
audioThread.start();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
}
public void stopRecording() {
runAudioThread = false;
try {
audioThread.join();
} catch (InterruptedException e) {
// reset interrupt to be nice
Thread.currentThread().interrupt();
return;
}
audioRecordRunnable = null;
audioThread = null;
if (recorder != null && recording) {
recording = false;
if(Static.DEBUG) Log.i(TAG,"Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
recorder = null;
}
}
@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
if (keyCode == KeyEvent.KEYCODE_BACK) {
if (recording) {
stopRecording();
}
finish();
return true;
}
return super.onKeyDown(keyCode, event);
}
public void onClickBtnStartRecord(View v) {
if (!recording) {
startRecording();
if(Static.DEBUG) Log.i(TAG, "Start Button Pushed");
} else {
// This will trigger the audio recording loop to stop and then set isRecorderStart = false;
stopRecording();
if(Static.DEBUG) Log.i(TAG, "Stop Button Pushed");
}
}
//---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {
@Override
public void run() {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
// Audio
int bufferSize;
ShortBuffer audioData;
int bufferReadResult;
bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
audioData = ShortBuffer.allocate(bufferSize);
if(Static.DEBUG) Log.i(TAG, "audioRecord.startRecording()");
audioRecord.startRecording();
/* ffmpeg_audio encoding loop */
while (runAudioThread) {
//if(Static.DEBUG) Log.i(TAG,"recording? " + recording);
bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
audioData.limit(bufferReadResult);
if (bufferReadResult > 0) {
//if(Static.DEBUG) Log.i(TAG,"bufferReadResult: " + bufferReadResult);
// If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
// Why? Good question...
if (recording) {
try {
recorder.recordSamples(audioData);
//if(Static.DEBUG) Log.i(TAG,"recording " + 1024*i + " to " + 1024*i+1024);
} catch (FFmpegFrameRecorder.Exception e) {
if(Static.DEBUG) Log.i(TAG,e.getMessage());
e.printStackTrace();
}
}
}
}
if(Static.DEBUG) Log.i(TAG,"AudioThread Finished, release audioRecord");
/* encoding finish, release recorder */
if (audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
if(Static.DEBUG) Log.i(TAG,"audioRecord released");
}
}
}
/**TODO*/
private Camera.Size getBestPreviewSize(int width, int height, Camera.Parameters parameters) {
Camera.Size result=null;
for (Camera.Size size : parameters.getSupportedPreviewSizes()) {
if(Static.DEBUG) Log.i(TAG, size.width + "/" + size.height);
if (size.width<=width && size.height<=height) {
if (result==null) {
result=size;
} else {
int resultArea=result.width*result.height;
int newArea=size.width*size.height;
if (newArea>resultArea) {
result=size;
}
}
}
}
return(result);
}
//---------------------------------------------
// camera thread, gets and encodes video data
//---------------------------------------------
private class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {
private final String TAG = CameraView.class.getSimpleName();
private SurfaceHolder mHolder;
private Camera mCamera;
public CameraView(Context context, Camera camera) {
super(context);
if(Static.DEBUG) Log.i(TAG, "camera view");
mCamera = camera;
// Install a SurfaceHolder.Callback so we get notified when the
// underlying surface is created and destroyed.
mHolder = getHolder();
mHolder.addCallback(CameraView.this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
mCamera.setPreviewCallback(CameraView.this);
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
try {
stopPreview();
mCamera.setPreviewDisplay(holder);
} catch (IOException exception) {
mCamera.release();
mCamera = null;
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
if(Static.DEBUG) Log.i(TAG, "surfaceChanged() => w=" + w + ", h=" + h);
// If your preview can change or rotate, take care of those events here.
// Make sure to stop the preview before resizing or reformatting it.
if (mHolder.getSurface() == null){
// preview surface does not exist
return;
}
// stop preview before making changes
try {
stopPreview();
} catch (Exception e){
// ignore: tried to stop a non-existent preview
}
// start preview with new settings
try {
Camera.Parameters params = mCamera.getParameters();
previewSize = getBestPreviewSize(w, h, params);
if(Static.DEBUG) Log.i(TAG, "getBestPreviewSize() => w=" + previewSize.width + ", h=" + previewSize.height);
if (previewSize != null)
params.setPreviewSize(previewSize.width, previewSize.height);
params.setPreviewFrameRate(recorderFrameRate);
if(Static.DEBUG) Log.i(TAG,"Preview Framerate: " + params.getPreviewFrameRate());
mCamera.setParameters(params);
mCamera.setPreviewDisplay(holder);
mCamera.setPreviewCallback(CameraView.this);
startPreview();
} catch (Exception e){
if(Static.DEBUG) Log.i(TAG, "Could not set preview display in surfaceChanged");
e.printStackTrace();
}
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
try {
mHolder.addCallback(null);
mCamera.setPreviewCallback(null);
} catch (RuntimeException e) {
// The camera has probably just been released, ignore.
}
}
public void startPreview() {
if (!isPreviewOn && mCamera != null) {
isPreviewOn = true;
mCamera.startPreview();
}
}
public void stopPreview() {
if (isPreviewOn && mCamera != null) {
isPreviewOn = false;
mCamera.stopPreview();
}
}
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
startTime = System.currentTimeMillis();
return;
}
// get video data
if (yuvImage != null && recording) {
((ByteBuffer)yuvImage.image[0].position(0)).put(data);
try {
long t = 1000 * (System.currentTimeMillis() - startTime);
if(Static.DEBUG) Log.i(TAG,"Writing Frame on timestamp: "+t);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
recorder.record(yuvImage);
} catch (FFmpegFrameRecorder.Exception e) {
if(Static.DEBUG) Log.i(TAG,e.getMessage());
e.printStackTrace();
}
}
}
}
}
</string>activity_opencv.xml :
<?xml version="1.0" encoding="utf-8"?>
<relativelayout>
<framelayout>
</framelayout>
<ImageButton<br />
android:id="@+id/btnStartRecord"<br />
android:layout_width="70dp"<br />
android:layout_height="70dp"<br />
android:scaleType="fitXY"<br />
android:src="@drawable/record_icon"<br />
android:background="@null"<br />
android:text="@string/btnStartRecord"<br />
android:onClick="onClickBtnStartRecord"<br />
android:clickable="true"<br />
android:layout_centerVertical="true"<br />
android:layout_alignParentRight="true"<br />
android:layout_alignParentEnd="true"/>
<textview></textview>
</relativelayout>Overlay views working, but recorded video is without overlay views.
-
H264 Encoding - Could not play video using VLC Player
31 mars 2016, par bot1131357I am have trouble encoding an H264 video correctly using FFmpeg libav. I could not play the encoded video in VLC media player, and although I could play the video on MPC-HC the time shows
00:00/00:00
. Clearly I’m missing something.The Media info from MPC-HC shows this :
General
Format : AVC
Format/Info : Advanced Video Codec
File size : 110 KiB
Duration : 2s 400ms
Overall bit rate : 375 Kbps
Writing library : x264 core 148 r2665 a01e339
Encoding settings : cabac=0 / ref=3 / deblock=1:0:0 / analyse=0x1:0x111 / me=hex / subme=7 / psy=1 / psy_rd=1.00:0.00 / mixed_ref=1 / me_range=16 / chroma_me=1 / trellis=1 / 8x8dct=0 / cqm=0 / deadzone=21,11 / fast_pskip=1 / chroma_qp_offset=-2 / threads=7 / lookahead_threads=1 / sliced_threads=0 / nr=0 / decimate=1 / interlaced=0 / bluray_compat=0 / constrained_intra=0 / bframes=0 / weightp=0 / keyint=12 / keyint_min=1 / scenecut=40 / intra_refresh=0 / rc_lookahead=12 / rc=abr / mbtree=1 / bitrate=2000 / ratetol=1.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / ip_ratio=1.40 / aq=1:1.00Video
Format : AVC
Format/Info : Advanced Video Codec
Format profile : Baseline@L2.1
Format settings, CABAC : No
Format settings, ReFrames : 3 frames
Format settings, GOP : M=1, N=12
Duration : 2s 400ms
Bit rate : 2 000 Kbps
Width : 320 pixels
Height : 240 pixels
Display aspect ratio : 4:3
Frame rate mode : Variable
Frame rate : 20.833 fps
Color space : YUV
Chroma subsampling : 4:2:0
Bit depth : 8 bits
Scan type : Progressive
Bits/(Pixel*Frame) : 1.250
Stream size : 586 KiB
Writing library : x264 core 148 r2665 a01e339
Encoding settings : cabac=0 / ref=3 / deblock=1:0:0 / analyse=0x1:0x111 / me=hex / subme=7 / psy=1 / psy_rd=1.00:0.00 / mixed_ref=1 / me_range=16 / chroma_me=1 / trellis=1 / 8x8dct=0 / cqm=0 / deadzone=21,11 / fast_pskip=1 / chroma_qp_offset=-2 / threads=7 / lookahead_threads=1 / sliced_threads=0 / nr=0 / decimate=1 / interlaced=0 / bluray_compat=0 / constrained_intra=0 / bframes=0 / weightp=0 / keyint=12 / keyint_min=1 / scenecut=40 / intra_refresh=0 / rc_lookahead=12 / rc=abr / mbtree=1 / bitrate=2000 / ratetol=1.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / ip_ratio=1.40 / aq=1:1.00I noticed something odd in the above info :
The frame rate is
20.833
fps, instead of the specified 10 fps.Duration of
2s 400ms
did not seem right either, since the video played for more than 4s.Also,
(AVFrame* picture)->pict_type
is always set toAV_PICTURE_TYPE_NONE
. I don’t think this is normal.The library that I’m using is ffmpeg-20160219-git-98a0053-win32-dev. I would really really appreciate if you could help me out of this confusion.
/*
* Video encoding example
*/
char filename[] = "test.mp4";
int main(int argc, char** argv)
{
AVCodec *codec = NULL;
AVCodecContext *codecCtx= NULL;
AVFormatContext *pFormatCtx = NULL;
AVStream * pVideoStream = NULL;
AVFrame *picture = NULL;
int i, x, y, //
ret, // Return value
got_packet_ptr; // Data encoded into packet
printf("Video encoding\n");
// Register all formats and codecs
av_register_all();
// allocate context
pFormatCtx = avformat_alloc_context();
memcpy(pFormatCtx->filename,filename,
min(strlen(filename), sizeof(pFormatCtx->filename)));
// guess format
pFormatCtx->oformat = av_guess_format("h264", NULL, NULL);
if (NULL==pFormatCtx->oformat)
{
cerr << "Could not guess output format" << endl;
return -1;
}
// Find the codec.
codec = avcodec_find_encoder(pFormatCtx->oformat->video_codec);
if (codec == NULL) {
fprintf(stderr, "Codec not found\n");
return -1;
}
// Set context
int framerate = 10;
codecCtx = avcodec_alloc_context3(codec);
avcodec_get_context_defaults3(codecCtx, codec);
codecCtx->pix_fmt = AV_PIX_FMT_YUV420P;
codecCtx->profile = FF_PROFILE_H264_BASELINE;
// Resolution must be a multiple of two.
codecCtx->width = 320;
codecCtx->height = 240;
codecCtx->bit_rate = 2000000;
codecCtx->time_base.den = framerate;
codecCtx->time_base.num = 1;
codecCtx->gop_size = 12; // emit one intra frame every twelve frames at most
// Open the codec.
if (avcodec_open2(codecCtx, codec, NULL) < 0)
{
printf("Cannot open video codec\n");
return -1;
}
// Add stream to pFormatCtx
pVideoStream = avformat_new_stream(pFormatCtx, codec);
if (!pVideoStream)
{
printf("Cannot add new video stream\n");
return -1;
}
pVideoStream->codec = codecCtx;
pVideoStream->time_base.den = framerate;
pVideoStream->time_base.num = 1;
if (avio_open2(&pFormatCtx->pb, filename, AVIO_FLAG_WRITE, NULL, NULL) < 0)
{
printf("Cannot open file\n");
return -1;
}
// Write file header.
avformat_write_header(pFormatCtx, NULL);
// Create frame
picture= av_frame_alloc();
picture->format = codecCtx->pix_fmt;
picture->width = codecCtx->width;
picture->height = codecCtx->height;
int bufferImgSize = av_image_get_buffer_size(codecCtx->pix_fmt, codecCtx->width,
codecCtx->height,1);
av_image_alloc(picture->data, picture->linesize, codecCtx->width, codecCtx->height, codecCtx->pix_fmt, 32);
AVPacket avpkt;
/* encode 1 second of video */
for(i=0;i<50;i++)
{
/* prepare a dummy image */
/* Y */
for(y=0;yheight;y++)
{
for(x=0;xwidth;x++)
{
picture->data[0][y * picture->linesize[0] + x] = x + y + i * 3;
}
}
/* Cb and Cr */
for(y=0;yheight/2;y++)
{
for(x=0;xwidth/2;x++)
{
picture->data[1][y * picture->linesize[1] + x] = 128 + y + i * 2;
picture->data[2][y * picture->linesize[2] + x] = 64 + x + i * 5;
}
}
// Get timestamp
picture->pts = (float) i * (1000.0/(float)(codecCtx->time_base.den)) * 90;
// Encode frame to packet
av_init_packet(&avpkt);
got_packet_ptr = 0;
int error = avcodec_encode_video2(codecCtx, &avpkt, picture, &got_packet_ptr);
if (!error && got_packet_ptr > 0)
{
// Write packet with frame.
ret = (av_interleaved_write_frame(pFormatCtx, &avpkt) == 0);
}
av_packet_unref(&avpkt);
}
// Flush remaining encoded data
while(1)
{
av_init_packet(&avpkt);
got_packet_ptr = 0;
// Encode frame to packet.
int error = avcodec_encode_video2(codecCtx, &avpkt, NULL, &got_packet_ptr);
if (!error && got_packet_ptr > 0)
{
// Write packet with frame.
ret = (av_interleaved_write_frame(pFormatCtx, &avpkt) == 0);
}
else
{
break;
}
av_packet_unref(&avpkt);
}
av_write_trailer(pFormatCtx);
av_packet_unref(&avpkt);
av_frame_free(&picture);
avcodec_close(codecCtx);
av_free(codecCtx);
cin.get();
}