
Recherche avancée
Médias (1)
-
Sintel MP4 Surround 5.1 Full
13 mai 2011, par
Mis à jour : Février 2012
Langue : English
Type : Video
Autres articles (112)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (6617)
-
Video Recording on Android from Frames using javaCV
23 juin 2016, par jawad bin zafarI am using this library javaCV to record video on android. They have provided a sample VideoRecording Activity But There is some bug which I could not figure out what I am doing it wrong or what is missing which causing this bug.
package org.bytedeco.javacv.recordactivity;
import android.app.Activity;
import android.content.Context;
import android.content.pm.ActivityInfo;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.media.AudioFormat;
import android.media.AudioRecord;
import android.media.MediaRecorder;
import android.os.Bundle;
import android.os.PowerManager;
import android.util.Log;
import android.view.Display;
import android.view.KeyEvent;
import android.view.LayoutInflater;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.View.OnClickListener;
import android.view.WindowManager;
import android.widget.Button;
import android.widget.LinearLayout;
import android.widget.RelativeLayout;
import java.io.IOException;
import java.nio.ShortBuffer;
import org.bytedeco.javacv.FFmpegFrameRecorder;
import static org.bytedeco.javacpp.opencv_core.*;
public class RecordActivity extends Activity implements OnClickListener {
private final static String CLASS_LABEL = "RecordActivity";
private final static String LOG_TAG = CLASS_LABEL;
private PowerManager.WakeLock mWakeLock;
private String ffmpeg_link = "/mnt/sdcard/stream.flv";
long startTime = 0;
boolean recording = false;
private volatile FFmpegFrameRecorder recorder;
private boolean isPreviewOn = false;
private int sampleAudioRateInHz = 44100;
private int imageWidth = 320;
private int imageHeight = 240;
private int frameRate = 30;
/* audio data getting thread */
private AudioRecord audioRecord;
private AudioRecordRunnable audioRecordRunnable;
private Thread audioThread;
volatile boolean runAudioThread = true;
/* video data getting thread */
private Camera cameraDevice;
private CameraView cameraView;
private IplImage yuvIplimage = null;
/* layout setting */
private final int bg_screen_bx = 232;
private final int bg_screen_by = 128;
private final int bg_screen_width = 700;
private final int bg_screen_height = 500;
private final int bg_width = 1123;
private final int bg_height = 715;
private final int live_width = 640;
private final int live_height = 480;
private int screenWidth, screenHeight;
private Button btnRecorderControl;
/** The number of seconds in the continuous record loop (or 0 to disable loop). */
final int RECORD_LENGTH = 10;
IplImage[] images;
long[] timestamps;
ShortBuffer[] samples;
int imagesIndex, samplesIndex;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
setContentView(R.layout.main);
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, CLASS_LABEL);
mWakeLock.acquire();
initLayout();
}
@Override
protected void onResume() {
super.onResume();
if (mWakeLock == null) {
PowerManager pm = (PowerManager) getSystemService(Context.POWER_SERVICE);
mWakeLock = pm.newWakeLock(PowerManager.SCREEN_BRIGHT_WAKE_LOCK, CLASS_LABEL);
mWakeLock.acquire();
}
}
@Override
protected void onPause() {
super.onPause();
if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}
}
@Override
protected void onDestroy() {
super.onDestroy();
recording = false;
if (cameraView != null) {
cameraView.stopPreview();
}
if(cameraDevice != null) {
cameraDevice.stopPreview();
cameraDevice.release();
cameraDevice = null;
}
if (mWakeLock != null) {
mWakeLock.release();
mWakeLock = null;
}
}
private void initLayout() {
/* get size of screen */
Display display = ((WindowManager) getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();
screenWidth = display.getWidth();
screenHeight = display.getHeight();
RelativeLayout.LayoutParams layoutParam = null;
LayoutInflater myInflate = null;
myInflate = (LayoutInflater) getSystemService(Context.LAYOUT_INFLATER_SERVICE);
RelativeLayout topLayout = new RelativeLayout(this);
setContentView(topLayout);
LinearLayout preViewLayout = (LinearLayout) myInflate.inflate(R.layout.main, null);
layoutParam = new RelativeLayout.LayoutParams(screenWidth, screenHeight);
topLayout.addView(preViewLayout, layoutParam);
/* add control button: start and stop */
btnRecorderControl = (Button) findViewById(R.id.recorder_control);
btnRecorderControl.setText("Start");
btnRecorderControl.setOnClickListener(this);
/* add camera view */
int display_width_d = (int) (1.0 * bg_screen_width * screenWidth / bg_width);
int display_height_d = (int) (1.0 * bg_screen_height * screenHeight / bg_height);
int prev_rw, prev_rh;
if (1.0 * display_width_d / display_height_d > 1.0 * live_width / live_height) {
prev_rh = display_height_d;
prev_rw = (int) (1.0 * display_height_d * live_width / live_height);
} else {
prev_rw = display_width_d;
prev_rh = (int) (1.0 * display_width_d * live_height / live_width);
}
layoutParam = new RelativeLayout.LayoutParams(prev_rw, prev_rh);
layoutParam.topMargin = (int) (1.0 * bg_screen_by * screenHeight / bg_height);
layoutParam.leftMargin = (int) (1.0 * bg_screen_bx * screenWidth / bg_width);
cameraDevice = Camera.open();
Log.i(LOG_TAG, "cameara open");
cameraView = new CameraView(this, cameraDevice);
topLayout.addView(cameraView, layoutParam);
Log.i(LOG_TAG, "cameara preview start: OK");
}
//---------------------------------------
// initialize ffmpeg_recorder
//---------------------------------------
private void initRecorder() {
Log.w(LOG_TAG,"init recorder");
if (RECORD_LENGTH > 0) {
imagesIndex = 0;
images = new IplImage[RECORD_LENGTH * frameRate];
timestamps = new long[images.length];
for (int i = 0; i < images.length; i++) {
images[i] = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
timestamps[i] = -1;
}
} else if (yuvIplimage == null) {
yuvIplimage = IplImage.create(imageWidth, imageHeight, IPL_DEPTH_8U, 2);
Log.i(LOG_TAG, "create yuvIplimage");
}
Log.i(LOG_TAG, "ffmpeg_url: " + ffmpeg_link);
recorder = new FFmpegFrameRecorder(ffmpeg_link, imageWidth, imageHeight, 1);
recorder.setFormat("flv");
recorder.setSampleRate(sampleAudioRateInHz);
// Set in the surface changed method
recorder.setFrameRate(frameRate);
Log.i(LOG_TAG, "recorder initialize success");
audioRecordRunnable = new AudioRecordRunnable();
audioThread = new Thread(audioRecordRunnable);
runAudioThread = true;
}
public void startRecording() {
initRecorder();
try {
recorder.start();
startTime = System.currentTimeMillis();
recording = true;
audioThread.start();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
}
public void stopRecording() {
runAudioThread = false;
try {
audioThread.join();
} catch (InterruptedException e) {
e.printStackTrace();
}
audioRecordRunnable = null;
audioThread = null;
if (recorder != null && recording) {
if (RECORD_LENGTH > 0) {
Log.v(LOG_TAG,"Writing frames");
try {
int firstIndex = imagesIndex % samples.length;
int lastIndex = (imagesIndex - 1) % images.length;
if (imagesIndex <= images.length) {
firstIndex = 0;
lastIndex = imagesIndex - 1;
}
if ((startTime = timestamps[lastIndex] - RECORD_LENGTH * 1000000L) < 0) {
startTime = 0;
}
if (lastIndex < firstIndex) {
lastIndex += images.length;
}
for (int i = firstIndex; i <= lastIndex; i++) {
long t = timestamps[i % timestamps.length] - startTime;
if (t >= 0) {
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
recorder.record(images[i % images.length]);
}
}
firstIndex = samplesIndex % samples.length;
lastIndex = (samplesIndex - 1) % samples.length;
if (samplesIndex <= samples.length) {
firstIndex = 0;
lastIndex = samplesIndex - 1;
}
if (lastIndex < firstIndex) {
lastIndex += samples.length;
}
for (int i = firstIndex; i <= lastIndex; i++) {
recorder.record(samples[i % samples.length]);
}
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
recording = false;
Log.v(LOG_TAG,"Finishing recording, calling stop and release on recorder");
try {
recorder.stop();
recorder.release();
} catch (FFmpegFrameRecorder.Exception e) {
e.printStackTrace();
}
recorder = null;
}
}
@Override
public boolean onKeyDown(int keyCode, KeyEvent event) {
if (keyCode == KeyEvent.KEYCODE_BACK) {
if (recording) {
stopRecording();
}
finish();
return true;
}
return super.onKeyDown(keyCode, event);
}
//---------------------------------------------
// audio thread, gets and encodes audio data
//---------------------------------------------
class AudioRecordRunnable implements Runnable {
@Override
public void run() {
android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);
// Audio
int bufferSize;
ShortBuffer audioData;
int bufferReadResult;
bufferSize = AudioRecord.getMinBufferSize(sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT);
audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, sampleAudioRateInHz,
AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, bufferSize);
if (RECORD_LENGTH > 0) {
samplesIndex = 0;
samples = new ShortBuffer[RECORD_LENGTH * sampleAudioRateInHz * 2 / bufferSize + 1];
for (int i = 0; i < samples.length; i++) {
samples[i] = ShortBuffer.allocate(bufferSize);
}
} else {
audioData = ShortBuffer.allocate(bufferSize);
}
Log.d(LOG_TAG, "audioRecord.startRecording()");
audioRecord.startRecording();
/* ffmpeg_audio encoding loop */
while (runAudioThread) {
if (RECORD_LENGTH > 0) {
audioData = samples[samplesIndex++ % samples.length];
audioData.position(0).limit(0);
}
//Log.v(LOG_TAG,"recording? " + recording);
bufferReadResult = audioRecord.read(audioData.array(), 0, audioData.capacity());
audioData.limit(bufferReadResult);
if (bufferReadResult > 0) {
Log.v(LOG_TAG,"bufferReadResult: " + bufferReadResult);
// If "recording" isn't true when start this thread, it never get's set according to this if statement...!!!
// Why? Good question...
if (recording) {
if (RECORD_LENGTH <= 0) try {
recorder.record(audioData);
//Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
}
}
Log.v(LOG_TAG,"AudioThread Finished, release audioRecord");
/* encoding finish, release recorder */
if (audioRecord != null) {
audioRecord.stop();
audioRecord.release();
audioRecord = null;
Log.v(LOG_TAG,"audioRecord released");
}
}
}
//---------------------------------------------
// camera thread, gets and encodes video data
//---------------------------------------------
class CameraView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback {
private SurfaceHolder mHolder;
private Camera mCamera;
public CameraView(Context context, Camera camera) {
super(context);
Log.w("camera","camera view");
mCamera = camera;
mHolder = getHolder();
mHolder.addCallback(CameraView.this);
mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
mCamera.setPreviewCallback(CameraView.this);
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
try {
stopPreview();
mCamera.setPreviewDisplay(holder);
} catch (IOException exception) {
mCamera.release();
mCamera = null;
}
}
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
Log.v(LOG_TAG,"Setting imageWidth: " + imageWidth + " imageHeight: " + imageHeight + " frameRate: " + frameRate);
Camera.Parameters camParams = mCamera.getParameters();
camParams.setPreviewSize(imageWidth, imageHeight);
Log.v(LOG_TAG,"Preview Framerate: " + camParams.getPreviewFrameRate());
camParams.setPreviewFrameRate(frameRate);
mCamera.setParameters(camParams);
startPreview();
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
try {
mHolder.addCallback(null);
mCamera.setPreviewCallback(null);
} catch (RuntimeException e) {
// The camera has probably just been released, ignore.
}
}
public void startPreview() {
if (!isPreviewOn && mCamera != null) {
isPreviewOn = true;
mCamera.startPreview();
}
}
public void stopPreview() {
if (isPreviewOn && mCamera != null) {
isPreviewOn = false;
mCamera.stopPreview();
}
}
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
if (audioRecord == null || audioRecord.getRecordingState() != AudioRecord.RECORDSTATE_RECORDING) {
startTime = System.currentTimeMillis();
return;
}
if (RECORD_LENGTH > 0) {
int i = imagesIndex++ % images.length;
yuvIplimage = images[i];
timestamps[i] = 1000 * (System.currentTimeMillis() - startTime);
}
/* get video data */
if (yuvIplimage != null && recording) {
yuvIplimage.getByteBuffer().put(data);
if (RECORD_LENGTH <= 0) try {
Log.v(LOG_TAG,"Writing Frame");
long t = 1000 * (System.currentTimeMillis() - startTime);
if (t > recorder.getTimestamp()) {
recorder.setTimestamp(t);
}
recorder.record(yuvIplimage);
} catch (FFmpegFrameRecorder.Exception e) {
Log.v(LOG_TAG,e.getMessage());
e.printStackTrace();
}
}
}
}
@Override
public void onClick(View v) {
if (!recording) {
startRecording();
Log.w(LOG_TAG, "Start Button Pushed");
btnRecorderControl.setText("Stop");
} else {
// This will trigger the audio recording loop to stop and then set isRecorderStart = false;
stopRecording();
Log.w(LOG_TAG, "Stop Button Pushed");
btnRecorderControl.setText("Start");
}
}
}I am getting this error
The type org.bytedeco.javacpp.avutil$AVFrame cannot be resolved. It is indirectly referenced from required .class files
at following line in the above code
if (RECORD_LENGTH <= 0) try {
recorder.record(audioData);
//Log.v(LOG_TAG,"recording " + 1024*i + " to " + 1024*i+1024);
}at recorder.record(audioData). I don’t know what I am doing wrong here. New to JavaCV. Any help will be appreciated.
-
using libav instead of ffmpeg
21 janvier 2015, par n00bieI want to streaming video over http, i am using ogg(theora + vorbis), now i have sender and receiver, and i can run them using command line :
Sender :
ffmpeg -f video4linux2 -s 320x240 -i /dev/mycam -codec:v libtheora -qscale:v 5 -f ogg http://127.0.0.1:8080
Receiver :
sudo gst-launch-0.10 tcpserversrc port = 8080 ! oggdemux ! theoradec ! autovideosink
Now, sender sends both audio and video, but receiver plays only video.
It works perfect, but now i want not to use ffmpeg and use only libav* instead.
Here’s my class for streaming :
class VCORE_LIBRARY_EXPORT VVideoWriter : private boost::noncopyable
{
public:
VVideoWriter( );
~VVideoWriter( );
bool openFile( const std::string& name,
int fps, int videoBitrate, int width, int height,
int audioSampleRate, bool stereo, int audioBitrate );
void close( );
bool writeVideoFrame( const uint8_t* image, int64_t timestamp );
bool writeAudioFrame( const int16_t* data, int64_t timestamp );
int audioFrameSize( ) const;
private:
AVFrame *m_videoFrame;
AVFrame *m_audioFrame;
AVFormatContext *m_context;
AVStream *m_videoStream;
AVStream *m_audioStream;
int64_t m_startTime;
};Initialization :
bool VVideoWriter::openFile( const std::string& name,
int fps, int videoBitrate, int width, int height,
int audioSampleRate, bool stereo, int audioBitrate )
{
if( ! m_context )
{
// initalize the AV context
m_context = avformat_alloc_context( );
assert( m_context );
// get the output format
m_context->oformat = av_guess_format( "ogg", name.c_str( ), nullptr );
if( m_context->oformat )
{
strcpy( m_context->filename, name.c_str( ) );
auto codecID = AV_CODEC_ID_THEORA;
auto codec = avcodec_find_encoder( codecID );
if( codec )
{
m_videoStream = avformat_new_stream( m_context, codec );
assert( m_videoStream );
// initalize codec
auto codecContext = m_videoStream->codec;
bool globalHeader = m_context->oformat->flags & AVFMT_GLOBALHEADER;
if( globalHeader )
codecContext->flags |= CODEC_FLAG_GLOBAL_HEADER;
codecContext->codec_id = codecID;
codecContext->codec_type = AVMEDIA_TYPE_VIDEO;
codecContext->width = width;
codecContext->height = height;
codecContext->time_base.den = fps;
codecContext->time_base.num = 1;
codecContext->bit_rate = videoBitrate;
codecContext->pix_fmt = PIX_FMT_YUV420P;
codecContext->flags |= CODEC_FLAG_QSCALE;
codecContext->global_quality = FF_QP2LAMBDA * 5;
int res = avcodec_open2( codecContext, codec, nullptr );
if( res >= 0 )
{
auto codecID = AV_CODEC_ID_VORBIS;
auto codec = avcodec_find_encoder( codecID );
if( codec )
{
m_audioStream = avformat_new_stream( m_context, codec );
assert( m_audioStream );
// initalize codec
auto codecContext = m_audioStream->codec;
bool globalHeader = m_context->oformat->flags & AVFMT_GLOBALHEADER;
if( globalHeader )
codecContext->flags |= CODEC_FLAG_GLOBAL_HEADER;
codecContext->codec_id = codecID;
codecContext->codec_type = AVMEDIA_TYPE_AUDIO;
codecContext->sample_fmt = AV_SAMPLE_FMT_FLTP;
codecContext->bit_rate = audioBitrate;
codecContext->sample_rate = audioSampleRate;
codecContext->channels = stereo ? 2 : 1;
codecContext->channel_layout = stereo ? AV_CH_LAYOUT_STEREO : AV_CH_LAYOUT_MONO;
res = avcodec_open2( codecContext, codec, nullptr );
if( res >= 0 )
{
// try to open the file
if( avio_open( &m_context->pb, m_context->filename, AVIO_FLAG_WRITE ) >= 0 )
{
m_audioFrame->nb_samples = codecContext->frame_size;
m_audioFrame->format = codecContext->sample_fmt;
m_audioFrame->channel_layout = codecContext->channel_layout;
boost::posix_time::ptime time_t_epoch( boost::gregorian::date( 1970, 1, 1 ) );
m_context->start_time_realtime = ( boost::posix_time::microsec_clock::universal_time( ) - time_t_epoch ).total_microseconds( );
m_startTime = -1;
// write the header
if( avformat_write_header( m_context, nullptr ) >= 0 )
{
return true;
}
else std::cerr << "VVideoWriter: failed to write video header" << std::endl;
}
else std::cerr << "VVideoWriter: failed to open video file " << name << std::endl;
}
else std::cerr << "VVideoWriter: failed to initialize audio codec" << std::endl;
}
else std::cerr << "VVideoWriter: requested audio codec is not supported" << std::endl;
}
else std::cerr << "VVideoWriter: failed to initialize video codec" << std::endl;
}
else std::cerr << "VVideoWriter: requested video codec is not supported" << std::endl;
}
else std::cerr << "VVideoWriter: requested video format is not supported" << std::endl;
avformat_free_context( m_context );
m_context = nullptr;
m_videoStream = nullptr;
m_audioStream = nullptr;
}
return false;
}Writing video :
bool VVideoWriter::writeVideoFrame( const uint8_t* image, int64_t timestamp )
{
if( m_context ) {
auto codecContext = m_videoStream->codec;
avpicture_fill( reinterpret_cast( m_videoFrame ),
const_cast( image ),
codecContext->pix_fmt, codecContext->width, codecContext->height );
AVPacket pkt;
av_init_packet( & pkt );
pkt.data = nullptr;
pkt.size = 0;
int gotPacket = 0;
if( ! avcodec_encode_video2( codecContext, &pkt, m_videoFrame, & gotPacket ) ) {
if( gotPacket == 1 ) {
pkt.stream_index = m_videoStream->index;
int res;
{
pkt.pts = AV_NOPTS_VALUE;
pkt.dts = AV_NOPTS_VALUE;
pkt.stream_index = m_videoStream->index;
res = av_write_frame( m_context, &pkt );
}
av_free_packet( & pkt );
return res >= 0;
}
assert( ! pkt.size );
return true;
}
}
return false;
}Writing audio (now i write test dummy audio) :
bool VVideoWriter::writeAudioFrame( const int16_t* data, int64_t timestamp )
{
if( m_context ) {
auto codecContext = m_audioStream->codec;
int buffer_size = av_samples_get_buffer_size(nullptr, codecContext->channels, codecContext->frame_size, codecContext->sample_fmt, 0);
float *samples = (float*)av_malloc(buffer_size);
for (int i = 0; i < buffer_size / sizeof(float); i++)
samples[i] = 1000. * sin((double)i/2.);
int ret = avcodec_fill_audio_frame( m_audioFrame, codecContext->channels, codecContext->sample_fmt, (const uint8_t*)samples, buffer_size, 0);
assert( ret >= 0 );
(void)(ret);
AVPacket pkt;
av_init_packet( & pkt );
pkt.data = nullptr;
pkt.size = 0;
int gotPacket = 0;
if( ! avcodec_encode_audio2( codecContext, &pkt, m_audioFrame, & gotPacket ) ) {
if( gotPacket == 1 ) {
pkt.stream_index = m_audioStream->index;
int res;
{
pkt.pts = AV_NOPTS_VALUE;
pkt.dts = AV_NOPTS_VALUE;
pkt.stream_index = m_audioStream->index;
res = av_write_frame( m_context, &pkt );
}
av_free_packet( & pkt );
return res >= 0;
}
assert( ! pkt.size );
return true;
}
return false;
}
return false;
}Here’s test example (i send video from webcam and dummy audio) :
class TestVVideoWriter : public sigslot::has_slots<>
{
public:
TestVVideoWriter( ) :
m_fileOpened( false )
{
}
void onCapturedFrame( cricket::VideoCapturer*, const cricket::CapturedFrame* capturedFrame )
{
if( m_fileOpened ) {
m_writer.writeVideoFrame( reinterpret_cast<const>( capturedFrame->data ),
capturedFrame->time_stamp / 1000 );
m_writer.writeAudioFrame( nullptr , 0 );
} else {
m_fileOpened = m_writer.openFile( "http://127.0.0.1:8080",
15, 40000, capturedFrame->width, capturedFrame->height,
16000, false, 64000 );
}
}
public:
vcore::VVideoWriter m_writer;
bool m_fileOpened;
};
TestVVideoWriter testWriter;
BOOST_AUTO_TEST_SUITE(TEST_VIDEO_WRITER)
BOOST_AUTO_TEST_CASE(testWritingVideo)
{
cricket::LinuxDeviceManager deviceManager;
std::vector devs;
if( deviceManager.GetVideoCaptureDevices( &devs ) ) {
if( devs.size( ) ) {
boost::shared_ptr camera( deviceManager.CreateVideoCapturer( devs[ 0 ] ) );
if( camera ) {
cricket::VideoFormat format( 320, 240, cricket::VideoFormat::FpsToInterval( 30 ),
camera->GetSupportedFormats( )->front( ).fourcc );
cricket::VideoFormat best;
if( camera->GetBestCaptureFormat( format, &best ) ) {
camera->SignalFrameCaptured.connect( &testWriter, &TestVVideoWriter::onCapturedFrame );
if( camera->Start( best ) != cricket::CS_FAILED ) {
boost::this_thread::sleep( boost::posix_time::seconds( 10 ) );
return;
}
}
}
}
}
std::cerr << "Problem has occured with camera" << std::endl;
}
BOOST_AUTO_TEST_SUITE_END() // TEST_VIDEO_WRITER
</const>But, in this case, gstreamer start playing video only when my test program stop executing (after 10 seconds in this case). It does not suit me, i want gstreamer start playing immediately after starting my test program.
Could someone help me ?
P.S. Sorry for my English.
-
How to make that video module can generate readable files for the module JW Player ?
4 janvier 2015, par DvexI use Drupal 7.x with Video Module, Video Pressets for FFMPEG and JW Player.
I install FFMPEG with this Guide : https://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu in Ubuntu 14.04I’m using jQuery 1.8
I downloaded the files that the JW Player module requires and renamed some files, as the guide tells me.
I set the video module so you can use FFMPEG and established the necessary pressets (FLV, MP4 and WebM) for any browser (web and mobile).
I changed the default settings of these pressets. I put check the option : Force one-pass Encoding. Because if not me error message :
PHPVideoToolkit error: Execute error. It was not possible to encode "/var/www/html/sites/default/files/private/videos/original/reel para web en loop_2.mp4" as FFmpeg returned an error. Note, however the error was encountered on the second pass of the encoding process and the first pass appear to go fine. The error is with the video codec of the input file. FFmpeg reports the error to be "Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height".
/usr/bin/ffmpeg -i '/var/www/html/sites/default/files/private/videos/original/reel para web en loop_2.mp4' -strict experimental -vcodec 'h264' -acodec 'aac' -vprofile 'baseline' -ac '2' -pass '1' -passlogfile '/tmp/1420389438-54a96c3eda9fc-multipass' -y /tmp/1420389438-54a96c3eda8c1.mp4
ffmpeg version 1.2.6-7:1.2.6-1~trusty1 Copyright (c) 2000-2014 the FFmpeg developers
built on Apr 26 2014 18:52:58 with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1)
configuration: --arch=amd64 --disable-stripping --enable-avresample --enable-pthreads --enable-runtime-cpudetect --extra-version='7:1.2.6-1~trusty1' --libdir=/usr/lib/x86_64-linux-gnu --prefix=/usr --enable-bzlib --enable-libdc1394 --enable-libfreetype --enable-frei0r --enable-gnutls --enable-libgsm --enable-libmp3lame --enable-librtmp --enable-libopencv --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-vaapi --enable-vdpau --enable-libvorbis --enable-libvpx --enable-zlib --enable-gpl --enable-postproc --enable-libcdio --enable-x11grab --enable-libx264 --shlibdir=/usr/lib/x86_64-linux-gnu --enable-shared --disable-static
libavutil 52. 18.100 / 52. 18.100
libavcodec 54. 92.100 / 54. 92.100
libavformat 54. 63.104 / 54. 63.104
libavdevice 53. 5.103 / 53. 5.103
libavfilter 3. 42.103 / 3. 42.103
libswscale 2. 2.100 / 2. 2.100
libswresample 0. 17.102 / 0. 17.102
libpostproc 52. 2.100 / 52. 2.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/var/www/html/sites/default/files/private/videos/original/reel para web en loop_2.mp4':
Metadata:
major_brand : mp42
minor_version : 0
compatible_brands: mp42mp41
creation_time : 2013-09-12 17:30:22
Duration: 00:00:39.32, start: 0.000000, bitrate: 3190 kb/s
Stream #0:0(eng): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 720x480 [SAR 40:33 DAR 20:11], 2991 kb/s, 29.97 fps, 29.97 tbr, 29970 tbn, 59.94 tbc
Metadata:
creation_time : 2013-09-12 17:30:22
handler_name : Mainconcept MP4 Video Media Handler
Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 189 kb/s
Metadata:
creation_time : 2013-09-12 17:30:22
handler_name : Mainconcept MP4 Sound Media Handler
[libx264 @ 0x164dec0] using SAR=40/33
[libx264 @ 0x164dec0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
[libx264 @ 0x164dec0] profile Constrained Baseline, level 3.0
[libx264 @ 0x164dec0] 264 - core 142 r2389 956c8d8 - H.264/MPEG-4 AVC codec - Copyleft 2003-2014 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=1:0:0 analyse=0x1:0 me=dia subme=2 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to '/tmp/1420389438-54a96c3eda8c1.mp4':
Metadata:
major_brand : mp42
minor_version : 0
compatible_brands: mp42mp41
encoder : Lavf54.63.104
Stream #0:0(eng): Video: h264 ([33][0][0][0] / 0x0021), yuv420p, 720x480 [SAR 40:33 DAR 20:11], q=-1--1, pass 1, 11988 tbn, 29.97 tbc
Metadata:
creation_time : 2013-09-12 17:30:22
handler_name : Mainconcept MP4 Video Media Handler
Stream #0:1(eng): Audio: aac ([64][0][0][0] / 0x0040), 48000 Hz, stereo, fltp, 128 kb/s
Metadata:
creation_time : 2013-09-12 17:30:22
handler_name : Mainconcept MP4 Sound Media Handler
Stream mapping:
Stream #0:0 -> #0:0 (h264 -> libx264)
Stream #0:1 -> #0:1 (aac -> aac)
Press [q] to stop, [?] for help
frame= 88 fps=0.0 q=29.0 size= 483kB time=00:00:02.96 bitrate=1335.6kbits/s dup=1 drop=0
frame= 190 fps=189 q=29.0 size= 1916kB time=00:00:06.33 bitrate=2477.6kbits/s dup=1 drop=0
frame= 289 fps=192 q=29.0 size= 3001kB time=00:00:09.64 bitrate=2549.5kbits/s dup=1 drop=0
frame= 395 fps=196 q=29.0 size= 3732kB time=00:00:13.31 bitrate=2296.7kbits/s dup=1 drop=0
frame= 495 fps=197 q=29.0 size= 4481kB time=00:00:16.66 bitrate=2203.1kbits/s dup=1 drop=0
frame= 591 fps=196 q=29.0 size= 5152kB time=00:00:19.98 bitrate=2111.4kbits/s dup=1 drop=0
frame= 690 fps=196 q=29.0 size= 6329kB time=00:00:23.01 bitrate=2252.4kbits/s dup=1 drop=0
frame= 790 fps=197 q=29.0 size= 7184kB time=00:00:26.32 bitrate=2235.6kbits/s dup=1 drop=0
frame= 890 fps=197 q=29.0 size= 8532kB time=00:00:29.73 bitrate=2350.3kbits/s dup=1 drop=0
frame= 1000 fps=199 q=29.0 size= 9398kB time=00:00:33.42 bitrate=2303.0kbits/s dup=1 drop=0
frame= 1106 fps=200 q=29.0 size= 10423kB time=00:00:37.01 bitrate=2306.8kbits/s dup=1 drop=0
frame= 1178 fps=199 q=-1.0 Lsize= 11404kB time=00:00:39.31 bitrate=2376.1kbits/s dup=1 drop=0
video:10752kB audio:618kB subtitle:0 global headers:0kB muxing overhead 0.302603%
[libx264 @ 0x164dec0] frame I:87 Avg QP:24.23 size: 21823
[libx264 @ 0x164dec0] frame P:1091 Avg QP:25.90 size: 8351
[libx264 @ 0x164dec0] mb I I16..4: 50.9% 0.0% 49.1%
[libx264 @ 0x164dec0] mb P I16..4: 14.8% 0.0% 0.0% P16..4: 51.3% 0.0% 0.0% 0.0% 0.0% skip:33.9%
[libx264 @ 0x164dec0] coded y,uvDC,uvAC intra: 43.9% 38.7% 10.2% inter: 23.0% 14.4% 0.8%
[libx264 @ 0x164dec0] i16 v,h,dc,p: 31% 39% 19% 11%
[libx264 @ 0x164dec0] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 18% 44% 14% 5% 4% 4% 4% 4% 4%
[libx264 @ 0x164dec0] i8c dc,h,v,p: 57% 22% 17% 4%
[libx264 @ 0x164dec0] kb/s:2240.77
/usr/bin/ffmpeg -i '/var/www/html/sites/default/files/private/videos/original/reel para web en loop_2.mp4' -strict experimental -vcodec 'h264' -acodec 'aac' -vprofile 'baseline' -ac '2' -pass '2' -passlogfile '/tmp/1420389438-54a96c3eda9fc-multipass' -y /tmp/1420389438-54a96c3eda8c1.mp4
ffmpeg version 1.2.6-7:1.2.6-1~trusty1 Copyright (c) 2000-2014 the FFmpeg developers
built on Apr 26 2014 18:52:58 with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1)
configuration: --arch=amd64 --disable-stripping --enable-avresample --enable-pthreads --enable-runtime-cpudetect --extra-version='7:1.2.6-1~trusty1' --libdir=/usr/lib/x86_64-linux-gnu --prefix=/usr --enable-bzlib --enable-libdc1394 --enable-libfreetype --enable-frei0r --enable-gnutls --enable-libgsm --enable-libmp3lame --enable-librtmp --enable-libopencv --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-vaapi --enable-vdpau --enable-libvorbis --enable-libvpx --enable-zlib --enable-gpl --enable-postproc --enable-libcdio --enable-x11grab --enable-libx264 --shlibdir=/usr/lib/x86_64-linux-gnu --enable-shared --disable-static
libavutil 52. 18.100 / 52. 18.100
libavcodec 54. 92.100 / 54. 92.100
libavformat 54. 63.104 / 54. 63.104
libavdevice 53. 5.103 / 53. 5.103
libavfilter 3. 42.103 / 3. 42.103
libswscale 2. 2.100 / 2. 2.100
libswresample 0. 17.102 / 0. 17.102
libpostproc 52. 2.100 / 52. 2.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/var/www/html/sites/default/files/private/videos/original/reel para web en loop_2.mp4':
Metadata:
major_brand : mp42
minor_version : 0
compatible_brands: mp42mp41
creation_time : 2013-09-12 17:30:22
Duration: 00:00:39.32, start: 0.000000, bitrate: 3190 kb/s
Stream #0:0(eng): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 720x480 [SAR 40:33 DAR 20:11], 2991 kb/s, 29.97 fps, 29.97 tbr, 29970 tbn, 59.94 tbc
Metadata:
creation_time : 2013-09-12 17:30:22
handler_name : Mainconcept MP4 Video Media Handler
Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 189 kb/s
Metadata:
creation_time : 2013-09-12 17:30:22
handler_name : Mainconcept MP4 Sound Media Handler
[libx264 @ 0x161eec0] using SAR=40/33
[libx264 @ 0x161eec0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
[libx264 @ 0x161eec0] constant rate-factor is incompatible with 2pass.
Output #0, mp4, to '/tmp/1420389438-54a96c3eda8c1.mp4':
Metadata:
major_brand : mp42
minor_version : 0
compatible_brands: mp42mp41
Stream #0:0(eng): Video: h264, yuv420p, 720x480 [SAR 40:33 DAR 20:11], q=-1--1, pass 2, 90k tbn, 29.97 tbc
Metadata:
creation_time : 2013-09-12 17:30:22
handler_name : Mainconcept MP4 Video Media Handler
Stream #0:1(eng): Audio: none, 48000 Hz, stereo, fltp, 128 kb/s
Metadata:
creation_time : 2013-09-12 17:30:22
handler_name : Mainconcept MP4 Sound Media Handler
Stream mapping:
Stream #0:0 -> #0:0 (h264 -> libx264)
Stream #0:1 -> #0:1 (aac -> aac)
Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or heightWhen I save a node and add its corresponding video, the module generates the 3 files with their respective formats. But when thrown at the JW Player module, only plays FLV in every browser.
But not the MP4 or WebM.I checked the log and status report and all I get in order.
So I assume that the MP4 and WebM formats generated are not what the JW Player waiting.
Does anyone have the solution to this problem ? I tried to recompile FFMPEG with more codecs but it does not solve the problem.
This is my FFMPEG Configuration :
ffmpeg version 1.2.6-1~trusty1 Copyright (c) 2000-2014 the FFmpeg developers
built on Apr 26 2014 18:52:58 with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1)
configuration: --arch=amd64 --disable-stripping --enable-avresample --enable-pthreads --enable-runtime-cpudetect --extra-version='7:1.2.6-1~trusty1' --libdir=/usr/lib/x86_64-linux-gnu --prefix=/usr --enable-bzlib --enable-libdc1394 --enable-libfreetype --enable-frei0r --enable-gnutls --enable-libgsm --enable-libmp3lame --enable-librtmp --enable-libopencv --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-vaapi --enable-vdpau --enable-libvorbis --enable-libvpx --enable-zlib --enable-gpl --enable-postproc --enable-libcdio --enable-x11grab --enable-libx264 --shlibdir=/usr/lib/x86_64-linux-gnu --enable-shared --disable-static