
Recherche avancée
Autres articles (111)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...) -
Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs
12 avril 2011, parLa manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.
Sur d’autres sites (7868)
-
Record video with Media Recorder in WEBM format
21 mai 2017, par MathoI need to record two videos via media recorder and process them with ffmpeg in Android. I want to read videos via pipe in ffmpeg command string. From what I have read, I could not use 3GPP/MP4 format, because these formats contains header at the end of video. Because of it, I want to use WEBM video format, but my app is crashing, when I use webm/vp8 setting in media recorder. Ffmpeg I’m using is : com.writingminds:FFmpegAndroid:0.3.2. Could you help me please ?
RecorderPrepareTask :
class RecorderPrepareTask extends AsyncTask {
private Context mContext;
public RecorderPrepareTask(Context context) {
mContext = context;
}
@Override
protected Boolean doInBackground(Void... voids) {
Timber.d("Starting screen recording...");
if (!outputRoot.exists() && !outputRoot.mkdirs()) {
Timber.e("Unable to create output directory '%s'.", outputRoot.getAbsolutePath());
return false;
}
recorder = new MediaRecorder();
recorder.setVideoSource(SURFACE);
recorder.setOutputFormat(MediaRecorder.OutputFormat.WEBM);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.VP8);
recorder.setVideoSize(Config.VIDEO_SIZE_W, Config.VIDEO_SIZE_H);
recorder.setVideoFrameRate(Config.FRAME_RATE);
Timber.i("Output file '%s'.", outputFileName);
recorder.setOutputFile(RecordingActivity.pipe[1].getFileDescriptor());
try {
recorder.prepare();
} catch (IOException e) {
throw new RuntimeException("Unable to prepare MediaRecorder.", e);
}
projection = projectionManager.getMediaProjection(resultCode, data);
Surface surface = recorder.getSurface();
display = projection.createVirtualDisplay(DISPLAY_NAME, Config.VIDEO_SIZE_W, Config.VIDEO_SIZE_H, 560, VIRTUAL_DISPLAY_FLAG_PRESENTATION, surface, null, null);
recorder.start();
return true;
}
@Override
protected void onPostExecute(Boolean result) {
}}
getConvertCmd :
private String getConvertCmd(){
return "-re -r 30 -f webm -c:v vp8 -i pipe:"+ RecordingActivity.pipe[0].getFd() + " -threads 2 -preset ultrafast -profile:v main -level 3.1 -b:v 440k -ar 44100 -ab 128k -s "+ Config.CAMERA_VIDEO_SIZE_W + "x" + Config.CAMERA_VIDEO_SIZE_H + " -vcodec vp8 -acodec vorbis " + outputFileName3;
}Logs :
D/SoftVPXEncoder: VP8: internalSetAndroidVp8Params. BRMode: 0. TS: 0. KF: 24. QP: 0 - 0 BR0: 100. BR1: 0. BR2: 0
I/ACodec: setupVideoEncoder succeeded
E/OMXNodeInstance: setConfig(1a:google.vp8.encoder, ConfigPriority(0x6f800002)) ERROR: Undefined(0x80001001)
I/ACodec: codec does not support config priority (err -2147483648)
W/ACodec: do not know color format 0x7f000789 = 2130708361
I/SoftMPEG4Encoder: Construct SoftMPEG4Encoder
I/MediaCodec: MediaCodec will operate in async mode
E/OMXNodeInstance: getParameter(1b:google.mpeg4.encoder, ParamVideoErrorCorrection(0x6000007)) ERROR: NotImplemented(0x80001006)
I/ACodec: setupVideoEncoder succeeded
E/OMXNodeInstance: setConfig(1b:google.mpeg4.encoder, ConfigPriority(0x6f800002)) ERROR: Undefined(0x80001001)
I/ACodec: codec does not support config priority (err -2147483648)
D/SoftVPXEncoder: VP8: initEncoder. BRMode: 0. TSLayers: 0. KF: 24. QP: 0 - 0
I/AudioFlinger: AudioFlinger's thread 0xb2440000 ready to run
xW/AudioFlinger: acquireAudioSessionId() unknown client 10079 for session 7
I/MediaCodec: MediaCodec will operate in async mode
/system_process I/DisplayManagerService: Display device added: DisplayDeviceInfo{"RMR": uniqueId="virtual:sk.matho.rocket_monkeys.rocketmonkeysrecorder,10052,RMR,0", 480 x 640, modeId 6, defaultModeId 6, supportedModes [{id=6, width=480, height=640, fps=60.0}], density 560, 560.0 x 560.0 dpi, appVsyncOff 0, presDeadline 16666666, touch NONE, rotation 0, type VIRTUAL, state ON, owner sk.matho.rocket_monkeys.rocketmonkeysrecorder (uid 10052), FLAG_PRIVATE, FLAG_PRESENTATION}
E/OMXNodeInstance: setConfig(1c:google.amrnb.encoder, ConfigPriority(0x6f800002)) ERROR: Undefined(0x80001001)
I/ACodec: codec does not support config priority (err -2147483648)
E/WebmElement: mmap64 failed; errno = 22
E/WebmElement: fd 18; flags: 1
E/WebmElement: mmap64 failed; errno = 22
E/WebmElement: fd 18; flags: 1
E/WebmElement: mmap64 failed; errno = 22
E/WebmElement: fd 18; flags: 1
E/WebmElement: mmap64 failed; errno = 22
E/WebmElement: fd 18; flags: 1
E/WebmElement: mmap64 failed; errno = 22
E/WebmElement: fd 18; flags: 1
E/WebmElement: mmap64 failed; errno = 22
E/WebmElement: fd 18; flags: 1
D/gralloc_ranchu: gralloc_alloc: format 1 and usage 0x10003 imply creation of host color buffer
I/MediaCodecSource: MediaCodecSource (video) starting
I/MediaCodecSource: MediaCodecSource (video) started
D/MPEG4Writer: Video track stopping
E/MPEG4Writer: Stop() called but track is not started
D/MPEG4Writer: Audio track stopping
E/MPEG4Writer: Stop() called but track is not started
[ 05-21 13:03:44.101 9757: 9766 D/ ]
HostConnection::get() New Host Connection established 0xb344f140, tid 9766
/sk.matho.rocket_monkeys.rocketmonkeysrecorder E/MediaRecorder: start failed: -2147483648
/sk.matho.rocket_monkeys.rocketmonkeysrecorder E/AndroidRuntime: FATAL EXCEPTION: AsyncTask #2
Process: sk.matho.rocket_monkeys.rocketmonkeysrecorder, PID: 10079
java.lang.RuntimeException: An error occurred while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:309)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:354)
at java.util.concurrent.FutureTask.setException(FutureTask.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:242)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
Caused by: java.lang.RuntimeException: start failed.
at android.media.MediaRecorder.start(Native Method)
at sk.matho.rocket_monkeys.rocketmonkeysrecorder.recording.CameraRecordingSession$MediaPrepareTask.doInBackground(CameraRecordingSession.java:181)
at sk.matho.rocket_monkeys.rocketmonkeysrecorder.recording.CameraRecordingSession$MediaPrepareTask.doInBackground(CameraRecordingSession.java:173)
at android.os.AsyncTask$2.call(AsyncTask.java:295)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
[ 05-21 13:03:44.109 9757:10315 D/ ]
HostConnection::get() New Host Connection established 0xb307c1c0, tid 10315 -
Record video with Media Recorder in WEBM format
21 mai 2017, par MathoI need to record two videos via media recorder and process them with ffmpeg in Android. I want to read videos via pipe in ffmpeg command string. From what I have read, I could not use 3GPP/MP4 format, because these formats contains header at the end of video. Because of it, I want to use WEBM video format, but my app is crashing, when I use webm/vp8 setting in media recorder. Ffmpeg I’m using is : com.writingminds:FFmpegAndroid:0.3.2. Could you help me please ?
RecorderPrepareTask :
class RecorderPrepareTask extends AsyncTask {
private Context mContext;
public RecorderPrepareTask(Context context) {
mContext = context;
}
@Override
protected Boolean doInBackground(Void... voids) {
Timber.d("Starting screen recording...");
if (!outputRoot.exists() && !outputRoot.mkdirs()) {
Timber.e("Unable to create output directory '%s'.", outputRoot.getAbsolutePath());
return false;
}
recorder = new MediaRecorder();
recorder.setVideoSource(SURFACE);
recorder.setOutputFormat(MediaRecorder.OutputFormat.WEBM);
recorder.setVideoEncoder(MediaRecorder.VideoEncoder.VP8);
recorder.setVideoSize(Config.VIDEO_SIZE_W, Config.VIDEO_SIZE_H);
recorder.setVideoFrameRate(Config.FRAME_RATE);
Timber.i("Output file '%s'.", outputFileName);
recorder.setOutputFile(RecordingActivity.pipe[1].getFileDescriptor());
try {
recorder.prepare();
} catch (IOException e) {
throw new RuntimeException("Unable to prepare MediaRecorder.", e);
}
projection = projectionManager.getMediaProjection(resultCode, data);
Surface surface = recorder.getSurface();
display = projection.createVirtualDisplay(DISPLAY_NAME, Config.VIDEO_SIZE_W, Config.VIDEO_SIZE_H, 560, VIRTUAL_DISPLAY_FLAG_PRESENTATION, surface, null, null);
recorder.start();
return true;
}
@Override
protected void onPostExecute(Boolean result) {
}}
getConvertCmd :
private String getConvertCmd(){
return "-re -r 30 -f webm -c:v vp8 -i pipe:"+ RecordingActivity.pipe[0].getFd() + " -threads 2 -preset ultrafast -profile:v main -level 3.1 -b:v 440k -ar 44100 -ab 128k -s "+ Config.CAMERA_VIDEO_SIZE_W + "x" + Config.CAMERA_VIDEO_SIZE_H + " -vcodec vp8 -acodec vorbis " + outputFileName3;
}Logs :
D/SoftVPXEncoder: VP8: internalSetAndroidVp8Params. BRMode: 0. TS: 0. KF: 24. QP: 0 - 0 BR0: 100. BR1: 0. BR2: 0
I/ACodec: setupVideoEncoder succeeded
E/OMXNodeInstance: setConfig(1a:google.vp8.encoder, ConfigPriority(0x6f800002)) ERROR: Undefined(0x80001001)
I/ACodec: codec does not support config priority (err -2147483648)
W/ACodec: do not know color format 0x7f000789 = 2130708361
I/SoftMPEG4Encoder: Construct SoftMPEG4Encoder
I/MediaCodec: MediaCodec will operate in async mode
E/OMXNodeInstance: getParameter(1b:google.mpeg4.encoder, ParamVideoErrorCorrection(0x6000007)) ERROR: NotImplemented(0x80001006)
I/ACodec: setupVideoEncoder succeeded
E/OMXNodeInstance: setConfig(1b:google.mpeg4.encoder, ConfigPriority(0x6f800002)) ERROR: Undefined(0x80001001)
I/ACodec: codec does not support config priority (err -2147483648)
D/SoftVPXEncoder: VP8: initEncoder. BRMode: 0. TSLayers: 0. KF: 24. QP: 0 - 0
I/AudioFlinger: AudioFlinger's thread 0xb2440000 ready to run
xW/AudioFlinger: acquireAudioSessionId() unknown client 10079 for session 7
I/MediaCodec: MediaCodec will operate in async mode
/system_process I/DisplayManagerService: Display device added: DisplayDeviceInfo{"RMR": uniqueId="virtual:sk.matho.rocket_monkeys.rocketmonkeysrecorder,10052,RMR,0", 480 x 640, modeId 6, defaultModeId 6, supportedModes [{id=6, width=480, height=640, fps=60.0}], density 560, 560.0 x 560.0 dpi, appVsyncOff 0, presDeadline 16666666, touch NONE, rotation 0, type VIRTUAL, state ON, owner sk.matho.rocket_monkeys.rocketmonkeysrecorder (uid 10052), FLAG_PRIVATE, FLAG_PRESENTATION}
E/OMXNodeInstance: setConfig(1c:google.amrnb.encoder, ConfigPriority(0x6f800002)) ERROR: Undefined(0x80001001)
I/ACodec: codec does not support config priority (err -2147483648)
E/WebmElement: mmap64 failed; errno = 22
E/WebmElement: fd 18; flags: 1
E/WebmElement: mmap64 failed; errno = 22
E/WebmElement: fd 18; flags: 1
E/WebmElement: mmap64 failed; errno = 22
E/WebmElement: fd 18; flags: 1
E/WebmElement: mmap64 failed; errno = 22
E/WebmElement: fd 18; flags: 1
E/WebmElement: mmap64 failed; errno = 22
E/WebmElement: fd 18; flags: 1
E/WebmElement: mmap64 failed; errno = 22
E/WebmElement: fd 18; flags: 1
D/gralloc_ranchu: gralloc_alloc: format 1 and usage 0x10003 imply creation of host color buffer
I/MediaCodecSource: MediaCodecSource (video) starting
I/MediaCodecSource: MediaCodecSource (video) started
D/MPEG4Writer: Video track stopping
E/MPEG4Writer: Stop() called but track is not started
D/MPEG4Writer: Audio track stopping
E/MPEG4Writer: Stop() called but track is not started
[ 05-21 13:03:44.101 9757: 9766 D/ ]
HostConnection::get() New Host Connection established 0xb344f140, tid 9766
/sk.matho.rocket_monkeys.rocketmonkeysrecorder E/MediaRecorder: start failed: -2147483648
/sk.matho.rocket_monkeys.rocketmonkeysrecorder E/AndroidRuntime: FATAL EXCEPTION: AsyncTask #2
Process: sk.matho.rocket_monkeys.rocketmonkeysrecorder, PID: 10079
java.lang.RuntimeException: An error occurred while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:309)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:354)
at java.util.concurrent.FutureTask.setException(FutureTask.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:242)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
Caused by: java.lang.RuntimeException: start failed.
at android.media.MediaRecorder.start(Native Method)
at sk.matho.rocket_monkeys.rocketmonkeysrecorder.recording.CameraRecordingSession$MediaPrepareTask.doInBackground(CameraRecordingSession.java:181)
at sk.matho.rocket_monkeys.rocketmonkeysrecorder.recording.CameraRecordingSession$MediaPrepareTask.doInBackground(CameraRecordingSession.java:173)
at android.os.AsyncTask$2.call(AsyncTask.java:295)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
[ 05-21 13:03:44.109 9757:10315 D/ ]
HostConnection::get() New Host Connection established 0xb307c1c0, tid 10315 -
can write frames to rtsp server, but can't display them in the ffplay or live555 client
11 octobre 2016, par tankyxI am working on a zero latency streaming server, using ffmpeg libraries and I am facing a problem.
My server is working when using nvenc, I can streaming successfully to my client, which is another computer on LAN. But if I change my encoder to use the libx264 (in order to reduce the latency), the server still write the frames, but the client is facing problems with the sdp header, and more specifically, the media subsession does not seem to be initialized. Therefore, the client crashes.The thing is, when I dump the sdp header when using nvenc and libx264, it is actually the same in both case.
Here is the code I have done to initialize my encoder :
/*
Init the codec that is used to encode the video.
Init the output format context (aka RTSP uri).
*/
FfmpegEncoder::FfmpegEncoder(char *url)
{
AVRational tmp_time_base;
AVDictionary* options = NULL;
this->pCodec = avcodec_find_encoder_by_name("libx264");
if (this->pCodec == NULL)
throw myExceptions("Error: Can't initialize the encoder. FfmpegEncoder.cpp l:9\n");
this->pCodecCtx = avcodec_alloc_context3(this->pCodec);
//Alloc output context
if (avformat_alloc_output_context2(&outFormatCtx, NULL, "rtsp", url) < 0)
throw myExceptions("Error: Can't alloc stream output. FfmpegEncoder.cpp l:17\n");
this->st = avformat_new_stream(this->outFormatCtx, this->pCodec);
this->st->id = this->outFormatCtx->nb_streams - 1;
if (this->st == NULL)
throw myExceptions("Error: Can't create stream . FfmpegEncoder.cpp l:22\n");
//Define the framerate of the output. The numerator should stay 1. Denumerator is the framerate we are aiming for.
tmp_time_base.num = 1;
tmp_time_base.den = 60;
//TODO : parse these values
this->pCodecCtx->bit_rate = 5000000;
this->pCodecCtx->width = 1280;
this->pCodecCtx->height = 720;
//This set the fps. 60fps at this point.
this->pCodecCtx->time_base = tmp_time_base;
this->st->time_base = tmp_time_base;
//Add a intra frame every 12 frames
this->pCodecCtx->gop_size = 10;
this->pCodecCtx->pix_fmt = AV_PIX_FMT_YUV420P;
av_opt_set(this->pCodecCtx, "tune", "zerolatency", 0);
av_opt_set(this->pCodecCtx, "vprofile", "main", 0);
av_opt_set(this->pCodecCtx, "preset", "faster", 0);
//Open Codec, using the context + x264 options
if (avcodec_open2(this->pCodecCtx, this->pCodec, &options) < 0)
throw myExceptions("Error: Can't open the codec. FfmpegEncoder.cpp l:43\n");
if (avcodec_copy_context(this->st->codec, this->pCodecCtx) != 0) {
throw myExceptions("Error : Can't copy codec context. FfmpegEncoder.cpp : l.46");
}
av_dump_format(this->outFormatCtx, 0, url, 1);
//write the header needed to start the stream.
if (avformat_write_header(this->outFormatCtx, NULL) != 0)
throw myExceptions("Error: failed to connect to RTSP server. FfmpegEncoder.cpp l:48\n");
}