
Recherche avancée
Médias (2)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (47)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (8838)
-
openGL ES 2.0 on android , YUV to RGB and Rendering with ffMpeg
14 octobre 2013, par 101110101100111111101101My renderer dies 1 2 frames later when video shows after.
FATAL ERROR 11 : blabla...(Exactly occurs in glDrawElements (Y part))
I think problem is 'glPixelStorei' or 'GL_RGB', 'GL_LUMINANCE' but.. I don't get it.
My rendering way :
-
Decode data that got from network, (SDK Getting-> NDK Decoding), Enqueueing.
-
Dequeueing another threads (of course synchronized) get ready to setup OpenGL ES 2.0.(SDK)
-
When onDrawFrame, onSurfaceCreated, onSurfaceChanged methods are called, it shrink down to NDK. (My Renderer source in NDK will attach below.)
-
Rendering.
As you know, Fragment shader is using for conversion.
My Data is YUV 420p (pix_fmt_YUV420p) (12bit per pixel)Here is my entire source.
I haven't any knowledge about OpenGL ES before, this is first time.
Please let me know what am I do improving performance.
and What am I use parameters in 'glTexImage2D', 'glTexSubImage2D', 'glRenderbufferStorage' ????
GL_LUMINANCE ? GL_RGBA ? GL_RGB ? (GL_LUMINANCE is using now)void Renderer::set_draw_frame(JNIEnv* jenv, jbyteArray yData, jbyteArray uData, jbyteArray vData)
{
for (int i = 0; i < 3; i++) {
if (yuv_data_[i] != NULL) {
free(yuv_data_[i]);
}
}
int YSIZE = -1;
int USIZE = -1;
int VSIZE = -1;
if (yData != NULL) {
YSIZE = (int)jenv->GetArrayLength(yData);
LOG_DEBUG("YSIZE : %d", YSIZE);
yuv_data_[0] = (unsigned char*)malloc(sizeof(unsigned char) * YSIZE);
memset(yuv_data_[0], 0, YSIZE);
jenv->GetByteArrayRegion(yData, 0, YSIZE, (jbyte*)yuv_data_[0]);
yuv_data_[0] = reinterpret_cast<unsigned>(yuv_data_[0]);
} else {
YSIZE = (int)jenv->GetArrayLength(yData);
yuv_data_[0] = (unsigned char*)malloc(sizeof(unsigned char) * YSIZE);
memset(yuv_data_[0], 1, YSIZE);
}
if (uData != NULL) {
USIZE = (int)jenv->GetArrayLength(uData);
LOG_DEBUG("USIZE : %d", USIZE);
yuv_data_[1] = (unsigned char*)malloc(sizeof(unsigned char) * USIZE);
memset(yuv_data_[1], 0, USIZE);
jenv->GetByteArrayRegion(uData, 0, USIZE, (jbyte*)yuv_data_[1]);
yuv_data_[1] = reinterpret_cast<unsigned>(yuv_data_[1]);
} else {
USIZE = YSIZE/4;
yuv_data_[1] = (unsigned char*)malloc(sizeof(unsigned char) * USIZE);
memset(yuv_data_[1], 1, USIZE);
}
if (vData != NULL) {
VSIZE = (int)jenv->GetArrayLength(vData);
LOG_DEBUG("VSIZE : %d", VSIZE);
yuv_data_[2] = (unsigned char*)malloc(sizeof(unsigned char) * VSIZE);
memset(yuv_data_[2], 0, VSIZE);
jenv->GetByteArrayRegion(vData, 0, VSIZE, (jbyte*)yuv_data_[2]);
yuv_data_[2] = reinterpret_cast<unsigned>(yuv_data_[2]);
} else {
VSIZE = YSIZE/4;
yuv_data_[2] = (unsigned char*)malloc(sizeof(unsigned char) * VSIZE);
memset(yuv_data_[2], 1, VSIZE);
}
glClearColor(1.0F, 1.0F, 1.0F, 1.0F);
check_gl_error("glClearColor");
glClear(GL_COLOR_BUFFER_BIT);
check_gl_error("glClear");
}
void Renderer::draw_frame()
{
// Binding created FBO
glBindFramebuffer(GL_FRAMEBUFFER, frame_buffer_object_);
check_gl_error("glBindFramebuffer");
// Add program to OpenGL environment
glUseProgram(program_object_);
check_gl_error("glUseProgram");
for (int i = 0; i < 3; i++) {
LOG_DEBUG("Success");
//Bind texture
glActiveTexture(GL_TEXTURE0 + i);
check_gl_error("glActiveTexture");
glBindTexture(GL_TEXTURE_2D, yuv_texture_id_[i]);
check_gl_error("glBindTexture");
glUniform1i(yuv_texture_object_[i], i);
check_gl_error("glBindTexture");
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, stream_yuv_width_[i], stream_yuv_height_[i], GL_RGBA, GL_UNSIGNED_BYTE, yuv_data_[i]);
check_gl_error("glTexSubImage2D");
}
LOG_DEBUG("Success");
// Load vertex information
glVertexAttribPointer(position_object_, 2, GL_FLOAT, GL_FALSE, kStride, kVertexInformation);
check_gl_error("glVertexAttribPointer");
// Load texture information
glVertexAttribPointer(texture_position_object_, 2, GL_SHORT, GL_FALSE, kStride, kTextureCoordinateInformation);
check_gl_error("glVertexAttribPointer");
LOG_DEBUG("9");
glEnableVertexAttribArray(position_object_);
check_gl_error("glEnableVertexAttribArray");
glEnableVertexAttribArray(texture_position_object_);
check_gl_error("glEnableVertexAttribArray");
// Back to window buffer
glBindFramebuffer(GL_FRAMEBUFFER, 0);
check_gl_error("glBindFramebuffer");
LOG_DEBUG("Success");
// Draw the Square
glDrawElements(GL_TRIANGLE_STRIP, 6, GL_UNSIGNED_SHORT, kIndicesInformation);
check_gl_error("glDrawElements");
}
void Renderer::setup_render_to_texture()
{
glGenFramebuffers(1, &frame_buffer_object_);
check_gl_error("glGenFramebuffers");
glBindFramebuffer(GL_FRAMEBUFFER, frame_buffer_object_);
check_gl_error("glBindFramebuffer");
glGenRenderbuffers(1, &render_buffer_object_);
check_gl_error("glGenRenderbuffers");
glBindRenderbuffer(GL_RENDERBUFFER, render_buffer_object_);
check_gl_error("glBindRenderbuffer");
glRenderbufferStorage(GL_RENDERBUFFER, GL_RGBA4, stream_yuv_width_[0], stream_yuv_height_[0]);
check_gl_error("glRenderbufferStorage");
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, render_buffer_object_);
check_gl_error("glFramebufferRenderbuffer");
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, yuv_texture_id_[0], 0);
check_gl_error("glFramebufferTexture2D");
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, yuv_texture_id_[1], 0);
check_gl_error("glFramebufferTexture2D");
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, yuv_texture_id_[2], 0);
check_gl_error("glFramebufferTexture2D");
glBindFramebuffer(GL_FRAMEBUFFER, 0);
check_gl_error("glBindFramebuffer");
GLint status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
if (status != GL_FRAMEBUFFER_COMPLETE) {
print_log("renderer.cpp", "setup_graphics", "FBO setting fault.", LOGERROR);
LOG_ERROR("%d\n", status);
return;
}
}
void Renderer::setup_yuv_texture()
{
// Use tightly packed data
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
check_gl_error("glPixelStorei");
for (int i = 0; i < 3; i++) {
if (yuv_texture_id_[i]) {
glDeleteTextures(1, &yuv_texture_id_[i]);
check_gl_error("glDeleteTextures");
}
glActiveTexture(GL_TEXTURE0+i);
check_gl_error("glActiveTexture");
// Generate texture object
glGenTextures(1, &yuv_texture_id_[i]);
check_gl_error("glGenTextures");
glBindTexture(GL_TEXTURE_2D, yuv_texture_id_[i]);
check_gl_error("glBindTexture");
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
check_gl_error("glTexParameteri");
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
check_gl_error("glTexParameteri");
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
check_gl_error("glTexParameterf");
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
check_gl_error("glTexParameterf");
glEnable(GL_TEXTURE_2D);
check_gl_error("glEnable");
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, maximum_yuv_width_[i], maximum_yuv_height_[i], 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
check_gl_error("glTexImage2D");
}
}
void Renderer::setup_graphics()
{
print_gl_string("Version", GL_VERSION);
print_gl_string("Vendor", GL_VENDOR);
print_gl_string("Renderer", GL_RENDERER);
print_gl_string("Extensions", GL_EXTENSIONS);
program_object_ = create_program(kVertexShader, kFragmentShader);
if (!program_object_) {
print_log("renderer.cpp", "setup_graphics", "Could not create program.", LOGERROR);
return;
}
position_object_ = glGetAttribLocation(program_object_, "vPosition");
check_gl_error("glGetAttribLocation");
texture_position_object_ = glGetAttribLocation(program_object_, "vTexCoord");
check_gl_error("glGetAttribLocation");
yuv_texture_object_[0] = glGetUniformLocation(program_object_, "yTexture");
check_gl_error("glGetUniformLocation");
yuv_texture_object_[1] = glGetUniformLocation(program_object_, "uTexture");
check_gl_error("glGetUniformLocation");
yuv_texture_object_[2] = glGetUniformLocation(program_object_, "vTexture");
check_gl_error("glGetUniformLocation");
setup_yuv_texture();
setup_render_to_texture();
glViewport(0, 0, stream_yuv_width_[0], stream_yuv_height_[0]);//736, 480);//1920, 1080);//maximum_yuv_width_[0], maximum_yuv_height_[0]);
check_gl_error("glViewport");
}
GLuint Renderer::create_program(const char* vertex_source, const char* fragment_source)
{
GLuint vertexShader = load_shader(GL_VERTEX_SHADER, vertex_source);
if (!vertexShader) {
return 0;
}
GLuint pixelShader = load_shader(GL_FRAGMENT_SHADER, fragment_source);
if (!pixelShader) {
return 0;
}
GLuint program = glCreateProgram();
if (program) {
glAttachShader(program, vertexShader);
check_gl_error("glAttachShader");
glAttachShader(program, pixelShader);
check_gl_error("glAttachShader");
glLinkProgram(program);
/* Get a Status */
GLint linkStatus = GL_FALSE;
glGetProgramiv(program, GL_LINK_STATUS, &linkStatus);
if (linkStatus != GL_TRUE) {
GLint bufLength = 0;
glGetProgramiv(program, GL_INFO_LOG_LENGTH, &bufLength);
if (bufLength) {
char* buf = (char*) malloc(bufLength);
if (buf) {
glGetProgramInfoLog(program, bufLength, NULL, buf);
print_log("renderer.cpp", "create_program", "Could not link program.", LOGERROR);
LOG_ERROR("%s\n", buf);
free(buf);
}
}
glDeleteProgram(program);
program = 0;
}
}
return program;
}
GLuint Renderer::load_shader(GLenum shaderType, const char* pSource)
{
GLuint shader = glCreateShader(shaderType);
if (shader) {
glShaderSource(shader, 1, &pSource, NULL);
glCompileShader(shader);
/* Get a Status */
GLint compiled = 0;
glGetShaderiv(shader, GL_COMPILE_STATUS, &compiled);
if (!compiled) {
GLint infoLen = 0;
glGetShaderiv(shader, GL_INFO_LOG_LENGTH, &infoLen);
if (infoLen) {
char* buf = (char*) malloc(infoLen);
if (buf) {
glGetShaderInfoLog(shader, infoLen, NULL, buf);
print_log("renderer.cpp", "load_shader", "Could not link program.", LOGERROR);
LOG_ERROR("%d :: %s\n", shaderType, buf);
free(buf);
}
glDeleteShader(shader);
shader = 0;
}
}
}
return shader;
}
void Renderer::onDrawFrame(JNIEnv* jenv, jbyteArray yData, jbyteArray uData, jbyteArray vData)
{
set_draw_frame(jenv, yData, uData, vData);
draw_frame();
return;
}
void Renderer::setSize(int stream_width, int stream_height) {
stream_yuv_width_[0] = stream_width;
stream_yuv_width_[1] = stream_width/2;
stream_yuv_width_[2] = stream_width/2;
stream_yuv_height_[0] = stream_height;
stream_yuv_height_[1] = stream_height/2;
stream_yuv_height_[2] = stream_height/2;
}
void Renderer::onSurfaceChanged(int width, int height)
{
mobile_yuv_width_[0] = width;
mobile_yuv_width_[1] = width/2;
mobile_yuv_width_[2] = width/2;
mobile_yuv_height_[0] = height;
mobile_yuv_height_[1] = height/2;
mobile_yuv_height_[2] = height/2;
maximum_yuv_width_[0] = 1920;
maximum_yuv_width_[1] = 1920/2;
maximum_yuv_width_[2] = 1920/2;
maximum_yuv_height_[0] = 1080;
maximum_yuv_height_[1] = 1080/2;
maximum_yuv_height_[2] = 1080/2;
// If stream size not setting, default size D1
//if (stream_yuv_width_[0] == 0) {
stream_yuv_width_[0] = 736;
stream_yuv_width_[1] = 736/2;
stream_yuv_width_[2] = 736/2;
stream_yuv_height_[0] = 480;
stream_yuv_height_[1] = 480/2;
stream_yuv_height_[2] = 480/2;
//}
setup_graphics();
return;
}
</unsigned></unsigned></unsigned>Here is my Fragment, Vertex source and coordinates :
static const char kVertexShader[] =
"attribute vec4 vPosition; \n"
"attribute vec2 vTexCoord; \n"
"varying vec2 v_vTexCoord; \n"
"void main() { \n"
"gl_Position = vPosition; \n"
"v_vTexCoord = vTexCoord; \n"
"} \n";
static const char kFragmentShader[] =
"precision mediump float; \n"
"varying vec2 v_vTexCoord; \n"
"uniform sampler2D yTexture; \n"
"uniform sampler2D uTexture; \n"
"uniform sampler2D vTexture; \n"
"void main() { \n"
"float y=texture2D(yTexture, v_vTexCoord).r;\n"
"float u=texture2D(uTexture, v_vTexCoord).r - 0.5;\n"
"float v=texture2D(vTexture, v_vTexCoord).r - 0.5;\n"
"float r=y + 1.13983 * v;\n"
"float g=y - 0.39465 * u - 0.58060 * v;\n"
"float b=y + 2.03211 * u;\n"
"gl_FragColor = vec4(r, g, b, 1.0);\n"
"}\n";
static const GLfloat kVertexInformation[] =
{
-1.0f, 1.0f, // TexCoord 0 top left
-1.0f,-1.0f, // TexCoord 1 bottom left
1.0f,-1.0f, // TexCoord 2 bottom right
1.0f, 1.0f // TexCoord 3 top right
};
static const GLshort kTextureCoordinateInformation[] =
{
0, 0, // TexCoord 0 top left
0, 1, // TexCoord 1 bottom left
1, 1, // TexCoord 2 bottom right
1, 0 // TexCoord 3 top right
};
static const GLuint kStride = 0;//COORDS_PER_VERTEX * 4;
static const GLshort kIndicesInformation[] =
{
0, 1, 2,
0, 2, 3
}; -
-
lavfi : do not export the filters from shared objects
28 octobre 2013, par Anton Khirnovlavfi : do not export the filters from shared objects
- [DBH] libavfilter/af_aformat.c
- [DBH] libavfilter/af_amix.c
- [DBH] libavfilter/af_anull.c
- [DBH] libavfilter/af_ashowinfo.c
- [DBH] libavfilter/af_asyncts.c
- [DBH] libavfilter/af_channelmap.c
- [DBH] libavfilter/af_channelsplit.c
- [DBH] libavfilter/af_join.c
- [DBH] libavfilter/af_resample.c
- [DBH] libavfilter/af_volume.c
- [DBH] libavfilter/allfilters.c
- [DBH] libavfilter/asink_anullsink.c
- [DBH] libavfilter/asrc_anullsrc.c
- [DBH] libavfilter/buffersink.c
- [DBH] libavfilter/buffersrc.c
- [DBH] libavfilter/fifo.c
- [DBH] libavfilter/setpts.c
- [DBH] libavfilter/split.c
- [DBH] libavfilter/trim.c
- [DBH] libavfilter/vf_aspect.c
- [DBH] libavfilter/vf_blackframe.c
- [DBH] libavfilter/vf_boxblur.c
- [DBH] libavfilter/vf_copy.c
- [DBH] libavfilter/vf_crop.c
- [DBH] libavfilter/vf_cropdetect.c
- [DBH] libavfilter/vf_delogo.c
- [DBH] libavfilter/vf_drawbox.c
- [DBH] libavfilter/vf_drawtext.c
- [DBH] libavfilter/vf_fade.c
- [DBH] libavfilter/vf_fieldorder.c
- [DBH] libavfilter/vf_format.c
- [DBH] libavfilter/vf_fps.c
- [DBH] libavfilter/vf_frei0r.c
- [DBH] libavfilter/vf_gradfun.c
- [DBH] libavfilter/vf_hflip.c
- [DBH] libavfilter/vf_hqdn3d.c
- [DBH] libavfilter/vf_interlace.c
- [DBH] libavfilter/vf_libopencv.c
- [DBH] libavfilter/vf_lut.c
- [DBH] libavfilter/vf_null.c
- [DBH] libavfilter/vf_overlay.c
- [DBH] libavfilter/vf_pad.c
- [DBH] libavfilter/vf_pixdesctest.c
- [DBH] libavfilter/vf_scale.c
- [DBH] libavfilter/vf_select.c
- [DBH] libavfilter/vf_settb.c
- [DBH] libavfilter/vf_showinfo.c
- [DBH] libavfilter/vf_transpose.c
- [DBH] libavfilter/vf_unsharp.c
- [DBH] libavfilter/vf_vflip.c
- [DBH] libavfilter/vf_yadif.c
- [DBH] libavfilter/vsink_nullsink.c
- [DBH] libavfilter/vsrc_color.c
- [DBH] libavfilter/vsrc_movie.c
- [DBH] libavfilter/vsrc_nullsrc.c
- [DBH] libavfilter/vsrc_testsrc.c
-
keepalive type and frequency in ffmpeg [on hold]
19 novembre 2013, par Jack SimthMy company has a bunch of IP cameras that we distribute - specifically Grandstream - and the manufacturer has changed their firmware. The normal keepalive that ffmpeg uses for the rtsp streams ( either ff_rtsp_send_cmd_async(s, "GET_PARAMETER", rt->control_uri, NULL) ; or ff_rtsp_send_cmd_async(s, "OPTIONS", "*", NULL) ; both in in libavformat/rtspdec.c) is no longer working, for two reasons :
1) The new Grandstream firmware is now checking for a receiver report to determine whether or not the program reading the stream is live, not just anything.
2) The new Grandstream firmware requires that the receiver report to keep the connection alive happen at least once every 25 seconds, and on the audio stream it is currently only happening about every 30 seconds or so (video is getting it every 7 seconds or so).
So after about a minute with ffmpeg connected, the camera stops sending the audio stream, the audio stream on ffmpeg reads end-of-file, and then ffmpeg shuts everything down.
As I can't change the firmware, I'm trying to dig through the ffmpeg code to make it send the appropriate receiver report for the keep alive... but I am getting nowhere. I've added a little snippet of code into the receiver reports so I know when they're running when I call ffmpeg on debug, but... well, it's not going well.
Test command :
ffmpeg -loglevel debug -i rtsp ://admin:admin@192.168.4.3:554/0 -acodec libmp3lame -ar 22050 -vcodec copy -y -f flv /dev/null &> test.txtTest output :
`[root@localhost ffmpeg]# cat test.txt
ffmpeg version 2.0 Copyright (c) 2000-2013 the FFmpeg developers
built on Aug 21 2013 14:24:28 with gcc 4.4.7 (GCC) 20120313 (Red Hat 4.4.7-3)
configuration: --datadir=/usr/share/ffmpeg --bindir=/usr/local/bin --libdir=/usr/local/lib --incdir=/usr/local/include --shlibdir=/usr/lib --mandir=/usr/share/man --disable-avisynth --extra-cflags='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m32 -march=i386 -mtune=generic -fasynchronous-unwind-tables' --enable-avfilter --enable-libx264 --enable-gpl --enable-version3 --enable-postproc --enable-pthreads --enable-shared --enable-swscale --enable-vdpau --enable-x11grab --enable-librtmp --enable-libopencore-amrnb --enable-libopencore-amrwb --disable-static --enable-libgsm --enable-libxvid --enable-libvpx --enable-libvorbis --enable-libvo-aacenc --enable-libmp3lame
libavutil 52. 38.100 / 52. 38.100
libavcodec 55. 18.102 / 55. 18.102
libavformat 55. 12.100 / 55. 12.100
libavdevice 55. 3.100 / 55. 3.100
libavfilter 3. 79.101 / 3. 79.101
libswscale 2. 3.100 / 2. 3.100
libswresample 0. 17.102 / 0. 17.102
libpostproc 52. 3.100 / 52. 3.100
Splitting the commandline.
Reading option '-loglevel' ... matched as option 'loglevel' (set logging level) with argument 'debug'.
Reading option '-i' ... matched as input file with argument 'rtsp://admin:admin@192.168.4.3:554/0'.
Reading option '-acodec' ... matched as option 'acodec' (force audio codec ('copy' to copy stream)) with argument 'libmp3lame'.
Reading option '-ar' ... matched as option 'ar' (set audio sampling rate (in Hz)) with argument '22050'.
Reading option '-vcodec' ... matched as option 'vcodec' (force video codec ('copy' to copy stream)) with argument 'copy'.
Reading option '-y' ... matched as option 'y' (overwrite output files) with argument '1'.
Reading option '-f' ... matched as option 'f' (force format) with argument 'flv'.
Reading option '/dev/null' ... matched as output file.
Finished splitting the commandline.
Parsing a group of options: global .
Applying option loglevel (set logging level) with argument debug.
Applying option y (overwrite output files) with argument 1.
Successfully parsed a group of options.
Parsing a group of options: input file rtsp://admin:admin@192.168.4.3:554/0.
Successfully parsed a group of options.
Opening an input file: rtsp://admin:admin@192.168.4.3:554/0.
[rtsp @ 0x9d9ccc0] SDP:
v=0
o=StreamingServer 3331435948 1116907222000 IN IP4 192.168.4.3
s=h264.mp4
c=IN IP4 0.0.0.0
t=0 0
a=control:*
m=video 0 RTP/AVP 96
a=control:trackID=0
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1; sprop-parameter-sets=Z0LgHtoCgPRA,aM4wpIA=
m=audio 0 RTP/AVP 0
a=control:trackID=1
a=rtpmap:0 PCMU/8000
a=ptime:20
m=application 0 RTP/AVP 107
a=control:trackID=2
a=rtpmap:107 vnd.onvif.metadata/90000
[rtsp @ 0x9d9ccc0] video codec set to: h264
[NULL @ 0x9d9f400] RTP Packetization Mode: 1
[NULL @ 0x9d9f400] Extradata set to 0x9d9f900 (size: 22)!
[rtsp @ 0x9d9ccc0] audio codec set to: pcm_mulaw
[rtsp @ 0x9d9ccc0] audio samplerate set to: 8000
[rtsp @ 0x9d9ccc0] audio channels set to: 1
[rtsp @ 0x9d9ccc0] hello state=0
[h264 @ 0x9d9f400] Current profile doesn't provide more RBSP data in PPS, skipping
Last message repeated 1 times
[rtsp @ 0x9d9ccc0] All info found
Guessed Channel Layout for Input Stream #0.1 : mono
Input #0, rtsp, from 'rtsp://admin:admin@192.168.4.3:554/0':
Metadata:
title : h264.mp4
Duration: N/A, start: 0.000000, bitrate: 64 kb/s
Stream #0:0, 28, 1/90000: Video: h264 (Constrained Baseline), yuv420p, 640x480, 1/180000, 10 tbr, 90k tbn, 180k tbc
Stream #0:1, 156, 1/8000: Audio: pcm_mulaw, 8000 Hz, mono, s16, 64 kb/s
Successfully opened the file.
Parsing a group of options: output file /dev/null.
Applying option acodec (force audio codec ('copy' to copy stream)) with argument libmp3lame.
Applying option ar (set audio sampling rate (in Hz)) with argument 22050.
Applying option vcodec (force video codec ('copy' to copy stream)) with argument copy.
Applying option f (force format) with argument flv.
Successfully parsed a group of options.
Opening an output file: /dev/null.
Successfully opened the file.
detected 2 logical cores
[graph 0 input from stream 0:1 @ 0x9f15380] Setting 'time_base' to value '1/8000'
[graph 0 input from stream 0:1 @ 0x9f15380] Setting 'sample_rate' to value '8000'
[graph 0 input from stream 0:1 @ 0x9f15380] Setting 'sample_fmt' to value 's16'
[graph 0 input from stream 0:1 @ 0x9f15380] Setting 'channel_layout' to value '0x4'
[graph 0 input from stream 0:1 @ 0x9f15380] tb:1/8000 samplefmt:s16 samplerate:8000 chlayout:0x4
[audio format for output stream 0:1 @ 0x9efa7c0] Setting 'sample_fmts' to value 's32p|fltp|s16p'
[audio format for output stream 0:1 @ 0x9efa7c0] Setting 'sample_rates' to value '22050'
[audio format for output stream 0:1 @ 0x9efa7c0] Setting 'channel_layouts' to value '0x4|0x3'
[audio format for output stream 0:1 @ 0x9efa7c0] auto-inserting filter 'auto-inserted resampler 0' between the filter 'Parsed_anull_0' and the filter 'audio format for output stream 0:1'
[AVFilterGraph @ 0x9f15980] query_formats: 4 queried, 9 merged, 3 already done, 0 delayed
[auto-inserted resampler 0 @ 0x9dfada0] ch:1 chl:mono fmt:s16 r:8000Hz -> ch:1 chl:mono fmt:s16p r:22050Hz
Output #0, flv, to '/dev/null':
Metadata:
title : h264.mp4
encoder : Lavf55.12.100
Stream #0:0, 0, 1/1000: Video: h264 ([7][0][0][0] / 0x0007), yuv420p, 640x480, 1/90000, q=2-31, 1k tbn, 90k tbc
Stream #0:1, 0, 1/1000: Audio: mp3 (libmp3lame) ([2][0][0][0] / 0x0002), 22050 Hz, mono, s16p
Stream mapping:
Stream #0:0 -> #0:0 (copy)
Stream #0:1 -> #0:1 (pcm_mulaw -> libmp3lame)
Press [q] to stop, [?] for help
Current profile doesn't provide more RBSP data in PPS, skippingrate= 135.4kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 134.4kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 135.0kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 135.5kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 136.9kbits/s
Queue input is backward in time= 233kB time=00:00:13.69 bitrate= 139.4kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 136.3kbits/s
[flv @ 0x9de1200] Non-monotonous DTS in output stream 0:1; previous: 14239, current: 13926; changing to 14239. This may result in incorrect timestamps in the output file.
[flv @ 0x9de1200] Non-monotonous DTS in output stream 0:1; previous: 14239, current: 13952; changing to 14239. This may result in incorrect timestamps in the output file.
[flv @ 0x9de1200] Non-monotonous DTS in output stream 0:1; previous: 14239, current: 13979; changing to 14239. This may result in incorrect timestamps in the output file.
[flv @ 0x9de1200] Non-monotonous DTS in output stream 0:1; previous: 14239, current: 14005; changing to 14239. This may result in incorrect timestamps in the output file.
[flv @ 0x9de1200] Non-monotonous DTS in output stream 0:1; previous: 14239, current: 14031; changing to 14239. This may result in incorrect timestamps in the output file.
[flv @ 0x9de1200] Non-monotonous DTS in output stream 0:1; previous: 14239, current: 14057; changing to 14239. This may result in incorrect timestamps in the output file.
[flv @ 0x9de1200] Non-monotonous DTS in output stream 0:1; previous: 14239, current: 14083; changing to 14239. This may result in incorrect timestamps in the output file.
[flv @ 0x9de1200] Non-monotonous DTS in output stream 0:1; previous: 14239, current: 14109; changing to 14239. This may result in incorrect timestamps in the output file.
[flv @ 0x9de1200] Non-monotonous DTS in output stream 0:1; previous: 14239, current: 14135; changing to 14239. This may result in incorrect timestamps in the output file.
[flv @ 0x9de1200] Non-monotonous DTS in output stream 0:1; previous: 14239, current: 14161; changing to 14239. This may result in incorrect timestamps in the output file.
[flv @ 0x9de1200] Non-monotonous DTS in output stream 0:1; previous: 14239, current: 14188; changing to 14239. This may result in incorrect timestamps in the output file.
[flv @ 0x9de1200] Non-monotonous DTS in output stream 0:1; previous: 14239, current: 14214; changing to 14239. This may result in incorrect timestamps in the output file.
Current profile doesn't provide more RBSP data in PPS, skippingrate= 141.5kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 142.0kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 142.5kbits/s
Receiver Report delay: 469789, gettime: -1527669086, last_recep: 322446, timebase: -1534837492
Current profile doesn't provide more RBSP data in PPS, skippingrate= 141.5kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 141.7kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 141.1kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 140.6kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 140.7kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 139.9kbits/s
Receiver Report delay: 132993, gettime: -1516538925, last_recep: 322446, timebase: -1518568234
Current profile doesn't provide more RBSP data in PPS, skippingrate= 139.6kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 139.6kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 139.7kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 139.4kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 140.0kbits/s
Receiver Report delay: 897727, gettime: -1504870331, last_recep: 322446, timebase: -1518568552
[NULL @ 0x9d9f400] Current profile doesn't provide more RBSP data in PPS, skipping
Current profile doesn't provide more RBSP data in PPS, skippingrate= 139.4kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 139.1kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 139.0kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 139.0kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 138.6kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 138.5kbits/s
Current profile doesn't provide more RBSP data in PPS, skippingrate= 138.2kbits/s
EOF on sink link output stream 0:1:default.time=00:00:58.40 bitrate= 139.6kbits/s
No more output streams to write to, finishing.
[libmp3lame @ 0x9dfa580] Trying to remove 344 more samples than there are in the queue
frame= 589 fps= 11 q=-1.0 Lsize= 1003kB time=00:00:58.85 bitrate= 139.5kbits/s
video:724kB audio:231kB subtitle:0 global headers:0kB muxing overhead 4.955356%
2959 frames successfully decoded, 0 decoding errors
[AVIOContext @ 0x9e021c0] Statistics: 3 seeks, 2860 writeouts
[root@localhost ffmpeg]#