Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
FFmpeg fails to be built for android
8 septembre 2014, par Andro-BeggI'm trying to build FFmpeg for android, everything works just fine until the build of
build_android.sh
, I get at the end :/bin/sh: 1: cd: can't cd to ../ffmpeg-build/armeabi/lib make: *** [install-libavresample-shared] Error 2 make: *** Waiting for unfinished jobs.... /bin/sh: 1: cd: can't cd to ../ffmpeg-build/armeabi/lib make: *** [install-libavutil-shared] Error 2 /bin/sh: 1: cd: can't cd to ../ffmpeg-build/armeabi/lib make: *** [install-libswresample-shared] Error 2
I've been inspecting this for a while but couldn't figure out where the problem is..
Any help will be very appreciated. Thank you.
-
interrupt_callback in ffmpeg
8 septembre 2014, par chesschiI understand that many of the functions provided by the ffmpeg libraries are blocking. In order to control the blocking function to timeout, we can assign a callback function to interrupt_callback.
static int interrupt_cb(void *ctx)
What is the optimal value for setting this timeout? What will these blocking functions behave when the timeout value is too small?
For example, the
avformat_open_input
may take 1 second to complete and if I set a timer for 500 milliseconds, will it never be able to connect to the RTSP server and return non-zero value?What is the best approach to use this callback function?
-
How to Render video via openGLES2 with Android NDK
8 septembre 2014, par wolfzI made a apk code with NDK to get video frames by ffmpeg decoding, and rendered to opengles 2.0 , but the screen had nothing to display. so did i make a mistake?
my opengles2.0 ini code is below:
static int window_init_display(void) { EGLint attribs [] = { EGL_RED_SIZE, 8, EGL_GREEN_SIZE, 8, EGL_BLUE_SIZE, 8, EGL_ALPHA_SIZE, 8, EGL_DEPTH_SIZE, 16, EGL_STENCIL_SIZE, 0, EGL_SAMPLE_BUFFERS, 0, EGL_SAMPLES, 0, EGL_SURFACE_TYPE, EGL_WINDOW_BIT, EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT, EGL_NONE }; EGLint w, h, format; EGLint numConfigs; EGLConfig config; EGLSurface surface; EGLContext context; EGLDisplay display = eglGetDisplay(EGL_DEFAULT_DISPLAY); eglInitialize(display, 0, 0); eglChooseConfig(display, attribs, &config, 1, &numConfigs); eglGetConfigAttrib(display, config, EGL_NATIVE_VISUAL_ID, &format); ANativeWindow_setBuffersGeometry(g_application->window, 0, 0, format); surface = eglCreateWindowSurface(display, config, g_application->window, NULL); eglQuerySurface(display, surface, EGL_WIDTH, &w); eglQuerySurface(display, surface, EGL_HEIGHT, &h); LOGI("EGL_WIDTH=%d, EGL_HEIGHT=%d",w,h); eglBindAPI(EGL_OPENGL_ES_API); EGLint contextAttrs[] = { EGL_CONTEXT_CLIENT_VERSION, 2, EGL_NONE }; context = eglCreateContext(display, config, NULL, contextAttrs); if (eglMakeCurrent(display, surface, surface, context) == EGL_FALSE) { LOGW("Unable to eglMakeCurrent"); return -1; } LOGI("OK init EGL !!!!");
My render code is below:
int m_rgbBufferSize = pPicture->iDisplayWidth*pPicture->iDisplayHeight*4; m_rgbBuffer = new unsigned char[m_rgbBufferSize]; struct SwsContext *m_sw_context=NULL; m_sw_context = m_dllSwScale.sws_getCachedContext(m_sw_context, pPicture->iWidth, pPicture->iHeight, PIX_FMT_YUV420P, pPicture->iWidth, pPicture->iHeight, PIX_FMT_RGBA, SWS_FAST_BILINEAR, NULL, NULL, NULL); uint8_t *psrc[] = { pPicture->data[0], pPicture->data[1], pPicture->data[2], 0 }; int srcStride[] = { pPicture->iLineSize[0], pPicture->iLineSize[1], pPicture->iLineSize[2], 0 }; uint8_t *dst[] = { m_rgbBuffer, 0, 0, 0 }; int dstStride[] = { pPicture->iDisplayWidth*4, 0, 0, 0 }; m_dllSwScale.sws_scale(m_sw_context, psrc, srcStride, 0, pPicture->iDisplayHeight, dst, dstStride); //LOGV("swscale OK"); glEnable(GL_TEXTURE_2D); glPixelStorei(GL_UNPACK_ALIGNMENT,1); if(textureid == 0) glGenTextures(1, &textureid); glActiveTexture(GL_TEXTURE2); glBindTexture(GL_TEXTURE_2D, textureid); glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, pPicture->iDisplayWidth, pPicture->iDisplayHeight, GL_RGBA, GL_UNSIGNED_BYTE, m_rgbBuffer);
-
Receiving Raw MPEG2-TS data received over RTSP(RTP) via FFMPEG
8 septembre 2014, par AGPI am trying to read raw MPEG2-TS data packets(188 byte) from an RTSP(RTP) stream. I can not figure how to code a client that opens stream and reads raw data. For other video formats and without raw data handling, it is much easier in ffmpeg.
How can i code this type of client in ffmpeg ?
Thanks
-
JavaCV on Android : UnsatisfiedLinkError from libjniavdevice.so / dlopen failed cannot locate symbol "av_input_audio_device_next"
8 septembre 2014, par liorgI am trying to work with JavaCV's FFmpegFrameGrabber on Android 4.4.2 / 4.3.1 .
Inside the onCreate(...) method I am calling:
FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(path)
This causes the following runtime error:
dlopen("/data/app-lib/com.example.testjavacv-1/libjniavdevice.so") failed: dlopen failed: cannot locate symbol "av_input_audio_device_next" referenced by "libjniavdevice.so"
Other (possible) relevant information:
1) I have included all the JavaCV .so libraries inside the libs/armeabi and libs/x86 folders.
2) I have given the application permissions to read/write to external storage and access the camera and microphone.
3) I am NOT statically loading any of the JavaCV native libraries before instantiating FFmpegFrameGrabber. This is because,
- Some of the libraries cause runtime exceptions when I attempt to load them statically by myself.
- I get the impression that JavaCV handles it's own native library loading - for example the error I get ocurrs even though I have not requested to load the
jniavdevice.so
library and happens only if the... new FFmpegFrameGrabber(...) ...
code is present.
4) I have taken a look at: https://groups.google.com/forum/#!topic/javacv/bgB5rCv_fVM which seems to address a similar issue, but the suggested solutions don't seem to work.
Any help on this issue would be much appreciated, Cheers!