
Recherche avancée
Médias (17)
-
Matmos - Action at a Distance
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
DJ Dolores - Oslodum 2004 (includes (cc) sample of “Oslodum” by Gilberto Gil)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Danger Mouse & Jemini - What U Sittin’ On ? (starring Cee Lo and Tha Alkaholiks)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Cornelius - Wataridori 2
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Rapture - Sister Saviour (Blackstrobe Remix)
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Chuck D with Fine Arts Militia - No Meaning No
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (43)
-
Contribute to translation
13 avril 2011You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
MediaSPIP is currently available in French and English (...) -
Other interesting software
13 avril 2011, parWe don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
We don’t know them, we didn’t try them, but you can take a peek.
Videopress
Website : http://videopress.com/
License : GNU/GPL v2
Source code : (...) -
Librairies et binaires spécifiques au traitement vidéo et sonore
31 janvier 2010, parLes logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
Binaires complémentaires et facultatifs flvtool2 : (...)
Sur d’autres sites (7704)
-
openGL ES 2.0 on android , YUV to RGB and Rendering with ffMpeg
14 octobre 2013, par 101110101100111111101101My renderer dies 1 2 frames later when video shows after.
FATAL ERROR 11 : blabla...(Exactly occurs in glDrawElements (Y part))
I think problem is 'glPixelStorei' or 'GL_RGB', 'GL_LUMINANCE' but.. I don't get it.
My rendering way :
-
Decode data that got from network, (SDK Getting-> NDK Decoding), Enqueueing.
-
Dequeueing another threads (of course synchronized) get ready to setup OpenGL ES 2.0.(SDK)
-
When onDrawFrame, onSurfaceCreated, onSurfaceChanged methods are called, it shrink down to NDK. (My Renderer source in NDK will attach below.)
-
Rendering.
As you know, Fragment shader is using for conversion.
My Data is YUV 420p (pix_fmt_YUV420p) (12bit per pixel)Here is my entire source.
I haven't any knowledge about OpenGL ES before, this is first time.
Please let me know what am I do improving performance.
and What am I use parameters in 'glTexImage2D', 'glTexSubImage2D', 'glRenderbufferStorage' ????
GL_LUMINANCE ? GL_RGBA ? GL_RGB ? (GL_LUMINANCE is using now)void Renderer::set_draw_frame(JNIEnv* jenv, jbyteArray yData, jbyteArray uData, jbyteArray vData)
{
for (int i = 0; i < 3; i++) {
if (yuv_data_[i] != NULL) {
free(yuv_data_[i]);
}
}
int YSIZE = -1;
int USIZE = -1;
int VSIZE = -1;
if (yData != NULL) {
YSIZE = (int)jenv->GetArrayLength(yData);
LOG_DEBUG("YSIZE : %d", YSIZE);
yuv_data_[0] = (unsigned char*)malloc(sizeof(unsigned char) * YSIZE);
memset(yuv_data_[0], 0, YSIZE);
jenv->GetByteArrayRegion(yData, 0, YSIZE, (jbyte*)yuv_data_[0]);
yuv_data_[0] = reinterpret_cast<unsigned>(yuv_data_[0]);
} else {
YSIZE = (int)jenv->GetArrayLength(yData);
yuv_data_[0] = (unsigned char*)malloc(sizeof(unsigned char) * YSIZE);
memset(yuv_data_[0], 1, YSIZE);
}
if (uData != NULL) {
USIZE = (int)jenv->GetArrayLength(uData);
LOG_DEBUG("USIZE : %d", USIZE);
yuv_data_[1] = (unsigned char*)malloc(sizeof(unsigned char) * USIZE);
memset(yuv_data_[1], 0, USIZE);
jenv->GetByteArrayRegion(uData, 0, USIZE, (jbyte*)yuv_data_[1]);
yuv_data_[1] = reinterpret_cast<unsigned>(yuv_data_[1]);
} else {
USIZE = YSIZE/4;
yuv_data_[1] = (unsigned char*)malloc(sizeof(unsigned char) * USIZE);
memset(yuv_data_[1], 1, USIZE);
}
if (vData != NULL) {
VSIZE = (int)jenv->GetArrayLength(vData);
LOG_DEBUG("VSIZE : %d", VSIZE);
yuv_data_[2] = (unsigned char*)malloc(sizeof(unsigned char) * VSIZE);
memset(yuv_data_[2], 0, VSIZE);
jenv->GetByteArrayRegion(vData, 0, VSIZE, (jbyte*)yuv_data_[2]);
yuv_data_[2] = reinterpret_cast<unsigned>(yuv_data_[2]);
} else {
VSIZE = YSIZE/4;
yuv_data_[2] = (unsigned char*)malloc(sizeof(unsigned char) * VSIZE);
memset(yuv_data_[2], 1, VSIZE);
}
glClearColor(1.0F, 1.0F, 1.0F, 1.0F);
check_gl_error("glClearColor");
glClear(GL_COLOR_BUFFER_BIT);
check_gl_error("glClear");
}
void Renderer::draw_frame()
{
// Binding created FBO
glBindFramebuffer(GL_FRAMEBUFFER, frame_buffer_object_);
check_gl_error("glBindFramebuffer");
// Add program to OpenGL environment
glUseProgram(program_object_);
check_gl_error("glUseProgram");
for (int i = 0; i < 3; i++) {
LOG_DEBUG("Success");
//Bind texture
glActiveTexture(GL_TEXTURE0 + i);
check_gl_error("glActiveTexture");
glBindTexture(GL_TEXTURE_2D, yuv_texture_id_[i]);
check_gl_error("glBindTexture");
glUniform1i(yuv_texture_object_[i], i);
check_gl_error("glBindTexture");
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, stream_yuv_width_[i], stream_yuv_height_[i], GL_RGBA, GL_UNSIGNED_BYTE, yuv_data_[i]);
check_gl_error("glTexSubImage2D");
}
LOG_DEBUG("Success");
// Load vertex information
glVertexAttribPointer(position_object_, 2, GL_FLOAT, GL_FALSE, kStride, kVertexInformation);
check_gl_error("glVertexAttribPointer");
// Load texture information
glVertexAttribPointer(texture_position_object_, 2, GL_SHORT, GL_FALSE, kStride, kTextureCoordinateInformation);
check_gl_error("glVertexAttribPointer");
LOG_DEBUG("9");
glEnableVertexAttribArray(position_object_);
check_gl_error("glEnableVertexAttribArray");
glEnableVertexAttribArray(texture_position_object_);
check_gl_error("glEnableVertexAttribArray");
// Back to window buffer
glBindFramebuffer(GL_FRAMEBUFFER, 0);
check_gl_error("glBindFramebuffer");
LOG_DEBUG("Success");
// Draw the Square
glDrawElements(GL_TRIANGLE_STRIP, 6, GL_UNSIGNED_SHORT, kIndicesInformation);
check_gl_error("glDrawElements");
}
void Renderer::setup_render_to_texture()
{
glGenFramebuffers(1, &frame_buffer_object_);
check_gl_error("glGenFramebuffers");
glBindFramebuffer(GL_FRAMEBUFFER, frame_buffer_object_);
check_gl_error("glBindFramebuffer");
glGenRenderbuffers(1, &render_buffer_object_);
check_gl_error("glGenRenderbuffers");
glBindRenderbuffer(GL_RENDERBUFFER, render_buffer_object_);
check_gl_error("glBindRenderbuffer");
glRenderbufferStorage(GL_RENDERBUFFER, GL_RGBA4, stream_yuv_width_[0], stream_yuv_height_[0]);
check_gl_error("glRenderbufferStorage");
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, render_buffer_object_);
check_gl_error("glFramebufferRenderbuffer");
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, yuv_texture_id_[0], 0);
check_gl_error("glFramebufferTexture2D");
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, yuv_texture_id_[1], 0);
check_gl_error("glFramebufferTexture2D");
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, yuv_texture_id_[2], 0);
check_gl_error("glFramebufferTexture2D");
glBindFramebuffer(GL_FRAMEBUFFER, 0);
check_gl_error("glBindFramebuffer");
GLint status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
if (status != GL_FRAMEBUFFER_COMPLETE) {
print_log("renderer.cpp", "setup_graphics", "FBO setting fault.", LOGERROR);
LOG_ERROR("%d\n", status);
return;
}
}
void Renderer::setup_yuv_texture()
{
// Use tightly packed data
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
check_gl_error("glPixelStorei");
for (int i = 0; i < 3; i++) {
if (yuv_texture_id_[i]) {
glDeleteTextures(1, &yuv_texture_id_[i]);
check_gl_error("glDeleteTextures");
}
glActiveTexture(GL_TEXTURE0+i);
check_gl_error("glActiveTexture");
// Generate texture object
glGenTextures(1, &yuv_texture_id_[i]);
check_gl_error("glGenTextures");
glBindTexture(GL_TEXTURE_2D, yuv_texture_id_[i]);
check_gl_error("glBindTexture");
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
check_gl_error("glTexParameteri");
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
check_gl_error("glTexParameteri");
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
check_gl_error("glTexParameterf");
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
check_gl_error("glTexParameterf");
glEnable(GL_TEXTURE_2D);
check_gl_error("glEnable");
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, maximum_yuv_width_[i], maximum_yuv_height_[i], 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
check_gl_error("glTexImage2D");
}
}
void Renderer::setup_graphics()
{
print_gl_string("Version", GL_VERSION);
print_gl_string("Vendor", GL_VENDOR);
print_gl_string("Renderer", GL_RENDERER);
print_gl_string("Extensions", GL_EXTENSIONS);
program_object_ = create_program(kVertexShader, kFragmentShader);
if (!program_object_) {
print_log("renderer.cpp", "setup_graphics", "Could not create program.", LOGERROR);
return;
}
position_object_ = glGetAttribLocation(program_object_, "vPosition");
check_gl_error("glGetAttribLocation");
texture_position_object_ = glGetAttribLocation(program_object_, "vTexCoord");
check_gl_error("glGetAttribLocation");
yuv_texture_object_[0] = glGetUniformLocation(program_object_, "yTexture");
check_gl_error("glGetUniformLocation");
yuv_texture_object_[1] = glGetUniformLocation(program_object_, "uTexture");
check_gl_error("glGetUniformLocation");
yuv_texture_object_[2] = glGetUniformLocation(program_object_, "vTexture");
check_gl_error("glGetUniformLocation");
setup_yuv_texture();
setup_render_to_texture();
glViewport(0, 0, stream_yuv_width_[0], stream_yuv_height_[0]);//736, 480);//1920, 1080);//maximum_yuv_width_[0], maximum_yuv_height_[0]);
check_gl_error("glViewport");
}
GLuint Renderer::create_program(const char* vertex_source, const char* fragment_source)
{
GLuint vertexShader = load_shader(GL_VERTEX_SHADER, vertex_source);
if (!vertexShader) {
return 0;
}
GLuint pixelShader = load_shader(GL_FRAGMENT_SHADER, fragment_source);
if (!pixelShader) {
return 0;
}
GLuint program = glCreateProgram();
if (program) {
glAttachShader(program, vertexShader);
check_gl_error("glAttachShader");
glAttachShader(program, pixelShader);
check_gl_error("glAttachShader");
glLinkProgram(program);
/* Get a Status */
GLint linkStatus = GL_FALSE;
glGetProgramiv(program, GL_LINK_STATUS, &linkStatus);
if (linkStatus != GL_TRUE) {
GLint bufLength = 0;
glGetProgramiv(program, GL_INFO_LOG_LENGTH, &bufLength);
if (bufLength) {
char* buf = (char*) malloc(bufLength);
if (buf) {
glGetProgramInfoLog(program, bufLength, NULL, buf);
print_log("renderer.cpp", "create_program", "Could not link program.", LOGERROR);
LOG_ERROR("%s\n", buf);
free(buf);
}
}
glDeleteProgram(program);
program = 0;
}
}
return program;
}
GLuint Renderer::load_shader(GLenum shaderType, const char* pSource)
{
GLuint shader = glCreateShader(shaderType);
if (shader) {
glShaderSource(shader, 1, &pSource, NULL);
glCompileShader(shader);
/* Get a Status */
GLint compiled = 0;
glGetShaderiv(shader, GL_COMPILE_STATUS, &compiled);
if (!compiled) {
GLint infoLen = 0;
glGetShaderiv(shader, GL_INFO_LOG_LENGTH, &infoLen);
if (infoLen) {
char* buf = (char*) malloc(infoLen);
if (buf) {
glGetShaderInfoLog(shader, infoLen, NULL, buf);
print_log("renderer.cpp", "load_shader", "Could not link program.", LOGERROR);
LOG_ERROR("%d :: %s\n", shaderType, buf);
free(buf);
}
glDeleteShader(shader);
shader = 0;
}
}
}
return shader;
}
void Renderer::onDrawFrame(JNIEnv* jenv, jbyteArray yData, jbyteArray uData, jbyteArray vData)
{
set_draw_frame(jenv, yData, uData, vData);
draw_frame();
return;
}
void Renderer::setSize(int stream_width, int stream_height) {
stream_yuv_width_[0] = stream_width;
stream_yuv_width_[1] = stream_width/2;
stream_yuv_width_[2] = stream_width/2;
stream_yuv_height_[0] = stream_height;
stream_yuv_height_[1] = stream_height/2;
stream_yuv_height_[2] = stream_height/2;
}
void Renderer::onSurfaceChanged(int width, int height)
{
mobile_yuv_width_[0] = width;
mobile_yuv_width_[1] = width/2;
mobile_yuv_width_[2] = width/2;
mobile_yuv_height_[0] = height;
mobile_yuv_height_[1] = height/2;
mobile_yuv_height_[2] = height/2;
maximum_yuv_width_[0] = 1920;
maximum_yuv_width_[1] = 1920/2;
maximum_yuv_width_[2] = 1920/2;
maximum_yuv_height_[0] = 1080;
maximum_yuv_height_[1] = 1080/2;
maximum_yuv_height_[2] = 1080/2;
// If stream size not setting, default size D1
//if (stream_yuv_width_[0] == 0) {
stream_yuv_width_[0] = 736;
stream_yuv_width_[1] = 736/2;
stream_yuv_width_[2] = 736/2;
stream_yuv_height_[0] = 480;
stream_yuv_height_[1] = 480/2;
stream_yuv_height_[2] = 480/2;
//}
setup_graphics();
return;
}
</unsigned></unsigned></unsigned>Here is my Fragment, Vertex source and coordinates :
static const char kVertexShader[] =
"attribute vec4 vPosition; \n"
"attribute vec2 vTexCoord; \n"
"varying vec2 v_vTexCoord; \n"
"void main() { \n"
"gl_Position = vPosition; \n"
"v_vTexCoord = vTexCoord; \n"
"} \n";
static const char kFragmentShader[] =
"precision mediump float; \n"
"varying vec2 v_vTexCoord; \n"
"uniform sampler2D yTexture; \n"
"uniform sampler2D uTexture; \n"
"uniform sampler2D vTexture; \n"
"void main() { \n"
"float y=texture2D(yTexture, v_vTexCoord).r;\n"
"float u=texture2D(uTexture, v_vTexCoord).r - 0.5;\n"
"float v=texture2D(vTexture, v_vTexCoord).r - 0.5;\n"
"float r=y + 1.13983 * v;\n"
"float g=y - 0.39465 * u - 0.58060 * v;\n"
"float b=y + 2.03211 * u;\n"
"gl_FragColor = vec4(r, g, b, 1.0);\n"
"}\n";
static const GLfloat kVertexInformation[] =
{
-1.0f, 1.0f, // TexCoord 0 top left
-1.0f,-1.0f, // TexCoord 1 bottom left
1.0f,-1.0f, // TexCoord 2 bottom right
1.0f, 1.0f // TexCoord 3 top right
};
static const GLshort kTextureCoordinateInformation[] =
{
0, 0, // TexCoord 0 top left
0, 1, // TexCoord 1 bottom left
1, 1, // TexCoord 2 bottom right
1, 0 // TexCoord 3 top right
};
static const GLuint kStride = 0;//COORDS_PER_VERTEX * 4;
static const GLshort kIndicesInformation[] =
{
0, 1, 2,
0, 2, 3
}; -
-
Error FFmpeg in Android
19 novembre 2013, par MarkI'm using FrameGrabber to load a video from sdcard and perform some image processing operation. I followed the tutorial at JavaCV site to load javacc and ffmpeg.
Currently, to run the application, I'm using my tablet (Asus TF 101G with Android version 4.0.3) connected to my pc and it works fine. But if I try to run the same application in another device (e.g. Nexus with Android 4.1.2) I get these errors :02-01 16:51:54.687: E/AndroidRuntime(2839): FATAL EXCEPTION: main
02-01 16:51:54.687: E/AndroidRuntime(2839): java.lang.ExceptionInInitializerError
02-01 16:51:54.687: E/AndroidRuntime(2839): at com.googlecode.javacv.FFmpegFrameGrabber.<init>(FFmpegFrameGrabber.java:104)
02-01 16:51:54.687: E/AndroidRuntime(2839): at com.example.com.uniud.avires.MainActivity.onCreate(MainActivity.java:92)
02-01 16:51:54.687: E/AndroidRuntime(2839): at android.app.Activity.performCreate(Activity.java:5008)
02-01 16:51:54.687: E/AndroidRuntime(2839): at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1079)
02-01 16:51:54.687: E/AndroidRuntime(2839): at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2023)
02-01 16:51:54.687: E/AndroidRuntime(2839): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2084)
02-01 16:51:54.687: E/AndroidRuntime(2839): at android.app.ActivityThread.access$600(ActivityThread.java:130)
02-01 16:51:54.687: E/AndroidRuntime(2839): at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1195)
02-01 16:51:54.687: E/AndroidRuntime(2839): at android.os.Handler.dispatchMessage(Handler.java:99)
02-01 16:51:54.687: E/AndroidRuntime(2839): at android.os.Looper.loop(Looper.java:137)
02-01 16:51:54.687: E/AndroidRuntime(2839): at android.app.ActivityThread.main(ActivityThread.java:4745)
02-01 16:51:54.687: E/AndroidRuntime(2839): at java.lang.reflect.Method.invokeNative(Native Method)
02-01 16:51:54.687: E/AndroidRuntime(2839): at java.lang.reflect.Method.invoke(Method.java:511)
02-01 16:51:54.687: E/AndroidRuntime(2839): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:786)
02-01 16:51:54.687: E/AndroidRuntime(2839): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:553)
02-01 16:51:54.687: E/AndroidRuntime(2839): at dalvik.system.NativeStart.main(Native Method)
02-01 16:51:54.687: E/AndroidRuntime(2839): Caused by: java.lang.ExceptionInInitializerError
02-01 16:51:54.687: E/AndroidRuntime(2839): at java.lang.Class.classForName(Native Method)
02-01 16:51:54.687: E/AndroidRuntime(2839): at java.lang.Class.forName(Class.java:217)
02-01 16:51:54.687: E/AndroidRuntime(2839): at com.googlecode.javacpp.Loader.load(Loader.java:338)
02-01 16:51:54.687: E/AndroidRuntime(2839): at com.googlecode.javacv.cpp.avcodec.<clinit>(avcodec.java:86)
02-01 16:51:54.687: E/AndroidRuntime(2839): ... 16 more
02-01 16:51:54.687: E/AndroidRuntime(2839): Caused by: java.lang.UnsatisfiedLinkError: Cannot load library: link_image[1891]: 26325 could not load needed library 'libavutil.so' for 'libjniavutil.so' (load_library[1093]: Library 'libavutil.so' not found)
02-01 16:51:54.687: E/AndroidRuntime(2839): at java.lang.Runtime.load(Runtime.java:340)
02-01 16:51:54.687: E/AndroidRuntime(2839): at java.lang.System.load(System.java:521)
02-01 16:51:54.687: E/AndroidRuntime(2839): at com.googlecode.javacpp.Loader.loadLibrary(Loader.java:422)
02-01 16:51:54.687: E/AndroidRuntime(2839): at com.googlecode.javacpp.Loader.load(Loader.java:372)
02-01 16:51:54.687: E/AndroidRuntime(2839): at com.googlecode.javacpp.Loader.load(Loader.java:319)
02-01 16:51:54.687: E/AndroidRuntime(2839): at com.googlecode.javacv.cpp.avutil.<clinit>(avutil.java:75)
02-01 16:51:54.687: E/AndroidRuntime(2839): ... 20 more
</clinit></clinit></init>Any suggestion please ?
I want to attach also my Android.mk file
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
OPENCV_LIB_TYPE:=STATIC
include C:/Users/Marco/Desktop/new_ADT/adt-bundle-windows-x86_64/OpenCV-2.4.3.2-android-sdk/sdk/native/jni/OpenCV.mk
OPENCV_INSTALL_MODULES:=on
#OPENCV_CAMERA_MODULES:=on
LOCAL_MODULE := udmodule
LOCAL_SRC_FILES := udjni.cpp
LOCAL_LDLIBS += -llog -ldl
include $(BUILD_SHARED_LIBRARY)and my Apllication.mk file
APP_STL := gnustl_static
APP_CPPFLAGS := -frtti -fexceptions
APP_ABI := armeabi armeabi-v7a
#APP_ABI := all
APP_PLATFORM := android-8Any suggestion ?
ThanksAlso I tried to change the application.mk file in this way but with no results :
APP_STL := gnustl_static
APP_CPPFLAGS := -frtti -fexceptions
APP_ABI := armeabi
LOCAL_CFLAGS += -march=armv7-a -mfloat-abi=softfp -mfpu=vfp -mfpu=neon
LOCAL_LDLIBS += -Wl,--fix-cortex-a8
APP_PLATFORM := android-8 -
How to stream to ffserver from android
8 mars 2013, par kevI need to stream from an android camera/ file to a remote ffserver which will broadcast my video. I can do this on the desktop in ubuntu by issuing a command like :
ffmpeg -f video4linux2 -s 640x480 -r 25 -i /dev/video0 http://192.168.0.20:8090/cam1.ffm
or stream a file like this :
ffmpeg -i /home/kev/share/movie.mp4 http://192.168.0.20:8090/cam1.ffm
So basically i want to be able to do the above from android. After several searches this is what i've done so far - i came across this link http://bambuser.com/opensource from which i downloaded the ffmpeg source and built it. The build outputs several things :
1. shared libs [libavcodec, libavcore, libavdevice, libavfilter,libavformat,libavutil,libswscale]
2. executables [ffmpeg,ffprobe]Not sure how to plug my functionality with these resources this is what i've tried so far :
1. loaded the libs in my Activity using System.loadLibrary() then copied the ffmpeg executable to the assets folder which at runtime i copied to my application's "files" directory i then set permissions for the executable using Runtime.getRuntime().exec(). then the last step was to execute it in java with the following statement :Runtime.getRuntime().exec("ffmpeg -i file:///android_asset/movie.mp4http://<server>:8090/cam1.ffm");
</server>2. copied ffmpeg.c,the shared libraries and the "include" folder that was generated by the build to my jni folder and added a jni function that wraps around the main() function in ffmpeg.c. With this approach i've found myself having to copy several header files from the ffmpeg source for the ndk-build to succeed and i highly doubt if this is the way to go.
The above two approaches havnt worked for me, i'm not sure where i'm going wrong, so any help on how to do a simple ffmpeg streaming like an mp4 file from android would be highly appreciated.