
Recherche avancée
Autres articles (82)
-
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs -
Personnaliser les catégories
21 juin 2013, parFormulaire de création d’une catégorie
Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
On peut modifier ce formulaire dans la partie :
Administration > Configuration des masques de formulaire.
Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...) -
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page.
Sur d’autres sites (10147)
-
APP crashes on SDL_INIT
24 décembre 2014, par user2270995I have been working on getting FFMPEG and SDL2 to get working on android but was unable and getting errors (Please have a look at my question for those errors ). However i finally got rid of that
SDL_main
error by using#undef main
after including all SDL header files and callingSDL_SetMainReady()
but now my app is crashing on call toSDL_INIT
onAndroid_JNI_GetTouchDeviceIds()
inSDL_android.c
.MY SDL and FFMPEG initializing Function
void Java_org_libsdl_app_MyClass_init(JNIEnv *env, jobject this)
{
SDL_SetMainReady();
LOGE("SDL main set ready");
LOGE("SDl initializing starting");
if(SDL_Init(SDL_INIT_VIDEO | SDL_INIT_AUDIO | SDL_INIT_TIMER))
{
LOGE("Could not initialize SDL - %s\n", SDL_GetError());
return;
}
LOGE("SDL initialized");
........... // other initializations
}CRASH DUMP
********** Crash dump: **********
Build fingerprint: 'micromax/s9311/s9311:4.2.1/JOP40D/:user/release-keys'
pid: 5222, tid: 5222, name: org.libsdl.app >>> org.libsdl.app <<<
signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr deadd00d
Stack frame #00 pc 00045ffc /system/lib/libdvm.so (dvmAbort+75)
Stack frame #01 pc 0003916f /system/lib/libdvm.so
Stack frame #02 pc 0003a22b /system/lib/libdvm.so
Stack frame #03 pc 0003e321 /system/lib/libdvm.so
Stack frame #04 pc 000b0c6c /data/app-lib/org.libsdl.app-1/libSDL2.so (Android_JNI_GetTouchDeviceIds+84): Routine Android_JNI_GetTouchDeviceIds at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/core/android/SDL_android.c:1261
Stack frame #05 pc 0018aeb4 /data/app-lib/org.libsdl.app-1/libSDL2.so (Android_InitTouch+20): Routine Android_InitTouch at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/video/android/SDL_androidtouch.c:58
Stack frame #06 pc 0018b4dc /data/app-lib/org.libsdl.app-1/libSDL2.so: Routine Android_VideoInit at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/video/android/SDL_androidvideo.c:169
Stack frame #07 pc 00182664 /data/app-lib/org.libsdl.app-1/libSDL2.so (SDL_VideoInit_REAL+776): Routine SDL_VideoInit_REAL at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/video/SDL_video.c:494
Stack frame #08 pc 000182e4 /data/app-lib/org.libsdl.app-1/libSDL2.so (SDL_InitSubSystem_REAL+308): Routine SDL_InitSubSystem_REAL at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/SDL.c:173
Stack frame #09 pc 00018444 /data/app-lib/org.libsdl.app-1/libSDL2.so (SDL_Init_REAL+20): Routine SDL_Init_REAL at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/SDL.c:244
Stack frame #10 pc 000badd0 /data/app-lib/org.libsdl.app-1/libSDL2.so (SDL_Init+32): Routine SDL_Init at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/dynapi/SDL_dynapi_procs.h:89
Stack frame #11 pc 00001934 /data/app-lib/org.libsdl.app-1/libmain.so (Java_org_libsdl_app_MyClass_init+52): Routine Java_org_libsdl_app_MyClass_init at /home/user/Android/Eclipse-workspace/FSDL2//jni/src/../src/main.c:66
Stack frame #12 pc 0001e4d0 /system/lib/libdvm.so (dvmPlatformInvoke+112)
Stack frame #13 pc 0004dd21 /system/lib/libdvm.so (dvmCallJNIMethod(unsigned int const*, JValue*, Method const*, Thread*)+500)
Stack frame #14 pc 00050097 /system/lib/libdvm.so (dvmResolveNativeMethod(unsigned int const*, JValue*, Method const*, Thread*)+174)
Stack frame #15 pc 000278a0 /system/lib/libdvm.so
Stack frame #16 pc 0002b7fc /system/lib/libdvm.so (dvmInterpret(Thread*, Method const*, JValue*)+180)
Stack frame #17 pc 000612bf /system/lib/libdvm.so (dvmInvokeMethod(Object*, Method const*, ArrayObject*, ArrayObject*, ClassObject*, bool)+374)
Stack frame #18 pc 000690e5 /system/lib/libdvm.so
Stack frame #19 pc 000278a0 /system/lib/libdvm.so
Stack frame #20 pc 0002b7fc /system/lib/libdvm.so (dvmInterpret(Thread*, Method const*, JValue*)+180)
Stack frame #21 pc 00060f99 /system/lib/libdvm.so (dvmCallMethodV(Thread*, Method const*, Object*, bool, JValue*, std::__va_list)+272)
Stack frame #22 pc 00049ff9 /system/lib/libdvm.so
Stack frame #23 pc 0004d129 /system/lib/libandroid_runtime.so
Stack frame #24 pc 0004decd /system/lib/libandroid_runtime.so (android::AndroidRuntime::start(char const*, char const*)+400)
Stack frame #25 pc 0000208f /system/bin/app_process: Routine ??
??:0
Stack frame #26 pc 0001bd98 /system/lib/libc.so (__libc_init+64): Routine ??
??:0
Stack frame #27 pc 00001bd4 /system/bin/app_process: Routine ??
??:0
Stack frame #00 pc 000012a0 /system/lib/libcorkscrew.so (unwind_backtrace_thread+27)
Stack frame #01 pc 0006107e /system/lib/libdvm.so (dvmDumpNativeStack(DebugOutputTarget const*, int)+53)
Stack frame #02 pc 000549ae /system/lib/libdvm.so (dvmDumpThreadEx(DebugOutputTarget const*, Thread*, bool)+329)
Stack frame #03 pc 00054a4e /system/lib/libdvm.so (dvmDumpThread(Thread*, bool)+25)
Stack frame #04 pc 00038e9a /system/lib/libdvm.so
Stack frame #05 pc 0003916e /system/lib/libdvm.so
Stack frame #06 pc 0003a22a /system/lib/libdvm.so
Stack frame #07 pc 0003e320 /system/lib/libdvm.so
Stack frame #08 pc 000b0c6c /data/app-lib/org.libsdl.app-1/libSDL2.so (Android_JNI_GetTouchDeviceIds+84): Routine Android_JNI_GetTouchDeviceIds at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/core/android/SDL_android.c:1261
Stack frame #09 pc 0018aeb4 /data/app-lib/org.libsdl.app-1/libSDL2.so (Android_InitTouch+20): Routine Android_InitTouch at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/video/android/SDL_androidtouch.c:58
Stack frame #10 pc 0018b4dc /data/app-lib/org.libsdl.app-1/libSDL2.so: Routine Android_VideoInit at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/video/android/SDL_androidvideo.c:169
Stack frame #11 pc 00182664 /data/app-lib/org.libsdl.app-1/libSDL2.so (SDL_VideoInit_REAL+776): Routine SDL_VideoInit_REAL at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/video/SDL_video.c:494
Stack frame #12 pc 000182e4 /data/app-lib/org.libsdl.app-1/libSDL2.so (SDL_InitSubSystem_REAL+308): Routine SDL_InitSubSystem_REAL at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/SDL.c:173
Stack frame #13 pc 00018444 /data/app-lib/org.libsdl.app-1/libSDL2.so (SDL_Init_REAL+20): Routine SDL_Init_REAL at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/SDL.c:244
Stack frame #14 pc 000badd0 /data/app-lib/org.libsdl.app-1/libSDL2.so (SDL_Init+32): Routine SDL_Init at /home/user/Android/Eclipse-workspace/FSDL2//jni/SDL/src/dynapi/SDL_dynapi_procs.h:89
Stack frame #15 pc 00001934 /data/app-lib/org.libsdl.app-1/libmain.so (Java_org_libsdl_app_MyClass_init+52): Routine Java_org_libsdl_app_MyClass_init at /home/user/Android/Eclipse-workspace/FSDL2//jni/src/../src/main.c:66
Stack frame #16 pc 0001e4d0 /system/lib/libdvm.so (dvmPlatformInvoke+112)
Stack frame #17 pc 0004dd20 /system/lib/libdvm.so (dvmCallJNIMethod(unsigned int const*, JValue*, Method const*, Thread*)+499)
Stack frame #18 pc 00050094 /system/lib/libdvm.so (dvmResolveNativeMethod(unsigned int const*, JValue*, Method const*, Thread*)+171)
Stack frame #19 pc 000278a0 /system/lib/libdvm.so
Stack frame #20 pc 0002b7fc /system/lib/libdvm.so (dvmInterpret(Thread*, Method const*, JValue*)+180)
Stack frame #21 pc 000612be /system/lib/libdvm.so (dvmInvokeMethod(Object*, Method const*, ArrayObject*, ArrayObject*, ClassObject*, bool)+373)
Stack frame #22 pc 000690e4 /system/lib/libdvm.so
Stack frame #23 pc 000278a0 /system/lib/libdvm.so
Stack frame #24 pc 0002b7fc /system/lib/libdvm.so (dvmInterpret(Thread*, Method const*, JValue*)+180)
Stack frame #25 pc 00060f98 /system/lib/libdvm.so (dvmCallMethodV(Thread*, Method const*, Object*, bool, JValue*, std::__va_list)+271)
Stack frame #26 pc 00049ff8 /system/lib/libdvm.so
Stack frame #27 pc 0004d126 /system/lib/libandroid_runtime.so
Stack frame #28 pc 0004decc /system/lib/libandroid_runtime.so (android::AndroidRuntime::start(char const*, char const*)+399)
Stack frame #29 pc 0000208e /system/bin/app_process: Routine ??
??:0
Stack frame #30 pc 0001bd98 /system/lib/libc.so (__libc_init+64): Routine ??
??:0
Crash dump is completedHow do i solve it, Please help (i am getting frustrated on SDL).
edit :-
is this error related to android permissions ? -
How can I encode and segment audio files without having gaps (or audio pops) between segments when I reconstruct it ?
16 mai 2013, par fenduruI'm working on a web application that requires streaming and synchronization of multiple audio files. For this, I am using the Web Audio API over HTML5 audio tags because of the importance of timing audio.
Currently, I'm using FFMPEG's segmentation feature to encode and segment the audio files into smaller chunks. The reason I am segmenting them is so I can start streaming from the middle of the file instead of starting from the beginning (otherwise I would've just split the files using UNIX split, as shown here. The problem is that when I string the audio segments back together, I get an audio pop between segments.
If I encode the segments using a PCM encoding (pcm_s24le) in a .wav file, the playback is seamless, which leads me to believe that the encoder is padding either the beginning or the end of the file. Since I will be dealing with many different audio files, using .wav would require far too much bandwidth.
I'm looking to one of the following solutions to the problem :
- How can I segment encoded audio files seamlessly,
- How can I force an encoder to NOT pad audio frames using ffmpeg (or another utility), or
- What is a better way to stream audio (starting at an arbitrary track time) without using an audio tag ?
System Information
- Custom node.js server
- Upon upload of an audio file, node.js pipes the data into ffmpeg's encoder
- Need to use HTML5 Web Audio API supported encoding
- Server sends audio chunks 1 at a time through a WebSockets socket
Thanks in advance. I've tried to be as clear as possible but if you need clarification I'd be more than willing to provide it.
-
Confused about x264 and encoding video frames
26 février 2015, par spartygwI built a test driver for encoding a series of images I have captured. I am using libx264 and based my driver off of this guy’s answer :
In my case I am starting out by reading in a JPG image and converting to YUV and passing that same frame over and over in a loop to the x264 encoder.
My expectation was that since the frame is the same that the output from the encoder would be very small and constant.
Instead I find that the NAL payload is varied from a few bytes to a few KB and also varies highly depending on the frame rate I specify in the encoder parameters.
Obviously I don’t understand video encoding. Why does the output size vary so much ?
int main()
{
Image image(WIDTH, HEIGHT);
image.FromJpeg("frame-1.jpg");
unsigned char *data = image.GetRGB();
x264_param_t param;
x264_param_default_preset(&param, "fast", "zerolatency");
param.i_threads = 1;
param.i_width = WIDTH;
param.i_height = HEIGHT;
param.i_fps_num = FPS;
param.i_fps_den = 1;
// Intra refres:
param.i_keyint_max = FPS;
param.b_intra_refresh = 1;
//Rate control:
param.rc.i_rc_method = X264_RC_CRF;
param.rc.f_rf_constant = FPS-5;
param.rc.f_rf_constant_max = FPS+5;
//For streaming:
param.b_repeat_headers = 1;
param.b_annexb = 1;
x264_param_apply_profile(&param, "baseline");
// initialize the encoder
x264_t* encoder = x264_encoder_open(&param);
x264_picture_t pic_in, pic_out;
x264_picture_alloc(&pic_in, X264_CSP_I420, WIDTH, HEIGHT);
// X264 expects YUV420P data use libswscale
// (from ffmpeg) to convert images to the right format
struct SwsContext* convertCtx =
sws_getContext(WIDTH, HEIGHT, PIX_FMT_RGB24, WIDTH, HEIGHT,
PIX_FMT_YUV420P, SWS_FAST_BILINEAR,
NULL, NULL, NULL);
// encoding is as simple as this then, for each frame do:
// data is a pointer to your RGB structure
int srcstride = WIDTH*3; //RGB stride is just 3*width
sws_scale(convertCtx, &data, &srcstride, 0, HEIGHT,
pic_in.img.plane, pic_in.img.i_stride);
x264_nal_t* nals;
int i_nals;
int frame_size =
x264_encoder_encode(encoder, &nals, &i_nals, &pic_in, &pic_out);
int max_loop=15;
int this_loop=1;
while (frame_size >= 0 && --max_loop)
{
cout << "------------" << this_loop++ << "-----------------\n";
cout << "Frame size = " << frame_size << endl;
cout << "output has " << pic_out.img.i_csp << " colorspace\n";
cout << "output has " << pic_out.img.i_plane << " # img planes\n";
cout << "i_nals = " << i_nals << endl;
for (int n=0; n