
Recherche avancée
Autres articles (61)
-
(Dés)Activation de fonctionnalités (plugins)
18 février 2011, parPour gérer l’ajout et la suppression de fonctionnalités supplémentaires (ou plugins), MediaSPIP utilise à partir de la version 0.2 SVP.
SVP permet l’activation facile de plugins depuis l’espace de configuration de MediaSPIP.
Pour y accéder, il suffit de se rendre dans l’espace de configuration puis de se rendre sur la page "Gestion des plugins".
MediaSPIP est fourni par défaut avec l’ensemble des plugins dits "compatibles", ils ont été testés et intégrés afin de fonctionner parfaitement avec chaque (...) -
Le plugin : Podcasts.
14 juillet 2010, parLe problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
Types de fichiers supportés dans les flux
Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (7728)
-
Running ffmpeg as library in android
4 février 2014, par szakalI've got a simple task to do. I need to merge set of pictures into a video using ffmpeg working in android environment.
After over a week fighting with different tutorials and examples explaining how to run compile ffmpeg I have, let's say, middle success. I've finally compiled ffmpeg for android.
I followed this example :
https://github.com/appunite/AndroidFFmpeg
which worked best for me.As a result of building ffmpeg a have following directory structure :
[Project]/jni/ffmpeg-build/armeabi-v7a/libffmpeg.so
[Project]/jni/ffmpeg-build/armeabi/libffmpeg.so
[Project]/jni/ffmpeg-build/mips/libffmpeg.so
[Project]/jni/ffmpeg-build/x86/libffmpeg.soI also followed the ndk examples so I have running c code from java :
#include
#include
#include
#include
#include
#include <android></android>log.h>
#include
#include
bool initted = false;
static JavaVM *sVm;
jstring Java_com_example_hellojni_HelloJni_stringFromJNI(JNIEnv* env, jobject thiz) {
char **argv;
char *cmd;
int argc;
// cmd = "ffmpeg -version";
// argv = parsedargs(cmd, &argc);
// ffmpeg(argc, argv);
return (*env)->NewStringUTF(env, "Hello from JNI !");
}My question is how to run function from ffmpeg from my "hello-jni" c-file. I've read I need to write a wrapper over ffmpeg which my hello-jni is intended to be.
Here is my Android.mk which probably is importat part to achieve my goal, but honestly I don't understand some lines set in this file. Or simply I don't know how to make things work.
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := ffmpeg-prebuilt
LOCAL_SRC_FILES := ffmpeg-build/$(TARGET_ARCH_ABI)/libffmpeg.so
LOCAL_EXPORT_C_INCLUDES := ffmpeg-build/$(TARGET_ARCH_ABI)/include
LOCAL_EXPORT_LDLIBS := ffmpeg-build/$(TARGET_ARCH_ABI)/libffmpeg.so
LOCAL_PRELINK_MODULE := true
include $(PREBUILT_SHARED_LIBRARY)
include $(CLEAR_VARS)
LOCAL_ALLOW_UNDEFINED_SYMBOLS=true
LOCAL_MODULE := hello-jni
LOCAL_SRC_FILES := hello-jni.c
LOCAL_C_INCLUDES := $(LOCAL_PATH)/ffmpeg-build/$(TARGET_ARCH_ABI)/include
LOCAL_SHARED_LIBRARY := ffmpeg-prebuilt
#LOCAL_CFLAGS += -g -Iffmpeg-prebuilt -Ihello-jni -Wno-deprecated-declarations
#LOCAL_LDLIBS += -llog -lz -landroid ffmpeg-build/$(TARGET_ARCH_ABI)/libffmpeg.so
include $(BUILD_SHARED_LIBRARY)One more thing. I've found an example how to wrap ffmpeg's main function. It'd be the easiest way to use ffmpeg for me sinse I don't know ffmpeg's api and I hope it's possible to run ffmpeg this way :
Can FFmpeg be used as a library, instead of a standalone program ?To sum up, I think my problems are due to completely lack of c/c++ knowledge at all, especially how to use run any function from .so library.
I hope someone can help me :).
-
FFmpeg tee muxer
14 février 2013, par deed02392I'm trying to use the Tee muxer recently introduced to FFmpeg, to both output a stream over UDP and render the output in an SDL window.
This is because I want to measure latency in my live broadcast system between the input to FFmpeg and its output (i.e. the encoding process latency).
Can anyone explain how to use it properly ? This is what I have tried :
ffmpeg -re -i ......\sample_ipod.m4v -f tee "[f=mpegts]udp ://127.0.0.1:1233|[f=sdl]foo"
I understood this should result in an output stream to UDP and that I could specify additional formats by enclosing like [f=sdl] to mean
-f sdl
. Then because the sdl muxer requires an argument (the window title) I provided this in the same way the example gives for the stream.Hope some FFmpeg expert can shed some light on this. I've even scanned the code that adds the Tee muxer to no avail.
-
Is 'Android+FFMpeg' friendship really available ?
26 janvier 2016, par vold_byThe question does not mean that I’m interested if ffmpeg code can be used on Andoid. I know that it can. I’m just asking if somebody has the real performance progress with that stuff.
I’ve created the question after several weeks of experiments with the stuff and I’ve had enough...
I do not want to write to branches where people even do not say what kind of video they decode (resolution, codec) and talk only about some mystical FPS. I just don’t understand what they want to do. Also I’m not going to develop application only for my phone or for Android 2.2++ phones that have some extended OpenGL features. I have quite popular phone HTC Desire so if the application does not work on it, so what’s next ?Well, what do I have ?
-
FFMpeg source from the latest HEAD branch. Actually I could not buld it with NDK5 so I decided to use stolen one.
-
Bambuser’s build script (bash) with appropriate ffmpeg source ([web] : http://bambuser.com/r/opensource/ffmpeg-4f7d2fe-android-2011-03-07.tar.gz).
It builds well after some corrections by using NDK5. -
Rockplayer’s gelded ffmpeg source code with huge Android.mk in the capacity of build script ([web] : http://www.rockplayer.com/download/rockplayer_ffmpeg_git_20100418.zip).
It builds by NDK3 and NDK5 after some corrections. Rockplayer is probably the most cool media player for Android and I supposed that I would have some perks using it’s build.
I had suitable video for a project (is not big and is not small) : 600x360 H.264.
Both libraries we got from clauses 2 and 3 provide us possibility to get frames from video (frame-by-frame, seek etc.). I did not try to get an audio track because I did not need one for the project. I’m not publishing my source here because I think that’s traditional and it’s easy to find.
Well, what’s the results with video ?
HTC Desire, Android 2.2
600x360, H.264
decoding and rendering are in different threads- Bambuser (NDK5 buld for armv5te, RGBA8888) : 33 ms/frame average.
- Rockplayer (NDK3 build for neon, RGB565) : 27 ms/frame average.
It’s not bad for the first look, but just think that these are results only to decode frames.
If somebody has much better results with decoding time, let me know.The most hard thing for a video is rendering. If we have bitmap 600x360 we should scale one somehow before painting because different phones have different screen sizes and we can not expect that our video will be the same size as screen.
What options do we have to rescale a frame to fit it to screen ?
I was able to check (the same phone and video source) those cases :- sws_scale() C function in Bambuser’s build : 70 ms/frame. Unacceptable.
- Stupid bitmap rescaling in Android (Bitmap.createScaledBitmap) : 65 ms/frame. Unacceptable.
- OpenGL rendering in ortho projection on textured quad. In this case I did not need to scale frame. I just needed to prepare texture 1024x512 (in my case it was RGBA8888) containig frame pixels and than load it in GPU (gl.glTexImage2D). Result : 220 ms/frame to render. Unacceptable. I did not expect that glTexImage2D just sucked on Snapdragon CPU.
That’s all.
I know that there is some way to use fragment shader to convert YUV pixels using GPU, but we will have the same glTexImage2D and 200 ms just to texture loading.But this is not the end. ...my only friend the end... :)
It’s not hopeless condition.Trying to use RockPlayer you definitely will wonder how they do that damn frame scaling so fast. I suppose that they have really good experience in ARM achitecture. They most probably use avcodec_decode_video2 and than img_convert (as I did in RP version), but then they use some tricks (depends of ARM version) for scaling.
Maybe they also have some "magic" buld configuration for ffmpeg decreasing decoding time but Android.mk that they published is not THE Android.mk they use. Dunno...So, now it looks like you can not just buld some easy JNI bridge for ffmpeg and than have real media player for Android platform. You can do this only if you have suitable video that you do not need to scale.
Any ideas ? I hope for you ;)
-