Recherche avancée

Médias (9)

Mot : - Tags -/soundtrack

Autres articles (67)

  • Gestion des droits de création et d’édition des objets

    8 février 2011, par

    Par défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;

  • Dépôt de média et thèmes par FTP

    31 mai 2013, par

    L’outil MédiaSPIP traite aussi les média transférés par la voie FTP. Si vous préférez déposer par cette voie, récupérez les identifiants d’accès vers votre site MédiaSPIP et utilisez votre client FTP favori.
    Vous trouverez dès le départ les dossiers suivants dans votre espace FTP : config/ : dossier de configuration du site IMG/ : dossier des média déjà traités et en ligne sur le site local/ : répertoire cache du site web themes/ : les thèmes ou les feuilles de style personnalisées tmp/ : dossier de travail (...)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

Sur d’autres sites (7800)

  • Running ffmpeg as library in android

    4 février 2014, par szakal

    I've got a simple task to do. I need to merge set of pictures into a video using ffmpeg working in android environment.

    After over a week fighting with different tutorials and examples explaining how to run compile ffmpeg I have, let's say, middle success. I've finally compiled ffmpeg for android.

    I followed this example :
    https://github.com/appunite/AndroidFFmpeg
    which worked best for me.

    As a result of building ffmpeg a have following directory structure :

    [Project]/jni/ffmpeg-build/armeabi-v7a/libffmpeg.so
    [Project]/jni/ffmpeg-build/armeabi/libffmpeg.so
    [Project]/jni/ffmpeg-build/mips/libffmpeg.so
    [Project]/jni/ffmpeg-build/x86/libffmpeg.so

    I also followed the ndk examples so I have running c code from java :

    #include
    #include
    #include
    #include
    #include    
    #include <android></android>log.h>    
    #include
    #include

    bool initted = false;    
    static JavaVM *sVm;

    jstring Java_com_example_hellojni_HelloJni_stringFromJNI(JNIEnv* env, jobject thiz) {

       char **argv;
       char *cmd;
       int argc;

    //  cmd = "ffmpeg -version";
    //  argv = parsedargs(cmd, &amp;argc);
    //  ffmpeg(argc, argv);

       return (*env)->NewStringUTF(env, "Hello from JNI !");

    }

    My question is how to run function from ffmpeg from my "hello-jni" c-file. I've read I need to write a wrapper over ffmpeg which my hello-jni is intended to be.

    Here is my Android.mk which probably is importat part to achieve my goal, but honestly I don't understand some lines set in this file. Or simply I don't know how to make things work.

    LOCAL_PATH := $(call my-dir)

    include $(CLEAR_VARS)
    LOCAL_MODULE := ffmpeg-prebuilt
    LOCAL_SRC_FILES := ffmpeg-build/$(TARGET_ARCH_ABI)/libffmpeg.so
    LOCAL_EXPORT_C_INCLUDES := ffmpeg-build/$(TARGET_ARCH_ABI)/include
    LOCAL_EXPORT_LDLIBS := ffmpeg-build/$(TARGET_ARCH_ABI)/libffmpeg.so
    LOCAL_PRELINK_MODULE := true
    include $(PREBUILT_SHARED_LIBRARY)


    include $(CLEAR_VARS)
    LOCAL_ALLOW_UNDEFINED_SYMBOLS=true
    LOCAL_MODULE    := hello-jni
    LOCAL_SRC_FILES := hello-jni.c
    LOCAL_C_INCLUDES := $(LOCAL_PATH)/ffmpeg-build/$(TARGET_ARCH_ABI)/include
    LOCAL_SHARED_LIBRARY := ffmpeg-prebuilt
    #LOCAL_CFLAGS += -g -Iffmpeg-prebuilt -Ihello-jni -Wno-deprecated-declarations
    #LOCAL_LDLIBS += -llog -lz -landroid ffmpeg-build/$(TARGET_ARCH_ABI)/libffmpeg.so

    include $(BUILD_SHARED_LIBRARY)

    One more thing. I've found an example how to wrap ffmpeg's main function. It'd be the easiest way to use ffmpeg for me sinse I don't know ffmpeg's api and I hope it's possible to run ffmpeg this way :
    Can FFmpeg be used as a library, instead of a standalone program ?

    To sum up, I think my problems are due to completely lack of c/c++ knowledge at all, especially how to use run any function from .so library.

    I hope someone can help me :).

  • FFmpeg tee muxer

    14 février 2013, par deed02392

    I'm trying to use the Tee muxer recently introduced to FFmpeg, to both output a stream over UDP and render the output in an SDL window.

    This is because I want to measure latency in my live broadcast system between the input to FFmpeg and its output (i.e. the encoding process latency).

    Can anyone explain how to use it properly ? This is what I have tried :

    ffmpeg -re -i ......\sample_ipod.m4v -f tee "[f=mpegts]udp ://127.0.0.1:1233|[f=sdl]foo"

    I understood this should result in an output stream to UDP and that I could specify additional formats by enclosing like [f=sdl] to mean -f sdl. Then because the sdl muxer requires an argument (the window title) I provided this in the same way the example gives for the stream.

    Hope some FFmpeg expert can shed some light on this. I've even scanned the code that adds the Tee muxer to no avail.

  • Is 'Android+FFMpeg' friendship really available ?

    26 janvier 2016, par vold_by

    The question does not mean that I’m interested if ffmpeg code can be used on Andoid. I know that it can. I’m just asking if somebody has the real performance progress with that stuff.
    I’ve created the question after several weeks of experiments with the stuff and I’ve had enough...
    I do not want to write to branches where people even do not say what kind of video they decode (resolution, codec) and talk only about some mystical FPS. I just don’t understand what they want to do. Also I’m not going to develop application only for my phone or for Android 2.2++ phones that have some extended OpenGL features. I have quite popular phone HTC Desire so if the application does not work on it, so what’s next ?

    Well, what do I have ?

    1. FFMpeg source from the latest HEAD branch. Actually I could not buld it with NDK5 so I decided to use stolen one.

    2. Bambuser’s build script (bash) with appropriate ffmpeg source ([web] : http://bambuser.com/r/opensource/ffmpeg-4f7d2fe-android-2011-03-07.tar.gz).
      It builds well after some corrections by using NDK5.

    3. Rockplayer’s gelded ffmpeg source code with huge Android.mk in the capacity of build script ([web] : http://www.rockplayer.com/download/rockplayer_ffmpeg_git_20100418.zip).
      It builds by NDK3 and NDK5 after some corrections. Rockplayer is probably the most cool media player for Android and I supposed that I would have some perks using it’s build.

    I had suitable video for a project (is not big and is not small) : 600x360 H.264.

    Both libraries we got from clauses 2 and 3 provide us possibility to get frames from video (frame-by-frame, seek etc.). I did not try to get an audio track because I did not need one for the project. I’m not publishing my source here because I think that’s traditional and it’s easy to find.

    Well, what’s the results with video ?
    HTC Desire, Android 2.2
    600x360, H.264
    decoding and rendering are in different threads

    1. Bambuser (NDK5 buld for armv5te, RGBA8888) : 33 ms/frame average.
    2. Rockplayer (NDK3 build for neon, RGB565) : 27 ms/frame average.

    It’s not bad for the first look, but just think that these are results only to decode frames.
    If somebody has much better results with decoding time, let me know.

    The most hard thing for a video is rendering. If we have bitmap 600x360 we should scale one somehow before painting because different phones have different screen sizes and we can not expect that our video will be the same size as screen.

    What options do we have to rescale a frame to fit it to screen ?
    I was able to check (the same phone and video source) those cases :

    1. sws_scale() C function in Bambuser’s build : 70 ms/frame. Unacceptable.
    2. Stupid bitmap rescaling in Android (Bitmap.createScaledBitmap) : 65 ms/frame. Unacceptable.
    3. OpenGL rendering in ortho projection on textured quad. In this case I did not need to scale frame. I just needed to prepare texture 1024x512 (in my case it was RGBA8888) containig frame pixels and than load it in GPU (gl.glTexImage2D). Result : 220 ms/frame to render. Unacceptable. I did not expect that glTexImage2D just sucked on Snapdragon CPU.

    That’s all.
    I know that there is some way to use fragment shader to convert YUV pixels using GPU, but we will have the same glTexImage2D and 200 ms just to texture loading.

    But this is not the end. ...my only friend the end... :)
    It’s not hopeless condition.

    Trying to use RockPlayer you definitely will wonder how they do that damn frame scaling so fast. I suppose that they have really good experience in ARM achitecture. They most probably use avcodec_decode_video2 and than img_convert (as I did in RP version), but then they use some tricks (depends of ARM version) for scaling.
    Maybe they also have some "magic" buld configuration for ffmpeg decreasing decoding time but Android.mk that they published is not THE Android.mk they use. Dunno...

    So, now it looks like you can not just buld some easy JNI bridge for ffmpeg and than have real media player for Android platform. You can do this only if you have suitable video that you do not need to scale.

    Any ideas ? I hope for you ;)