Recherche avancée

Médias (2)

Mot : - Tags -/media

Autres articles (57)

  • Qualité du média après traitement

    21 juin 2013, par

    Le bon réglage du logiciel qui traite les média est important pour un équilibre entre les partis ( bande passante de l’hébergeur, qualité du média pour le rédacteur et le visiteur, accessibilité pour le visiteur ). Comment régler la qualité de son média ?
    Plus la qualité du média est importante, plus la bande passante sera utilisée. Le visiteur avec une connexion internet à petit débit devra attendre plus longtemps. Inversement plus, la qualité du média est pauvre et donc le média devient dégradé voire (...)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

Sur d’autres sites (10987)

  • UnsatisfiedLInkError Linking to FFMPEG with NDK

    28 novembre 2011, par U Avalos

    I compiled FFMPEG for android using bambuser's files. The compile runs fine. No errors.
    I also made sure to change the package name in build.sh. However, once I try to link to the files, the phone throws an UnsatisfiedLinkError. This is the Androkd.mk file :

    LOCAL_PATH := $(call my-dir)

    include $(CLEAR_VARS)
    MY_LIB_PATH := ffmpeg-android/build/ffmpeg/armeabi/lib
    LOCAL_MODULE := bambuser-libavcore
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavcore.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libavformat
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavformat.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libavcodec
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavcodec.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libavdevice
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavdevice.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libavfilter
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavfilter.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libavutil
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavutil.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libswscale
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libswscale.so
    include $(PREBUILT_SHARED_LIBRARY)


    #local_PATH := $(call my-dir)

    include $(CLEAR_VARS)

    LOCAL_MODULE    := libtest_jni
    LOCAL_SRC_FILES := libtest/video.c

    LOCAL_C_INCLUDES := \
       $(LOCAL_PATH)/include \
           $(LOCAL_PATH)/ffmpeg-android/ffmpeg
           LOCAL_LDLIBS := -L$(NDK_PLATFORMS_ROOT)/$(TARGET_PLATFORM)/arch-arm/usr/lib -L$(LOCAL_PATH) -L$(LOCAL_PATH)/ffmpeg-android/build/ffmpeg/armeabi/lib/ -lavformat -lavcodec -lavdevice -lavfilter -lavutil -lswscale -llog -lz -lm
           #dl -lgcc

           include $(BUILD_SHARED_LIBRARY)

    Video.c is dead simple :

    #include <libavcodec></libavcodec>avcodec.h>
    #include <libavformat></libavformat>avformat.h>
    #include <libswscale></libswscale>swscale.h>

    #include  
    #include  
    #include
    #include <android></android>log.h>  

    void Java_com_bukabros_videolivewallpaper_Opengl2Renderer_loadNthFrame3
          (JNIEnv * env, jobject this, jstring fileName)  {

          jboolean isCopy;  
            const char * szLogThis = (*env)->GetStringUTFChars(env, fileName, &amp;isCopy);  

             __android_log_print(ANDROID_LOG_DEBUG, "NDK: ", "NDK:LC: [%s]", szLogThis);
             }  

    Tne corresponding Java code is also simple :

    private native void loadNthFrame3(String fileName);

    static {  
    System.loadLibrary("libtest_jni");  
    }  

    But I get this error :

    E/AndroidRuntime(11489): FATAL EXCEPTION: main

    E/AndroidRuntime(11489): java.lang.ExceptionInInitializerError

    E/AndroidRuntime(11489):    at com.bukabros.videolivewallpaper.VideoLiveWallpaper$CubeEngine.<init>(VideoLiveWallpaper.java:147)

    E/AndroidRuntime(11489):    at com.bukabros.videolivewallpaper.VideoLiveWallpaper.onCreateEngine(VideoLiveWallpaper.java:120)

    E/AndroidRuntime(11489):    at android.service.wallpaper.WallpaperService$IWallpaperEngineWrapper.executeMessage(WallpaperService.java:814)

    E/AndroidRuntime(11489):    at com.android.internal.os.HandlerCaller$MyHandler.handleMessage(HandlerCaller.java:61)

    E/AndroidRuntime(11489):    at android.os.Handler.dispatchMessage(Handler.java:99)

    E/AndroidRuntime(11489):    at android.os.Looper.loop(Looper.java:123)

    E/AndroidRuntime(11489):    at android.app.ActivityThread.main(ActivityThread.java:4627)

    E/AndroidRuntime(11489):    at java.lang.reflect.Method.invokeNative(Native Method)

    E/AndroidRuntime(11489):    at java.lang.reflect.Method.invoke(Method.java:521)

    E/AndroidRuntime(11489):    at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:878)

    E/AndroidRuntime(11489):    at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:636)

    E/AndroidRuntime(11489):    at dalvik.system.NativeStart.main(Native Method)

    E/AndroidRuntime(11489): Caused by: java.lang.UnsatisfiedLinkError: Library libtest_jni not found

    E/AndroidRuntime(11489):    at java.lang.Runtime.loadLibrary(Runtime.java:461)

    E/AndroidRuntime(11489):    at java.lang.System.loadLibrary(System.java:557)

    E/AndroidRuntime(11489):    at com.bukabros.videolivewallpaper.Opengl2Renderer.<clinit>(Opengl2Renderer.java:389)

    E/AndroidRuntime(11489):    ... 12 more
    </clinit></init>

    I tried manually loading the prebuilt shared libraries (the bambuser files) in Java (using System.loadLibrary) but then it tells me that the files are not found.

    If it helps, here's the output of readelf :

    0x00000001 (NEEDED)                     Shared library: [libc.so]
     0x00000001 (NEEDED)                     Shared library: [libstdc++.so]
      0x00000001 (NEEDED)                     Shared library: [libm.so]
       0x00000001 (NEEDED)                     Shared library: [/data/data/com.bukabros.videolivewallpaper/lib/libavformat.so]
        0x00000001 (NEEDED)                     Shared library: [/data/data/com.bukabros.videolivewallpaper/lib/libavcodec.so]
         0x00000001 (NEEDED)                     Shared library: [/data/data/com.bukabros.videolivewallpaper/lib/libavdevice.so]
          0x00000001 (NEEDED)                     Shared library: [/data/data/com.bukabros.videolivewallpaper/lib/libavfilter.so]
           0x00000001 (NEEDED)                     Shared library: [/data/data/com.bukabros.videolivewallpaper/lib/libavutil.so]
        0x00000001 (NEEDED)                     Shared library: [/data/data/com.bukabros.videolivewallpaper/lib/libswscale.so]
         0x00000001 (NEEDED)                     Shared library: [liblog.so]
          0x00000001 (NEEDED)                     Shared library: [libz.so]
           0x00000001 (NEEDED)                     Shared library: [libdl.so]
            0x0000000e (SONAME)                     Library soname: [libtest_jni.so]
             0x00000010 (SYMBOLIC)                   0x0
              0x00000004 (HASH)                       0xd4
               0x00000005 (STRTAB)                     0x250
            0x00000006 (SYMTAB)                     0x130
             0x0000000a (STRSZ)                      712 (bytes)
              0x0000000b (SYMENT)                     16 (bytes)
               0x00000003 (PLTGOT)                     0x174c
                0x00000002 (PLTRELSZ)                   32 (bytes)
                 0x00000014 (PLTREL)                     REL
                  0x00000017 (JMPREL)                     0x55c
                   0x6ffffffe (VERNEED)                    0x53c
                0x6fffffff (VERNEEDNUM)                 1
                 0x6ffffff0 (VERSYM)                     0x518
                  0x00000000 (NULL)                       0x0

    Oh yeah. I'm using ndk r5.

  • Revision 5e3439bbf7 : sad_test : fix msvc compile Fixes : error C2121 : '#' : invalid character : possib

    30 mai 2013, par James Zern

    Changed Paths :
     Modify /test/sad_test.cc



    sad_test : fix msvc compile

    Fixes :
    error C2121 : '#' : invalid character : possibly the result of a macro expansion

    Change-Id : I63d7ebba29f3a3cbb546406be39270290e9dc47d
    (cherry picked from commit a91e5b4fdc4e81352fa3a65c7fc795527eee6b07)

  • Webcam stream with FFMpeg on iPhone

    6 décembre 2011, par Saphrosit

    I'm trying to send and show a webcam stream from a linux server to an iPhone app. I don't know if it's the best solution, but I downloaded and installed FFMpeg on the linux server (following, for those who want to know, this tutorial).
    FFMpeg is working fine. After a lots of wandering, I managed to send a stream to the client launching

    ffmpeg  -s 320x240 -f video4linux2 -i /dev/video0 -f mpegts -vcodec libx264 udp://192.168.1.34:1234

    where 192.168.1.34 is the address of the client. Actually the client is a Mac, but it is supposed to be an iPhone. I know the stream is sent and received correctly (tested in different ways).
    However I didn't managed to watch the stream directly on the iPhone.
    I thought of different (possible) solutions :

    • first solution : store incoming data in a NSMutableData object. Then, when the stream ends, store it and then play it using a MPMoviePlayerController. Here's the code :

      [video writeToFile:@"videoStream.m4v" atomically:YES];
      NSURL *url = [NSURL fileURLWithPath:@"videoStream.m4v"];

      MPMoviePlayerController *videoController = [[MPMoviePlayerController alloc] initWithContentURL:url];

      [videoController.view setFrame:CGRectMake(100, 100, 150, 150)];

      [self.view addSubview:videoController.view];

      [videoController play];

      the problem of this solution is that nothing is played (I only see a black square), even if the video is saved correctly (I can play it directly from my disk using VLC). Besides, it's not such a great idea. It's just to make things work.

    • Second solution : use CMSampleBufferRef to store the incoming video. Much more problems comes with this solution : first of all, there's no CoreMedia.framework in my system. Besides I do not get well what does this class represents and what should I do to make it works : I mean if I start (somehow) filling this "SampleBuffer" with bytes I receive from UDP connection, then it will automatically call the CMSampleBufferMakeDataReadyCallback function I set during creation ? If yes, when ? When the single frame is completed or when the whole stream is received ?

    • Third solution : use AVFoundation framework (neither this is actually available on my Mac). I did not understand if it's actually possible to start recording from a remote source or even from a NSMutableData, a char* or something like that. On AVFoundation Programming Guide I didn't find any reference that say if it's possible or not.

    I don't know which one of this solution is the best for my purpose. ANY suggestion would be appreciate.

    Besides, there's also another problem : I didn't use any segmenter program to send the video. Now, if I'm not getting wrong, segmenter needs to split the source video in smaller/shorter video easier to send. If it is right, then maybe it's not strictly necessary to make things work (may be added later). However, since the server is running under linux, I cannot use Apple's mediastreamsegmeter. May someone suggest an opensource segmenter to use in association with FFMpeg ?


    UPDATE : I edited my question adding more informations on what I did since now and what my doubts are.