Recherche avancée

Médias (91)

Autres articles (66)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Ajouter des informations spécifiques aux utilisateurs et autres modifications de comportement liées aux auteurs

    12 avril 2011, par

    La manière la plus simple d’ajouter des informations aux auteurs est d’installer le plugin Inscription3. Il permet également de modifier certains comportements liés aux utilisateurs (référez-vous à sa documentation pour plus d’informations).
    Il est également possible d’ajouter des champs aux auteurs en installant les plugins champs extras 2 et Interface pour champs extras.

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

Sur d’autres sites (14025)

  • Seeking Guidance on Audio Development Libraries for Playing Various Music Formats on iOS [closed]

    3 décembre 2023, par FaiChou

    I am exploring audio development and am overwhelmed by the numerous libraries available :

    


      

    • AVFoundation
    • 


    • MediaPlayer
    • 


    • AVFAudio
    • 


    • Audio Engine
    • 


    • CoreAudio
    • 


    • AudioToolbox
    • 


    • Audio Unit
    • 


    


    Which of these should I use to play both network or local music files (flac/wav/mp3/aac/wma/ogg) ? What are their specific roles ?

    


    My understanding is that I should use ffmpeg for demuxing to unpackage, then decode the stream to get PCM samples, possibly using Audio Engine for playback. Is this a software decoding method ? For Apple-supported formats like mp3, can I bypass ffmpeg for decoding ? Which library would be most suitable in this case ?

    


    Additionally, how do I extract metadata (song name, artist, cover, lyrics, etc.) from music files ? Apple's own solutions seem to support only ID3 or iTunes tags. How should I handle other formats like wav/flac ?

    


    The two links below are the resources I've referenced, but they seem outdated. My search for newer, reliable explanations has been fruitless.

    


    


  • Piwik 3 Development Update #2 – Git master branch will become Piwik 3

    2 septembre 2016, par Piwik Core Team — Community, Development

    As mentioned in the Piwik 3 Development Update #1 we are actively working on the new major Piwik 3 release.

    This blog post is an announcement regarding an upcoming change on our Git repository.

    On October 4th CET, we will merge the current changes done for Piwik 3 from the “3.x-dev” branch into the “master“ branch across our Piwik projects and plugin repositories. While this is not important for most of our users, it can be a problem if you have installed and deployed Piwik from git.

    • If you are currently on “master” branch and want to continue using Piwik 2, you need to checkout the newly created “2.x-dev” branch instead of “master” anytime within the next 4 weeks. Don’t forget to update your scripts and scheduled tasks (cronjobs) that may reference “master” branch.
    • If you want to receive an early version of Piwik 3 via git automatically, you won’t have to change anything.

    The final Piwik 3 release will be ready before the end of the year. If you want to give it a try, you can either use Piwik from Git and check out the “3.x-dev” branch, or download Piwik 3 from GitHub.

    Until our next Piwik 3 dev update, Happy analysis !

  • How can I develop a 264 "codec format worked in VideoView API on android ?

    31 juillet 2018, par lscodex

    I have got a big problem. I have a video "tc10.264" I downloaded from live555.com. I can not play the video that using videoview,exoplayer and vitamio sdk
    on android.
    I know that the video is codec.it is not container like mp4,flv etc.
    Later, I played the video with "ffplay -f h264 tc10.264 " via on windows console.
    and ffmpeg shows me this spec.
    h264(baseline), yubj420p and 25 fps.

    Okay this video is exist.
    After, I downloaded ffmpeg from on ubuntu in virtual machine. And I build(compile) the ffmpeg with x264 it as shown below.

    my directory path is like ffmpeg > x264 and I compiled libx264.a for ffmpeg from x264 folder.

    Note : my android phone has architecturally armeabi_v7a

    There is build_android_arm.sh script.

    #!/bin/bash
    echo starting building ....
    NDK=/home/lscodex/android-sdk/ndk-bundle
    PLATFORM=$NDK/platforms/android-19/arch-arm/
    TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64
    PREFIX=./android/arm

    function build_one
    {
     ./configure \
     --prefix=$PREFIX \
     --enable-static \
     --enable-pic \
     --host=arm-linux \
     --cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
     --sysroot=$PLATFORM

     make clean
     make
     make install
    }

    build_one

    echo Android ARM builds finished....

    After, I have a folder called android containing lib,include and bin files.So, I compiled script that shown below to achieve shared folder (".so") in ffmpeg folder.
    there is build_android_armeabi_v7a.sh script

    #!/bin/bash

    echo Android starting armeabi_v7a
    NDK=/home/lscodex/android-sdk/ndk-bundle
    PLATFORM=$NDK/platforms/android-19/arch-arm/
    PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64
    CPU=armeabi_v7a
    PREFIX=$(pwd)/android/$CPU

    GENERAL="\
    --enable-small \
    --enable-cross-compile \
    --extra-libs="-lgcc" \
    --arch=arm \
    --cc=$PREBUILT/bin/arm-linux-androideabi-gcc \
    --cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \
    --nm=$PREBUILT/bin/arm-linux-androideabi-nm \
    --extra-cflags="-I../x264/android/arm/include" \
    --extra-ldflags="-L../x264/android/arm/lib" "


    MODULES="\
    --enable-gpl \
    --enable-libx264"

    H264TEST="\
    --enable-encoder=libx264 \
    --enable-encoder=libx264rgb \
    --enable-decoder=h264 \
    --enable-muxer=h264 \
    --enable-demuxer=h264 \
    --enable-parser=h264"

    function build_ARMv7
    {
     ./configure \
     --target-os=android \
     --prefix=$PREFIX \
     ${GENERAL} \
     --sysroot=$PLATFORM \
     --enable-shared \
     ${H264TEST} \
     --disable-static \
     --extra-cflags="-march=armv7-a -mfloat-abi=softfp -mfpu=vfpv3-d16 -fomit-frame-pointer -fstrict-aliasing -funswitch-loops -finline-limit=300" \
     --extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog" \
     --enable-zlib \
     ${MODULES} \
     --disable-doc \
     --enable-neon

     make clean
     make
     make install
    }


    build_ARMv7
    echo Android ARMv7-a builds finished

    finally, I obtained another an android folder that contain shared library.
    I integrated the files from the android studio. I created CMakeLists.txt and cpp folder.
    So, Everything is perfectly working, I think.
    this ndk code helping me for receive a duration of bugs_bunny.mp4 video.

    extern "C"
    JNIEXPORT jint JNICALL
    Java_com_lscodex_just_videoplayertesting2_VideoProcessing_videoDuration(
           JNIEnv *env,
           jobject obj,
           jstring input) {


       AVFormatContext *pFormatCtx = NULL;

        if(avformat_open_input(&pFormatCtx,convertStringFileToChar(env,input),NULL,NULL)<0){
            throwException(env,"Could not open input file ");
            loge("Could not open input file ");
            return 0;
        }
        if (avformat_find_stream_info(pFormatCtx,NULL)<0){
            throwException(env,"Failed to retrieve input stream information");
            loge("Failed to retrieve input stream information");
            return 0;
        }

       logd("I reached it here :) ");


       int64_t duration = pFormatCtx->duration;
       avformat_close_input(&pFormatCtx);
       avformat_free_context(pFormatCtx);
       return (jint) (duration / AV_TIME_BASE);
    }

    at least for mp4 video format.
    So, my question is, How can I run tc10.264 codec video format via ffmpeg on exoplayer or on videoview api ?