Recherche avancée

Médias (16)

Mot : - Tags -/mp3

Autres articles (86)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Gestion des droits de création et d’édition des objets

    8 février 2011, par

    Par défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;

Sur d’autres sites (13435)

  • Best way to stream live video under webRTC development [on hold]

    5 avril 2016, par Ankit Dhanna

    I have implemented webrtc live video chat with multiple presentors (using licode ).
    However, I want to ask how can I broadcast this video stream so that multiple users who are not on webrtc can be served ?

    Where/Which server should I broadcast this stream to to get least lag time ?

  • How can I develop a 264 "codec format worked in VideoView API ?

    31 juillet 2018, par lscodex

    I have got a big problem. I have a video "tc10.264" I downloaded from live555.com. I can not play the video that using videoview,exoplayer and vitamio sdk
    on android.
    I know that the video is codec.it is not container like mp4,flv etc.
    Later, I played the video with "ffplay -f h264 tc10.264 " via on windows console.
    and ffmpeg shows me this spec.
    h264(baseline), yubj420p and 25 fps.

    Okay this video is exist.
    After, I downloaded ffmpeg from on ubuntu in virtual machine. And I build(compile) the ffmpeg with x264 it as shown below.

    my directory path is like ffmpeg > x264 and I compiled libx264.a for ffmpeg from x264 folder.

    Note : my android phone has architecturally armeabi_v7a

    There is build_android_arm.sh script.

    #!/bin/bash
    echo starting building ....
    NDK=/home/lscodex/android-sdk/ndk-bundle
    PLATFORM=$NDK/platforms/android-19/arch-arm/
    TOOLCHAIN=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64
    PREFIX=./android/arm

    function build_one
    {
     ./configure \
     --prefix=$PREFIX \
     --enable-static \
     --enable-pic \
     --host=arm-linux \
     --cross-prefix=$TOOLCHAIN/bin/arm-linux-androideabi- \
     --sysroot=$PLATFORM

     make clean
     make
     make install
    }

    build_one

    echo Android ARM builds finished....

    After, I have a folder called android containing lib,include and bin files.So, I compiled script that shown below to achieve shared folder (".so") in ffmpeg folder.
    there is build_android_armeabi_v7a.sh script

    #!/bin/bash

    echo Android starting armeabi_v7a
    NDK=/home/lscodex/android-sdk/ndk-bundle
    PLATFORM=$NDK/platforms/android-19/arch-arm/
    PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64
    CPU=armeabi_v7a
    PREFIX=$(pwd)/android/$CPU

    GENERAL="\
    --enable-small \
    --enable-cross-compile \
    --extra-libs="-lgcc" \
    --arch=arm \
    --cc=$PREBUILT/bin/arm-linux-androideabi-gcc \
    --cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \
    --nm=$PREBUILT/bin/arm-linux-androideabi-nm \
    --extra-cflags="-I../x264/android/arm/include" \
    --extra-ldflags="-L../x264/android/arm/lib" "


    MODULES="\
    --enable-gpl \
    --enable-libx264"

    H264TEST="\
    --enable-encoder=libx264 \
    --enable-encoder=libx264rgb \
    --enable-decoder=h264 \
    --enable-muxer=h264 \
    --enable-demuxer=h264 \
    --enable-parser=h264"

    function build_ARMv7
    {
     ./configure \
     --target-os=android \
     --prefix=$PREFIX \
     ${GENERAL} \
     --sysroot=$PLATFORM \
     --enable-shared \
     ${H264TEST} \
     --disable-static \
     --extra-cflags="-march=armv7-a -mfloat-abi=softfp -mfpu=vfpv3-d16 -fomit-frame-pointer -fstrict-aliasing -funswitch-loops -finline-limit=300" \
     --extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog" \
     --enable-zlib \
     ${MODULES} \
     --disable-doc \
     --enable-neon

     make clean
     make
     make install
    }


    build_ARMv7
    echo Android ARMv7-a builds finished

    finally, I obtained another an android folder that contain shared library.
    I integrated the files from the android studio. I created CMakeLists.txt and cpp folder.
    So, Everything is perfectly working, I think.
    this ndk code helping me for receive a duration of bugs_bunny.mp4 video.

    extern "C"
    JNIEXPORT jint JNICALL
    Java_com_lscodex_just_videoplayertesting2_VideoProcessing_videoDuration(
           JNIEnv *env,
           jobject obj,
           jstring input) {


       AVFormatContext *pFormatCtx = NULL;

        if(avformat_open_input(&pFormatCtx,convertStringFileToChar(env,input),NULL,NULL)<0){
            throwException(env,"Could not open input file ");
            loge("Could not open input file ");
            return 0;
        }
        if (avformat_find_stream_info(pFormatCtx,NULL)<0){
            throwException(env,"Failed to retrieve input stream information");
            loge("Failed to retrieve input stream information");
            return 0;
        }

       logd("I reached it here :) ");


       int64_t duration = pFormatCtx->duration;
       avformat_close_input(&pFormatCtx);
       avformat_free_context(pFormatCtx);
       return (jint) (duration / AV_TIME_BASE);
    }

    at least for mp4 video format.
    So, my question is, How can I run tc10.264 codec video format via ffmpeg on exoplayer or on videoview api ?

  • Video Streaming Media Source Segments - This SourceBuffer has been removed from the parent media source

    16 septembre 2021, par Swifted

    I have a problem with my .mp4 file segmented with MP4Box and the following command :

    


    MP4Box -dash 4000 -rap -segment-name segment_ Trailler.mp4


    


    This generates my files "segment_1.m4s", "segment_2.m4s", ...
as well as video_dash.mpd and segment_init.mp4

    


    Then, on the Javascript side, I use MediaSource and I used the following tutorial :

    


    https://eyevinntechnology.medium.com/how-to-build-your-own-streaming-video-html-player-6ee85d4d078a

    


    This tutorial works very well for me, except when I try to use my fragmented .mp4 video. The following error comes out :

    


    enter image description here

    


    Can you tell me or point me to why this error comes up ?

    


    Thanks in advance