Recherche avancée

Médias (2)

Mot : - Tags -/kml

Autres articles (37)

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

  • Soumettre améliorations et plugins supplémentaires

    10 avril 2011

    Si vous avez développé une nouvelle extension permettant d’ajouter une ou plusieurs fonctionnalités utiles à MediaSPIP, faites le nous savoir et son intégration dans la distribution officielle sera envisagée.
    Vous pouvez utiliser la liste de discussion de développement afin de le faire savoir ou demander de l’aide quant à la réalisation de ce plugin. MediaSPIP étant basé sur SPIP, il est également possible d’utiliser le liste de discussion SPIP-zone de SPIP pour (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

Sur d’autres sites (4429)

  • How to route FFMPEG screen capture output to Python for OpenCV processing ?

    16 décembre 2016, par Bitani

    I am trying to capture the screen of a Mac, specifically game windows, and process them in real-time using OpenCV. I currently have it working with pyscreenshot and a conversion to a numpy array, but the max framerate of this method, on any system, appears to be around 10 FPS.

    Obviously with games it would be very beneficial to be able to process 30+ FPS, and so I found that FFMPEG could record the screen using this command :

    ffmpeg -y -f avfoundation -capture_cursor 1 -i "1" -pix_fmt yuv420p -vf scale=1024:-1 -r 30 out.mp4

    That will write the screen to a file, out.mp4, with a width of 1024 and a frame rate of 30 FPS. Works perfectly ! But now I’ve searched and searched and searched for a method to route FFMPEG’s output to Python and.. no luck.

    If anybody knows of a way to do this I would be greatly appreciative, as I’ve been trying to solve this for quite a while !

    Thanks, and happy holidays !

  • libav works with RTP stream on PC, but not on Android (Same RTP stream)

    25 septembre 2016, par Nitay

    I’m using libav to decode video received from a 3rd party. The video is received in an Android app and is then made into an RTP stream and published to another device.

    When I use the PC as the target device, the stream decodes properly and I see video. When I use android (same code, compiled to android) the video does not decode at all.
    This happens only with the video from the 3rd party. Other video streams works okay both on PC and on Android.

    To be clear :

    • If stream is casted from a command line using ffmpeg -> Video is displayed both on Android & on PC
    • If stream is casted from the Android App -> Video is displayed only on PC (the same code, compiled for different platforms)

    libav 11.7 was compiled to android using the following line on configure :

    NDK=/opt/android-ndk-r12b
    SYSROOT="${NDK}/platforms/android-23/arch-arm/"
    ECFLAGS="-march=armv7-a -mfloat-abi=softfp -I /usr/local/include"
    ELDFLAGS="-Wl,--fix-cortex-a8 -L /usr/local/lib"
    ARCH_SPECIFIC="--disable-asm --arch=arm --cpu=armv7-a --cross-prefix=/opt/android-ndk-r12b/prebuilt/linux-x86_64/bin/../../../toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64/bin/arm-linux-androideabi-"

    ./configure \
    ${ARCH_SPECIFIC} \
    --target-os=linux \
    --sysroot="$SYSROOT" \
    --extra-cflags="$ECFLAGS" \
    --extra-ldflags="$ELDFLAGS" \
    --enable-shared \
    --disable-symver

    (—disabled-asm is unfortunately needed to avoid text-relocations in the compiled library which is not allowed on Android)

    Here are the libav logs from the android side : http://pastebin.com/MDE3N7BD

    The logs starting with LIBAV are libav messaged, the ones without are my own messages wrapped around the libav functions.

    Logs from the PC side : http://pastebin.com/N0Fd18F9

    The loop that reads the frames :

           // If read frame fails (which happens), keep trying
           LOG_DEBUG("Before read frame");
           while (av_read_frame(formatContext, &packet) >= 0 && !terminate)
           {
               LOG_DEBUG1("Packet read. Size: %d", packet.size);

               this->DecodeFrame(videoStreamIndex, formatContext->streams[videoStreamIndex]->codec, &packet);

               av_free_packet(&packet);
               av_init_packet(&packet);
           }

    And here’s the frames decoding code :

    void VideoDecoder::DecodeFrame(int videoStreamIndex, AVCodecContext* streamCodec, AVPacket* packet)
    {
       static bool save_file = false;

       AVPixelFormat destinationFormat = AV_PIX_FMT_RGBA;
       LOG_DEBUG("Decoding frame!");


       if (this->isFirstFrame)
       {
           LOG_DEBUG("Creating codecs");
           this->isFirstFrame = false;
           // For parsing the packets, we first need to create the right codec
           AVCodec* h264Codec = NULL;
           // (I'm not sure about why does ffmpeg need this. It has an SDP file which states exactly that, but okay)
           h264Codec = avcodec_find_decoder(AV_CODEC_ID_H264);

           // Now make a copy of the codec for us to change
           codecContext = avcodec_alloc_context3(h264Codec);
           avcodec_get_context_defaults3(codecContext, h264Codec);
           avcodec_copy_context(codecContext, streamCodec);


           // Initialize codecContext to use codec
           if (avcodec_open2(codecContext, h264Codec, NULL) >= 0)
           {
               // There's a nasty edge case here that we need to handle first
               if (streamCodec->width == 0 || streamCodec->height == 0)
               {
                   // That means that the stream initialized before any packets were sent to it, we can't initialize
                   // any buffers without knowing their size. So to tackle this we'll initialize the largest buffer
                   // can think of

                   codecContext->width = MAX_RESOLUTION_WIDTH;
                   codecContext->height = MAX_RESOLUTION_HEIGHT;
               }

               // Instantiate new buffers
               int size = avpicture_get_size(AV_PIX_FMT_YUV420P, codecContext->width, codecContext->height);
               originalPic = av_frame_alloc();
               originalPicBuffer = (uint8_t*)(av_malloc(size));

               avpicture_fill((AVPicture*)originalPic, originalPicBuffer, AV_PIX_FMT_YUV420P, codecContext->width, codecContext->height);
           }

           // Instantiate an output context, for usage in the conversion of the picture
           outputFormatContext = avformat_alloc_context();
       }

       if ((packet->stream_index == videoStreamIndex) && !terminate)
       {
           // Packet is video. Convert!

           if (outputStream == NULL)
           {
               //create stream in file
               outputStream = avformat_new_stream(outputFormatContext, streamCodec->codec);
               avcodec_copy_context(outputStream->codec, streamCodec);
               outputStream->sample_aspect_ratio = streamCodec->sample_aspect_ratio;
           }

           int pictureReceived = 0;
           packet->stream_index = outputStream->id;
           int result = avcodec_decode_video2(codecContext, originalPic, &pictureReceived, packet);
           //          std::cout << "Bytes decoded " << result << " check " << check << std::endl;

           if (pictureReceived)
           {
               LOG_DEBUG("New frame received");
               // NOTICE: It is generally not a good practice to allocate on demand instead on initialization.
               // It this case the edge cases demand it (what happens if width==0 on the first packet?)
               if (this->imageConvertContext == NULL)
               {
                   // Allocate pictures and buffers for conversion
                   this->imageConvertContext = sws_getContext(
                       codecContext->width,
                       codecContext->height,
                       codecContext->pix_fmt,
                       codecContext->width,
                       codecContext->height,
                       destinationFormat,
                       SWS_BICUBIC,
                       NULL, NULL, NULL);
               }

               if (this->convertedPic == NULL)
               {
                   int size_rgba = avpicture_get_size(destinationFormat, codecContext->width, codecContext->height);
                   convertedPicBuffer = (uint8_t*)(av_malloc(size_rgba));
                   convertedPic = av_frame_alloc();
                   avpicture_fill((AVPicture*)convertedPic, convertedPicBuffer, destinationFormat, codecContext->width, codecContext->height);
               }

               // Scale the image
               sws_scale(imageConvertContext, originalPic->data, originalPic->linesize, 0, codecContext->height, convertedPic->data, convertedPic->linesize);

               // We have a frame! Callback
               if (frameReadyCallback != NULL)
               {
                   LOG_DEBUG3("Updated frame [width=%d, height=%d, ptr=0x%08x]", codecContext->width, codecContext->height, convertedPic->data[0]);
                   if (save_file)
                   {
                       save_file = false;
                       std::string filename = "/storage/emulated/0/DCIM/saved_file.rgba";
                       save_buffer_to_file((unsigned char*)convertedPic->data[0], codecContext->width * codecContext->height * 4, filename.c_str());
                       LOG_DEBUG("Exported file");
                   }
                   frameReadyCallback((char*)convertedPic->data[0], codecContext->width, codecContext->height);
               }
           }
           else
           {
               LOG_DEBUG("Packet without frame");
           }
       }
    }

    Obviously the stream from the 3rd party is somehow different, and probably from a different encoder. But it works with libav (same version) on PC. What could be the difference on Android causing it to not find the frames ?

  • Compiling FFmpeg on OSX - "speex not found using pkg-config"

    19 septembre 2016, par n4zArh

    I recently had few problems with FFmpeg and compiling it to get library. I managed to get through all of them, however recently I found out I need to add Speex decoder (and possibly encoder) to my project. I got Speex by sources, ./configure and make;make install (later - as I had problems - I also used Brew to download Speex). I added --enable-libspeex to my configure script and every time I try to use it I get "Speex not found using pkg-config" error.

    I am sure that there’s Speex files at /usr/local/include and lib directories, I also added those two as CFLAGS and LDFLAGS, I tried building Speex with or without using --prefix (both pointing to /usr/ and /usr/local/), I tried modifying FFmpeg’s configure file (require_pkg_config with Speex call), but no matter what I try to do I fail to build it - every time with same error.

    Long story short - how to build FFmpeg with Speex decoder on OSX ? I read somewhere that libspeex-dev might be needed, but it’s available through apt-get and not Brew (unless I screwed something up).

    My build script :

    #!/bin/bash

    if [ "$NDK" = "" ]; then
       echo NDK variable not set, assuming ${HOME}/android-ndk
       export NDK=${HOME}/Library/Android/sdk/ndk-bundle
    fi

    SYSROOT=$NDK/platforms/android-16/arch-arm
    # Expand the prebuilt/* path into the correct one
    TOOLCHAIN=`echo $NDK/toolchains/arm-linux-androideabi-4.9/prebuilt/darwin-x86_64`
    export PATH=$TOOLCHAIN/bin:$PATH

    rm -rf build/ffmpeg
    mkdir -p build/ffmpeg
    cd ffmpeg

    # Don't build any neon version for now
    for version in armv5te armv7a; do

    DEST=../build/ffmpeg
    FLAGS="--target-os=linux --cross-prefix=arm-linux-androideabi- --arch=arm"
    FLAGS="$FLAGS --sysroot=$SYSROOT"
    FLAGS="$FLAGS --enable-shared --disable-symver"
    FLAGS="$FLAGS --enable-small"
    FLAGS="$FLAGS --disable-everything"
    FLAGS="$FLAGS --enable-decoder=h264 --enable-decoder=adpcm_ima_oki --enable-decoder=adpcm_ima_ws"
    FLAGS="$FLAGS --enable-encoder=adpcm_ima_qt --enable-encoder=adpcm_ima_wav --enable-encoder=adpcm_g726"
    FLAGS="$FLAGS --enable-encoder=adpcm_g722 --enable-libspeex"

    case "$version" in
       neon)
           EXTRA_CFLAGS="-march=armv7-a -mfloat-abi=softfp -mfpu=neon"
           EXTRA_LDFLAGS="-Wl,--fix-cortex-a8"
           # Runtime choosing neon vs non-neon requires
           # renamed files
           ABI="armeabi-v7a"
           ;;
       armv7a)
           EXTRA_CFLAGS="-march=armv7-a -mfloat-abi=softfp"
           EXTRA_LDFLAGS=""
           ABI="armeabi-v7a"
           ;;
       *)
           EXTRA_CFLAGS=""
           EXTRA_LDFLAGS=""
           ABI="armeabi"
           ;;
    esac
    DEST="$DEST/$ABI"
    FLAGS="$FLAGS --prefix=$DEST"
    EXTRA_CFLAGS="$EXTRA_CFLAGS -I/usr/local/include/"
    EXTRA_LDFLAGS="$EXTRA_LDFLAGS -L/usr/local/lib"
    PKT_CONFIG_PATH="/usr/lib/pkgconfig/"
    mkdir -p $DEST
    echo $FLAGS --extra-cflags="$EXTRA_CFLAGS" --extra-ldflags="$EXTRA_LDFLAGS" > $DEST/info.txt
    ./configure $FLAGS --extra-cflags="$EXTRA_CFLAGS" --extra-ldflags="$EXTRA_LDFLAGS" | tee $DEST/configuration.txt
    [ $PIPESTATUS == 0 ] || exit 1
    rm compat/strtod.o
    rm compat/strtod.d
    make clean
    make -j4 || exit 1
    make install || exit 1

    done

    Tail of config.log :

    BEGIN /tmp/ffconf.QcYgKHFW.c
       1   #include
       2   #include
       3   float foo(complex float f, complex float g) { return cabs(f * I); }
       4   int main(void){ return (int) foo; }
    END /tmp/ffconf.QcYgKHFW.c
    arm-linux-androideabi-gcc --sysroot=/Users/mgriszbacher/Library/Android/sdk/ndk-bundle/platforms/android-16/arch-arm -isysroot /Users/mgriszbacher/Library/Android/sdk/ndk-bundle/platforms/android-16/arch-arm -D_ISOC99_SOURCE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE -Dstrtod=avpriv_strtod -DPIC -I/usr/local/include/ -march=armv5te -std=c99 -fomit-frame-pointer -fPIC -marm -pthread -c -o /tmp/ffconf.vfjjuG7b.o /tmp/ffconf.QcYgKHFW.c
    /tmp/ffconf.QcYgKHFW.c:1:21: fatal error: complex.h: No such file or directory
    #include
                    ^
    compilation terminated.
    check_complexfunc cexp 1
    check_ld cc
    check_cc
    BEGIN /tmp/ffconf.QcYgKHFW.c
       1   #include
       2   #include
       3   float foo(complex float f, complex float g) { return cexp(f * I); }
       4   int main(void){ return (int) foo; }
    END /tmp/ffconf.QcYgKHFW.c
    arm-linux-androideabi-gcc --sysroot=/Users/mgriszbacher/Library/Android/sdk/ndk-bundle/platforms/android-16/arch-arm -isysroot /Users/mgriszbacher/Library/Android/sdk/ndk-bundle/platforms/android-16/arch-arm -D_ISOC99_SOURCE -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE -Dstrtod=avpriv_strtod -DPIC -I/usr/local/include/ -march=armv5te -std=c99 -fomit-frame-pointer -fPIC -marm -pthread -c -o /tmp/ffconf.vfjjuG7b.o /tmp/ffconf.QcYgKHFW.c
    /tmp/ffconf.QcYgKHFW.c:1:21: fatal error: complex.h: No such file or directory
    #include
                    ^
    compilation terminated.
    check_pkg_config speex speex/speex.h speex_decoder_init -lspeex
    false --exists --print-errors speex
    ERROR: speex not found using pkg-config