Recherche avancée

Médias (91)

Autres articles (63)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • MediaSPIP Core : La Configuration

    9 novembre 2010, par

    MediaSPIP Core fournit par défaut trois pages différentes de configuration (ces pages utilisent le plugin de configuration CFG pour fonctionner) : une page spécifique à la configuration générale du squelettes ; une page spécifique à la configuration de la page d’accueil du site ; une page spécifique à la configuration des secteurs ;
    Il fournit également une page supplémentaire qui n’apparait que lorsque certains plugins sont activés permettant de contrôler l’affichage et les fonctionnalités spécifiques (...)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (9654)

  • Fetching movies' frames via ffmpeg and feed it to vlfeat's sift

    16 novembre 2012, par Karl

    I am going to develop a program that uses ffmpeg and vlfeat on a linux server.

    My task is simply : get some frames from a movie file and feed these frames to vlfeat's sift.

    I am reading through some documents regarding using ffmpeg in c development, mainly here and here. As stated in the site, "There is not much "web based" official documentation for using these libraries." And some tutorials there might be a little outdated. I also read around that the API might differ from version to version. So I would like to ask for the following :

    1. Is it safe to follow this tutorial for the current implementation ?

      • If so, given the impression from above, what are some of the things that should be change for the current implementation ? (currently I got ffmpeg-git-c995644)
      • If not, what are the functions to acquire the frames of the movie file in any format ?
    2. For vlfeat side, if I am to feed a movie's image frame from ffmpeg, what kind of conversion is required so that vlfeat's sift implementation can "digest" the movie's image frame ?

  • FFmpeg Centos Compile Error - caused by libwebp (error : implicit declaration of function ‘libwebp_error_to_averror’)

    2 septembre 2022, par fvid1

    I'm trying to compile FFmpeg on Centos 7.9. I followed the official instructions : https://trac.ffmpeg.org/wiki/CompilationGuide/Centos

    


    I need support for WebP images, so I installed libwebp and libwebp-devel via YUM, and used the option --enable-libwebp.

    


    But at the ffmepg "make" stage, the following error is returned :

    


    CC      libavcodec/libwebpenc_common.o
libavcodec/libwebpenc_common.c: In function ‘ff_libwebp_get_frame’:
libavcodec/libwebpenc_common.c:283:17: error: implicit declaration of function ‘libwebp_error_to_averror’ [-Werror=implicit-function-declaration]
                 ret = libwebp_error_to_averror(pic->error_code);
                 ^
cc1: some warnings being treated as errors
make: *** [libavcodec/libwebpenc_common.o] Error 1


    


    Does anyone know why this error is occuring and how it can be corrected ?

    


    I've tried compiling without using --enable-libwebp and it works fine.

    


  • Gstreamer video increases latency with decresed FPS

    19 novembre 2024, par Ri Di

    I am using RPI 5 to stream the video :

    


    rpicam-vid -t 0 --camera 0 --nopreview --mode 2304:1296:10:P --codec yuv420 --width 640 --height 360 --framerate 10 --rotation 0 --autofocus-mode manual --inline --listen -o - | ffmpeg -f rawvideo -pix_fmt yuv420p -s:v 640x360 -r 10 -i /dev/stdin -c:v libx264 -preset ultrafast -tune zerolatency -maxrate 300k -bufsize 50k -g 30000 -f mpegts tcp://192.168.0.147:1234


    


    View it with :

    


    gst-launch-1.0 -v tcpserversrc host=0.0.0.0 port=1234 ! queue ! tsdemux ! h264parse ! avdec_h264 ! videorate ! video/x-raw,framerate=10/1 ! videoconvert ! autovideosink sync=false


    


    Problem is that with 10 FPS I get around 2s of latency ! While 56 or 120 FPS results in below 300ms latency.

    


    Is the problem in sender or reader side ? Or both ?

    


    I am not planning to use the 10 FPS, its only for demonstration of problem. But I would like to get lower latency at 56 FPS - just like at 120 FPS (around 80-100 ms difference) or maybe even better, as it seems to get lower with higher FPS.

    


    Maybe there is some kind of buffering parameter which holds frames ?

    


    (of course, when testing with higher FPS I change both numbers in sender and the one in reader command. The camera is v3 RPI official)

    


    Also I'd like to mention that same thing happens with ffplay :

    


    ffplay -i -probesize 3000 tcp://0.0.0.0:1234/?listen