Recherche avancée

Médias (91)

Autres articles (59)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

Sur d’autres sites (8331)

  • ffmpeg Nvidia build [closed]

    30 septembre 2022, par Never

    I follow this document
https://docs.nvidia.com/video-technologies/video-codec-sdk/ffmpeg-with-nvidia-gpu/#compiling-for-linux
compile ffmpeg

    


    the documentation says "To compile FFmpeg, the CUDA toolkit must be installed on the system, though the CUDA toolkit is not needed to run the FFmpeg compiled binary."

    


    Why does running ffmpeg binary still require CUDA toolkit after I compile ffmpeg

    


    $ ./ffmpeg
./ffmpeg: error while loading shared libraries: libnppig.so.11: cannot open shared object file: No such file or directory


    


    I follow this document
https://developer.nvidia.com/cuda-downloads?target_os=Linux&target_arch=x86_64&Distribution=Debian&target_version=11&target_type=deb_local
install cuda

    


    it runs ok

    


    $ ./ffmpeg

ffmpeg version N-108433-g8089fe072e Copyright (c) 2000-2022 the FFmpeg developers
  built with gcc 10 (Debian 10.2.1-6)
  configuration: --prefix=/root/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/root/ffmpeg_build/include --extra-ldflags=-L/root/ffmpeg_build/lib --extra-libs='-lpthread -lm' --ld=g++ --bindir=/root/bin --enable-gpl --enable-gnutls --enable-libaom --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libsvtav1 --enable-libdav1d --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libmfx --enable-nonfree --enable-cuda-nvcc --enable-cuvid --enable-nvenc --enable-libnpp --extra-cflags=-I/usr/local/cuda/include --extra-ldflags=-L/usr/local/cuda/lib64
  libavutil      57. 38.100 / 57. 38.100
  libavcodec     59. 48.100 / 59. 48.100
  libavformat    59. 33.100 / 59. 33.100
  libavdevice    59.  8.101 / 59.  8.101
  libavfilter     8. 49.100 /  8. 49.100
  libswscale      6.  8.112 /  6.  8.112
  libswresample   4.  9.100 /  4.  9.100
  libpostproc    56.  7.100 / 56.  7.100
Hyper fast Audio and Video encoder
usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...

Use -h to get full help or, even better, run 'man ffmpeg'


    


    How to run ffmpeg without installing cuda ?

    


  • Restream a rtsp stream to a rtmp server

    26 septembre 2022, par Sayan Chakraborty

    I have few hikvision cameras, I want to access it remotely in an app that I wrote. I am able to get the camera feed when I am connected to the network which has the camera. So I have setup a rtmp server in ec2 with Nginx, where I will be broadcasting the camera video feed through the raspberry pi, and would Be able to fetch the feed in my app.

    


    I used obs to stream the camera feed to the ramp server and was able to successfully get the video feed from the ec2 instance, but when using ffmpeg I am getting the following error.

    


    pi@raspberrypi:~$ ffmpeg -re -i rtsp://admin:pass@192.168.0.253/ISAPI/Streaming/channels/101 -an -c:v libx264 -f flv rtmp://ec2instanceip/live
ffmpeg version 4.2.7-0ubuntu0.1 Copyright (c) 2000-2022 the FFmpeg developers
  built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
  configuration: --prefix=/usr --extra-version=0ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  libavutil      56. 31.100 / 56. 31.100
  libavcodec     58. 54.100 / 58. 54.100
  libavformat    58. 29.100 / 58. 29.100
  libavdevice    58.  8.100 / 58.  8.100
  libavfilter     7. 57.100 /  7. 57.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  5.100 /  5.  5.100
  libswresample   3.  5.100 /  3.  5.100
  libpostproc    55.  5.100 / 55.  5.100
[rtsp @ 0x55dd3b126300] Missing PPS in sprop-parameter-sets, ignoring
[rtsp @ 0x55dd3b126300] UDP timeout, retrying with TCP
[rtsp @ 0x55dd3b126300] method PAUSE failed: 551 Option not supported
[rtsp @ 0x55dd3b126300] Could not find codec parameters for stream 0 (Video: h264, none): unspecified size
Consider increasing the value for the 'analyzeduration' and 'probesize' options
Guessed Channel Layout for Input Stream #0.1 : mono
Input #0, rtsp, from 'rtsp://admin:pass@192.168.0.253/ISAPI/Streaming/channels/101':
  Metadata:
    title           : Media Presentation
  Duration: N/A, bitrate: 64 kb/s
    Stream #0:0: Video: h264, none, 90k tbr, 90k tbn, 180k tbc
    Stream #0:1: Audio: pcm_mulaw, 8000 Hz, mono, s16, 64 kb/s
Output #0, flv, to 'rtmp://ec2instanceip/live':
Output file #0 does not contain any stream


    


  • vsync flag usage in ffmpeg while filtering

    13 octobre 2022, par antortjim

    I am trying to apply a threshold to an input video with ffmpeg, but I observe the following warning emitted for every processed frame

    


    [mp4 @ 0x56360181a200] Non-monotonous DTS in output stream 0:0; previous: 182272, current: 182272; changing to 182273. This may result in incorrect timestamps in the output file.


    


    where the previous and current are always 1 less than the value to which the DTS (Decoding Time Stamp) is changed

    


    I have noticed this warning is emitted only if I set -vsync passthrough (which I changed from the original -vsync 0 which is seen in many online examples).

    


    # input.mp4 has resolution 790x790
ffmpeg -vsync passthrough  -i input.mp4  -f lavfi -i color=808080:s=790x790 -f lavfi -i color=black:s=790x790 -f lavfi -i color=white:s=790x790 -filter_complex '[0:v][1:v][2:v][3:v]threshold' -an -c:v h264_nvenc threshold.mp4


    


    Shall I leave the vsync flag set to the default (auto or -1), or is -vsync passthrough essential to guarantee the frames are displayed in the right order ? In that case, how do I handle this warning ? Some other online examples I found of users experiencing this warning are different from mine, because in their case they are concatenating videos (see 1, 2

    


    From the documentation on the -vsync flag, at the end, I see :

    


    


    With -map you can select from which stream the timestamps should be taken. You can leave either video or audio unchanged and sync the remaining stream(s) to the unchanged one

    


    


    Maybe this warning should be handled with -map ? But I don't know how.

    


    Sidenote, I keep getting the deprecation warning asserting me to change -vsync for -fps_mode, however doing so breaks the command.

    


    FFPEG Version :

    


    commit 28ac2279adb860ea8b90d3073603912bf3eb6a83 from ffmpeg master branch

    


    ffmpeg version N-108625-g28ac2279ad Copyright (c) 2000-2022 the FFmpeg developers
built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
configuration: --enable-nonfree --enable-cuda-nvcc --enable-libnpp --enable-gpl --extra-cflags=-I/usr/local/cuda/include --extra-ldflags=-L/usr/local/cuda/lib64 --disable-static --enable-shared
libavutil      57. 39.101 / 57. 39.101
libavcodec     59. 50.100 / 59. 50.100
libavformat    59. 34.101 / 59. 34.101
libavdevice    59.  8.101 / 59.  8.101
libavfilter     8. 49.101 /  8. 49.101
libswscale      6.  8.112 /  6.  8.112
libswresample   4.  9.100 /  4.  9.100
libpostproc    56.  7.100 / 56.  7.100


    


    OS

    


    Ubuntu 20.04.4 LTS