Recherche avancée

Médias (2)

Mot : - Tags -/rotation

Autres articles (97)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

  • Personnaliser les catégories

    21 juin 2013, par

    Formulaire de création d’une catégorie
    Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
    Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
    On peut modifier ce formulaire dans la partie :
    Administration > Configuration des masques de formulaire.
    Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
    Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...)

Sur d’autres sites (10482)

  • ffmpeg continuous live streaming of dynamic files from directory

    4 septembre 2022, par user72261

    So I have been using ffmpeg to create a live stream from files stored in a folder using concat list.

    


    This works perfectly for local, static content but I have a security system in my warehouse that uploads .mp4 files to my server in the office.

    


    This system doesn't have any live stream functionality so I would like to set up a live stream myself using ffmpeg to continually read the latest files from the folder.

    


    I tried setting up a script to read all files in the directory and sort them based on the file name that is a UNIX timestamp of the date/time the video was taken. Then had the script select the latest segments after that last used segment and create a new concat list such as below

    


    file '00000.mpd'
file '00001.mp4'
file '00002.mp4'
file '00003.mp4'
file '00004.mp4'


    


    After doing some reading I found out to make it loop that file you could reference its self in the list like so

    


    file '00000.mpd'
file '00001.mp4'
file '00002.mp4'
file '00003.mp4'
file '00004.mp4'
file 'concat.txt'


    


    This seemed to work ok until I started getting the error impossible to open concat.txt. This file doesn't exist, which I can only guess is because ffmpeg is trying to open the file as its being updated by the script.

    


    Has anyone come across this before or know another way to do it ?

    


  • Subtitles in ffmpeg/libavfilter

    15 juin 2021, par Captain Jack

    I have a C program to read video/audio with libav/ffmpeg libraries and decode it.

    



    I am playing with some filters and most work just fine. I can draw text, overlay logos, flip and invert video colours. However, I am having big issues overlaying subtitles.

    



    My filter is very simple.

    



    const char *vfilter_descr = "[in]subtitles=subs.srt[out]";


    



    On the console I get this :

    



    [Parsed_subtitles_0 @ 0x7fe76c703240] Shaper: FriBidi 0.19.7 (SIMPLE) HarfBuzz-ng 2.4.0 (COMPLEX)
[Parsed_subtitles_0 @ 0x7fe76c703240] Using font provider coretext
[Parsed_subtitles_0 @ 0x7fe76c703240] fontselect: (Arial, 400, 0) -> /Library/Fonts/Microsoft/Arial.ttf, -1, ArialMT
[Parsed_subtitles_0 @ 0x7fe76c703240] fontselect: (Arial, 400, 100) -> /Library/Fonts/Microsoft/Arial Italic.ttf, -1, Arial-ItalicMT


    



    ...which somewhat confirms that subtitles are loading, though I am not sure why there are two fonts being loaded ?

    



    However, they are not showing at all - almost as if they never loaded. I tried several different files, including ASS ones but no luck.

    



    ffmpeg version is the latest one.

    



    $ ffmpeg -v
ffmpeg version 4.1.3 Copyright (c) 2000-2019 the FFmpeg developers
  built with Apple LLVM version 9.0.0 (clang-900.0.39.2)
  configuration: --prefix=/usr/local/Cellar/ffmpeg/4.1.3_1 --enable-shared --enable-pthreads --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags='-I/Library/Java/JavaVirtualMachines/adoptopenjdk-11.0.2.jdk/Contents/Home/include -I/Library/Java/JavaVirtualMachines/adoptopenjdk-11.0.2.jdk/Contents/Home/include/darwin' --host-ldflags= --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libbluray --enable-libmp3lame --enable-libopus --enable-librubberband --enable-libsnappy --enable-libtesseract --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librtmp --enable-libspeex --enable-videotoolbox --disable-libjack --disable-indev=jack --enable-libaom --enable-libsoxr
  libavutil      56. 22.100 / 56. 22.100
  libavcodec     58. 35.100 / 58. 35.100
  libavformat    58. 20.100 / 58. 20.100
  libavdevice    58.  5.100 / 58.  5.100
  libavfilter     7. 40.101 /  7. 40.101
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  3.100 /  5.  3.100
  libswresample   3.  3.100 /  3.  3.100
  libpostproc    55.  3.100 / 55.  3.100


    



    Any ideas ?

    


  • Mixed Reality WebRTC without Signalling Server

    25 mai 2021, par SilverLife

    I am trying to find a way, which allows me to use Mixed Reality WebRTC (link to git-repo) without a signalling server.
In detail, I want to create a sdp-file from my ffmpeg Video sender and use this sdp-description in my unity-Project to bypass the signaling process and receive the ffmpeg video stream.
Is there a way of doing so with Mixed Reality WebRTC ? I was already searching for the line of code, where the sdp-file is created within MR WebRTC but I didn´t find it.

    


    I am relatively new to this topic and I am not sure if this works at all but since ffmpeg is not directly compatible with webrtc I was thinking that this might be the most promising approach.