Recherche avancée

Médias (0)

Mot : - Tags -/organisation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (82)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (10448)

  • Best approach to set up live video streaming between HoloLens2 and Desktop PC

    19 mai 2021, par SilverLife

    First of all, I am sorry for this question. I know its more asking for an advice than asking about any coding problems. Unfortunately I don´t know where else I should ask such a question, so please be patient. (I am open for any recommendations)

    


    Since a while I am searching for a promising approach to stream live videos from a desktop PC via network to my HoloLens 2 unity application. The video transfer should be within a encapsulated network without internet access. Therefore a direct connection between both devices without any signaling- or web-servers would be desirable, if possible. For now we are thinking about sending the stream via ffmpeg and receiving it somehow in the unity application.
So far I came across MixedReality-WebRTC and ffmpegInterop. Unfortunately as I read, WebRTC needs at least some kind of signaling server which manages the connection between the clients. ffmpegInterop seems to be really difficult to integrate into unity.

    


    I am completely new to the topic of low latency video streaming and a bit lost in this complex environment. Can anybody give me an advice about a promising and extendable way to receive a low latency video stream ?

    


  • What version of ffmpeg is bundled inside electron ?

    7 août 2020, par greghmerrill

    The prebuilt electron binaries for Windows include the file ffmpeg.dll. How can I determine what version of the underlying ffmpeg library is actually compiled to produce this dll ? I need this information to understand what known vulnerabilities (CVE's, etc) might be in a given version of electron via ffmpeg.

    



    As I understand it, the ffmpeg dll itself is taken from https://github.com/electron/nightlies/releases/ when I download my dependencies (I'm using electron-prebuilt-compile). But I'm not getting a clear picture of what the source for that binary is. I think it might be from https://chromium.googlesource.com/chromium/third_party/ffmpeg/ but then I'm not clear on the relationship of that repo with the original ffmpeg repo (e.g. how often are fixes merged from the ffmpeg repo to the chromium third-party repo, etc.)

    



    I tried searching the content of the dll as per cody's suggestion, but no luck :

    



    $ strings ffmpeg.dll  | grep -i ffmp
FFmpeg video codec #1
Huffyuv FFmpeg variant
Not yet implemented in FFmpeg, patches welcome
C:\projects\libchromiumcontent\src\out-x64\static_library\ffmpeg.dll.pdb
ffmpeg.dll


$ strings ffmpeg.dll  | grep -i version
H.263 / H.263-1996, H.263+ / H.263-1998 / H.263 version 2
MPEG-4 part 2 Microsoft variant version 1
MPEG-4 part 2 Microsoft variant version 2
MPEG-4 part 2 Microsoft variant version 3
H.263+ / H.263-1998 / H.263 version 2
On2 VP6 (Flash version)
On2 VP6 (Flash version, with alpha channel)
old standard qpel (autodetected per FOURCC/version)
direct-qpel-blocksize bug (autodetected per FOURCC/version)
edge padding bug (autodetected per FOURCC/version)
strictly conform to a older more strict version of the spec or reference software
minor_version
premiere_version
quicktime_version
Assume this x264 version if no x264 version found in any SEI


    


  • ffmpeg stream videos from two cams have a same device name

    21 novembre 2020, par junsang

    There are two cameras have a same device name Microsoft® LifeCam Studio(TM).
    
ffmpeg -list_deivces true -f dshow -i dummy prints the below output.

    



    C:\Users\user>ffmpeg -list_devices true -f dshow -i dummy
ffmpeg version git-2020-02-05-e6891d1 Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 9.2.1 (GCC) 20200122
  configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --enable-libmfx --enable-ffnvcodec --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf
  libavutil      56. 39.100 / 56. 39.100
  libavcodec     58. 67.101 / 58. 67.101
  libavformat    58. 37.100 / 58. 37.100
  libavdevice    58.  9.103 / 58.  9.103
  libavfilter     7. 74.100 /  7. 74.100
  libswscale      5.  6.100 /  5.  6.100
  libswresample   3.  6.100 /  3.  6.100
  libpostproc    55.  6.100 / 55.  6.100
[dshow @ 000001d5c5108dc0] DirectShow video devices (some may be both video and audio devices)
[dshow @ 000001d5c5108dc0]  "Microsoft┬« LifeCam Studio(TM)"
[dshow @ 000001d5c5108dc0]     Alternative name "@device_pnp_\\?\usb#vid_045e&pid_0811&mi_00#8&6ae46e6&0&0000#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global"
[dshow @ 000001d5c5108dc0]  "Microsoft┬« LifeCam Studio(TM)"
[dshow @ 000001d5c5108dc0]     Alternative name "@device_pnp_\\?\usb#vid_045e&pid_0811&mi_00#8&e544916&0&0000#{65e8773d-8f56-11d0-a3b9-00a0c9223196}\global"
[dshow @ 000001d5c5108dc0] DirectShow audio devices
[dshow @ 000001d5c5108dc0]  "Desktop Microphone (6- Microsoft┬« LifeCam Studio(TM))"
[dshow @ 000001d5c5108dc0]     Alternative name "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{D5F4881A-6E88-4563-8BA0-081CFD50E353}"
[dshow @ 000001d5c5108dc0]  "Desktop Microphone (5- Microsoft┬« LifeCam Studio(TM))"
[dshow @ 000001d5c5108dc0]     Alternative name "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{84674B28-DC68-4AC0-8331-D687C7B7D69C}"
[dshow @ 000001d5c5108dc0]  "Digital Audio (S/PDIF) (High Definition Audio Device)"
[dshow @ 000001d5c5108dc0]     Alternative name "@device_cm_{33D9A762-90C8-11D0-BD43-00A0C911CE86}\wave_{CDD24485-59D2-4BED-B6FC-B7447251C7E2}"


    



    Because the two cameras have a same device name, I couldn't stream two videos using this simple command at a same time : ffplay -f dshow -i video=Microsoft® LifeCam Studio(TM).
    
So I used pin names following the ffmpeg dshow example.
The only thing I could check was Could not find video device with name [video=~~pin name~~] among source devices of type video.

    



    What ffplay command makes enable to stream both videos ?