
Recherche avancée
Médias (1)
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (100)
-
Mise à jour de la version 0.1 vers 0.2
24 juin 2013, parExplications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.
Sur d’autres sites (12198)
-
Revision e27bcc2451 : Merge "[svc] Verify and store input two pass stats data in 2nd pass rc"
27 mars 2014, par Minghai ShangMerge "[svc] Verify and store input two pass stats data in 2nd pass rc"
-
ffmpeg crashes in electron on mac apple store ; no suitable image found file system sandbox blocked open() of 'libass'
20 février 2021, par MartinI am trying to release an electron app on the Mac Apple Store (mas), my electron app uses ffmpeg to render videos. In order to release my app on the mac apple store, It needs to be sandboxed, and by default ffmpeg makes calls to external libraries so I need to statically build ffmpeg and package it with my app. I have successfully built my app, submitted it to the app store, had it approved, and downloaded/used it but my ffmpeg fails with this errir :


Uncaught (in promise) Error: Command was killed with SIGABRT (Aborted): /Users/martinbarker/Documents/projects/digify-new/dist/mas/Digify.app/Contents/Resources/ffmpeg -i /Users/martinbarker/Downloads/Steve Leach With The Crystal Grass Orchestra – Ocean Potion/9. Get Out In The Sun.flac -i /Users/martinbarker/Downloads/Steve Leach With The Crystal Grass Orchestra – Ocean Potion/10. Golden Hues.flac -y -filter_complex concat=n=2:v=0:a=1 -c:a libmp3lame -b:a 320k /Users/martinbarker/Downloads/Steve Leach With The Crystal Grass Orchestra – Ocean Potion/output-261020.mp3
dyld: Library not loaded: /usr/local/opt/libass/lib/libass.9.dylib
 Referenced from: /Users/martinbarker/Documents/projects/digify-new/dist/mas/Digify.app/Contents/Resources/ffmpeg
 Reason: no suitable image found. Did find:
 file system sandbox blocked open() of '/usr/local/opt/libass/lib/libass.9.dylib'
 /usr/local/opt/libass/lib/libass.9.dylib: stat() failed with errno=1
 file system sandbox blocked open() of '/usr/local/lib/libass.9.dylib'
 file system sandbox blocked open() of '/usr/local/Cellar/libass/0.15.0/lib/libass.9.dylib'
 at makeError (/Users/martinbarker/…eca/lib/error.js:59)
 at handlePromise (/Users/martinbarker/…/execa/index.js:114)
 at async file:/Users/ma…js/newindex.js:1151



I think this line is important ;
file system sandbox blocked open() of '/usr/local/opt/libass/lib/libass.9.dylib'
but I'm not sure what I should change with my static ffmpeg build so that it works in production and so that I can avoid the above error.

My code is available on the branch mas-attempt-after-redesign here : https://github.com/MartinBarker/digify/tree/mas-attempt-after-redesign


Inside my package.json I have the command
download-ffmpeg
which clones the ffmpeg repo, runs a configure command with some flags, and then builds ffmpeg into a folder called 'ffmpeg-mac', this folder gets packaged with the app for the mac apple store build.

git clone https://git.ffmpeg.org/ffmpeg.git ffmpeg-mac && cd ffmpeg-mac && ./configure pkg_config='pkg-config --static' --pkg-config-flags='--static' --libdir=/usr/local/lib --extra-version=ntd_20150128 --disable-shared --disable-lzma --enable-gpl --enable-pthreads --enable-nonfree --enable-libass --enable-libfdk-aac --enable-libmp3lame --enable-libx264 --enable-static --enable-filters --enable-runtime-cpudetect && make && cd ..



You can see in the above ffmpeg command the flag
--enable-libass
, but even though I have that flag included, after I build and sign my mac apple store build by runningsudo rm -rf dist/mas/ && npm run build-mas && sh mas-sign-script.sh
, the production build (once approved) fails with the above included error.

-
How to encode a HEVC video from YUV420 Mat data (from a hardware triggered camera) via FFmpeg in Python
4 juillet 2023, par QuantumRiverI am having problems encoding a HEVC video from a series of YUV420 Mat data via FFmpeg.


- 

- I am using python in Ubuntu-20.04 ;
- I am retrieving frame data from a hardware triggered camera (BASLER), using pypylon ;
- I want to write a video from that camera in HEVC codec, using my GPU-NVENC ;
- I guess I have to use FFmpeg to achieve these ;










What I have tried :


- 

- I find that FFmpeg supports encoding from a camera, but it seems to only support webcams, not the camera I use (hardware triggered BASLER cameras with pypylon APIs) ;
- I find that FFmpeg supports transfering a video from one codec to another, which is not my case ;
- I find that FFmpeg supports encoding a video from a series of jpeg images. But in my case, it will be inefficient if I first save each frame into a picture and then encode them into a video ;
- The frame data retrieved from camera can be converted to YUV420 (directly from pypylon), which is suitable for HEVC encoding ;
- I learnt that the basic unit in FFmpeg to encode a video is AVFrame. I guess I have to first turn my YUV420 data into AVFrame, then encode AVFrames into HEVC ;
- But I do not know how to achieve that in python.














My simplified and expected codes :


camera = pylon.InstantCamera(tlf.CreateFirstDevice())
converter = pylon.ImageFormatConverter()
converter.OutputPixelFormat = pylon.PixelType_YUV420
video_handle = xxxxxx # HEVC
while True:
 grabResult = camera.RetrieveResult(timeout, pylon.TimeoutHandling_ThrowException)
 image = converter.Convert(grabResult).GetArray()
 video_handle.write(frame) # encode into a hevc video via ffmpeg in NVENC