
Recherche avancée
Autres articles (71)
-
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
Sur d’autres sites (7490)
-
Output a video file from ffmpeg directly to google cloud storage
24 juin 2021, par Anup SedhainBackground on the problem


We are trying to compress a video using FFmpeg in our server that is hosted in the Google App Engine (GAE). The input file is in Google Cloud Storage (GCS) which can be easily passed as an input to FFmpeg and the processing is done, however, I wanted to output the file directly to GCS. I have referred to the documentation here https://ffmpeg.org/ffmpeg-protocols.html#http that they provided and used the correct headers and method to upload the file using a signed URL, but it doesn't seem to work. As of now, I am not even sure whether this is possible.


Current Implementation


Currently, we are first saving the output file in the GAE workspace and then uploading the file to the bucket. This flow worked fine until we faced another problem. Whenever the file is too big, the processing takes more than 10 mins which seems to be the threshold for Automatically Scaled Instances in the Flexible environment. To get around this problem we could use basic_scaling in a Standard environment but there we cannot seem to write files in the GAE workspace. I could choose to write in the
/tmp
directory but that uses RAM and we can have many files being uploaded at a time, so it's not an option.

Possible Future


Right now one solution I have seen is to use a Flexible environment with manual scaling, but this is a bad idea when it comes to scaling and cost-effectiveness. Another, which I am not so sure about would be to use Google Compute Engine but I am yet to try this.


Conclusion


The problem that arose from not being able to make a PUT request to Google Cloud Storage from FFmpeg led me to go through tens of problems surrounding GAE and its weird combination of instances and feature set.


Would really appreciate suggestions or possible solutions if I am missing anything. If only we could make the FFmpeg output the file to GCS.


-
How to pipe live stream output of ffmpeg to google drive using rclone
24 février 2021, par CARE HFI want to send live captured output of ffmpeg to google drive (not by storing in local and then move using rclone)


ffmpeg -i "$url" -t 00:00:20 -map 0 -c copy "1.mp4" | rclone cat rclone:/rclone/1.mp4



I tried above but fails.


-
Revision 95a484c01465c56cc527a044e72c2e1165f5448f : Google&co indexe les urls courtes microbloguees. Il convient donc de les ...
6 décembre 2010, par Cerdic — LogGoogle&co indexe les urls courtes microbloguees. Il convient donc de les rediriger par un status 301 afin d’indiquer que cette adresse courte est definitivement renvoyee vers l’url complete git-svn-id : svn ://trac.rezo.net/spip/branches/spip-2.1@16625 (...)