
Recherche avancée
Autres articles (101)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
Sur d’autres sites (16653)
-
Best practices for developing scalable video transcoding server on Amazon Web Services ? [closed]
5 février, par undefinedWhat do people think are the most important issues when developing an application that is going to allow users to upload video and images to a server and have them transcoded by FFMPEG and stored in amazon S3 ? I have a couple of options ;


- 

- install FFMPEG on the same server that handles file uploads, when a video is uploaded and stored on EC2 instance, call FFMPEG to convert it then when done, write the file to S3 bucket and dispose of the original.




How scalable is this ? What happens when many users upload at the same time ? How do I manage multiple processes at once ? How do I know when to start another instance and load balance this configuration ?


- 

- Have one server for processing uploads (updating database, renaming files etc) and one server for doing transcoding. Again what is the best way to manage multiple processes ? should I be looking at Amazon SQS for this ? Can I tell the transcoding server to get the file from the upload server or should I copy the file to the transcoding server ? Should I just store all files on S3 and SQS can read from there. I am trying to have as little traffic as possible.




I am running a linux box as the upload server and have FFMPEG running on this.


-
install ffmpeg on amazon ecr linux python
27 mai 2024, par Luka SavicI'm trying to install ffmpeg on docker for amazon lambda function.
Code for Dockerfile is :


FROM public.ecr.aws/lambda/python:3.8

# Copy function code
COPY app.py ${LAMBDA_TASK_ROOT}

# Install the function's dependencies using file requirements.txt
# from your project folder.

COPY requirements.txt .
RUN yum install gcc -y
RUN pip3 install -r requirements.txt --target "${LAMBDA_TASK_ROOT}"
RUN yum install -y ffmpeg

# Set the CMD to your handler (could also be done as a parameter override outside of the Dockerfile)
CMD [ "app.handler" ]



I am getting an error :


> [6/6] RUN yum install -y ffmpeg:
#9 0.538 Loaded plugins: ovl
#9 1.814 No package ffmpeg available.
#9 1.843 Error: Nothing to do



-
Best practices for developing scalable video transcoding server on Amazon Web Services ?
6 septembre 2016, par undefinedWhat do people think are the most important issues when developing an application that is going to allow users to upload video and images to a server and have them transcoded by FFMPEG and stored in amazon S3 ? I have a couple of options ;
1) install FFMPEG on the same server that handles file uploads, when a video is uploaded and stored on EC2 instance, call FFMPEG to convert it then when done, write the file to S3 bucket and dispose of the original.
How scalable is this ? What happens when many users upload at the same time ? How do I manage multiple processes at once ? How do I know when to start another instance and load balance this configuration ?
2) Have one server for processing uploads (updating database, renaming files etc) and one server for doing transcoding. Again what is the best way to manage multiple processes ? should I be looking at Amazon SQS for this ? Can I tell the transcoding server to get the file from the upload server or should I copy the file to the transcoding server ? Should I just store all files on S3 and SQS can read from there. I am trying to have as little traffic as possible.
I am running a linux box as the upload server and have FFMPEG running on this.
Any advice on best practices for setting up such a configuration would be appreciated. Many thanks