
Recherche avancée
Médias (2)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
-
Carte de Schillerkiez
13 mai 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (43)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Submit enhancements and plugins
13 avril 2011If you have developed a new extension to add one or more useful features to MediaSPIP, let us know and its integration into the core MedisSPIP functionality will be considered.
You can use the development discussion list to request for help with creating a plugin. As MediaSPIP is based on SPIP - or you can use the SPIP discussion list SPIP-Zone. -
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 is the first MediaSPIP stable release.
Its official release date is June 21, 2013 and is announced here.
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)
Sur d’autres sites (6302)
-
Best practices for developing scalable video transcoding server on Amazon Web Services ?
6 septembre 2016, par undefinedWhat do people think are the most important issues when developing an application that is going to allow users to upload video and images to a server and have them transcoded by FFMPEG and stored in amazon S3 ? I have a couple of options ;
1) install FFMPEG on the same server that handles file uploads, when a video is uploaded and stored on EC2 instance, call FFMPEG to convert it then when done, write the file to S3 bucket and dispose of the original.
How scalable is this ? What happens when many users upload at the same time ? How do I manage multiple processes at once ? How do I know when to start another instance and load balance this configuration ?
2) Have one server for processing uploads (updating database, renaming files etc) and one server for doing transcoding. Again what is the best way to manage multiple processes ? should I be looking at Amazon SQS for this ? Can I tell the transcoding server to get the file from the upload server or should I copy the file to the transcoding server ? Should I just store all files on S3 and SQS can read from there. I am trying to have as little traffic as possible.
I am running a linux box as the upload server and have FFMPEG running on this.
Any advice on best practices for setting up such a configuration would be appreciated. Many thanks
-
install ffmpeg on amazon ecr linux python
27 mai 2024, par Luka SavicI'm trying to install ffmpeg on docker for amazon lambda function.
Code for Dockerfile is :


FROM public.ecr.aws/lambda/python:3.8

# Copy function code
COPY app.py ${LAMBDA_TASK_ROOT}

# Install the function's dependencies using file requirements.txt
# from your project folder.

COPY requirements.txt .
RUN yum install gcc -y
RUN pip3 install -r requirements.txt --target "${LAMBDA_TASK_ROOT}"
RUN yum install -y ffmpeg

# Set the CMD to your handler (could also be done as a parameter override outside of the Dockerfile)
CMD [ "app.handler" ]



I am getting an error :


> [6/6] RUN yum install -y ffmpeg:
#9 0.538 Loaded plugins: ovl
#9 1.814 No package ffmpeg available.
#9 1.843 Error: Nothing to do



-
Best practices for developing scalable video transcoding server on Amazon Web Services ? [closed]
5 février, par undefinedWhat do people think are the most important issues when developing an application that is going to allow users to upload video and images to a server and have them transcoded by FFMPEG and stored in amazon S3 ? I have a couple of options ;


- 

- install FFMPEG on the same server that handles file uploads, when a video is uploaded and stored on EC2 instance, call FFMPEG to convert it then when done, write the file to S3 bucket and dispose of the original.




How scalable is this ? What happens when many users upload at the same time ? How do I manage multiple processes at once ? How do I know when to start another instance and load balance this configuration ?


- 

- Have one server for processing uploads (updating database, renaming files etc) and one server for doing transcoding. Again what is the best way to manage multiple processes ? should I be looking at Amazon SQS for this ? Can I tell the transcoding server to get the file from the upload server or should I copy the file to the transcoding server ? Should I just store all files on S3 and SQS can read from there. I am trying to have as little traffic as possible.




I am running a linux box as the upload server and have FFMPEG running on this.