
Recherche avancée
Autres articles (81)
-
Soumettre bugs et patchs
10 avril 2011Un logiciel n’est malheureusement jamais parfait...
Si vous pensez avoir mis la main sur un bug, reportez le dans notre système de tickets en prenant bien soin de nous remonter certaines informations pertinentes : le type de navigateur et sa version exacte avec lequel vous avez l’anomalie ; une explication la plus précise possible du problème rencontré ; si possibles les étapes pour reproduire le problème ; un lien vers le site / la page en question ;
Si vous pensez avoir résolu vous même le bug (...) -
Contribute to a better visual interface
13 avril 2011MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community. -
Automated installation script of MediaSPIP
25 avril 2011, parTo overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
The documentation of the use of this installation script is available here.
The code of this (...)
Sur d’autres sites (4968)
-
How to transcode .mp4 files using ffmpeg celery rabbitMq in Amazon Linux ?
11 mars 2017, par Srinivas 25I want to transcode .mp4, .flv files by using ffmpeg, celery and rabbitMQ. With the help of these tools i can able to transcode in localhost, where in
my OS is ubuntu, but i am unable to do the same on AWS Linux for production
Here is the code i am using to integrate ffmpeg, rabbitMQ and celery to transcode on Amazon LinuxFFMPEG_PATH = '/usr/bin/ffmpeg'
CELERY_BROKER_URL = 'amqp://guest:guest@awsuser:5672//'
CELERY_ACCEPT_CONTENT = 'file'
CELERY_RESULT_BACKEND = 'rpc://'
CELERY_TASK_SERIALIZER = 'file'celery.py
from __future__ import absolute_import
import os
from celery import Celery
from afnity.settings import CELERY_BROKER_URL
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'afnity.settings')
app = Celery('taskapp',
broker=CELERY_BROKER_URL,
include=['taskapp.tasks'])
app.config_from_object('django.conf:settings', namespace='CELERY')
if __name__ == '__main__':
app.start()tasks.py
from .celery import app
@app.task
def add()
return(3+4d) -
Php - Get video duration from a file hosted on amazon S3 server
8 mai 2017, par Michael LeeI am developing back-end API with php to get the duration from a video file hosted on amazon S3. The API server is hosted on amazon EC2 and the video file is uploaded on amazon S3. The problem is when I request to API server, it has to response the information of the video file hosted on amazon S3 such as duration, file size, resolution etc. At least, it must response the video. I’ve followed codes by using ffmpeg such as below, but it’s not working.
<?php
$file = 'https://videoapptest-bucket.s3.amazonaws.com/video/10/44__55_f03876434e48d5fed7d6982b0cf578de_u_00000.ts';
$result = shell_exec('ffmpeg -i ' . escapeshellcmd($file) . ' 2>&1');
preg_match('/(?<=Duration: )(\d{2}:\d{2}:\d{2})\.\d{2}/', $result, $match);
print_r($match);
?>How can I get the video duration from a file hosted on amazon s3 with ffmpeg or another tool ?
-
Best practices for developing scalable video transcoding server on Amazon Web Services ? [closed]
5 février, par undefinedWhat do people think are the most important issues when developing an application that is going to allow users to upload video and images to a server and have them transcoded by FFMPEG and stored in amazon S3 ? I have a couple of options ;


- 

- install FFMPEG on the same server that handles file uploads, when a video is uploaded and stored on EC2 instance, call FFMPEG to convert it then when done, write the file to S3 bucket and dispose of the original.




How scalable is this ? What happens when many users upload at the same time ? How do I manage multiple processes at once ? How do I know when to start another instance and load balance this configuration ?


- 

- Have one server for processing uploads (updating database, renaming files etc) and one server for doing transcoding. Again what is the best way to manage multiple processes ? should I be looking at Amazon SQS for this ? Can I tell the transcoding server to get the file from the upload server or should I copy the file to the transcoding server ? Should I just store all files on S3 and SQS can read from there. I am trying to have as little traffic as possible.




I am running a linux box as the upload server and have FFMPEG running on this.