
Recherche avancée
Autres articles (111)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Multilang : améliorer l’interface pour les blocs multilingues
18 février 2011, parMultilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela. -
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)
Sur d’autres sites (12049)
-
ffmpeg + AWS Lambda issues. Won't compress full video
7 juillet 2022, par Joesph Stah LynnSo I followed this tutorial to set everything up, and changed the function a bit to compress video, but no matter what I try, on larger videos (basically anything over 50-100MB), the output file will always be cut short, and depending on the encoding settings I'm using, will be cut by different amounts. I tried using the solution found here, adding a -nostdin flag to my ffmpeg command, but that also didn't seem to fix the issue.

Another odd thing, is no matter what I try, if I remove the '-f mpegts' flag, the output video will be 0B.

My Lambda function is set up with 3008MB of Memory (submitted a ticket to get my limit upped so I can use the full 10240MB available), and 2048MB of Ephemeral storage (I honestly am not sure if I need anything more than the minimum 512, but I upped it to try and fix the issue). When I check my cloudwatch logs, on really large files, it will occasionally time out, but other than that, I will get no error messages, just the standard start, end, and billable time messages.

This is the code for my lambda function.


import json
import os
import subprocess
import shlex
import boto3

S3_DESTINATION_BUCKET = "rw-video-out"
SIGNED_URL_TIMEOUT = 600

def lambda_handler(event, context):

 s3_source_bucket = event['Records'][0]['s3']['bucket']['name']
 s3_source_key = event['Records'][0]['s3']['object']['key']

 s3_source_basename = os.path.splitext(os.path.basename(s3_source_key))[0]
 s3_destination_filename = s3_source_basename + "-comp.mp4"

 s3_client = boto3.client('s3')
 s3_source_signed_url = s3_client.generate_presigned_url('get_object',
 Params={'Bucket': s3_source_bucket, 'Key': s3_source_key},
 ExpiresIn=SIGNED_URL_TIMEOUT)

 ffmpeg_cmd = f"/opt/bin/ffmpeg -nostdin -i {s3_source_signed_url} -f mpegts libx264 -preset fast -crf 28 -c:a copy - "
 command1 = shlex.split(ffmpeg_cmd)
 p1 = subprocess.run(command1, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
 resp = s3_client.put_object(Body=p1.stdout, Bucket=S3_DESTINATION_BUCKET, Key=s3_destination_filename)
 s3 = boto3.resource('s3')
 s3.Object(s3_source_bucket,s3_source_key).delete()

 return {
 'statusCode': 200,
 'body': json.dumps('Processing complete successfully')
 }



This is the code from the solution I mentioned, but when I try using this code, I get output.mp4 not found errors


def lambda_handler(event, context):
 print(event)
 os.chdir('/tmp')
 s3_source_bucket = event['Records'][0]['s3']['bucket']['name']
 s3_source_key = event['Records'][0]['s3']['object']['key']

 s3_source_basename = os.path.splitext(os.path.basename(s3_source_key))[0]
 s3_destination_filename = s3_source_basename + ".mp4"

 s3_client = boto3.client('s3')
 s3_source_signed_url = s3_client.generate_presigned_url('get_object',
 Params={'Bucket': s3_source_bucket, 'Key': s3_source_key},
 ExpiresIn=SIGNED_URL_TIMEOUT)
 print(s3_source_signed_url)
 s3_client.download_file(s3_source_bucket,s3_source_key,s3_source_key)
 # ffmpeg_cmd = "/opt/bin/ffmpeg -framerate 25 -i \"" + s3_source_signed_url + "\" output.mp4 "
 ffmpeg_cmd = f"/opt/bin/ffmpeg -framerate 25 -i {s3_source_key} output.mp4 "
 # command1 = shlex.split(ffmpeg_cmd)
 # print(command1)
 os.system(ffmpeg_cmd)
 # os.system('ls')
 # p1 = subprocess.run(command1, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
 file = 'output.mp4'
 resp = s3_client.put_object(Body=open(file,"rb"), Bucket=S3_DESTINATION_BUCKET, Key=s3_destination_filename)
 # resp = s3_client.put_object(Body=p1.stdout, Bucket=S3_DESTINATION_BUCKET, Key=s3_destination_filename)
 s3 = boto3.resource('s3')
 s3.Object(s3_source_bucket,s3_source_key).delete()
 return {
 'statusCode': 200,
 'body': json.dumps('Processing complete successfully')
 }



Any help would be greatly appreciated.


-
AWS Lambda subprocess OSError : [Errno 2] No such file or directory
11 septembre 2016, par LevI’m trying to create a lambda function that makes collection of thumbnails from a video on amazon s3 using ffmpeg. ffmpeg binary is included into fuction package.
function code :
# -*- coding: utf-8 -*-
import stat
import shutil
import boto3
import logging
import subprocess as sp
import os
import threading
thumbnail_prefix = 'thumb_'
thumbnail_ext = '.jpg'
time_delta = 1
video_frames_path = 'media/videos/frames'
print('Loading function')
logger = logging.getLogger()
logger.setLevel(logging.INFO)
lambda_tmp_dir = '/tmp' # Lambda fuction can use this directory.
# ffmpeg is stored with this script.
# When executing ffmpeg, execute permission is requierd.
# But Lambda source directory do not have permission to change it.
# So move ffmpeg binary to `/tmp` and add permission.
ffmpeg_bin = "{0}/ffmpeg.linux64".format(lambda_tmp_dir)
shutil.copyfile('/var/task/ffmpeg.linux64', ffmpeg_bin)
os.chmod(ffmpeg_bin, 777)
# tried also:
# os.chmod(ffmpeg_bin, os.stat(ffmpeg_bin).st_mode | stat.S_IEXEC)
s3 = boto3.client('s3')
def get_thumb_filename(num):
return '{prefix}{num:03d}{ext}'.format(prefix=thumbnail_prefix, num=num, ext=thumbnail_ext)
def create_thumbnails(video_url):
i = 1
filenames_list = []
filename = None
while i == 1 or os.path.isfile(os.path.join(os.getcwd(), get_thumb_filename(i-1))):
if filename:
filenames_list.append(filename)
time = time_delta * (i - 1)
filename = get_thumb_filename(i)
print(ffmpeg_bin)
if os.path.isfile(ffmpeg_bin):
print('ok')
sp.call(['sudo',
ffmpeg_bin,
'-ss',
str(time),
'-i',
video_url,
'-frames:v',
'1',
get_thumb_filename(i)])
i += 1
print(filenames_list)
return filenames_list
def s3_upload_file(file_path, key, bucket, acl, content_type):
file = open(file_path, 'r')
s3.put_object(
Bucket=bucket,
ACL=acl,
Body=file,
Key=key,
ContentType=content_type
)
logger.info("file {0} moved to {1}/{2}".format(file_path, bucket, key))
def s3_upload_files_in_threads(filenames_list, dir_path, bucket, s3path, acl, content_type):
for filename in filenames_list:
if os.path.isfile(os.path.join(dir_path, filename)):
print(os.path.join(dir_path, filename))
t = threading.Thread(target=s3_upload_file,
args=(os.path.join(dir_path, filename),
'{0}/{1}'.format(s3path, filename),
bucket,
acl,
content_type)).start()
def lambda_handler(event, context):
bucket = event['Records'][0]['s3']['bucket']['name']
video_key = event['Records'][0]['s3']['object']['key']
video_name = video_key.split('/')[-1].split('.')[0]
video_url = 'http://{0}/{1}'.format(bucket, video_key)
filenames_list = create_thumbnails(video_url)
s3_upload_files_in_threads(filenames_list,
os.getcwd(),
bucket,
'{0}/{1}'.format(video_frames_path, video_name),
'public-read',
'image/jpeg')
returnduring the execution I get following logs :
Loading function
/tmp/ffmpeg.linux64
ok
[Errno 2] No such file or directory: OSError
Traceback (most recent call last):
File "/var/task/lambda_function.py", line 112, in lambda_handler
filenames_list = create_thumbnails(video_url)
File "/var/task/lambda_function.py", line 77, in create_thumbnails
get_thumb_filename(i)])
File "/usr/lib64/python2.7/subprocess.py", line 522, in call
return Popen(*popenargs, **kwargs).wait()
File "/usr/lib64/python2.7/subprocess.py", line 710, in __init__
errread, errwrite)
File "/usr/lib64/python2.7/subprocess.py", line 1335, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directoryWhen I use the same sp.call() with the same ffmpeg binary on my ec2 instance it works fine.
-
ffmpeg merge video and audio + fade out
1er mai 2016, par jacky brownI have one video file and one audio file.
I want to merge them together that the final output video will be in the length of the video and will contain the audio in the background.i did :
ffmpeg -i output.avi -i bgmusic.mp3 -filter_complex " [1:0] apad " -shortest out.avi
but the background audio is cut in the end of the final merge movie.
i want it to fade out nicely.how can i do it ??
but the background audio is cut in the end of the final merge movie.
i want it to fade out nicely.how can i do it ??
thanks.