Recherche avancée

Médias (91)

Autres articles (41)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Other interesting software

    13 avril 2011, par

    We don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
    The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
    We don’t know them, we didn’t try them, but you can take a peek.
    Videopress
    Website : http://videopress.com/
    License : GNU/GPL v2
    Source code : (...)

Sur d’autres sites (6508)

  • Videos written with moviepy on amazon aws S3 are empty

    10 avril 2019, par cellistigs

    I am working on processing a dataset of large videos ( 100 GB) for a collaborative project. To make it easier to share data and results, I am keeping all videos remotely on an amazon S3 bucket, and processing it by mounting the bucket on an EC2 instance.

    One of the processing steps I am trying to do involves cropping the videos, and rewriting them into smaller segments. I am doing this with moviepy, splitting the video with the subclip method and calling :

    subclip.write_videofile("PathtoS3Bucket"+VideoName.split('.')[0]+'part' +str(segment)+ '.mp4',codec = 'mpeg4',bitrate = "1500k",threads = 2)

    I found that when the videos are too large (parameters set as above) calls to this function will sometimes generate empty files in my S3 bucket ( 10% of the time). Does anyone have insight into features of moviepy/ffmpeg/S3 that would lead to this ?

  • How to create m3u8 playlist from mp4 video url ( stored in amazon S3 ) and store the video chunks ( .ts files) and .m3u8 file back to another S3 ?

    19 mai 2019, par dexter2019

    I am building an application where user can upload video and others can watch them later. I am aiming for HLS streaming of the video on the client side, for which the video format should be .m3u8. I am using node fluent-FFmpeg module to do the processing, however, I have a huge doubt, that, how to ensure that all the .ts files (chunks) are also stored back in s3 bucket along with the m3u8 file after ffmpeg processed the mp4 file ?

    Because the ffmpeg command only takes the location of the m3u8 file ? How handle it when I want the input and output location to be S3 ?

    Any help will be greatly appreciated.

    I am following the answer from this question Ffmpeg creating m3u8 from mp4, video file size , which is working absolutely fine in my local machine, how to achieve the same for s3 ?

  • PHP - Upload video convert mp4 and upload to Amazon S3

    31 octobre 2019, par Kadir Geçit

    I’m using amazon s3 as video storage for my website. I’m having problems for some videos. black screen or sound problems etc.

    I want to convert the video to mp4 format after uploading the video to my server and then upload it to amazon. Is it possible with FFMPEG ?

    I’m using this code for uploading files now :

    $file1 = $_FILES['file']['name'];
    $videoFileType = strtolower(pathinfo($file1,PATHINFO_EXTENSION));
    $file_name = sprintf('%s_%s', uniqid(),uniqid().".".$videoFileType);
    $temp_file_location = $_FILES["file"]["tmp_name"];

    require 'application/libraries/Amazon/aws-autoloader.php';
           $s3 = new Aws\S3\S3Client([
               'region'  => $amazon_region,
               'version' => 'latest',
               'credentials' => [
               'key'    => $amazon_key,
               'secret' => $amazon_secret,
               ]
           ]);    

           $result = $s3->putObject([
               'Bucket' => $amazon_bucket,
               'Key'    => $file_name,
               'SourceFile' => $temp_file_location,
               'ACL'    => 'public-read',
               'CacheControl' => 'max-age=3153600',
           ]);
               $filepath = $result['ObjectURL'] . PHP_EOL;

               echo json_encode([
                   'status' => 'ok',
                   'path' => $filepath

               ]);