
Recherche avancée
Autres articles (34)
-
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (7132)
-
Uncaught Error : spawn D :\Users\...\ffmpeg.exe ENOENT at Process.ChildProcess._handle.onexit
15 avril 2020, par yasgur99I am trying to use ffmpeg in an electron project
I did :
yarn add ffmpeg-static
yarn add fluent-ffmpeg



In the file where I am trying to use it I have :



import ffmpeg from 'fluent-ffmpeg';
import ffmpegStatic from 'ffmpeg-static';




In the method that calls ffmpeg, I have :



ffmpeg.setFfmpegPath(ffmpegStatic);
ffmpeg(path)
 .audioCodec('aac')
 .videoCodec('h264')
 .videoBitrate(8192) // 8 MB = 1024 * 8 KB
 .outputOptions([
 '-y',
 'movflags','faststart'
 ])
 .output('temp.mp4')
 .on('start', () => console.log('starting'))
 .on('stderr', (err,stdout,stderr) => {
 console.log('Cannot Process Video' + err.message);
 })
 .on('progress', progress => {
 console.log(progress.percent);
 })
 .on('end', (stdout, stderr) => {
 fs.readFile('temp.mp4', (error, data) => {
 console.log(error);

 if (error) throw error;
 const filename = path.replace(/^.*[\\\/]/, '');
 return new File([data], filename);
 });
 })
 .run();




When the method that contains this code I get the following exception :



events.js:187 Uncaught Error: spawn D:\Users\Michael\Documents\desktopapp\app\ffmpeg.exe ENOENT
 at Process.ChildProcess._handle.onexit (internal/child_process.js:264)
 at onErrorNT (internal/child_process.js:456)
 at processTicksAndRejections (internal/process/task_queues.js:80)




If I
console.log(ffmpegStatic)
is outputsD:\Users\Michael\Documents\desktopapp\app\ffmpeg.exe



If I look in this folder, I do not see ffmpeg.exe. How can I fix this ?


-
Writing an MP4 slideshow video to S3 using the Lambda FFmpeg Layer
15 avril 2020, par GracieI am using AWS Lambda with the FFmpeg layer to try and build a 15 second MP4 file (beach.mp4) - from 3 static images that show for 3, 5 and 7 seconds in sequence.
These 3 images are within my Lambda upload deployment zip on S3, along with the sequence.txt file needed for the function.



SEQUENCE.TXT



file beach1.jpg
outpoint 3
file beach2.jpg
outpoint 8
file beach3.jpg
outpoint 15



FFMPEG COMMAND



ffmpeg -f concat -i sequence.txt -c:v libx264 -tune stillimage -c:a aac -b:a 192k -pix_fmt yuv420p -shortest beach.mp4



I am writing a file to S3, but it is blank, only 15 bytes. So doesn't contain the MP4 file created by FFmpeg. I think this has something to do with sync or streaming the video file so both the txt can be read and the MP4 can be written to a file, but not sure.



How can I read the .txt contents and then write the ffmpeg command to a file in /tmp/ ?



You can download or view the files at https://lifeisabeach.netlify.app/
(For some strange reason the MP4 length when built locally is 19 seconds, when it should be 15 !)



const util = require('util');
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
const { readFileSync, writeFileSync, unlinkSync, writeFile, readdir } = require('fs');
//const fs = require('fs');
const path = require('path');
const { spawnSync } = require('child_process');

exports.handler = async (event, context, callback) => {

 const outputBucket = 'mys3bucket';
 const sequenceTXT = "sequence.txt";

 // FFmpeg creates the file, using the contents of sequence.txt to create timed image slides
 const mp4create = await spawnSync(
 '/opt/bin/ffmpeg',
 [
 '-f',
 'concat',
 '-i',
 sequenceTXT,
 '-c:v',
 'libx264',
 '-tune',
 'stillimage',
 '-c:a',
 'aac',
 '-b:a',
 '192k',
 '-pix_fmt',
 'yuv420p',
 '-shortest',
 'beach.mp4'
 ]
 );

 // Write ffmpeg output to a file in /tmp/
 const writeMP4File = util.promisify(writeFile);
 await writeMP4File('/tmp/beach.mp4', mp4create, 'binary');
 console.log('MP4 content written to /tmp/');

 // Copy MP4 data to a variable to enable write to S3 Bucket
 let result = mp4create;
 console.log('MP4 Result contents ', result);

 const vidFile = readFileSync('/tmp/beach.mp4');

 // Set S3 bucket details and put MP4 file into S3 bucket from /tmp/
 const s3 = new AWS.S3();
 const params = {
 Bucket: outputBucket,
 Key: 'beach.mp4',
 ACL: 'private',
 Body: vidFile
 };

 // Put MP4 file from AWS Lambda function /tmp/ to an S3 bucket
 const s3Response = await s3.putObject(params).promise();
 callback(null, s3Response);

};



-
Writing an MPEG video to S3 using the Lambda FFmpeg Layer
12 avril 2020, par GracieI am using AWS Lambda with the FFmpeg layer to try and convert an existing local MP4 file (beach.mp4) - that is within my upload deployment zip to S3 - into to MPEG video and then write that file to S3.



I have used ffprobe, which works, so the FFmpeg layer is setup correctly.



I am writing a file to S3, but it is blank, only 15 bytes. So doesn't contain the MPEG file created by FFmpeg.



I think this has something to do with sync or streaming the video file so it can be written, but not sure.



Here is my code, if anyone could help figure this out :



const util = require('util');
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
const { readFileSync, writeFileSync, unlinkSync, writeFile, readdir } = require('fs');
//const fs = require('fs');
const path = require('path');
const { spawnSync } = require('child_process');

exports.handler = async (event, context, callback) => {

 const outputBucket = 'mys3bucket';

 const mpegcreate = await spawnSync(
 '/opt/bin/ffmpeg',
 [
 '-i',
 'beach.mp4',
 'beach.mpeg'
 ]
 );

 // Write ffmpeg output to a file in /tmp/
 const writeMPEGFile = util.promisify(writeFile);
 await writeMPEGFile('/tmp/beach.mpeg', mpegcreate, 'binary');
 console.log('MPEG content written to /tmp/');

 // Copy MPEG data to a variable to enable write to S3 Bucket
 let result = mpegcreate;
 console.log('MPEG Result contents ', result);

 const vidFile = readFileSync('/tmp/beach.mpeg');

 // Set S3 bucket details and put MPEG file into S3 bucket from /tmp/
 const s3 = new AWS.S3();
 const params = {
 Bucket: outputBucket,
 Key: 'beach.mpeg',
 ACL: 'private',
 Body: vidFile
 };

 // Put MPEG file from AWS Lambda function /tmp/ to an S3 bucket
 const s3Response = await s3.putObject(params).promise();
 callback(null, s3Response);

};