
Recherche avancée
Autres articles (71)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
De l’upload à la vidéo finale [version standalone]
31 janvier 2010, parLe chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
Upload et récupération d’informations de la vidéo source
Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)
Sur d’autres sites (12491)
-
ffmpeg -acodec not found but installed
16 décembre 2015, par nadermxI am trying to change a video url from youtube to a mp3 via ffmpeg so I can stream the response in Ubuntu 14.04
ffmpeg -i https://r7---sn-ab5l6ne7.googlevideo.com/videoplayback?mime=video%2Fmp4&requiressl=yes&signature=99EFD8B801C44EA1221B5D653B2EB30C4CE962C6.6CD123737E54A11B0E5B16BC7AA2572688B091D9&source=youtube&mn=sn-ab5l6ne7&upn=2c_sPjXh1FE&itag=22&pl=16&mt=1450291179&ms=au&expire=1450312883&mm=31&id=o-ABMSf6BaCXTeSmFM41vs85JJ2rmcdeD6CVriiGKVMDlG&sver=3&ratebypass=yes&ip=68.9.161.152&sparams=dur%2Cid%2Cinitcwndbps%2Cip%2Cipbits%2Citag%2Clmt%2Cmime%2Cmm%2Cmn%2Cms%2Cmv%2Cpl%2Cratebypass%2Crequiressl%2Csource%2Cupn%2Cexpire&key=yt6&lmt=1429526600478294&dur=6.385&fexp=9416126%2C9420452%2C9422596%2C9423459%2C9423662&mv=m&ipbits=0&initcwndbps=2147500 -acodec liblamemp3 -f mp3 shave.mp3
But when I run this command, I end up with this issue
ffmpeg -i https://r7---sn-ab5l6ne7.googlevideo.com/videoplayback?mime=video%2Fmp4&requiressl=yes&signature=99EFD8B801C44EA1221B5D653B2EB30C4CE962C6.6CD123737E54A11B0E5B16BC7AA2572688B091D9&source=youtube&mn=sn-ab5l6ne7&upn=2c_sPjXh1FE&itag=22&pl=16&mt=1450291179&ms=au&expire=1450312883&mm=31&id=o-ABMSf6BaCXTeSmFM41vs85JJ2rmcdeD6CVriiGKVMDlG&sver=3&ratebypass=yes&ip=68.9.161.152&sparams=dur%2Cid%2Cinitcwndbps%2Cip%2Cipbits%2Citag%2Clmt%2Cmime%2Cmm%2Cmn%2Cms%2Cmv%2Cpl%2Cratebypass%2Crequiressl%2Csource%2Cupn%2Cexpire&key=yt6&lmt=1429526600478294&dur=6.385&fexp=9416126%2C9420452%2C9422596%2C9423459%2C9423662&mv=m&ipbits=0&initcwndbps=2147500 -acodec liblamemp3 -f mp3 shave.mp3[19] 8925
[20] 8926
[21] 8927
[22] 8928
[23] 8929
[24] 8930
[25] 8931
[26] 8932
[27] 8933
[28] 8934
[29] 8935
[30] 8936
[31] 8937
[32] 8938
[33] 8939
[34] 8940
[35] 8941
[36] 8942
[37] 8943
[38] 8944
[39] 8945
[40] 8946
[41] 8947
ffmpeg version git-2015-12-10-3652dd5 Copyright (c) 2000-2015 the FFmpeg developers
built with gcc 4.8 (Ubuntu 4.8.4-2ubuntu1~14.04)
configuration: --enable-gpl --enable-libfaac --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-librtmp --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-openssl --enable-nonfree --enable-version3 --enable-gnutls
libavutil 55. 10.100 / 55. 10.100
libavcodec 57. 17.100 / 57. 17.100
libavformat 57. 19.100 / 57. 19.100
libavdevice 57. 0.100 / 57. 0.100
libavfilter 6. 20.100 / 6. 20.100
libswscale 4. 0.100 / 4. 0.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
-acodec: command not found
[20] Done requiressl=yes
[21] Done signature=99EFD8B801C44EA1221B5D653B2EB30C4CE962C6.6CD123737E54A11B0E5B16BC7AA2572688B091D9
[22] Done source=youtube
[23] Done mn=sn-ab5l6ne7
[24] Done upn=2c_sPjXh1FE
[25] Done itag=22
[26] Done pl=16
[27] Done mt=1450291179
[28] Done ms=au
[29] Done expire=1450312883
[30] Done mm=31
[31] Done id=o-ABMSf6BaCXTeSmFM41vs85JJ2rmcdeD6CVriiGKVMDlG
[32] Done sver=3
[33] Done ratebypass=yes
[34] Done ip=68.9.161.152
[35] Done sparams=dur%2Cid%2Cinitcwndbps%2Cip%2Cipbits%2Citag%2Clmt%2Cmime%2Cmm%2Cmn%2Cms%2Cmv%2Cpl%2Cratebypass%2Crequiressl%2Csource%2Cupn%2Cexpire
[19]+ Stopped ffmpeg -i https://r7---sn-ab5l6ne7.googlevideo.com/videoplayback?mime=video%2Fmp4
[20] Done requiressl=yes
[21] Done signature=99EFD8B801C44EA1221B5D653B2EB30C4CE962C6.6CD123737E54A11B0E5B16BC7AA2572688B091D9
[22] Done source=youtube
[23] Done mn=sn-ab5l6ne7
[24] Done upn=2c_sPjXh1FE
[25] Done itag=22
[26] Done pl=16
[27] Done mt=1450291179
[28] Done ms=au
[29] Done expire=1450312883
[30] Done mm=31
[31] Done id=o-ABMSf6BaCXTeSmFM41vs85JJ2rmcdeD6CVriiGKVMDlG
[32] Done sver=3
[33] Done ratebypass=yes
[34] Done ip=68.9.161.152
[35] Done sparams=dur%2Cid%2Cinitcwndbps%2Cip%2Cipbits%2Citag%2Clmt%2Cmime%2Cmm%2Cmn%2Cms%2Cmv%2Cpl%2Cratebypass%2Crequiressl%2Csource%2Cupn%2Cexpire
[36] Done key=yt6
[37] Done lmt=1429526600478294
[38] Done dur=6.385
[39] Done fexp=9416126%2C9420452%2C9422596%2C9423459%2C9423662
[40] Done mv=m
[41] Done ipbits=0Now I am not sure why it keeps spitting this out. I have tried to do a streaming output with ’-’ variable as well and still no avail.
I have installed ffmpeg, and all the dependencies. I used this bash script to compile ffmpeg. I have been trying for days to get this working and still cannot seem to figure it out.
-
FFMpeg - how to encode a F4M manifest file to mp4
20 novembre 2015, par RoeeI’m working on a project, where we want to encode flash videos to mp4 (for example, we want to encode a f4v file), and live-stream them (which means we can’t just download all the f4f files, then encode them to a single mp4 and just then stream it. The encoding and streaming has to be done while downloading the files).
If the format of the flash video is flv for example, FFMpeg can do what I’ve described without any problem. I just give the address of the flv file to FFMpeg, and it encodes and streams it as mp4.
But, if the format of the flash video is something more complicated as f4v (which gets downloaded as many f4f files), I don’t have a single url to give to FFMpeg as input - I have many addresses of f4f files. I don’t even know how many f4f files the video has before playing it - it looks like the flash player just fetches the next f4f file when needed.
I read online that there is a manifest file (its extension is f4m), that "describes" to the flash player which f4f files it should download and play, and what’s their playing-order and everything.
My question is - if I have the url of this f4m file, what should I do in order to encode all its f4f files to mp4 ?
I’ve tried to give to FFMpeg just the f4m file as input, but it doesn’t know what to do with it...I’ll really appreciate any response that might help, as I’ve been working on this issue for few days now and I still haven’t found any answer...
Thanks,
Roee. -
How do you use Node.js to stream an MP4 file with ffmpeg ?
2 novembre 2016, par LaserJesusI’ve been trying to solve this problem for several days now and would really appreciate any help on the subject.
I’m able to successfully stream an mp4 audio file stored on a Node.js server using fluent-ffmpeg by passing the location of the file as a string and transcoding it to mp3. If I create a file stream from the same file and pass that to fluent-ffmpeg instead it works for an mp3 input file, but not a mp4 file. In the case of the mp4 file no error is thrown and it claims the stream completed successfully, but nothing is playing in the browser. I’m guessing this has to do with the meta data being stored at the end of an mp4 file, but I don’t know how to code around this. This is the exact same file that works correctly when it’s location is passed to ffmpeg, rather than the stream. When I try and pass a stream to the mp4 file on s3, again no error is thrown, but nothing streams to the browser. This isn’t surprising as ffmpeg won’t work with the file locally as stream, so expecting it to handle the stream from s3 is wishful thinking.
How can I stream the mp4 file from s3, without storing it locally as a file first ? How do I get ffmpeg to do this without transcoding the file too ? The following is the code I have at the moment which isn’t working. Note that it attempts to pass the s3 file as a stream to ffmpeg and it’s also transcoding it into an mp3, which I’d prefer not to do.
.get(function(req,res) {
aws.s3(s3Bucket).getFile(s3Path, function (err, result) {
if (err) {
return next(err);
}
var proc = new ffmpeg(result)
.withAudioCodec('libmp3lame')
.format('mp3')
.on('error', function (err, stdout, stderr) {
console.log('an error happened: ' + err.message);
console.log('ffmpeg stdout: ' + stdout);
console.log('ffmpeg stderr: ' + stderr);
})
.on('end', function () {
console.log('Processing finished !');
})
.on('progress', function (progress) {
console.log('Processing: ' + progress.percent + '% done');
})
.pipe(res, {end: true});
});
});This is using the knox library when it calls aws.s3... I’ve also tried writing it using the standard aws sdk for Node.js, as shown below, but I get the same outcome as above.
var AWS = require('aws-sdk');
var s3 = new AWS.S3({
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_KEY,
region: process.env.AWS_REGION_ID
});
var fileStream = s3.getObject({
Bucket: s3Bucket,
Key: s3Key
}).createReadStream();
var proc = new ffmpeg(fileStream)
.withAudioCodec('libmp3lame')
.format('mp3')
.on('error', function (err, stdout, stderr) {
console.log('an error happened: ' + err.message);
console.log('ffmpeg stdout: ' + stdout);
console.log('ffmpeg stderr: ' + stderr);
})
.on('end', function () {
console.log('Processing finished !');
})
.on('progress', function (progress) {
console.log('Processing: ' + progress.percent + '% done');
})
.pipe(res, {end: true});=====================================
Updated
I placed an mp3 file in the same s3 bucket and the code I have here worked and was able to stream the file through to the browser without storing a local copy. So the streaming issues I face have something to do with the mp4/aac container/encoder format.
I’m still interested in a way to bring the m4a file down from s3 to the Node.js server in it’s entirety, then pass it to ffmpeg for streaming without actually storing the file in the local file system.
=====================================
Updated Again
I’ve managed to get the server streaming the file, as mp4, straight to the browser. This half answers my original question. My only issue now is that I have to download the file to a local store first, before I can stream it. I’d still like to find a way to stream from s3 without needing the temporary file.
aws.s3(s3Bucket).getFile(s3Path, function(err, result){
result.pipe(fs.createWriteStream(file_location));
result.on('end', function() {
console.log('File Downloaded!');
var proc = new ffmpeg(file_location)
.outputOptions(['-movflags isml+frag_keyframe'])
.toFormat('mp4')
.withAudioCodec('copy')
.seekInput(offset)
.on('error', function(err,stdout,stderr) {
console.log('an error happened: ' + err.message);
console.log('ffmpeg stdout: ' + stdout);
console.log('ffmpeg stderr: ' + stderr);
})
.on('end', function() {
console.log('Processing finished !');
})
.on('progress', function(progress) {
console.log('Processing: ' + progress.percent + '% done');
})
.pipe(res, {end: true});
});
});On the receiving side I just have the following javascript in an empty html page :
window.AudioContext = window.AudioContext || window.webkitAudioContext;
context = new AudioContext();
function process(Data) {
source = context.createBufferSource(); // Create Sound Source
context.decodeAudioData(Data, function(buffer){
source.buffer = buffer;
source.connect(context.destination);
source.start(context.currentTime);
});
};
function loadSound() {
var request = new XMLHttpRequest();
request.open("GET", "/stream/", true);
request.responseType = "arraybuffer";
request.onload = function() {
var Data = request.response;
process(Data);
};
request.send();
};
loadSound()=====================================
The Answer
The code above under the title ’updated again’ will stream an mp4 file, from s3, via a Node.js server to a browser without using flash. It does require that the file be stored temporarily on the Node.js server so that the meta data in the file is moved from the end of the file to the front. In order to stream without storing the temporary file, you need to actual modify the file on S3 first and make this meta data change. If you have changed the file in this way on S3 then you can modify the code under the title ’updated again’ so that the result from S3 is piped straight into the ffmpeg constructor, rather than into a file stream on the Node.js server, then providing that file location to ffmepg, as the code does now. You can change the final ’pipe’ command to ’save(location)’ to get a version of the mp4 file locally with the meta data moved to the front. You can then upload that new version of the file to S3 and try out the end to end streaming. Personally I’m now going to create a task that modifies the files in this way as they are uploaded to s3 in the first place. This allows me to record and stream in mp4 without transcoding or storing a temp file on the Node.js server.