
Recherche avancée
Autres articles (28)
-
Submit bugs and patches
13 avril 2011Unfortunately a software is never perfect.
If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
You may also (...) -
Support de tous types de médias
10 avril 2011Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)
-
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
Sur d’autres sites (7378)
-
How do I independently fade in/out multiple (3+) overlay images over video using FFMPEG ?
3 mars 2017, par blahblahberUsing an arbitrary source video, I would like to INDEPENDENTLY fade in/fade out a minimum of three .png overlays/watermarks at various times throughout the video. I’m having trouble getting the syntax right for the filter chain.
In these failed attempts below, I’m using four transparent .png images all at 1920x1080 using the same sized source input video. No scaling/positioning needed, just the overlays fading in and out at the defined times.
I have the functionality working without fade, unsing ’enable’, like so :
ffmpeg -i vid1.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
"overlay=0:0:enable='between(t,8,11)' [tmp];
[tmp]overlay=0:0:enable='between(t,10,15)'[tmp1];
[tmp1]overlay=0:0:enable='between(t,15,138)'[tmp2];
[tmp2]overlay=0:0:enable='between(t,140,150)"
-c:v libx264 -c:a copy
-flags +global_header -shortest -s 1920x1080 -y out.mp4I just want the same control, using fade in/out.
The following almost works as I’d like but I obviously don’t want the entire output stream to fade out. I realize that adding the fade=out to lines 7, 8 & 9 is fading the combined output (starting at line 7), but this is as close as I’ve come where I see each overlay image actually fading. The defined fades in lines 3, 4 and 5 apparently don’t affect anything, and that’s where I defined them originally. When I copied them to the output stream, the fade works on each overlay image, but again I don’t want it to affect the entire output stream, just the individual overlays.
ffmpeg -i vid1.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
"[1:v]fade=out:st=3:d=1[watermark0];
[2:v]fade=out:st=4:d=1[watermark1];
[3:v]fade=out:st=5:d=1[watermark2];
[4:v]fade=out:st=6:d=1[watermark3];
[0:v][watermark0] overlay=0:0 [tmp0];
[tmp0][watermark1] overlay=0:0,fade=out:st=4:d=1 [tmp1];
[tmp1][watermark2] overlay=0:0,fade=out:st=6:d=1 [tmp2];
[tmp2][watermark3] overlay=0:0,fade=out:st=8:d=1 [out]" -map "[out]" -c:v libx264 -c:a copy
-flags +global_header -shortest -s 1920x1080 -y out.mp4I’ve also tried ’split’ with similar results to the above, but the fade only seems to work on the first image (this one uses fade in as well) :
ffmpeg -i vid.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
"[1:v]split=4[wm1][wm2][wm3][wm4];
[wm1]fade=in:st=1:d=1:alpha=1,fade=out:st=3:d=1:alpha=1[ovr1];
[wm2]fade=in:st=2:d=1:alpha=1,fade=out:st=4:d=1:alpha=1[ovr2];
[wm3]fade=in:st=3:d=1:alpha=1,fade=out:st=5:d=1:alpha=1[ovr3];
[wm4]fade=in:st=4:d=1:alpha=1,fade=out:st=6:d=1:alpha=1[ovr4];
[0:v][ovr1]overlay=0:0[base1];
[base1][ovr2]overlay=0:0[base2];
[base2][ovr3]overlay=0:0[base3];
[base3][ovr4]overlay=0:0[out]" -map "[out]"
-t 10 -c:v libx264 -c:a copy -flags +global_header -shortest -s 1920x1080 -y out.mp4Any help is greatly appreciated ! :)
-
How do I independently fade in/out multiple (3+) overlay images over video using FFMPEG ?
9 janvier 2019, par blahblahberUsing an arbitrary source video, I would like to INDEPENDENTLY fade in/fade out a minimum of three .png overlays/watermarks at various times throughout the video. I’m having trouble getting the syntax right for the filter chain.
In these failed attempts below, I’m using four transparent .png images all at 1920x1080 using the same sized source input video. No scaling/positioning needed, just the overlays fading in and out at the defined times.
I have the functionality working without fade, unsing ’enable’, like so :
ffmpeg -i vid1.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
"overlay=0:0:enable='between(t,8,11)' [tmp];
[tmp]overlay=0:0:enable='between(t,10,15)'[tmp1];
[tmp1]overlay=0:0:enable='between(t,15,138)'[tmp2];
[tmp2]overlay=0:0:enable='between(t,140,150)"
-c:v libx264 -c:a copy
-flags +global_header -shortest -s 1920x1080 -y out.mp4I just want the same control, using fade in/out.
The following almost works as I’d like but I obviously don’t want the entire output stream to fade out. I realize that adding the fade=out to lines 7, 8 & 9 is fading the combined output (starting at line 7), but this is as close as I’ve come where I see each overlay image actually fading. The defined fades in lines 3, 4 and 5 apparently don’t affect anything, and that’s where I defined them originally. When I copied them to the output stream, the fade works on each overlay image, but again I don’t want it to affect the entire output stream, just the individual overlays.
ffmpeg -i vid1.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
"[1:v]fade=out:st=3:d=1[watermark0];
[2:v]fade=out:st=4:d=1[watermark1];
[3:v]fade=out:st=5:d=1[watermark2];
[4:v]fade=out:st=6:d=1[watermark3];
[0:v][watermark0] overlay=0:0 [tmp0];
[tmp0][watermark1] overlay=0:0,fade=out:st=4:d=1 [tmp1];
[tmp1][watermark2] overlay=0:0,fade=out:st=6:d=1 [tmp2];
[tmp2][watermark3] overlay=0:0,fade=out:st=8:d=1 [out]" -map "[out]" -c:v libx264 -c:a copy
-flags +global_header -shortest -s 1920x1080 -y out.mp4I’ve also tried ’split’ with similar results to the above, but the fade only seems to work on the first image (this one uses fade in as well) :
ffmpeg -i vid.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
"[1:v]split=4[wm1][wm2][wm3][wm4];
[wm1]fade=in:st=1:d=1:alpha=1,fade=out:st=3:d=1:alpha=1[ovr1];
[wm2]fade=in:st=2:d=1:alpha=1,fade=out:st=4:d=1:alpha=1[ovr2];
[wm3]fade=in:st=3:d=1:alpha=1,fade=out:st=5:d=1:alpha=1[ovr3];
[wm4]fade=in:st=4:d=1:alpha=1,fade=out:st=6:d=1:alpha=1[ovr4];
[0:v][ovr1]overlay=0:0[base1];
[base1][ovr2]overlay=0:0[base2];
[base2][ovr3]overlay=0:0[base3];
[base3][ovr4]overlay=0:0[out]" -map "[out]"
-t 10 -c:v libx264 -c:a copy -flags +global_header -shortest -s 1920x1080 -y out.mp4Any help is greatly appreciated ! :)
-
Creating a continuous stream for RTMP in Node.js
9 mars 2023, par hankthetank27I'm working on an app that downloads audio from youtube that will be streamed sequentially, similar to radio broadcast. I'm having trouble getting the individual tracks to stream continuously. My idea was to write the tracks sequentially into a readable stream that then is read by FFMPEG. Here is the code I'm having trouble with...


import ytdl from "ytdl-core";
import ffmpeg from 'fluent-ffmpeg';
import { Readable } from "node:stream"

export async function startAudioStream(): Promise<void>{

 const mainStream = await createStreamedQueue()

 ffmpeg(mainStream)
 .inputOptions([
 '-re'
 ])
 .outputOption([
 // '-c:v libx264',
 // '-preset veryfast',
 // '-tune zerolatency',
 '-c:a aac',
 '-ar 44100',
 ])
 .save('rtmp://localhost/live/main.flv');
};

async function createStreamedQueue(): Promise<readable>{

 function createStream(): Readable{
 const stream = new Readable({
 read(){},
 highWaterMark: 1024 * 512,
 });
 stream._destroy = () => { stream.destroyed = true };
 return stream;
 }

 const mainStream = createStream();

 const ref1 = 'https://youtu.be/lLCEUpIg8rE'
 const ref2 = 'https://youtu.be/bRdyzdXJ0KA';

 function queueSong(src: string, stream: Readable): Promise<void>{
 return new Promise<void>((resolve, reject) => {
 ytdl(src, {
 filter: 'audioonly',
 quality: 'highestaudio'
 })
 .on('data', (data) => {
 stream.push(data);
 })
 .on('end', () => {
 resolve();
 })
 .on('error', (err) => {
 console.error('Error downloading file from YouTube.', err);
 reject(err);
 })
 })
 }
 
 await queueSong(ref1, mainStream);
 // console.log('after firsrt: ', mainStream)
 await queueSong(ref2, mainStream);
 // console.log('after second: ', mainStream)
 return mainStream;
};
</void></void></readable></void>


To start,
startAudioStream
is called to initiate a readable stream of audio to an RTMP server via FFMPEG. That is working fine. The part I'm having trouble with is "queuing" the tracks into the stream that's being fed into FFMPEG. Right now, I have a "main" stream that each songs data is being pushed into, as you can see inqueueSong
. At the end of ytdl stream, the promise is resolved, allowing for the next song to be queued and its data to be pushed intomainStream
. The issue that I'm experiencing is that the audio fromref1
is only every played.

I can see in the logs that
mainStream
does grow in length after each call toqueueSong
, but still will only stream the audio from the first track. My initial thought was that there is a terminating character at the of the last data chunk thats being written to the steam for each song ? But maybe im getting screwed up on how streams work...