
Recherche avancée
Médias (91)
-
Spoon - Revenge !
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
My Morning Jacket - One Big Holiday
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Zap Mama - Wadidyusay ?
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
David Byrne - My Fair Lady
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Beastie Boys - Now Get Busy
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Granite de l’Aber Ildut
9 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
Autres articles (25)
-
Contribute to translation
13 avril 2011You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
MediaSPIP is currently available in French and English (...) -
Selection of projects using MediaSPIP
2 mai 2011, parThe examples below are representative elements of MediaSPIP specific uses for specific projects.
MediaSPIP farm @ Infini
The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...) -
Use, discuss, criticize
13 avril 2011, parTalk to people directly involved in MediaSPIP’s development, or to people around you who could use MediaSPIP to share, enhance or develop their creative projects.
The bigger the community, the more MediaSPIP’s potential will be explored and the faster the software will evolve.
A discussion list is available for all exchanges between users.
Sur d’autres sites (6851)
-
How do I independently fade in/out multiple (3+) overlay images over video using FFMPEG ?
3 mars 2017, par blahblahberUsing an arbitrary source video, I would like to INDEPENDENTLY fade in/fade out a minimum of three .png overlays/watermarks at various times throughout the video. I’m having trouble getting the syntax right for the filter chain.
In these failed attempts below, I’m using four transparent .png images all at 1920x1080 using the same sized source input video. No scaling/positioning needed, just the overlays fading in and out at the defined times.
I have the functionality working without fade, unsing ’enable’, like so :
ffmpeg -i vid1.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
"overlay=0:0:enable='between(t,8,11)' [tmp];
[tmp]overlay=0:0:enable='between(t,10,15)'[tmp1];
[tmp1]overlay=0:0:enable='between(t,15,138)'[tmp2];
[tmp2]overlay=0:0:enable='between(t,140,150)"
-c:v libx264 -c:a copy
-flags +global_header -shortest -s 1920x1080 -y out.mp4I just want the same control, using fade in/out.
The following almost works as I’d like but I obviously don’t want the entire output stream to fade out. I realize that adding the fade=out to lines 7, 8 & 9 is fading the combined output (starting at line 7), but this is as close as I’ve come where I see each overlay image actually fading. The defined fades in lines 3, 4 and 5 apparently don’t affect anything, and that’s where I defined them originally. When I copied them to the output stream, the fade works on each overlay image, but again I don’t want it to affect the entire output stream, just the individual overlays.
ffmpeg -i vid1.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
"[1:v]fade=out:st=3:d=1[watermark0];
[2:v]fade=out:st=4:d=1[watermark1];
[3:v]fade=out:st=5:d=1[watermark2];
[4:v]fade=out:st=6:d=1[watermark3];
[0:v][watermark0] overlay=0:0 [tmp0];
[tmp0][watermark1] overlay=0:0,fade=out:st=4:d=1 [tmp1];
[tmp1][watermark2] overlay=0:0,fade=out:st=6:d=1 [tmp2];
[tmp2][watermark3] overlay=0:0,fade=out:st=8:d=1 [out]" -map "[out]" -c:v libx264 -c:a copy
-flags +global_header -shortest -s 1920x1080 -y out.mp4I’ve also tried ’split’ with similar results to the above, but the fade only seems to work on the first image (this one uses fade in as well) :
ffmpeg -i vid.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
"[1:v]split=4[wm1][wm2][wm3][wm4];
[wm1]fade=in:st=1:d=1:alpha=1,fade=out:st=3:d=1:alpha=1[ovr1];
[wm2]fade=in:st=2:d=1:alpha=1,fade=out:st=4:d=1:alpha=1[ovr2];
[wm3]fade=in:st=3:d=1:alpha=1,fade=out:st=5:d=1:alpha=1[ovr3];
[wm4]fade=in:st=4:d=1:alpha=1,fade=out:st=6:d=1:alpha=1[ovr4];
[0:v][ovr1]overlay=0:0[base1];
[base1][ovr2]overlay=0:0[base2];
[base2][ovr3]overlay=0:0[base3];
[base3][ovr4]overlay=0:0[out]" -map "[out]"
-t 10 -c:v libx264 -c:a copy -flags +global_header -shortest -s 1920x1080 -y out.mp4Any help is greatly appreciated ! :)
-
How do I independently fade in/out multiple (3+) overlay images over video using FFMPEG ?
9 janvier 2019, par blahblahberUsing an arbitrary source video, I would like to INDEPENDENTLY fade in/fade out a minimum of three .png overlays/watermarks at various times throughout the video. I’m having trouble getting the syntax right for the filter chain.
In these failed attempts below, I’m using four transparent .png images all at 1920x1080 using the same sized source input video. No scaling/positioning needed, just the overlays fading in and out at the defined times.
I have the functionality working without fade, unsing ’enable’, like so :
ffmpeg -i vid1.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
"overlay=0:0:enable='between(t,8,11)' [tmp];
[tmp]overlay=0:0:enable='between(t,10,15)'[tmp1];
[tmp1]overlay=0:0:enable='between(t,15,138)'[tmp2];
[tmp2]overlay=0:0:enable='between(t,140,150)"
-c:v libx264 -c:a copy
-flags +global_header -shortest -s 1920x1080 -y out.mp4I just want the same control, using fade in/out.
The following almost works as I’d like but I obviously don’t want the entire output stream to fade out. I realize that adding the fade=out to lines 7, 8 & 9 is fading the combined output (starting at line 7), but this is as close as I’ve come where I see each overlay image actually fading. The defined fades in lines 3, 4 and 5 apparently don’t affect anything, and that’s where I defined them originally. When I copied them to the output stream, the fade works on each overlay image, but again I don’t want it to affect the entire output stream, just the individual overlays.
ffmpeg -i vid1.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
"[1:v]fade=out:st=3:d=1[watermark0];
[2:v]fade=out:st=4:d=1[watermark1];
[3:v]fade=out:st=5:d=1[watermark2];
[4:v]fade=out:st=6:d=1[watermark3];
[0:v][watermark0] overlay=0:0 [tmp0];
[tmp0][watermark1] overlay=0:0,fade=out:st=4:d=1 [tmp1];
[tmp1][watermark2] overlay=0:0,fade=out:st=6:d=1 [tmp2];
[tmp2][watermark3] overlay=0:0,fade=out:st=8:d=1 [out]" -map "[out]" -c:v libx264 -c:a copy
-flags +global_header -shortest -s 1920x1080 -y out.mp4I’ve also tried ’split’ with similar results to the above, but the fade only seems to work on the first image (this one uses fade in as well) :
ffmpeg -i vid.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
"[1:v]split=4[wm1][wm2][wm3][wm4];
[wm1]fade=in:st=1:d=1:alpha=1,fade=out:st=3:d=1:alpha=1[ovr1];
[wm2]fade=in:st=2:d=1:alpha=1,fade=out:st=4:d=1:alpha=1[ovr2];
[wm3]fade=in:st=3:d=1:alpha=1,fade=out:st=5:d=1:alpha=1[ovr3];
[wm4]fade=in:st=4:d=1:alpha=1,fade=out:st=6:d=1:alpha=1[ovr4];
[0:v][ovr1]overlay=0:0[base1];
[base1][ovr2]overlay=0:0[base2];
[base2][ovr3]overlay=0:0[base3];
[base3][ovr4]overlay=0:0[out]" -map "[out]"
-t 10 -c:v libx264 -c:a copy -flags +global_header -shortest -s 1920x1080 -y out.mp4Any help is greatly appreciated ! :)
-
Creating a continuous stream for RTMP in Node.js
9 mars 2023, par hankthetank27I'm working on an app that downloads audio from youtube that will be streamed sequentially, similar to radio broadcast. I'm having trouble getting the individual tracks to stream continuously. My idea was to write the tracks sequentially into a readable stream that then is read by FFMPEG. Here is the code I'm having trouble with...


import ytdl from "ytdl-core";
import ffmpeg from 'fluent-ffmpeg';
import { Readable } from "node:stream"

export async function startAudioStream(): Promise<void>{

 const mainStream = await createStreamedQueue()

 ffmpeg(mainStream)
 .inputOptions([
 '-re'
 ])
 .outputOption([
 // '-c:v libx264',
 // '-preset veryfast',
 // '-tune zerolatency',
 '-c:a aac',
 '-ar 44100',
 ])
 .save('rtmp://localhost/live/main.flv');
};

async function createStreamedQueue(): Promise<readable>{

 function createStream(): Readable{
 const stream = new Readable({
 read(){},
 highWaterMark: 1024 * 512,
 });
 stream._destroy = () => { stream.destroyed = true };
 return stream;
 }

 const mainStream = createStream();

 const ref1 = 'https://youtu.be/lLCEUpIg8rE'
 const ref2 = 'https://youtu.be/bRdyzdXJ0KA';

 function queueSong(src: string, stream: Readable): Promise<void>{
 return new Promise<void>((resolve, reject) => {
 ytdl(src, {
 filter: 'audioonly',
 quality: 'highestaudio'
 })
 .on('data', (data) => {
 stream.push(data);
 })
 .on('end', () => {
 resolve();
 })
 .on('error', (err) => {
 console.error('Error downloading file from YouTube.', err);
 reject(err);
 })
 })
 }
 
 await queueSong(ref1, mainStream);
 // console.log('after firsrt: ', mainStream)
 await queueSong(ref2, mainStream);
 // console.log('after second: ', mainStream)
 return mainStream;
};
</void></void></readable></void>


To start,
startAudioStream
is called to initiate a readable stream of audio to an RTMP server via FFMPEG. That is working fine. The part I'm having trouble with is "queuing" the tracks into the stream that's being fed into FFMPEG. Right now, I have a "main" stream that each songs data is being pushed into, as you can see inqueueSong
. At the end of ytdl stream, the promise is resolved, allowing for the next song to be queued and its data to be pushed intomainStream
. The issue that I'm experiencing is that the audio fromref1
is only every played.

I can see in the logs that
mainStream
does grow in length after each call toqueueSong
, but still will only stream the audio from the first track. My initial thought was that there is a terminating character at the of the last data chunk thats being written to the steam for each song ? But maybe im getting screwed up on how streams work...