Recherche avancée

Médias (9)

Mot : - Tags -/soundtrack

Autres articles (93)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

Sur d’autres sites (4761)

  • Creating a continuous stream for RTMP in Node.js

    9 mars 2023, par hankthetank27

    I'm working on an app that downloads audio from youtube that will be streamed sequentially, similar to radio broadcast. I'm having trouble getting the individual tracks to stream continuously. My idea was to write the tracks sequentially into a readable stream that then is read by FFMPEG. Here is the code I'm having trouble with...

    


    import ytdl from "ytdl-core";&#xA;import ffmpeg from &#x27;fluent-ffmpeg&#x27;;&#xA;import { Readable } from "node:stream"&#xA;&#xA;export async function startAudioStream(): Promise<void>{&#xA;&#xA;  const mainStream = await createStreamedQueue()&#xA;&#xA;  ffmpeg(mainStream)&#xA;    .inputOptions([&#xA;      &#x27;-re&#x27;&#xA;    ])&#xA;    .outputOption([&#xA;      // &#x27;-c:v libx264&#x27;,&#xA;      // &#x27;-preset veryfast&#x27;,&#xA;      // &#x27;-tune zerolatency&#x27;,&#xA;      &#x27;-c:a aac&#x27;,&#xA;      &#x27;-ar 44100&#x27;,&#xA;    ])&#xA;    .save(&#x27;rtmp://localhost/live/main.flv&#x27;);&#xA;};&#xA;&#xA;async function createStreamedQueue(): Promise<readable>{&#xA;&#xA;  function createStream(): Readable{&#xA;    const stream = new Readable({&#xA;      read(){},&#xA;      highWaterMark: 1024 * 512,&#xA;    });&#xA;    stream._destroy = () => { stream.destroyed = true };&#xA;    return stream;&#xA;  }&#xA;&#xA;  const mainStream = createStream();&#xA;&#xA;  const ref1 = &#x27;https://youtu.be/lLCEUpIg8rE&#x27;&#xA;  const ref2 = &#x27;https://youtu.be/bRdyzdXJ0KA&#x27;;&#xA;&#xA;  function queueSong(src: string, stream: Readable): Promise<void>{&#xA;    return new Promise<void>((resolve, reject) => {&#xA;      ytdl(src, {&#xA;          filter: &#x27;audioonly&#x27;,&#xA;          quality: &#x27;highestaudio&#x27;&#xA;        })&#xA;        .on(&#x27;data&#x27;, (data) => {&#xA;          stream.push(data);&#xA;        })&#xA;        .on(&#x27;end&#x27;, () => {&#xA;          resolve();&#xA;        })&#xA;        .on(&#x27;error&#x27;, (err) => {&#xA;          console.error(&#x27;Error downloading file from YouTube.&#x27;, err);&#xA;          reject(err);&#xA;        })&#xA;    })&#xA;  }&#xA;  &#xA;  await queueSong(ref1, mainStream);&#xA;  // console.log(&#x27;after firsrt: &#x27;, mainStream)&#xA;  await queueSong(ref2, mainStream);&#xA;  // console.log(&#x27;after second: &#x27;, mainStream)&#xA;  return mainStream;&#xA;};&#xA;</void></void></readable></void>

    &#xA;

    To start, startAudioStream is called to initiate a readable stream of audio to an RTMP server via FFMPEG. That is working fine. The part I'm having trouble with is "queuing" the tracks into the stream that's being fed into FFMPEG. Right now, I have a "main" stream that each songs data is being pushed into, as you can see in queueSong. At the end of ytdl stream, the promise is resolved, allowing for the next song to be queued and its data to be pushed into mainStream. The issue that I'm experiencing is that the audio from ref1 is only every played.

    &#xA;

    I can see in the logs that mainStream does grow in length after each call to queueSong, but still will only stream the audio from the first track. My initial thought was that there is a terminating character at the of the last data chunk thats being written to the steam for each song ? But maybe im getting screwed up on how streams work...

    &#xA;

  • How do I independently fade in/out multiple (3+) overlay images over video using FFMPEG ?

    9 janvier 2019, par blahblahber

    Using an arbitrary source video, I would like to INDEPENDENTLY fade in/fade out a minimum of three .png overlays/watermarks at various times throughout the video. I’m having trouble getting the syntax right for the filter chain.

    In these failed attempts below, I’m using four transparent .png images all at 1920x1080 using the same sized source input video. No scaling/positioning needed, just the overlays fading in and out at the defined times.

    I have the functionality working without fade, unsing ’enable’, like so :

    ffmpeg -i vid1.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
    "overlay=0:0:enable='between(t,8,11)' [tmp];
    [tmp]overlay=0:0:enable='between(t,10,15)'[tmp1];
    [tmp1]overlay=0:0:enable='between(t,15,138)'[tmp2];
    [tmp2]overlay=0:0:enable='between(t,140,150)"
    -c:v libx264 -c:a copy
    -flags +global_header -shortest -s 1920x1080 -y out.mp4

    I just want the same control, using fade in/out.

    The following almost works as I’d like but I obviously don’t want the entire output stream to fade out. I realize that adding the fade=out to lines 7, 8 & 9 is fading the combined output (starting at line 7), but this is as close as I’ve come where I see each overlay image actually fading. The defined fades in lines 3, 4 and 5 apparently don’t affect anything, and that’s where I defined them originally. When I copied them to the output stream, the fade works on each overlay image, but again I don’t want it to affect the entire output stream, just the individual overlays.

    ffmpeg -i vid1.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
       "[1:v]fade=out:st=3:d=1[watermark0];
       [2:v]fade=out:st=4:d=1[watermark1];
       [3:v]fade=out:st=5:d=1[watermark2];
       [4:v]fade=out:st=6:d=1[watermark3];
       [0:v][watermark0] overlay=0:0 [tmp0];
       [tmp0][watermark1] overlay=0:0,fade=out:st=4:d=1 [tmp1];
       [tmp1][watermark2] overlay=0:0,fade=out:st=6:d=1 [tmp2];
       [tmp2][watermark3] overlay=0:0,fade=out:st=8:d=1 [out]" -map "[out]" -c:v libx264 -c:a copy
       -flags +global_header -shortest -s 1920x1080 -y out.mp4

    I’ve also tried ’split’ with similar results to the above, but the fade only seems to work on the first image (this one uses fade in as well) :

    ffmpeg -i vid.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
    "[1:v]split=4[wm1][wm2][wm3][wm4];
    [wm1]fade=in:st=1:d=1:alpha=1,fade=out:st=3:d=1:alpha=1[ovr1];
    [wm2]fade=in:st=2:d=1:alpha=1,fade=out:st=4:d=1:alpha=1[ovr2];
    [wm3]fade=in:st=3:d=1:alpha=1,fade=out:st=5:d=1:alpha=1[ovr3];
    [wm4]fade=in:st=4:d=1:alpha=1,fade=out:st=6:d=1:alpha=1[ovr4];
    [0:v][ovr1]overlay=0:0[base1];
    [base1][ovr2]overlay=0:0[base2];
    [base2][ovr3]overlay=0:0[base3];
    [base3][ovr4]overlay=0:0[out]" -map "[out]"
    -t 10 -c:v libx264 -c:a copy -flags +global_header -shortest -s 1920x1080 -y out.mp4

    Any help is greatly appreciated ! :)

  • How do I independently fade in/out multiple (3+) overlay images over video using FFMPEG ?

    3 mars 2017, par blahblahber

    Using an arbitrary source video, I would like to INDEPENDENTLY fade in/fade out a minimum of three .png overlays/watermarks at various times throughout the video. I’m having trouble getting the syntax right for the filter chain.

    In these failed attempts below, I’m using four transparent .png images all at 1920x1080 using the same sized source input video. No scaling/positioning needed, just the overlays fading in and out at the defined times.

    I have the functionality working without fade, unsing ’enable’, like so :

    ffmpeg -i vid1.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
    "overlay=0:0:enable='between(t,8,11)' [tmp];
    [tmp]overlay=0:0:enable='between(t,10,15)'[tmp1];
    [tmp1]overlay=0:0:enable='between(t,15,138)'[tmp2];
    [tmp2]overlay=0:0:enable='between(t,140,150)"
    -c:v libx264 -c:a copy
    -flags +global_header -shortest -s 1920x1080 -y out.mp4

    I just want the same control, using fade in/out.

    The following almost works as I’d like but I obviously don’t want the entire output stream to fade out. I realize that adding the fade=out to lines 7, 8 & 9 is fading the combined output (starting at line 7), but this is as close as I’ve come where I see each overlay image actually fading. The defined fades in lines 3, 4 and 5 apparently don’t affect anything, and that’s where I defined them originally. When I copied them to the output stream, the fade works on each overlay image, but again I don’t want it to affect the entire output stream, just the individual overlays.

    ffmpeg -i vid1.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
       "[1:v]fade=out:st=3:d=1[watermark0];
       [2:v]fade=out:st=4:d=1[watermark1];
       [3:v]fade=out:st=5:d=1[watermark2];
       [4:v]fade=out:st=6:d=1[watermark3];
       [0:v][watermark0] overlay=0:0 [tmp0];
       [tmp0][watermark1] overlay=0:0,fade=out:st=4:d=1 [tmp1];
       [tmp1][watermark2] overlay=0:0,fade=out:st=6:d=1 [tmp2];
       [tmp2][watermark3] overlay=0:0,fade=out:st=8:d=1 [out]" -map "[out]" -c:v libx264 -c:a copy
       -flags +global_header -shortest -s 1920x1080 -y out.mp4

    I’ve also tried ’split’ with similar results to the above, but the fade only seems to work on the first image (this one uses fade in as well) :

    ffmpeg -i vid.mp4 -loop 1 -i img1.png -i img2.png -i img3.png -i img4.png -filter_complex
    "[1:v]split=4[wm1][wm2][wm3][wm4];
    [wm1]fade=in:st=1:d=1:alpha=1,fade=out:st=3:d=1:alpha=1[ovr1];
    [wm2]fade=in:st=2:d=1:alpha=1,fade=out:st=4:d=1:alpha=1[ovr2];
    [wm3]fade=in:st=3:d=1:alpha=1,fade=out:st=5:d=1:alpha=1[ovr3];
    [wm4]fade=in:st=4:d=1:alpha=1,fade=out:st=6:d=1:alpha=1[ovr4];
    [0:v][ovr1]overlay=0:0[base1];
    [base1][ovr2]overlay=0:0[base2];
    [base2][ovr3]overlay=0:0[base3];
    [base3][ovr4]overlay=0:0[out]" -map "[out]"
    -t 10 -c:v libx264 -c:a copy -flags +global_header -shortest -s 1920x1080 -y out.mp4

    Any help is greatly appreciated ! :)