Recherche avancée

Médias (1)

Mot : - Tags -/embed

Autres articles (31)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (6390)

  • How can i stream through ffmpeg a canvas generated in Node.js to youtube/any other rtmp server ?

    10 octobre 2020, par DDC

    i wanted to generate some images in Node.JS, compile them to a video and stream them to youtube. To generate the images i'm using the node-canvas module. This sounds simple enough, but i wanted to generate the images continuously, and stream the result in realtime. I'm very new to this whole thing, and what i was thinking about doing, after reading a bunch of resources on the internet was :

    


      

    1. Open ffmpeg with spawn('ffmpeg', ...args), setting the output to the destination rtmp server
    2. 


    3. Generate the image in the canvas
    4. 


    5. Convert the content of the canvas to a buffer, and write it to the ffmpeg process through stdin
    6. 


    7. Enjoy the result on Youtube
    8. 


    


    But it's not as simple as that, is it ? I saw people sharing their code involving client-side JS running on the browser, but i wanted it to be a Node app so that i could run it from a remote VPS.
Is there a way for me to do this without using something like p5 in my browser and capturing the window to restream it ?
Is my thought process even remotely adequate ? For now i don't really care about performance/resources usage. Thanks in advance.

    


    EDIT :

    


    I worked on it for a bit, and i couldn't get it to work...
This is my code :

    


    const { spawn } = require('child_process');
const { createCanvas } = require('canvas');
const fs = require('fs');


const canvas = createCanvas(1920, 1080);
const ctx = canvas.getContext('2d');
const ffmpeg = spawn("ffmpeg",
    ["-re", "-f", "png_pipe", "-vcodec", "png", "-i", "pipe:0", "-vcodec", "h264", "-re", "-f", "flv", "rtmp://a.rtmp.youtube.com/live2/key-i-think"],
    { stdio: 'pipe' })

const randomColor = (depth) => Math.floor(Math.random() * depth)
const random = (min, max) => (Math.random() * (max - min)) + min;

let i = 0;
let drawSomething = function () {
    ctx.strokeStyle = `rgb(${randomColor(255)}, ${randomColor(255)}, ${randomColor(255)})`
    let x1 = random(0, canvas.width);
    let x2 = random(0, canvas.width);
    let y1 = random(0, canvas.height);
    let y2 = random(0, canvas.height);
    ctx.moveTo(x1, y1);
    ctx.lineTo(x2, y2);
    ctx.stroke();

    let out = canvas.toBuffer();
    ffmpeg.stdin.write(out)
    i++;
    if (i >= 30) {
        ffmpeg.stdin.end();
        clearInterval(int)
    };
}

drawSomething();
let int = setInterval(drawSomething, 1000);



    


    I'm not getting any errors, neither i am getting any video data from it. I have set up an rtmp server that i can connect to, and then get the stream with VLC, but i don't get any video data. Am i doing something wrong ? I Looked around for a while, and i can't seem to find anyone that tried this, so i don't really have a clue...

    


    EDIT 2 :
Apparently i was on the right track, but my approach just gave me like 2 seconds of "good" video and then it started becoming blocky and messy. i think that, most likely, my method of generating images is just too slow. I'll try to use some GPU accelerated code to generate the images, instead of using the canvas, which means i'll be doing fractals all the time, since i don't know how to do anything else with that. Also, a bigger buffer in ffmpeg might help too

    


  • Audio/Video de-synchronisation when playing a video on Chrome

    30 novembre 2020, par Sonia Seddiki

    I've been recently working on a project where I try to play a "custom-made" video on an HTML5 player. By custom-made, I mean I concatenate a bunch of videos together using FFmpeg concat demuxer, each of them having the same properties (FPS, bitrate, resolution, timebase, etc).

    


    Now, I'm having a few issues regarding audio/video synchronisation, with a twist : it does not happen on every video player. The video is perfectly synchronised when read on Firefox, but not on Chrome. It is synchronised when read on a "local" video player like VLC.

    


    I assume it has to do with how the video data is presented to the player. I read a little about PTS, DTS, I-P-B frames and I guess the final output may be a little messed up ? But I don't really have a strong lead to follow here.

    


    I tried to find info on how the HTML5 player was implemented by both browsers, but couldn't find much (again, I'm probably not googling this right). Does anyone here know a bit more about the technical aspect of how a video is actually played in a browser ? Or any clue as to why this de-synchronisation doesn't happen on every platform ?

    


    Thank you so much for your help !

    


  • rtmp stream error with ffmpeg desktop capture on electron js

    9 septembre 2020, par Berk Coşar

    I'm making a program that captures the desktop converts it to rtmp stream.

    


    Working logic of the program.

    


      

    • users will click the stream button on electron js window

      


    • 


    • The program will run ffmpeg shell code that captures the desktop as rtmp stream on the backend.

      


      my ffmpeg command to run for macOS Catalina on the backend :

      


      ffmpeg -f avfoundation -i 1:0 -pixel_format nv12 -vcodec libx264 -preset ultrafast -pix_fmt yuv422p -s 1280x800 -r 30 -b:v 1500k -bufsize 1500k -maxrate 7000k -c:a aac -b:a 128k -ac 2 -ar 44100 -f flv rtmp ://192.168.0.109/live

      


    • 


    • And It will play stream on vlc

      


    • 


    


    Okay. this Ffmpeg command is running successfully on terminal but unsuccessfully on electron js.

    


    Error code :

    


    


    stream.js:172 Error : Command failed :
/Users/macbook/Pictures/relteco/ffmpeg.sh ffmpeg version 4.3.1
Copyright (c) 2000-2020 the FFmpeg developers built with Apple clang
version 11.0.3 (clang-1103.0.32.62) configuration :
—prefix=/usr/local/Cellar/ffmpeg/4.3.1 —enable-shared —enable-pthreads —enable-version3 —enable-avresample —cc=clang —host-cflags= —host-ldflags= —enable-ffplay —enable-gnutls —enable-gpl —enable-libaom —enable-libbluray —enable-libdav1d —enable-libmp3lame —enable-libopus —enable-librav1e —enable-librubberband —enable-libsnappy —enable-libsrt —enable-libtesseract —enable-libtheora —enable-libvidstab —enable-libvorbis —enable-libvpx —enable-libwebp —enable-libx264 —enable-libx265 —enable-libxml2 —enable-libxvid —enable-lzma —enable-libfontconfig —enable-libfreetype —enable-frei0r —enable-libass —enable-libopencore-amrnb —enable-libopencore-amrwb —enable-libopenjpeg —enable-librtmp —enable-libspeex —enable-libsoxr —enable-videotoolbox —disable-libjack —disable-indev=jack libavutil 56. 51.100 / 56. 51.100 libavcodec 58. 91.100 / 58. 91.100 libavformat 58. 45.100 /
58. 45.100 libavdevice 58. 10.100 / 58. 10.100 libavfilter 7. 85.100 / 7. 85.100 libavresample 4. 0. 0 / 4. 0. 0 libswscale 5. 7.100 / 5. 7.100 libswresample 3. 7.100 /
3. 7.100 libpostproc 55. 7.100 / 55. 7.100 [AVFoundation indev @ 0x7fd3a8e0ce80] Configuration of video device failed, falling back
to default. [avfoundation @ 0x7fd3a900ea00] Selected pixel format
(yuv420p) is not supported by the input device. [avfoundation @
0x7fd3a900ea00] Supported pixel formats : [avfoundation @
0x7fd3a900ea00] uyvy422 [avfoundation @ 0x7fd3a900ea00] yuyv422
[avfoundation @ 0x7fd3a900ea00] nv12 [avfoundation @ 0x7fd3a900ea00]
0rgb [avfoundation @ 0x7fd3a900ea00] bgr0 [avfoundation @
0x7fd3a900ea00] Overriding selected pixel format to use uyvy422
instead. /Users/macbook/Pictures/relteco/ffmpeg.sh : line 3 : 10253
Abort trap : 6 ffmpeg -f avfoundation -i 1:0 -vcodec libx264
-preset ultrafast -pix_fmt yuv422p -s 1280x800 -r 30 -b:v 1500k -bufsize 1500k -maxrate 7000k -c:a aac -b:a 128k -ac 2 -ar 44100 -f flv rtmp ://192.168.0.109/live

    


    at ChildProcess.exithandler (child_process.js:312)&#xA;at ChildProcess.emit (events.js:310)&#xA;at maybeClose (internal/child_process.js:1021)&#xA;at Socket.<anonymous> (internal/child_process.js:443)&#xA;at Socket.emit (events.js:310)&#xA;at Pipe.<anonymous> (net.js:672)&#xA;</anonymous></anonymous>

    &#xA;

    &#xA;

    when command is running in terminal :

    &#xA;

    enter image description here

    &#xA;

    My tried javascript library's

    &#xA;

      &#xA;
    • Child_process&#xA;
        &#xA;
      • exec
      • &#xA;

      • execfile
      • &#xA;

      • spawn
      • &#xA;

      &#xA;

    • &#xA;

    • shelljs
    • &#xA;

    • fluent-ffmpeg
    • &#xA;

    &#xA;

    last i wrote this ffmpeg command on .sh file and saved and i tried run ffmpeg.sh file with this librarys but they all gave the same error.

    &#xA;

    My idea :&#xA;yuv420p caught my attention and i searched it but i learned yuv420p is not error , this was a ffmpeg warning.That's why js see the warning as a error and it send response me stdout as stderr ?

    &#xA;

    Thanks..

    &#xA;