Recherche avancée

Médias (2)

Mot : - Tags -/media

Autres articles (82)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

Sur d’autres sites (7901)

  • How to add watermark with delay in FFmpeg

    28 décembre 2022, par Мохамед Русланович

    Am trying to put a gif file in all the edges of the video. So am using this command

    


    ffmpeg  -y -i film.mp4 -stream_loop -1 -i gif.gif -filter_complex \ 
"[1]colorchannelmixer=aa=0.8,scale=iw*1:-1[a];[0][a]overlay=
x='if(lt(mod(t\,16)\,8)\,W-w-W*10/200\,W*10/100)':
y='if(lt(mod(t+4\,16)\,8)\,H-h-H*5/200\,H*5/200)':
shortest=1" 
 -acodec copy output_task_3.mp4


    


    But now am required to put the gif file on each edge with delay of 10 minutes

    


    For exmaple :

    


    1- render gif at top left for only 16 seconds.
Then wait 10 minutes

    


    2- render gif at top right for 16 seconds.
Wait 10 minutes

    


    3- rendder gif at right buttom for 16 seconds.
Wait 10 minutes

    


    4- render gif at left buttom for 16 secnods.
Wait 10 minutes

    


    This scenario should be repete till the movie ends.

    


    How can i archive this ?

    


  • JSmpeg is not playing audio from websocket stream

    5 juin 2023, par Nik

    I am trying to stream RTSP to web browser using ffmpeg through web socket relay written in node js taken from https://github.com/phoboslab/jsmpeg , and on the browser i am using JSMpeg to display the RTSP stream, the video is playing fine, but audio is not playing,

    


    The ffmpeg command :

    


    ffmpeg -rtsp_transport tcp -i rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mp4 
       -f mpegts -c:v mpeg1video -c:a mp2 http://127.0.0.1:8081/stream_from_ffmpeg/


    


    The node js web socket relay :

    


    // Use the websocket-relay to serve a raw MPEG-TS over WebSockets. You can use&#xA;// ffmpeg to feed the relay. ffmpeg -> websocket-relay -> browser&#xA;// Example:&#xA;// node websocket-relay yoursecret 8081 8082&#xA;// ffmpeg -i <some input="input"> -f mpegts http://localhost:8081/yoursecret&#xA;&#xA;var fs = require(&#x27;fs&#x27;),&#xA;    http = require(&#x27;http&#x27;),&#xA;    WebSocket = require(&#x27;ws&#x27;);&#xA;&#xA;if (process.argv.length &lt; 3) {&#xA;    console.log(&#xA;        &#x27;Usage: \n&#x27; &#x2B;&#xA;        &#x27;node websocket-relay.js <secret> [ ]&#x27;&#xA;    );&#xA;    process.exit();&#xA;}&#xA;&#xA;var STREAM_SECRET = process.argv[2],&#xA;    STREAM_PORT = process.argv[3] || 8081,&#xA;    WEBSOCKET_PORT = process.argv[4] || 8082,&#xA;    RECORD_STREAM = false;&#xA;&#xA;// Websocket Server&#xA;var socketServer = new WebSocket.Server({port: WEBSOCKET_PORT, perMessageDeflate: false});&#xA;socketServer.connectionCount = 0;&#xA;socketServer.on(&#x27;connection&#x27;, function(socket, upgradeReq) {&#xA;    socketServer.connectionCount&#x2B;&#x2B;;&#xA;    console.log(&#xA;        &#x27;New WebSocket Connection: &#x27;,&#xA;        (upgradeReq || socket.upgradeReq).socket.remoteAddress,&#xA;        (upgradeReq || socket.upgradeReq).headers[&#x27;user-agent&#x27;],&#xA;        &#x27;(&#x27;&#x2B;socketServer.connectionCount&#x2B;&#x27; total)&#x27;&#xA;    );&#xA;    socket.on(&#x27;close&#x27;, function(code, message){&#xA;        socketServer.connectionCount--;&#xA;        console.log(&#xA;            &#x27;Disconnected WebSocket (&#x27;&#x2B;socketServer.connectionCount&#x2B;&#x27; total)&#x27;&#xA;        );&#xA;    });&#xA;});&#xA;socketServer.broadcast = function(data) {&#xA;    socketServer.clients.forEach(function each(client) {&#xA;        if (client.readyState === WebSocket.OPEN) {&#xA;            client.send(data);&#xA;        }&#xA;    });&#xA;};&#xA;&#xA;// HTTP Server to accept incoming MPEG-TS Stream from ffmpeg&#xA;var streamServer = http.createServer( function(request, response) {&#xA;    var params = request.url.substr(1).split(&#x27;/&#x27;);&#xA;&#xA;    if (params[0] !== STREAM_SECRET) {&#xA;        console.log(&#xA;            &#x27;Failed Stream Connection: &#x27;&#x2B; request.socket.remoteAddress &#x2B; &#x27;:&#x27; &#x2B;&#xA;            request.socket.remotePort &#x2B; &#x27; - wrong secret.&#x27;&#xA;        );&#xA;        response.end();&#xA;    }&#xA;&#xA;    response.connection.setTimeout(0);&#xA;    console.log(&#xA;        &#x27;Stream Connected: &#x27; &#x2B;&#xA;        request.socket.remoteAddress &#x2B; &#x27;:&#x27; &#x2B;&#xA;        request.socket.remotePort&#xA;    );&#xA;    request.on(&#x27;data&#x27;, function(data){&#xA;        socketServer.broadcast(data);&#xA;        if (request.socket.recording) {&#xA;            request.socket.recording.write(data);&#xA;        }&#xA;    });&#xA;    request.on(&#x27;end&#x27;,function(){&#xA;        console.log(&#x27;close&#x27;);&#xA;        if (request.socket.recording) {&#xA;            request.socket.recording.close();&#xA;        }&#xA;    });&#xA;&#xA;    // Record the stream to a local file?&#xA;    if (RECORD_STREAM) {&#xA;        var path = &#x27;recordings/&#x27; &#x2B; Date.now() &#x2B; &#x27;.ts&#x27;;&#xA;        request.socket.recording = fs.createWriteStream(path);&#xA;    }&#xA;})&#xA;// Keep the socket open for streaming&#xA;streamServer.headersTimeout = 0;&#xA;streamServer.listen(STREAM_PORT);&#xA;&#xA;console.log(&#x27;Listening for incoming MPEG-TS Stream on http://127.0.0.1:&#x27;&#x2B;STREAM_PORT&#x2B;&#x27;/<secret>&#x27;);&#xA;console.log(&#x27;Awaiting WebSocket connections on ws://127.0.0.1:&#x27;&#x2B;WEBSOCKET_PORT&#x2B;&#x27;/&#x27;);&#xA;</secret></secret></some>

    &#xA;

    The front end code

    &#xA;

    &#xA;&#xA;  &#xA;    &#xA;    &#xA;    &#xA;    <code class="echappe-js">&lt;script src='http://stackoverflow.com/feeds/tag/jsmpeg.min.js'&gt;&lt;/script&gt;&#xA;    &#xA;  &#xA;  &#xA;    &#xA;  &#xA;  &lt;script&gt;&amp;#xA;    let url;&amp;#xA;    let player;&amp;#xA;    let canvas = document.getElementById(&quot;video-canvas&quot;);&amp;#xA;    let ipAddr = &quot;127.0.0.1:8082&quot;;&amp;#xA;    window.onload = async() =&gt; {&amp;#xA;      url = `ws://${ipAddr}`;&amp;#xA;      player = new JSMpeg.Player(url, { canvas: canvas, });&amp;#xA;    };&amp;#xA;&amp;#xA;  &lt;/script&gt;&#xA;&#xA;&#xA;

    &#xA;

    The above code works fine and plays the video, but no audio is playing&#xA;Things I tried :

    &#xA;

    Changed the audio context state inside the player object from suspended to running

    &#xA;

    player.audioOut.context.onstatechange = async () => {&#xA;    console.log("Event triggered by audio");&#xA;&#xA;    if (player.audioOut.context === "suspended") {&#xA;        await player.audioOut.context.resume();&#xA;    }&#xA;}&#xA;

    &#xA;

  • Turn off sw_scale conversion to planar YUV 32 byte alignment requirements

    8 novembre 2022, par flansel

    I am experiencing artifacts on the right edge of scaled and converted images when converting into planar YUV pixel formats with sw_scale. I am reasonably sure (although I can not find it anywhere in the documentation) that this is because sw_scale is using an optimization for 32 byte aligned lines, in the destination. However I would like to turn this off because I am using sw_scale for image composition, so even though the destination lines may be 32 byte aligned, the output image may not be.

    &#xA;

    Example.

    &#xA;

    Full output frame is 1280x720 yuv422p10le. (this is 32 byte aligned)&#xA;However into the top left corner I am scaling an image with an outwidth of 1280 / 3 = 426.&#xA;426 in this format is not 32 byte aligned, but I believe sw_scale sees that the output linesize is 32 byte aligned and overwrites the width of 426 putting garbage in the next 22 bytes of data thinking this is simply padding when in my case this is displayable area.

    &#xA;

    This is why I need to actually disable this optimization or somehow trick sw_scale into believing it does not apply while keeping intact the way the program works, which is otherwise fine.

    &#xA;

    I have tried adding extra padding to the destination lines so they are no longer 32 byte aligned,&#xA;this did not help as far as I can tell.

    &#xA;

    Edit with code Example. Rendering omitted for ease of use.&#xA;Also here is a similar issue, unfortunately as I stated there fix will not work for my use case. https://github.com/obsproject/obs-studio/pull/2836

    &#xA;

    Use the commented line of code to swap between a output width which is and isnt 32 byte aligned.

    &#xA;

    #include "libswscale/swscale.h"&#xA;#include "libavutil/imgutils.h"&#xA;#include "libavutil/pixelutils.h"&#xA;#include "libavutil/pixfmt.h"&#xA;#include "libavutil/pixdesc.h"&#xA;#include &#xA;#include &#xA;#include &#xA;&#xA;int main(int argc, char **argv) {&#xA;&#xA;/// Set up a 1280x720 window, and an item with 1/3 width and height of the window.&#xA;int window_width, window_height, item_width, item_height;&#xA;window_width = 1280;&#xA;window_height = 720;&#xA;item_width = (window_width / 3);&#xA;item_height = (window_height / 3);&#xA;&#xA;int item_out_width = item_width;&#xA;/// This line sets the item width to be 32 byte aligned uncomment to see uncorrupted results&#xA;/// Note %16 because outformat is 2 bytes per component&#xA;//item_out_width -= (item_width % 16);&#xA;&#xA;enum AVPixelFormat outformat = AV_PIX_FMT_YUV422P10LE;&#xA;enum AVPixelFormat informat = AV_PIX_FMT_UYVY422;&#xA;int window_lines[4] = {0};&#xA;av_image_fill_linesizes(window_lines, outformat, window_width);&#xA;&#xA;uint8_t *window_planes[4] = {0};&#xA;window_planes[0] = calloc(1, window_lines[0] * window_height);&#xA;window_planes[1] = calloc(1, window_lines[1] * window_height);&#xA;window_planes[2] = calloc(1, window_lines[2] * window_height); /// Fill the window with all 0s, this is green in yuv.&#xA;&#xA;&#xA;int item_lines[4] = {0};&#xA;av_image_fill_linesizes(item_lines, informat, item_width);&#xA;&#xA;uint8_t *item_planes[4] = {0};&#xA;item_planes[0] = malloc(item_lines[0] * item_height);&#xA;memset(item_planes[0], 100, item_lines[0] * item_height);&#xA;&#xA;struct SwsContext *ctx;&#xA;ctx = sws_getContext(item_width, item_height, informat,&#xA;               item_out_width, item_height, outformat, SWS_FAST_BILINEAR, NULL, NULL, NULL);&#xA;&#xA;/// Check a block in the normal region&#xA;printf("Pre scale normal region %d %d %d\n", (int)((uint16_t*)window_planes[0])[0], (int)((uint16_t*)window_planes[1])[0],&#xA;       (int)((uint16_t*)window_planes[2])[0]);&#xA;&#xA;/// Check a block in the corrupted region (should be all zeros) These values should be out of the converted region&#xA;int corrupt_offset_y = (item_out_width &#x2B; 3) * 2; ///(item_width &#x2B; 3) * 2 bytes per component Y PLANE&#xA;int corrupt_offset_uv = (item_out_width &#x2B; 3); ///(item_width &#x2B; 3) * (2 bytes per component rshift 1 for horiz scaling) U and V PLANES&#xA;&#xA;printf("Pre scale corrupted region %d %d %d\n", (int)(*((uint16_t*)(window_planes[0] &#x2B; corrupt_offset_y))),&#xA;       (int)(*((uint16_t*)(window_planes[1] &#x2B; corrupt_offset_uv))), (int)(*((uint16_t*)(window_planes[2] &#x2B; corrupt_offset_uv))));&#xA;sws_scale(ctx, (const uint8_t**)item_planes, item_lines, 0, item_height,window_planes, window_lines);&#xA;&#xA;/// Preform same tests after scaling&#xA;printf("Post scale normal region %d %d %d\n", (int)((uint16_t*)window_planes[0])[0], (int)((uint16_t*)window_planes[1])[0],&#xA;       (int)((uint16_t*)window_planes[2])[0]);&#xA;printf("Post scale corrupted region %d %d %d\n", (int)(*((uint16_t*)(window_planes[0] &#x2B; corrupt_offset_y))),&#xA;       (int)(*((uint16_t*)(window_planes[1] &#x2B; corrupt_offset_uv))), (int)(*((uint16_t*)(window_planes[2] &#x2B; corrupt_offset_uv))));&#xA;&#xA;return 0;&#xA;

    &#xA;

    }

    &#xA;

    Example Output:&#xA;&#xA;//No alignment&#xA;Pre scale normal region 0 0 0&#xA;Pre scale corrupted region 0 0 0&#xA;Post scale normal region 400 400 400&#xA;Post scale corrupted region 512 36865 36865&#xA;&#xA;//With alignment&#xA;Pre scale normal region 0 0 0&#xA;Pre scale corrupted region 0 0 0&#xA;Post scale normal region 400 400 400&#xA;Post scale corrupted region 0 0 0&#xA;

    &#xA;