Recherche avancée

Médias (91)

Autres articles (89)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • La sauvegarde automatique de canaux SPIP

    1er avril 2010, par

    Dans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
    Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)

Sur d’autres sites (11675)

  • FFmpeg : Encode x264 with AMD GPU on Windows ?

    20 septembre 2023, par ZeroTek

    I am currently trying to record a Video on my Lenovo Laptop with its Built-In Webcam using FFmpeg on Windows 10. One of my goals is to keep the CPU Usage as low as possible, that's why i want to push the h264 encoding to the GPU. 
Now it gets a bit tricky here with my Laptop. Because it uses two GPUs. The first GPU is a Intel HD 5500 Graphics Unit as Part of the CPU. This one is most likly used for non-demanding Applications like office etc. to save Energy. The other one is a AMD R5 M330 that will be used for graphic intense applications like gaming.

    



    Currently, i am using the following command to encode the Webcam Stream on the Intel HD GPU :

    



    ffmpeg -f dshow -vcodec mjpeg -video_size 1280x720 -framerate 30 video="Lenovo EasyCamera":audio="Mikrofon (Realtek High Definition Audio)" -c:v h264_qsv -g 60 -q 28 -look_ahead 0 -preset:v faster -c:a aac -q:a 0.6 -r 30 output.mp4


    



    This does work so far but it seems this GPU does not have enough Power to keep up with the framerate on higher bitrates or with a high amount of i-frames. The Video starts lacking and skipping frames. If i am using CPU encoding everything works smooth.

    



    Now that my Laptop got that second AMD GPU with a lot more Power it would be a nice Try to encode on that one, but i can't find any information about how to encode on AMD Hardware on Windows 10. So my question is : How does the ffmpeg command look like to use AMD Hardware for h264 encoding ?

    


  • images->video->web canvas : RGB/YUV issues

    5 février 2016, par nrob

    We’ve written an web app which :

    1. takes 3D, time dependent weather data
    2. tiles each 3D time point to make a 2D frame (written out as a png image)
    3. stitches these frames together into a video (using ffmpeg/avconv)
    4. streams this video into a web app
    5. polls the canvas for frames
    6. sends the frames to the GPU where they are converted back to 3D and ray traced

    You can see the app here, code here and you can see the data video here

    Currently the pngs are written as RGB images, the video codec is in YUV and getting frames from the canvas returns RGB. As such there is a significant loss of information due to the conversion between image spaces.

    Does anyone have suggestions what is the best way round this ?

    I’ve tried a bunch of RGB video codecs, but I can’t get any to work, and I don’t know if the web browser will support it anyway. Can anyone suggest a good RGB codec (both lossy and lossless would be great)

    Also, is it possible to write to YUV images/read them from a video canvas in HTML5 ?

    Ultimately, I don’t even want anything to do with images/videos, I’m just hacking the codecs to stream/compress large animated 3D data volumes

  • Live streaming : node-media-server + Dash.js configured for real-time low latency

    7 juillet 2021, par Maoration

    We're working on an app that enables live monitoring of your back yard.
Each client has a camera connected to the internet, streaming to our public node.js server.

    



    I'm trying to use node-media-server to publish an MPEG-DASH (or HLS) stream to be available for our app clients, on different networks, bandwidths and resolutions around the world.

    



    Our goal is to get as close as possible to live "real-time" so you can monitor what happens in your backyard instantly.

    



    The technical flow already accomplished is :

    



      

    1. ffmpeg process on our server processes the incoming camera stream (separate child process for each camera) and publishes the stream via RTSP on the local machine for node-media-server to use as an 'input' (we are also saving segmented files, generating thumbnails, etc.). the ffmpeg command responsible for that is :

      



      -c:v libx264 -preset ultrafast -tune zerolatency -b:v 900k -f flv rtmp://127.0.0.1:1935/live/office

    2. 


    3. node-media-server is running with what I found as the default configuration for 'live-streaming'

      



      private NMS_CONFIG = {
server: {
  secret: 'thisisnotmyrealsecret',
},
rtmp_server: {
  rtmp: {
    port: 1935,
    chunk_size: 60000,
    gop_cache: false,
    ping: 60,
    ping_timeout: 30,
  },
  http: {
    port: 8888,
    mediaroot: './server/media',
    allow_origin: '*',
  },
  trans: {
    ffmpeg: '/usr/bin/ffmpeg',
    tasks: [
      {
        app: 'live',
        hls: true,
        hlsFlags: '[hls_time=2:hls_list_size=3:hls_flags=delete_segments]',
        dash: true,
        dashFlags: '[f=dash:window_size=3:extra_window_size=5]',
      },
    ],
  },
},


      



      } ;

    4. 


    5. As I understand it, out of the box NMS (node-media-server) publishes the input stream it gets in multiple output formats : flv, mpeg-dash, hls.
with all sorts of online players for these formats I'm able to access and the stream using the url on localhost. with mpeg-dash and hls I'm getting anything between 10-15 seconds of delay, and more.

    6. 


    




    



    My goal now is to implement a local client-side mpeg-dash player, using dash.js and configure it to be as close as possible to live.

    



    my code for that is :

    



    

    

    &#xD;&#xA;&#xD;&#xA;    &#xD;&#xA;        &#xD;&#xA;        &#xD;&#xA;    &#xD;&#xA;    &#xD;&#xA;        <div>&#xD;&#xA;            <video autoplay="" controls=""></video>&#xD;&#xA;        </div>&#xD;&#xA;        <code class="echappe-js">&lt;script src=&quot;https://cdnjs.cloudflare.com/ajax/libs/dashjs/3.0.2/dash.all.min.js&quot;&gt;&lt;/script&gt;&#xD;&#xA;&#xD;&#xA;        &lt;script&gt;&amp;#xD;&amp;#xA;            (function(){&amp;#xD;&amp;#xA;                // var url = &quot;https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd&quot;;&amp;#xD;&amp;#xA;                var url = &quot;http://localhost:8888/live/office/index.mpd&quot;;&amp;#xD;&amp;#xA;                var player = dashjs.MediaPlayer().create();&amp;#xD;&amp;#xA;                &amp;#xD;&amp;#xA;                &amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                // config&amp;#xD;&amp;#xA;                targetLatency = 2.0;        // Lowering this value will lower latency but may decrease the player&amp;#x27;s ability to build a stable buffer.&amp;#xD;&amp;#xA;                minDrift = 0.05;            // Minimum latency deviation allowed before activating catch-up mechanism.&amp;#xD;&amp;#xA;                catchupPlaybackRate = 0.5;  // Maximum catch-up rate, as a percentage, for low latency live streams.&amp;#xD;&amp;#xA;                stableBuffer = 2;           // The time that the internal buffer target will be set to post startup/seeks (NOT top quality).&amp;#xD;&amp;#xA;                bufferAtTopQuality = 2;     // The time that the internal buffer target will be set to once playing the top quality.&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                player.updateSettings({&amp;#xD;&amp;#xA;                    &amp;#x27;streaming&amp;#x27;: {&amp;#xD;&amp;#xA;                        &amp;#x27;liveDelay&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;liveCatchUpMinDrift&amp;#x27;: 0.05,&amp;#xD;&amp;#xA;                        &amp;#x27;liveCatchUpPlaybackRate&amp;#x27;: 0.5,&amp;#xD;&amp;#xA;                        &amp;#x27;stableBufferTime&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;bufferTimeAtTopQuality&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;bufferTimeAtTopQualityLongForm&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;bufferToKeep&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;bufferAheadToKeep&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;lowLatencyEnabled&amp;#x27;: true,&amp;#xD;&amp;#xA;                        &amp;#x27;fastSwitchEnabled&amp;#x27;: true,&amp;#xD;&amp;#xA;                        &amp;#x27;abr&amp;#x27;: {&amp;#xD;&amp;#xA;                            &amp;#x27;limitBitrateByPortal&amp;#x27;: true&amp;#xD;&amp;#xA;                        },&amp;#xD;&amp;#xA;                    }&amp;#xD;&amp;#xA;                });&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                console.log(player.getSettings());&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                setInterval(() =&gt; {&amp;#xD;&amp;#xA;                  console.log(&amp;#x27;Live latency= &amp;#x27;, player.getCurrentLiveLatency());&amp;#xD;&amp;#xA;                  console.log(&amp;#x27;Buffer length= &amp;#x27;, player.getBufferLength(&amp;#x27;video&amp;#x27;));&amp;#xD;&amp;#xA;                }, 3000);&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                player.initialize(document.querySelector(&quot;#videoPlayer&quot;), url, true);&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;            })();&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;        &lt;/script&gt;&#xD;&#xA;    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;&#xA;&#xA;

    with the online test video (https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd) I see that the live latency value is close to 2 secs (but I have no way to actually confirm it. it's a video file streamed. in my office I have a camera so I can actually compare latency between real-life and the stream I get).&#xA;however when working locally with my NMS, it seems this value does not want to go below 20-25 seconds.

    &#xA;&#xA;

    Am I doing something wrong ? any configuration on the player (client-side html) I'm forgetting ?&#xA;or is there a missing configuration I should add on the server side (NMS) ?

    &#xA;