
Recherche avancée
Médias (1)
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (63)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;
Sur d’autres sites (8806)
-
Stream video and commands on same connection or split connections ?
10 mai 2013, par bizzehdeeBackground
I am in the middle of writing a client/server app that i will install on every machine within my office (roughly 30 - 35 machines). I currently have the client connecting to the server and it has an ability to send mouse movement, mouse clicks, key strokes and execute certain commands. The next step is to stream back a video output of the screen, i am using the GDI method from Fastest method of screen capturing to capture the entire screen and will be using the x264 encoder to compress the frames and transmit them back to the client which will then decode and display the stream.
Question
is it best (by means of reducing lag, ensuring all commands are delivered as fast as possible and that streaming is as live as possible) that i transmit back along the same connection that i established for the commands, or, should i establish a separate connection on the same port, or on a different port to stream back the video ?
P.S.
i am aware that VNC, RD and other things such as TeamViewer already exist and already do this sort of thing, but none of these support all the requirements needed for what we need within this system.
-
Live streaming : node-media-server + Dash.js configured for real-time low latency
7 juillet 2021, par MaorationWe're working on an app that enables live monitoring of your back yard.
Each client has a camera connected to the internet, streaming to our public node.js server.



I'm trying to use node-media-server to publish an MPEG-DASH (or HLS) stream to be available for our app clients, on different networks, bandwidths and resolutions around the world.



Our goal is to get as close as possible to live "real-time" so you can monitor what happens in your backyard instantly.



The technical flow already accomplished is :



- 

-
ffmpeg process on our server processes the incoming camera stream (separate child process for each camera) and publishes the stream via RTSP on the local machine for node-media-server to use as an 'input' (we are also saving segmented files, generating thumbnails, etc.). the ffmpeg command responsible for that is :



-c:v libx264 -preset ultrafast -tune zerolatency -b:v 900k -f flv rtmp://127.0.0.1:1935/live/office
-
node-media-server is running with what I found as the default configuration for 'live-streaming'



private NMS_CONFIG = {
server: {
 secret: 'thisisnotmyrealsecret',
},
rtmp_server: {
 rtmp: {
 port: 1935,
 chunk_size: 60000,
 gop_cache: false,
 ping: 60,
 ping_timeout: 30,
 },
 http: {
 port: 8888,
 mediaroot: './server/media',
 allow_origin: '*',
 },
 trans: {
 ffmpeg: '/usr/bin/ffmpeg',
 tasks: [
 {
 app: 'live',
 hls: true,
 hlsFlags: '[hls_time=2:hls_list_size=3:hls_flags=delete_segments]',
 dash: true,
 dashFlags: '[f=dash:window_size=3:extra_window_size=5]',
 },
 ],
 },
},




} ;
-
As I understand it, out of the box NMS (node-media-server) publishes the input stream it gets in multiple output formats : flv, mpeg-dash, hls.
with all sorts of online players for these formats I'm able to access and the stream using the url on localhost. with mpeg-dash and hls I'm getting anything between 10-15 seconds of delay, and more.











My goal now is to implement a local client-side mpeg-dash player, using dash.js and configure it to be as close as possible to live.



my code for that is :







 
 
 
 
 
 <div>
 <video autoplay="" controls=""></video>
 </div>
 <code class="echappe-js"><script src="https://cdnjs.cloudflare.com/ajax/libs/dashjs/3.0.2/dash.all.min.js"></script>


<script>&#xD;&#xA; (function(){&#xD;&#xA; // var url = "https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd";&#xD;&#xA; var url = "http://localhost:8888/live/office/index.mpd";&#xD;&#xA; var player = dashjs.MediaPlayer().create();&#xD;&#xA; &#xD;&#xA; &#xD;&#xA;&#xD;&#xA; // config&#xD;&#xA; targetLatency = 2.0; // Lowering this value will lower latency but may decrease the player&#x27;s ability to build a stable buffer.&#xD;&#xA; minDrift = 0.05; // Minimum latency deviation allowed before activating catch-up mechanism.&#xD;&#xA; catchupPlaybackRate = 0.5; // Maximum catch-up rate, as a percentage, for low latency live streams.&#xD;&#xA; stableBuffer = 2; // The time that the internal buffer target will be set to post startup/seeks (NOT top quality).&#xD;&#xA; bufferAtTopQuality = 2; // The time that the internal buffer target will be set to once playing the top quality.&#xD;&#xA;&#xD;&#xA;&#xD;&#xA; player.updateSettings({&#xD;&#xA; &#x27;streaming&#x27;: {&#xD;&#xA; &#x27;liveDelay&#x27;: 2,&#xD;&#xA; &#x27;liveCatchUpMinDrift&#x27;: 0.05,&#xD;&#xA; &#x27;liveCatchUpPlaybackRate&#x27;: 0.5,&#xD;&#xA; &#x27;stableBufferTime&#x27;: 2,&#xD;&#xA; &#x27;bufferTimeAtTopQuality&#x27;: 2,&#xD;&#xA; &#x27;bufferTimeAtTopQualityLongForm&#x27;: 2,&#xD;&#xA; &#x27;bufferToKeep&#x27;: 2,&#xD;&#xA; &#x27;bufferAheadToKeep&#x27;: 2,&#xD;&#xA; &#x27;lowLatencyEnabled&#x27;: true,&#xD;&#xA; &#x27;fastSwitchEnabled&#x27;: true,&#xD;&#xA; &#x27;abr&#x27;: {&#xD;&#xA; &#x27;limitBitrateByPortal&#x27;: true&#xD;&#xA; },&#xD;&#xA; }&#xD;&#xA; });&#xD;&#xA;&#xD;&#xA; console.log(player.getSettings());&#xD;&#xA;&#xD;&#xA; setInterval(() => {&#xD;&#xA; console.log(&#x27;Live latency= &#x27;, player.getCurrentLiveLatency());&#xD;&#xA; console.log(&#x27;Buffer length= &#x27;, player.getBufferLength(&#x27;video&#x27;));&#xD;&#xA; }, 3000);&#xD;&#xA;&#xD;&#xA; player.initialize(document.querySelector("#videoPlayer"), url, true);&#xD;&#xA;&#xD;&#xA; })();&#xD;&#xA;&#xD;&#xA; </script>

 








with the online test video (https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd) I see that the live latency value is close to 2 secs (but I have no way to actually confirm it. it's a video file streamed. in my office I have a camera so I can actually compare latency between real-life and the stream I get).
however when working locally with my NMS, it seems this value does not want to go below 20-25 seconds.



Am I doing something wrong ? any configuration on the player (client-side html) I'm forgetting ?
or is there a missing configuration I should add on the server side (NMS) ?


-
-
images->video->web canvas : RGB/YUV issues
5 février 2016, par nrobWe’ve written an web app which :
- takes 3D, time dependent weather data
- tiles each 3D time point to make a 2D frame (written out as a png image)
- stitches these frames together into a video (using ffmpeg/avconv)
- streams this video into a web app
- polls the canvas for frames
- sends the frames to the GPU where they are converted back to 3D and ray traced
You can see the app here, code here and you can see the data video here
Currently the pngs are written as RGB images, the video codec is in YUV and getting frames from the canvas returns RGB. As such there is a significant loss of information due to the conversion between image spaces.
Does anyone have suggestions what is the best way round this ?
I’ve tried a bunch of RGB video codecs, but I can’t get any to work, and I don’t know if the web browser will support it anyway. Can anyone suggest a good RGB codec (both lossy and lossless would be great)
Also, is it possible to write to YUV images/read them from a video canvas in HTML5 ?
Ultimately, I don’t even want anything to do with images/videos, I’m just hacking the codecs to stream/compress large animated 3D data volumes