
Recherche avancée
Autres articles (30)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Les vidéos
21 avril 2011, parComme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
Sur d’autres sites (2627)
-
Creating image overlay on video like tiktok on a video in flutter [closed]
12 octobre 2020, par Sarthak SinghalI am trying to create an app in flutter in which user can add image on a video like the given screenshot. The user should be able to move the image widget anywhere on the screen, resize it, and rotate it also. When this is done, the image should be merged on the video.


For now I have used Stack and MatrixGestureDetector to let the user move the images on the screen.


I want to know how to merge this image along with the exact scale, rotation and position as seen on the screen.


Also if I am going in wrong direction then please guide me the right way to do image overlay on a video in flutter like that of tiktok or snapchat.




-
Live streaming : node-media-server + Dash.js configured for real-time low latency
7 juillet 2021, par MaorationWe're working on an app that enables live monitoring of your back yard.
Each client has a camera connected to the internet, streaming to our public node.js server.



I'm trying to use node-media-server to publish an MPEG-DASH (or HLS) stream to be available for our app clients, on different networks, bandwidths and resolutions around the world.



Our goal is to get as close as possible to live "real-time" so you can monitor what happens in your backyard instantly.



The technical flow already accomplished is :



- 

-
ffmpeg process on our server processes the incoming camera stream (separate child process for each camera) and publishes the stream via RTSP on the local machine for node-media-server to use as an 'input' (we are also saving segmented files, generating thumbnails, etc.). the ffmpeg command responsible for that is :



-c:v libx264 -preset ultrafast -tune zerolatency -b:v 900k -f flv rtmp://127.0.0.1:1935/live/office
-
node-media-server is running with what I found as the default configuration for 'live-streaming'



private NMS_CONFIG = {
server: {
 secret: 'thisisnotmyrealsecret',
},
rtmp_server: {
 rtmp: {
 port: 1935,
 chunk_size: 60000,
 gop_cache: false,
 ping: 60,
 ping_timeout: 30,
 },
 http: {
 port: 8888,
 mediaroot: './server/media',
 allow_origin: '*',
 },
 trans: {
 ffmpeg: '/usr/bin/ffmpeg',
 tasks: [
 {
 app: 'live',
 hls: true,
 hlsFlags: '[hls_time=2:hls_list_size=3:hls_flags=delete_segments]',
 dash: true,
 dashFlags: '[f=dash:window_size=3:extra_window_size=5]',
 },
 ],
 },
},




} ;
-
As I understand it, out of the box NMS (node-media-server) publishes the input stream it gets in multiple output formats : flv, mpeg-dash, hls.
with all sorts of online players for these formats I'm able to access and the stream using the url on localhost. with mpeg-dash and hls I'm getting anything between 10-15 seconds of delay, and more.











My goal now is to implement a local client-side mpeg-dash player, using dash.js and configure it to be as close as possible to live.



my code for that is :







 
 
 
 
 
 <div>
 <video autoplay="" controls=""></video>
 </div>
 <code class="echappe-js"><script src="https://cdnjs.cloudflare.com/ajax/libs/dashjs/3.0.2/dash.all.min.js"></script>


<script>&#xD;&#xA; (function(){&#xD;&#xA; // var url = "https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd";&#xD;&#xA; var url = "http://localhost:8888/live/office/index.mpd";&#xD;&#xA; var player = dashjs.MediaPlayer().create();&#xD;&#xA; &#xD;&#xA; &#xD;&#xA;&#xD;&#xA; // config&#xD;&#xA; targetLatency = 2.0; // Lowering this value will lower latency but may decrease the player&#x27;s ability to build a stable buffer.&#xD;&#xA; minDrift = 0.05; // Minimum latency deviation allowed before activating catch-up mechanism.&#xD;&#xA; catchupPlaybackRate = 0.5; // Maximum catch-up rate, as a percentage, for low latency live streams.&#xD;&#xA; stableBuffer = 2; // The time that the internal buffer target will be set to post startup/seeks (NOT top quality).&#xD;&#xA; bufferAtTopQuality = 2; // The time that the internal buffer target will be set to once playing the top quality.&#xD;&#xA;&#xD;&#xA;&#xD;&#xA; player.updateSettings({&#xD;&#xA; &#x27;streaming&#x27;: {&#xD;&#xA; &#x27;liveDelay&#x27;: 2,&#xD;&#xA; &#x27;liveCatchUpMinDrift&#x27;: 0.05,&#xD;&#xA; &#x27;liveCatchUpPlaybackRate&#x27;: 0.5,&#xD;&#xA; &#x27;stableBufferTime&#x27;: 2,&#xD;&#xA; &#x27;bufferTimeAtTopQuality&#x27;: 2,&#xD;&#xA; &#x27;bufferTimeAtTopQualityLongForm&#x27;: 2,&#xD;&#xA; &#x27;bufferToKeep&#x27;: 2,&#xD;&#xA; &#x27;bufferAheadToKeep&#x27;: 2,&#xD;&#xA; &#x27;lowLatencyEnabled&#x27;: true,&#xD;&#xA; &#x27;fastSwitchEnabled&#x27;: true,&#xD;&#xA; &#x27;abr&#x27;: {&#xD;&#xA; &#x27;limitBitrateByPortal&#x27;: true&#xD;&#xA; },&#xD;&#xA; }&#xD;&#xA; });&#xD;&#xA;&#xD;&#xA; console.log(player.getSettings());&#xD;&#xA;&#xD;&#xA; setInterval(() => {&#xD;&#xA; console.log(&#x27;Live latency= &#x27;, player.getCurrentLiveLatency());&#xD;&#xA; console.log(&#x27;Buffer length= &#x27;, player.getBufferLength(&#x27;video&#x27;));&#xD;&#xA; }, 3000);&#xD;&#xA;&#xD;&#xA; player.initialize(document.querySelector("#videoPlayer"), url, true);&#xD;&#xA;&#xD;&#xA; })();&#xD;&#xA;&#xD;&#xA; </script>

 








with the online test video (https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd) I see that the live latency value is close to 2 secs (but I have no way to actually confirm it. it's a video file streamed. in my office I have a camera so I can actually compare latency between real-life and the stream I get).
however when working locally with my NMS, it seems this value does not want to go below 20-25 seconds.



Am I doing something wrong ? any configuration on the player (client-side html) I'm forgetting ?
or is there a missing configuration I should add on the server side (NMS) ?


-
-
ffmpeg : What is the best practice to keep a live connection/socket with a camera, and save time on ffprobe
15 mai 2022, par Jeff StrongmanToday... I used the following command : with
subprocess.PIPE
andsubprocess.Popen
in python 3 :

ffmpeg -i udp://{address_of_camera} \
 -vf select='if(eq(pict_type,I),st(1,t),gt(t,ld(1)))' setpts=N/FRAME_RATE/TB \
 -f rawvideo -an -vframes {NUM_WANTED_FRAMES} pipe:`



This command helps me to capture
NUM_WANTED_FRAMES
frames from a live camera at a given moment.

However... it takes me about 4 seconds to read the frames, and about 2.5 seconds to open a socket between my computer and the camera's computer.


Is there a way, to have a socket/connection always open between my computer and the camera's computer, to save the 2.5 seconds ?


I read something about
fifo_size
andoverrun_fatal
. I thought that maybe I can setfifo_size
to be equal toNUM_WANTED_FRAMES
, andoverrun_fatal
to True ? Will this solve my problem ? Or is there a different and simpler/better solution ?

Should I try to record always (no
-vframes
flag) store the frames in a queue(With max size), and upon a wish to slice the video, read from my queue buffer ? Will it work well with the keyframe ?

Also... What to do when ffmpeg fails ? restart the ffmpeg command ?