
Recherche avancée
Médias (1)
-
Video d’abeille en portrait
14 mai 2011, par
Mis à jour : Février 2012
Langue : français
Type : Video
Autres articles (75)
-
MediaSPIP 0.1 Beta version
25 avril 2011, parMediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Mise à jour de la version 0.1 vers 0.2
24 juin 2013, parExplications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)
Sur d’autres sites (10164)
-
Live streaming : node-media-server + Dash.js configured for real-time low latency
7 juillet 2021, par MaorationWe're working on an app that enables live monitoring of your back yard.
Each client has a camera connected to the internet, streaming to our public node.js server.



I'm trying to use node-media-server to publish an MPEG-DASH (or HLS) stream to be available for our app clients, on different networks, bandwidths and resolutions around the world.



Our goal is to get as close as possible to live "real-time" so you can monitor what happens in your backyard instantly.



The technical flow already accomplished is :



- 

-
ffmpeg process on our server processes the incoming camera stream (separate child process for each camera) and publishes the stream via RTSP on the local machine for node-media-server to use as an 'input' (we are also saving segmented files, generating thumbnails, etc.). the ffmpeg command responsible for that is :



-c:v libx264 -preset ultrafast -tune zerolatency -b:v 900k -f flv rtmp://127.0.0.1:1935/live/office
-
node-media-server is running with what I found as the default configuration for 'live-streaming'



private NMS_CONFIG = {
server: {
 secret: 'thisisnotmyrealsecret',
},
rtmp_server: {
 rtmp: {
 port: 1935,
 chunk_size: 60000,
 gop_cache: false,
 ping: 60,
 ping_timeout: 30,
 },
 http: {
 port: 8888,
 mediaroot: './server/media',
 allow_origin: '*',
 },
 trans: {
 ffmpeg: '/usr/bin/ffmpeg',
 tasks: [
 {
 app: 'live',
 hls: true,
 hlsFlags: '[hls_time=2:hls_list_size=3:hls_flags=delete_segments]',
 dash: true,
 dashFlags: '[f=dash:window_size=3:extra_window_size=5]',
 },
 ],
 },
},




} ;
-
As I understand it, out of the box NMS (node-media-server) publishes the input stream it gets in multiple output formats : flv, mpeg-dash, hls.
with all sorts of online players for these formats I'm able to access and the stream using the url on localhost. with mpeg-dash and hls I'm getting anything between 10-15 seconds of delay, and more.











My goal now is to implement a local client-side mpeg-dash player, using dash.js and configure it to be as close as possible to live.



my code for that is :







 
 
 
 
 
 <div>
 <video autoplay="" controls=""></video>
 </div>
 <code class="echappe-js"><script src="https://cdnjs.cloudflare.com/ajax/libs/dashjs/3.0.2/dash.all.min.js"></script>


<script>&#xD;&#xA; (function(){&#xD;&#xA; // var url = "https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd";&#xD;&#xA; var url = "http://localhost:8888/live/office/index.mpd";&#xD;&#xA; var player = dashjs.MediaPlayer().create();&#xD;&#xA; &#xD;&#xA; &#xD;&#xA;&#xD;&#xA; // config&#xD;&#xA; targetLatency = 2.0; // Lowering this value will lower latency but may decrease the player&#x27;s ability to build a stable buffer.&#xD;&#xA; minDrift = 0.05; // Minimum latency deviation allowed before activating catch-up mechanism.&#xD;&#xA; catchupPlaybackRate = 0.5; // Maximum catch-up rate, as a percentage, for low latency live streams.&#xD;&#xA; stableBuffer = 2; // The time that the internal buffer target will be set to post startup/seeks (NOT top quality).&#xD;&#xA; bufferAtTopQuality = 2; // The time that the internal buffer target will be set to once playing the top quality.&#xD;&#xA;&#xD;&#xA;&#xD;&#xA; player.updateSettings({&#xD;&#xA; &#x27;streaming&#x27;: {&#xD;&#xA; &#x27;liveDelay&#x27;: 2,&#xD;&#xA; &#x27;liveCatchUpMinDrift&#x27;: 0.05,&#xD;&#xA; &#x27;liveCatchUpPlaybackRate&#x27;: 0.5,&#xD;&#xA; &#x27;stableBufferTime&#x27;: 2,&#xD;&#xA; &#x27;bufferTimeAtTopQuality&#x27;: 2,&#xD;&#xA; &#x27;bufferTimeAtTopQualityLongForm&#x27;: 2,&#xD;&#xA; &#x27;bufferToKeep&#x27;: 2,&#xD;&#xA; &#x27;bufferAheadToKeep&#x27;: 2,&#xD;&#xA; &#x27;lowLatencyEnabled&#x27;: true,&#xD;&#xA; &#x27;fastSwitchEnabled&#x27;: true,&#xD;&#xA; &#x27;abr&#x27;: {&#xD;&#xA; &#x27;limitBitrateByPortal&#x27;: true&#xD;&#xA; },&#xD;&#xA; }&#xD;&#xA; });&#xD;&#xA;&#xD;&#xA; console.log(player.getSettings());&#xD;&#xA;&#xD;&#xA; setInterval(() => {&#xD;&#xA; console.log(&#x27;Live latency= &#x27;, player.getCurrentLiveLatency());&#xD;&#xA; console.log(&#x27;Buffer length= &#x27;, player.getBufferLength(&#x27;video&#x27;));&#xD;&#xA; }, 3000);&#xD;&#xA;&#xD;&#xA; player.initialize(document.querySelector("#videoPlayer"), url, true);&#xD;&#xA;&#xD;&#xA; })();&#xD;&#xA;&#xD;&#xA; </script>

 








with the online test video (https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd) I see that the live latency value is close to 2 secs (but I have no way to actually confirm it. it's a video file streamed. in my office I have a camera so I can actually compare latency between real-life and the stream I get).
however when working locally with my NMS, it seems this value does not want to go below 20-25 seconds.



Am I doing something wrong ? any configuration on the player (client-side html) I'm forgetting ?
or is there a missing configuration I should add on the server side (NMS) ?


-
-
How to apply dynamic watermarking for users watching video in real-time ? [closed]
3 janvier, par Barun BhattacharjeeI am working on a video streaming project where I need to apply a dynamic watermarking (e.g., username and email) in real-time for security purposes. The video is being streamed in DASH format, and the segment files are in .m4s format generated via FFmpeg.


Challenges :
Is it possible to directly apply dynamic watermarking to .m4s segment files ?


Video segments are generated using FFmpeg with the following command :


ffmpeg
 .input(video_path)
 .output(mpd_path,
 format='dash',
 map='0',
 video_bitrate='2400k',
 video_size='1920x1080',
 vcodec='libx264',
 seg_duration='4', # Sets segment duration to 4 seconds
 acodec='copy')
 .run()




What I tried :
I attempted to use FFmpeg to apply a watermark dynamically to the .m4s files using the drawtext filter, but .m4s files are not always recognized as valid input for FFmpeg operations.


# FFmpeg command to add watermark to m4s file
try:
 # FFmpeg processing
 out, err = (
 ffmpeg
 .input(m4s_file_path) # Input the segment file
 .filter(
 "drawtext",
 text=user_info,
 fontfile="font/dejavu-sans/DejaVuSans-Bold.ttf",
 fontsize=24,
 fontcolor="white",
 x=10,
 y=10
 )
 .output(
 "pipe:", # Stream output as a byte stream
 format="mp4", # Output format as MP4 (compatible with MPEG-DASH)
 vcodec="libx264",
 acodec="copy",
 movflags="frag_keyframe+empty_moov"
 )
 .run(capture_stdout=True, capture_stderr=True)
 )

 logger.info(f"FFmpeg process completed. stdout length: {len(out)}, stderr: {err.decode('utf-8')}")
 logger.error(f"FFmpeg stderr: {err.decode('utf-8')}")
 return out # Return the processed video stream data


except ffmpeg.Error as e:
 stderr_output = e.stderr.decode('utf-8') if e.stderr else "No stderr available"
 logger.error(f"FFmpeg error: {stderr_output}")

 raise RuntimeError(f"Error processing video: {stderr_output}")




Error I faced :


video-streaming-backend | [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f1bf99cc640] Could not find codec parameters for stream 0 (Video: h264 (avc1 / 0x31637661), none(tv, bt709), 1920x1012): unspecified pixel format
video-streaming-backend | Consider increasing the value for the 'analyzeduration' (10000000) and 'probesize' (5000000) options
video-streaming-backend | Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'http://web:8000/media/stream_video/chunks/ec1db006-b488-47ad-8220-79a05bcaae39/segments/init-stream0.m4s':
video-streaming-backend | Metadata:
video-streaming-backend | major_brand : iso5
video-streaming-backend | minor_version : 512
video-streaming-backend | compatible_brands: iso5iso6mp41
video-streaming-backend | encoder : Lavf60.16.100
video-streaming-backend | Duration: N/A, bitrate: N/A
video-streaming-backend | Stream #0:0[0x1](und): Video: h264 (avc1 / 0x31637661), none(tv, bt709), 1920x1012, SAR 1:1 DAR 480:253, 12288 tbr, 12288 tbn (default)
video-streaming-backend | Metadata:
video-streaming-backend | handler_name : VideoHandler
video-streaming-backend | vendor_id : [0][0][0][0]
video-streaming-backend | Stream mapping:
video-streaming-backend | Stream #0:0 (h264) -> drawtext:default
video-streaming-backend | drawtext:default -> Stream #0:0 (libx264)
video-streaming-backend | Press [q] to stop, [?] for help
video-streaming-backend | Cannot determine format of input stream 0:0 after EOF
video-streaming-backend | Error marking filters as finished
video-streaming-backend | Error while filtering: Invalid data found when processing input
video-streaming-backend | [out#0/mp4 @ 0x7f1bf8e73100] Nothing was written into output file, because at least one of its streams received no packets.
video-streaming-backend | frame= 0 fps=0.0 q=0.0 Lsize= 0kB time=N/A bitrate=N/A speed=N/A 
video-streaming-backend | Conversion failed!




These errors have left me wondering if .m4s is a viable format for dynamic watermarking. If it's not, what would be the correct approach ?


-
Real time audio streaming from ffmpeg to browser (am I missing something ?)
19 septembre 2022, par Яктенс ТидI have tried a couple of solutions already, but nothing works for me.
I want to stream audio from my PC to another computer with almost zero latency. Things are working fine so far in a sense of lagging and everything, sound is clear and not choppy at all, but there is something like a delay between the moment when audio starts playing on my PC and remote PC. For example when I click on Youtube 'play' button audio starts playing only after 3-4 seconds on the remote machine. The same when I click 'Pause', the sound on the remote PC stops after a couple of seconds.


I've tried to use websockets\plain audio tag, but no luck so far.


For example this is my solution by using websockets and pipes :


def create_pipe():
 return win32pipe.CreateNamedPipe(r'\\.\pipe\__audio_ffmpeg', win32pipe.PIPE_ACCESS_INBOUND,
 win32pipe.PIPE_TYPE_MESSAGE |
 win32pipe.PIPE_READMODE_MESSAGE |
 win32pipe.PIPE_WAIT, 1, 1024 * 8, 1024 * 8, 0, None)


async def echo(websocket):
 pipe = create_pipe()
 win32pipe.ConnectNamedPipe(pipe, None)
 while True:
 data = win32file.ReadFile(pipe, 1024 * 2)
 await websocket.send(data[1])


async def main():
 async with websockets.serve(echo, "0.0.0.0", 7777):
 await asyncio.Future() # run forever


if __name__ == '__main__':
 asyncio.run(main())



The way I start ffmpeg


.\ffmpeg.exe -f dshow -i audio="Stereo Mix (Realtek High Definition Audio)" -acodec libmp3lame -ab 320k -f mp3 -probesize 32 -muxdelay 0.01 -y \\.\pipe\__audio_ffmpeg



On the JS side the code is a little bit long, but essentially I am just reading a web socket and appending to buffer


this.buffer = this.mediaSource.addSourceBuffer('audio/mpeg')



Also as you see I tried to use -probesize 32 -muxdelay 0.01 flags, but no luck as well


I tried to use plain tag as well, but still - this couple-of-seconds delay exists


What can I do ? Am I missing something ? Maybe I have to disable buffering somewhere ?