Recherche avancée

Médias (1)

Mot : - Tags -/iphone

Autres articles (75)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

Sur d’autres sites (10164)

  • Live streaming : node-media-server + Dash.js configured for real-time low latency

    7 juillet 2021, par Maoration

    We're working on an app that enables live monitoring of your back yard.
Each client has a camera connected to the internet, streaming to our public node.js server.

    



    I'm trying to use node-media-server to publish an MPEG-DASH (or HLS) stream to be available for our app clients, on different networks, bandwidths and resolutions around the world.

    



    Our goal is to get as close as possible to live "real-time" so you can monitor what happens in your backyard instantly.

    



    The technical flow already accomplished is :

    



      

    1. ffmpeg process on our server processes the incoming camera stream (separate child process for each camera) and publishes the stream via RTSP on the local machine for node-media-server to use as an 'input' (we are also saving segmented files, generating thumbnails, etc.). the ffmpeg command responsible for that is :

      



      -c:v libx264 -preset ultrafast -tune zerolatency -b:v 900k -f flv rtmp://127.0.0.1:1935/live/office

    2. 


    3. node-media-server is running with what I found as the default configuration for 'live-streaming'

      



      private NMS_CONFIG = {
server: {
  secret: 'thisisnotmyrealsecret',
},
rtmp_server: {
  rtmp: {
    port: 1935,
    chunk_size: 60000,
    gop_cache: false,
    ping: 60,
    ping_timeout: 30,
  },
  http: {
    port: 8888,
    mediaroot: './server/media',
    allow_origin: '*',
  },
  trans: {
    ffmpeg: '/usr/bin/ffmpeg',
    tasks: [
      {
        app: 'live',
        hls: true,
        hlsFlags: '[hls_time=2:hls_list_size=3:hls_flags=delete_segments]',
        dash: true,
        dashFlags: '[f=dash:window_size=3:extra_window_size=5]',
      },
    ],
  },
},


      



      } ;

    4. 


    5. As I understand it, out of the box NMS (node-media-server) publishes the input stream it gets in multiple output formats : flv, mpeg-dash, hls.
with all sorts of online players for these formats I'm able to access and the stream using the url on localhost. with mpeg-dash and hls I'm getting anything between 10-15 seconds of delay, and more.

    6. 


    




    



    My goal now is to implement a local client-side mpeg-dash player, using dash.js and configure it to be as close as possible to live.

    



    my code for that is :

    



    

    

    &#xD;&#xA;&#xD;&#xA;    &#xD;&#xA;        &#xD;&#xA;        &#xD;&#xA;    &#xD;&#xA;    &#xD;&#xA;        <div>&#xD;&#xA;            <video autoplay="" controls=""></video>&#xD;&#xA;        </div>&#xD;&#xA;        <code class="echappe-js">&lt;script src=&quot;https://cdnjs.cloudflare.com/ajax/libs/dashjs/3.0.2/dash.all.min.js&quot;&gt;&lt;/script&gt;&#xD;&#xA;&#xD;&#xA;        &lt;script&gt;&amp;#xD;&amp;#xA;            (function(){&amp;#xD;&amp;#xA;                // var url = &quot;https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd&quot;;&amp;#xD;&amp;#xA;                var url = &quot;http://localhost:8888/live/office/index.mpd&quot;;&amp;#xD;&amp;#xA;                var player = dashjs.MediaPlayer().create();&amp;#xD;&amp;#xA;                &amp;#xD;&amp;#xA;                &amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                // config&amp;#xD;&amp;#xA;                targetLatency = 2.0;        // Lowering this value will lower latency but may decrease the player&amp;#x27;s ability to build a stable buffer.&amp;#xD;&amp;#xA;                minDrift = 0.05;            // Minimum latency deviation allowed before activating catch-up mechanism.&amp;#xD;&amp;#xA;                catchupPlaybackRate = 0.5;  // Maximum catch-up rate, as a percentage, for low latency live streams.&amp;#xD;&amp;#xA;                stableBuffer = 2;           // The time that the internal buffer target will be set to post startup/seeks (NOT top quality).&amp;#xD;&amp;#xA;                bufferAtTopQuality = 2;     // The time that the internal buffer target will be set to once playing the top quality.&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                player.updateSettings({&amp;#xD;&amp;#xA;                    &amp;#x27;streaming&amp;#x27;: {&amp;#xD;&amp;#xA;                        &amp;#x27;liveDelay&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;liveCatchUpMinDrift&amp;#x27;: 0.05,&amp;#xD;&amp;#xA;                        &amp;#x27;liveCatchUpPlaybackRate&amp;#x27;: 0.5,&amp;#xD;&amp;#xA;                        &amp;#x27;stableBufferTime&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;bufferTimeAtTopQuality&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;bufferTimeAtTopQualityLongForm&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;bufferToKeep&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;bufferAheadToKeep&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;lowLatencyEnabled&amp;#x27;: true,&amp;#xD;&amp;#xA;                        &amp;#x27;fastSwitchEnabled&amp;#x27;: true,&amp;#xD;&amp;#xA;                        &amp;#x27;abr&amp;#x27;: {&amp;#xD;&amp;#xA;                            &amp;#x27;limitBitrateByPortal&amp;#x27;: true&amp;#xD;&amp;#xA;                        },&amp;#xD;&amp;#xA;                    }&amp;#xD;&amp;#xA;                });&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                console.log(player.getSettings());&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                setInterval(() =&gt; {&amp;#xD;&amp;#xA;                  console.log(&amp;#x27;Live latency= &amp;#x27;, player.getCurrentLiveLatency());&amp;#xD;&amp;#xA;                  console.log(&amp;#x27;Buffer length= &amp;#x27;, player.getBufferLength(&amp;#x27;video&amp;#x27;));&amp;#xD;&amp;#xA;                }, 3000);&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                player.initialize(document.querySelector(&quot;#videoPlayer&quot;), url, true);&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;            })();&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;        &lt;/script&gt;&#xD;&#xA;    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;&#xA;&#xA;

    with the online test video (https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd) I see that the live latency value is close to 2 secs (but I have no way to actually confirm it. it's a video file streamed. in my office I have a camera so I can actually compare latency between real-life and the stream I get).&#xA;however when working locally with my NMS, it seems this value does not want to go below 20-25 seconds.

    &#xA;&#xA;

    Am I doing something wrong ? any configuration on the player (client-side html) I'm forgetting ?&#xA;or is there a missing configuration I should add on the server side (NMS) ?

    &#xA;

  • How to apply dynamic watermarking for users watching video in real-time ? [closed]

    3 janvier, par Barun Bhattacharjee

    I am working on a video streaming project where I need to apply a dynamic watermarking (e.g., username and email) in real-time for security purposes. The video is being streamed in DASH format, and the segment files are in .m4s format generated via FFmpeg.

    &#xA;

    Challenges :&#xA;Is it possible to directly apply dynamic watermarking to .m4s segment files ?

    &#xA;

    Video segments are generated using FFmpeg with the following command :

    &#xA;

    ffmpeg&#xA;    .input(video_path)&#xA;    .output(mpd_path,&#xA;            format=&#x27;dash&#x27;,&#xA;            map=&#x27;0&#x27;,&#xA;            video_bitrate=&#x27;2400k&#x27;,&#xA;            video_size=&#x27;1920x1080&#x27;,&#xA;            vcodec=&#x27;libx264&#x27;,&#xA;            seg_duration=&#x27;4&#x27;,  # Sets segment duration to 4 seconds&#xA;            acodec=&#x27;copy&#x27;)&#xA;    .run()&#xA;&#xA;

    &#xA;

    What I tried :&#xA;I attempted to use FFmpeg to apply a watermark dynamically to the .m4s files using the drawtext filter, but .m4s files are not always recognized as valid input for FFmpeg operations.

    &#xA;

    # FFmpeg command to add watermark to m4s file&#xA;try:&#xA;    # FFmpeg processing&#xA;    out, err = (&#xA;        ffmpeg&#xA;        .input(m4s_file_path)  # Input the segment file&#xA;        .filter(&#xA;            "drawtext",&#xA;            text=user_info,&#xA;            fontfile="font/dejavu-sans/DejaVuSans-Bold.ttf",&#xA;            fontsize=24,&#xA;            fontcolor="white",&#xA;            x=10,&#xA;            y=10&#xA;        )&#xA;        .output(&#xA;            "pipe:",  # Stream output as a byte stream&#xA;            format="mp4",  # Output format as MP4 (compatible with MPEG-DASH)&#xA;            vcodec="libx264",&#xA;            acodec="copy",&#xA;            movflags="frag_keyframe&#x2B;empty_moov"&#xA;        )&#xA;        .run(capture_stdout=True, capture_stderr=True)&#xA;    )&#xA;&#xA;    logger.info(f"FFmpeg process completed. stdout length: {len(out)}, stderr: {err.decode(&#x27;utf-8&#x27;)}")&#xA;    logger.error(f"FFmpeg stderr: {err.decode(&#x27;utf-8&#x27;)}")&#xA;    return out  # Return the processed video stream data&#xA;&#xA;&#xA;except ffmpeg.Error as e:&#xA;    stderr_output = e.stderr.decode(&#x27;utf-8&#x27;) if e.stderr else "No stderr available"&#xA;    logger.error(f"FFmpeg error: {stderr_output}")&#xA;&#xA;    raise RuntimeError(f"Error processing video: {stderr_output}")&#xA;&#xA;

    &#xA;

    Error I faced :

    &#xA;

    video-streaming-backend  | [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7f1bf99cc640] Could not find codec parameters for stream 0 (Video: h264 (avc1 / 0x31637661), none(tv, bt709), 1920x1012): unspecified pixel format&#xA;video-streaming-backend  | Consider increasing the value for the &#x27;analyzeduration&#x27; (10000000) and &#x27;probesize&#x27; (5000000) options&#xA;video-streaming-backend  | Input #0, mov,mp4,m4a,3gp,3g2,mj2, from &#x27;http://web:8000/media/stream_video/chunks/ec1db006-b488-47ad-8220-79a05bcaae39/segments/init-stream0.m4s&#x27;:&#xA;video-streaming-backend  |   Metadata:&#xA;video-streaming-backend  |     major_brand     : iso5&#xA;video-streaming-backend  |     minor_version   : 512&#xA;video-streaming-backend  |     compatible_brands: iso5iso6mp41&#xA;video-streaming-backend  |     encoder         : Lavf60.16.100&#xA;video-streaming-backend  |   Duration: N/A, bitrate: N/A&#xA;video-streaming-backend  |   Stream #0:0[0x1](und): Video: h264 (avc1 / 0x31637661), none(tv, bt709), 1920x1012, SAR 1:1 DAR 480:253, 12288 tbr, 12288 tbn (default)&#xA;video-streaming-backend  |     Metadata:&#xA;video-streaming-backend  |       handler_name    : VideoHandler&#xA;video-streaming-backend  |       vendor_id       : [0][0][0][0]&#xA;video-streaming-backend  | Stream mapping:&#xA;video-streaming-backend  |   Stream #0:0 (h264) -> drawtext:default&#xA;video-streaming-backend  |   drawtext:default -> Stream #0:0 (libx264)&#xA;video-streaming-backend  | Press [q] to stop, [?] for help&#xA;video-streaming-backend  | Cannot determine format of input stream 0:0 after EOF&#xA;video-streaming-backend  | Error marking filters as finished&#xA;video-streaming-backend  | Error while filtering: Invalid data found when processing input&#xA;video-streaming-backend  | [out#0/mp4 @ 0x7f1bf8e73100] Nothing was written into output file, because at least one of its streams received no packets.&#xA;video-streaming-backend  | frame=    0 fps=0.0 q=0.0 Lsize=       0kB time=N/A bitrate=N/A speed=N/A    &#xA;video-streaming-backend  | Conversion failed!&#xA;&#xA;

    &#xA;

    These errors have left me wondering if .m4s is a viable format for dynamic watermarking. If it's not, what would be the correct approach ?

    &#xA;

  • Real time audio streaming from ffmpeg to browser (am I missing something ?)

    19 septembre 2022, par Яктенс Тид

    I have tried a couple of solutions already, but nothing works for me.&#xA;I want to stream audio from my PC to another computer with almost zero latency. Things are working fine so far in a sense of lagging and everything, sound is clear and not choppy at all, but there is something like a delay between the moment when audio starts playing on my PC and remote PC. For example when I click on Youtube 'play' button audio starts playing only after 3-4 seconds on the remote machine. The same when I click 'Pause', the sound on the remote PC stops after a couple of seconds.

    &#xA;

    I've tried to use websockets\plain audio tag, but no luck so far.

    &#xA;

    For example this is my solution by using websockets and pipes :

    &#xA;

    def create_pipe():&#xA;    return win32pipe.CreateNamedPipe(r&#x27;\\.\pipe\__audio_ffmpeg&#x27;, win32pipe.PIPE_ACCESS_INBOUND,&#xA;                                     win32pipe.PIPE_TYPE_MESSAGE |&#xA;                                     win32pipe.PIPE_READMODE_MESSAGE |&#xA;                                     win32pipe.PIPE_WAIT, 1, 1024 * 8, 1024 * 8, 0, None)&#xA;&#xA;&#xA;async def echo(websocket):&#xA;    pipe = create_pipe()&#xA;    win32pipe.ConnectNamedPipe(pipe, None)&#xA;    while True:&#xA;        data = win32file.ReadFile(pipe, 1024 * 2)&#xA;        await websocket.send(data[1])&#xA;&#xA;&#xA;async def main():&#xA;    async with websockets.serve(echo, "0.0.0.0", 7777):&#xA;        await asyncio.Future()  # run forever&#xA;&#xA;&#xA;if __name__ == &#x27;__main__&#x27;:&#xA;    asyncio.run(main())&#xA;

    &#xA;

    The way I start ffmpeg

    &#xA;

    .\ffmpeg.exe -f dshow -i audio="Stereo Mix (Realtek High Definition Audio)" -acodec libmp3lame  -ab 320k -f mp3 -probesize 32 -muxdelay 0.01 -y \\.\pipe\__audio_ffmpeg&#xA;

    &#xA;

    On the JS side the code is a little bit long, but essentially I am just reading a web socket and appending to buffer

    &#xA;

    this.buffer = this.mediaSource.addSourceBuffer(&#x27;audio/mpeg&#x27;)&#xA;

    &#xA;

    Also as you see I tried to use -probesize 32 -muxdelay 0.01 flags, but no luck as well

    &#xA;

    I tried to use plain tag as well, but still - this couple-of-seconds delay exists

    &#xA;

    What can I do ? Am I missing something ? Maybe I have to disable buffering somewhere ?

    &#xA;