Recherche avancée

Médias (91)

Autres articles (55)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

Sur d’autres sites (7723)

  • ffmpeg creating mpeg-dash chunk files too slowly resulting in 404 errors

    17 juillet 2021, par Danny

    I have a hardware encoder feeding FFmpeg to create a MPEG-DASH Low Latency stream. It works well for a while, but after letting FFmpeg run for a while and reloading the page there are many 404 errors.

    


    When that happens, the dash.js player tries to fetch the segment file on the "live edge" but the file has not been created yet by FFmpeg. For example, after running for 20-30 minutes and loading the web page player, debug code in the web server shows :

    


    2021-07-16 16:46:30.64 : GET REQUEST : /data/ott/chunk-stream0-00702.m4s
2021-07-16 16:46:30.67 : NOT FOUND. Latest files on filesystem:
    chunk-stream0-00699.m4s.tmp
    chunk-stream0-00698.m4s
    chunk-stream0-00697.m4s
    chunk-stream0-00696.m4s
    ...


    


    So you can see the browser requested chunk 702 but the latest on the server is (part of) 699. With 2 second chunks, that is 3-5 seconds of content not yet available.

    


    To analyze, I modified FFmpeg's dashenc.c to add a timestamp every time a file is opened which displays like :

    


    [dash @ 0x9b17c0] 21:48:52.935 1626443332.935  : dashenc_io_open() - opened /data/ott/chunk-stream0-00060.m4s.tmp


    


    And loaded the timestamps into Excel.

    


    Despite a segment duration of 2.000 seconds, the average time between file opens is 2.011 seconds. Over two hours this accumulated to a 45 second difference between the calculated live edge and the latest file on the server.

    


    The HW encoder is set to 25 fps and a GOP size of 5. I've confirmed both by analyzing the H.264 NALUs output by the HW encoder.

    


    My Question : Is this a bug in FFmpeg or can I avoid this problem by adjusting the settings of either the HW encoder and/or FFmpeg options ?

    


    REFERENCE

    


    FFmpeg: Version 4.4 
Centos 8 
Apache 2.4.37


    


    FFmpeg command line (pipe is fed by process reading HW encoder)

    


    ffmpeg -re -loglevel verbose -an -f h264 -i pipe:17 -c:v copy \
-f dash -dash_segment_type mp4 -b:v 1000000 -seg_duration 2.000000 \
-frag_type duration -frag_duration 0.200000 -target_latency 1 \
-window_size 10 -extra_window_size 5 -remove_at_exit 1 -streaming 1 \
-ldash 1 -use_template 1 -use_timeline 0 -write_prft 1 -avioflags direct \
-fflags +nobuffer+flush_packets -format_options movflags=+cmaf \
-utc_timing_url /web/be/time.php /data/ott/master.mpd


    


    Modified dash_io_open() from dashenc.c

    


    static int 
dashenc_io_open(AVFormatContext *s, AVIOContext **pb, char *filename, AVDictionary **options)
{
    DASHContext *c = s->priv_data;
    int http_base_proto = filename ? ff_is_http_proto(filename) : 0;
    int err = AVERROR_MUXER_NOT_FOUND;
    if (!*pb || !http_base_proto || !c->http_persistent)
    {
        err = s->io_open(s, pb, filename, AVIO_FLAG_WRITE, options);

        // My Debug
        {
            char buf[20], milli[60];
            struct timeb tp;

            ftime(&tp); // sec + ms
            struct tm *tmInfo = localtime(&tp.time);

            // 2020-05-15 21:15:12.123
            strftime(buf, sizeof(buf), "%H:%M:%S", tmInfo);
            snprintf(milli, 59, "%s.%03d %d.%03d ", buf, tp.millitm, tp.time, tp.millitm);

            av_log(s, AV_LOG_INFO, "%s : dashenc_io_open() - opened %s\n", milli, filename);
        }
    }
    return err;
}


    


  • ffmpeg, dash manifest cannot be created due to unspecified pixel format

    24 septembre 2015, par artworkad シ

    I am using ffmpeg 2.8 on OSX.

    I try to convert a short mp4 video to webm for adaptive streaming like suggested here http://wiki.webmproject.org/adaptive-streaming/instructions-to-playback-adaptive-webm-using-dash like this :

    VP9_DASH_PARAMS="-tile-columns 6 -frame-parallel 1"

    ffmpeg -i t2.mp4 -c:v libvpx-vp9 -s 160x90 -b:v 250k -keyint_min 150 -g 150 ${VP9_DASH_PARAMS} -an -f webm -dash 1 video_160x90_250k.webm
    ffmpeg -i t2.mp4 -c:a libvorbis -b:a 128k -vn -f webm -dash 1 audio_128k.webm

    ffmpeg \
    -f webm_dash_manifest -i video_160x90_250k.webm \
    -f webm_dash_manifest -i audio_128k.webm \
    -c copy -map 0 -map 1 \
    -f webm_dash_manifest \
    -adaptation_sets "id=0,streams=0 id=1,streams=1" \
    manifest.mpd

    However this gives me a warning unspecified pixel format :

    [webm_dash_manifest @ 0x7f9414812800] Could not find codec parameters for stream 0 (Video: vp9, none, 160x90): unspecified pixel format
    Consider increasing the value for the 'analyzeduration' and 'probesize' options
    video_160x90_250k.webm: could not find codec parameters
    Input #0, webm_dash_manifest, from 'video_160x90_250k.webm':
     Metadata:
       encoder         : Lavf56.36.100
     Duration: 00:00:09.97, bitrate: 111 kb/s
       Stream #0:0: Video: vp9, none, 160x90, SAR 1:1 DAR 16:9, 23.98 fps, 23.98 tbr, 1k tbn, 1k tbc (default)
       Metadata:
         webm_dash_manifest_duration: 9969
         webm_dash_manifest_initialization_range: 437
         webm_dash_manifest_file_name: video_160x90_250k.webm
         webm_dash_manifest_track_number: 1
         webm_dash_manifest_cues_start: 139297
         webm_dash_manifest_cues_end: 139399
         webm_dash_manifest_bandwidth: 99164
         webm_dash_manifest_cluster_keyframe: 1
         webm_dash_manifest_cue_timestamps: 0,2085,4171,6256,8342
    Input #1, webm_dash_manifest, from 'audio_128k.webm':
     Metadata:
       encoder         : Lavf56.36.100
     Duration: 00:00:10.01, bitrate: 120 kb/s
       Stream #1:0: Audio: vorbis, 48000 Hz, stereo, fltp (default)
       Metadata:
         webm_dash_manifest_duration: 10009
         webm_dash_manifest_initialization_range: 4697
         webm_dash_manifest_file_name: audio_128k.webm
         webm_dash_manifest_track_number: 1
         webm_dash_manifest_cues_start: 151174
         webm_dash_manifest_cues_end: 151240
         webm_dash_manifest_bandwidth: 105517
         webm_dash_manifest_cluster_keyframe: 1
         webm_dash_manifest_cue_timestamps: 0,4999,9998
    Output #0, webm_dash_manifest, to 'manifest.mpd':
     Metadata:
       encoder         : Lavf56.36.100
       Stream #0:0: Video: vp9, none, 160x90 [SAR 1:1 DAR 16:9], q=2-31, 23.98 fps, 23.98 tbr, 1k tbn, 1k tbc (default)
       Metadata:
         webm_dash_manifest_duration: 9969
         webm_dash_manifest_initialization_range: 437
         webm_dash_manifest_file_name: video_160x90_250k.webm
         webm_dash_manifest_track_number: 1
         webm_dash_manifest_cues_start: 139297
         webm_dash_manifest_cues_end: 139399
         webm_dash_manifest_bandwidth: 99164
         webm_dash_manifest_cluster_keyframe: 1
         webm_dash_manifest_cue_timestamps: 0,2085,4171,6256,8342
       Stream #0:1: Video: vorbis, none, q=2-31, 1k tbn, 1k tbc (default)
       Metadata:
         webm_dash_manifest_duration: 10009
         webm_dash_manifest_initialization_range: 4697
         webm_dash_manifest_file_name: audio_128k.webm
         webm_dash_manifest_track_number: 1
         webm_dash_manifest_cues_start: 151174
         webm_dash_manifest_cues_end: 151240
         webm_dash_manifest_bandwidth: 105517
         webm_dash_manifest_cluster_keyframe: 1
         webm_dash_manifest_cue_timestamps: 0,4999,9998
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
     Stream #1:0 -> #0:1 (copy)
    Press [q] to stop, [?] for help
    frame=    0 fps=0.0 q=-1.0 Lsize=       1kB time=00:00:00.00 bitrate=N/A    
    video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
    Output file is empty, nothing was encoded (check -ss / -t / -frames parameters if used)

    Nevertheless the manifest file is created. I tried to specify the pixel format :

    -pix_fmt yuv420p

    However this did not change anything. The warning remains the same.

    Any ideas why the warning appears and how to fix this ?

  • Live streaming : node-media-server + Dash.js configured for real-time low latency

    7 juillet 2021, par Maoration

    We're working on an app that enables live monitoring of your back yard.
Each client has a camera connected to the internet, streaming to our public node.js server.

    



    I'm trying to use node-media-server to publish an MPEG-DASH (or HLS) stream to be available for our app clients, on different networks, bandwidths and resolutions around the world.

    



    Our goal is to get as close as possible to live "real-time" so you can monitor what happens in your backyard instantly.

    



    The technical flow already accomplished is :

    



      

    1. ffmpeg process on our server processes the incoming camera stream (separate child process for each camera) and publishes the stream via RTSP on the local machine for node-media-server to use as an 'input' (we are also saving segmented files, generating thumbnails, etc.). the ffmpeg command responsible for that is :

      



      -c:v libx264 -preset ultrafast -tune zerolatency -b:v 900k -f flv rtmp://127.0.0.1:1935/live/office

    2. 


    3. node-media-server is running with what I found as the default configuration for 'live-streaming'

      



      private NMS_CONFIG = {
server: {
  secret: 'thisisnotmyrealsecret',
},
rtmp_server: {
  rtmp: {
    port: 1935,
    chunk_size: 60000,
    gop_cache: false,
    ping: 60,
    ping_timeout: 30,
  },
  http: {
    port: 8888,
    mediaroot: './server/media',
    allow_origin: '*',
  },
  trans: {
    ffmpeg: '/usr/bin/ffmpeg',
    tasks: [
      {
        app: 'live',
        hls: true,
        hlsFlags: '[hls_time=2:hls_list_size=3:hls_flags=delete_segments]',
        dash: true,
        dashFlags: '[f=dash:window_size=3:extra_window_size=5]',
      },
    ],
  },
},


      



      } ;

    4. 


    5. As I understand it, out of the box NMS (node-media-server) publishes the input stream it gets in multiple output formats : flv, mpeg-dash, hls.
with all sorts of online players for these formats I'm able to access and the stream using the url on localhost. with mpeg-dash and hls I'm getting anything between 10-15 seconds of delay, and more.

    6. 


    




    



    My goal now is to implement a local client-side mpeg-dash player, using dash.js and configure it to be as close as possible to live.

    



    my code for that is :

    



    

    

    &#xD;&#xA;&#xD;&#xA;    &#xD;&#xA;        &#xD;&#xA;        &#xD;&#xA;    &#xD;&#xA;    &#xD;&#xA;        <div>&#xD;&#xA;            <video autoplay="" controls=""></video>&#xD;&#xA;        </div>&#xD;&#xA;        <code class="echappe-js">&lt;script src=&quot;https://cdnjs.cloudflare.com/ajax/libs/dashjs/3.0.2/dash.all.min.js&quot;&gt;&lt;/script&gt;&#xD;&#xA;&#xD;&#xA;        &lt;script&gt;&amp;#xD;&amp;#xA;            (function(){&amp;#xD;&amp;#xA;                // var url = &quot;https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd&quot;;&amp;#xD;&amp;#xA;                var url = &quot;http://localhost:8888/live/office/index.mpd&quot;;&amp;#xD;&amp;#xA;                var player = dashjs.MediaPlayer().create();&amp;#xD;&amp;#xA;                &amp;#xD;&amp;#xA;                &amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                // config&amp;#xD;&amp;#xA;                targetLatency = 2.0;        // Lowering this value will lower latency but may decrease the player&amp;#x27;s ability to build a stable buffer.&amp;#xD;&amp;#xA;                minDrift = 0.05;            // Minimum latency deviation allowed before activating catch-up mechanism.&amp;#xD;&amp;#xA;                catchupPlaybackRate = 0.5;  // Maximum catch-up rate, as a percentage, for low latency live streams.&amp;#xD;&amp;#xA;                stableBuffer = 2;           // The time that the internal buffer target will be set to post startup/seeks (NOT top quality).&amp;#xD;&amp;#xA;                bufferAtTopQuality = 2;     // The time that the internal buffer target will be set to once playing the top quality.&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                player.updateSettings({&amp;#xD;&amp;#xA;                    &amp;#x27;streaming&amp;#x27;: {&amp;#xD;&amp;#xA;                        &amp;#x27;liveDelay&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;liveCatchUpMinDrift&amp;#x27;: 0.05,&amp;#xD;&amp;#xA;                        &amp;#x27;liveCatchUpPlaybackRate&amp;#x27;: 0.5,&amp;#xD;&amp;#xA;                        &amp;#x27;stableBufferTime&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;bufferTimeAtTopQuality&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;bufferTimeAtTopQualityLongForm&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;bufferToKeep&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;bufferAheadToKeep&amp;#x27;: 2,&amp;#xD;&amp;#xA;                        &amp;#x27;lowLatencyEnabled&amp;#x27;: true,&amp;#xD;&amp;#xA;                        &amp;#x27;fastSwitchEnabled&amp;#x27;: true,&amp;#xD;&amp;#xA;                        &amp;#x27;abr&amp;#x27;: {&amp;#xD;&amp;#xA;                            &amp;#x27;limitBitrateByPortal&amp;#x27;: true&amp;#xD;&amp;#xA;                        },&amp;#xD;&amp;#xA;                    }&amp;#xD;&amp;#xA;                });&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                console.log(player.getSettings());&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                setInterval(() =&gt; {&amp;#xD;&amp;#xA;                  console.log(&amp;#x27;Live latency= &amp;#x27;, player.getCurrentLiveLatency());&amp;#xD;&amp;#xA;                  console.log(&amp;#x27;Buffer length= &amp;#x27;, player.getBufferLength(&amp;#x27;video&amp;#x27;));&amp;#xD;&amp;#xA;                }, 3000);&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;                player.initialize(document.querySelector(&quot;#videoPlayer&quot;), url, true);&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;            })();&amp;#xD;&amp;#xA;&amp;#xD;&amp;#xA;        &lt;/script&gt;&#xD;&#xA;    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;&#xA;&#xA;

    with the online test video (https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd) I see that the live latency value is close to 2 secs (but I have no way to actually confirm it. it's a video file streamed. in my office I have a camera so I can actually compare latency between real-life and the stream I get).&#xA;however when working locally with my NMS, it seems this value does not want to go below 20-25 seconds.

    &#xA;&#xA;

    Am I doing something wrong ? any configuration on the player (client-side html) I'm forgetting ?&#xA;or is there a missing configuration I should add on the server side (NMS) ?

    &#xA;