Recherche avancée

Médias (91)

Autres articles (112)

  • Script d’installation automatique de MediaSPIP

    25 avril 2011, par

    Afin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
    Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
    La documentation de l’utilisation du script d’installation (...)

  • Demande de création d’un canal

    12 mars 2010, par

    En fonction de la configuration de la plateforme, l’utilisateur peu avoir à sa disposition deux méthodes différentes de demande de création de canal. La première est au moment de son inscription, la seconde, après son inscription en remplissant un formulaire de demande.
    Les deux manières demandent les mêmes choses fonctionnent à peu près de la même manière, le futur utilisateur doit remplir une série de champ de formulaire permettant tout d’abord aux administrateurs d’avoir des informations quant à (...)

  • La sauvegarde automatique de canaux SPIP

    1er avril 2010, par

    Dans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
    Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)

Sur d’autres sites (12397)

  • Combining two multi-channel streams into one

    31 mars 2023, par Christiaan Roselaar

    Hi to all you FFmpeg Guru's.

    


    I'm doing my best to combine two incoming webstreams on my server. One of them containing two (2) mono audio-channels, the other five (5). The objective is to create one stream containing seven (7) channels, to be routed directly to the hardware.

    


    At the two transmission-origins, host_a and host_b, the streams are generated as follows :

    


    host_a : $ ffmpeg -hide_banner -loglevel error -f jack -channels 1 -i Tonmeister_mic1 -f jack -channels 1 -i Tonmeister_mic2 -nostdin -filter_complex "[0:a][1:a]amerge=inputs=2[a]" -map "[a]" -codec:a libopus -b:a 128k -vbr on -packet_loss 10 -fec on -compression_level 10 -frame_duration 20 -application audio -f mpegts udp ://X.X.X.X:yyyy &

    


    host_b : $ ffmpeg -hide_banner -loglevel error -f jack -channels 1 -i Tonmeister_mic1 -f jack -channels 1 -i Tonmeister_mic2 -f jack -channels 1 -i Tonmeister_mic3 -f jack -channels 1 -i Tonmeister_mic4 -f jack -channels 1 -i Tonmeister_mic5 -nostdin -filter_complex "[0:a][1:a][2:a][3:a][4:a]amerge=inputs=5[a]" -map "[a]" -codec:a libopus -b:a 128k -vbr on -packet_loss 10 -fec on -compression_level 10 -frame_duration 20 -application audio -f mpegts udp ://X.X.X.X:zzzz &

    


    The streams do seem to get in allright :

    


    cjr-macbook : CJR$ ffprobe -hide_banner -i udp ://127.0.0.1:yyyy
Input #0, mpegts, from 'udp ://127.0.0.1:yyyy' :
Duration : N/A, start : 4759.783900, bitrate : N/A
Program 1
Metadata :
service_name : Service01
service_provider : FFmpeg
Stream #0:0[0x100] : Audio : opus (Opus / 0x7375704F), 48000 Hz, stereo, fltp

    


    cjr-macbook : CJR$ ffprobe -hide_banner -i udp ://127.0.0.1:zzzz
Input #0, mpegts, from 'udp ://127.0.0.1:zzzz' :
Duration : N/A, start : 4856.559000, bitrate : N/A
Program 1
Metadata :
service_name : Service01
service_provider : FFmpeg
Stream #0:0[0x100] : Audio : opus (Opus / 0x7375704F), 48000 Hz, 5.0, fltp

    


    However, I just cannot figure out how to combine the seven (7) channels present in these two incoming streams into one seven-channel stream.

    


    At the receiving end I tried various commands, amongst others

    


    cjr-macbook : CJR$ ffmpeg -hide_banner -loglevel verbose -i udp ://127.0.0.1:yyyy -i udp ://127.0.0.1:zzzz -filter_complex "[0:a]channelsplit=channel_layout=stereo[b]" ;"[1:a]channelsplit=channel_layout=5.0[c]" ;"[b][c]amerge=inputs=7[d]" -map "[d]" sevenchannels.wav

    


    Any help appreciated !

    


  • FFmpeg ; combining two multi-channel streams into one

    30 mars 2023, par Christiaan Roselaar

    Hi to all you FFmpeg Guru's.

    


    I'm doing my best to combine two incoming webstreams on my server. One of them containing two (2) mono audio-channels, the other five (5). The objective is to create one stream containing seven (7) channels, to be routed directly to the hardware.

    


    At the two transmission-origins, host_a and host_b, the streams are generated as follows :

    


    host_a : $ ffmpeg -hide_banner -loglevel error -f jack -channels 1 -i Tonmeister_mic1 -f jack -channels 1 -i Tonmeister_mic2 -nostdin -filter_complex "[0:a][1:a]amerge=inputs=2[a]" -map "[a]" -codec:a libopus -b:a 128k -vbr on -packet_loss 10 -fec on -compression_level 10 -frame_duration 20 -application audio -f mpegts udp ://X.X.X.X:yyyy &

    


    host_b : $ ffmpeg -hide_banner -loglevel error -f jack -channels 1 -i Tonmeister_mic1 -f jack -channels 1 -i Tonmeister_mic2 -f jack -channels 1 -i Tonmeister_mic3 -f jack -channels 1 -i Tonmeister_mic4 -f jack -channels 1 -i Tonmeister_mic5 -nostdin -filter_complex "[0:a][1:a][2:a][3:a][4:a]amerge=inputs=5[a]" -map "[a]" -codec:a libopus -b:a 128k -vbr on -packet_loss 10 -fec on -compression_level 10 -frame_duration 20 -application audio -f mpegts udp ://X.X.X.X:zzzz &

    


    The streams do seem to get in allright :

    


    cjr-macbook : CJR$ ffprobe -hide_banner -i udp ://127.0.0.1:yyyy
Input #0, mpegts, from 'udp ://127.0.0.1:yyyy' :
Duration : N/A, start : 4759.783900, bitrate : N/A
Program 1
Metadata :
service_name : Service01
service_provider : FFmpeg
Stream #0:0[0x100] : Audio : opus (Opus / 0x7375704F), 48000 Hz, stereo, fltp

    


    cjr-macbook : CJR$ ffprobe -hide_banner -i udp ://127.0.0.1:zzzz
Input #0, mpegts, from 'udp ://127.0.0.1:zzzz' :
Duration : N/A, start : 4856.559000, bitrate : N/A
Program 1
Metadata :
service_name : Service01
service_provider : FFmpeg
Stream #0:0[0x100] : Audio : opus (Opus / 0x7375704F), 48000 Hz, 5.0, fltp

    


    However, I just cannot figure out how to combine the seven (7) channels present in these two incoming streams into one seven-channel stream.

    


    At the receiving end I tried various commands, amongst others

    


    cjr-macbook : CJR$ ffmpeg -hide_banner -loglevel verbose -i udp ://127.0.0.1:yyyy -i udp ://127.0.0.1:zzzz -filter_complex "[0:a]channelsplit=channel_layout=stereo[b]" ;"[1:a]channelsplit=channel_layout=5.0[c]" ;"[b][c]amerge=inputs=7[d]" -map "[d]" sevenchannels.wav

    


    Any help appreciated !

    


  • Dash.js not playing mpd files made with ffmpeg

    31 décembre 2022, par Macster

    I'm using ffmpeg to create chunks and manifest of a webm file which I want to live stream with Dash.js. Unfortunately Dash.js won't play the mpd file, no matter which way I create the chunks and manifest. However, the sample mpd URL from Dash.js is working.

    


    Commands

    


    ffmpeg -re -r 25 -i Dash/strm.webm
-map 0:v:0
-pix_fmt yuv420p
-c:v libvpx
-s 640x480 -keyint_min 60 -g 60 -speed 6 -tile-columns 4 -frame-parallel 1 -threads 8 -static-thresh 0 -max-intra-rate 300 -deadline realtime -lag-in-frames 0 -error-resilient 1
-b:v 3000k
-f webm_chunk
-header "Dash/glass_360.hdr"
-chunk_start_index 1 Dash/glass_360_%d.chk
-map 0:a:0
-c:a libvorbis
-b:a 128k -ar 44100
-f webm_chunk
-audio_chunk_duration 2000
-header Dash/glass_171.hdr
-chunk_start_index 1 Dash/glass_171_%d.chk


//Manifest
ffmpeg
-f webm_dash_manifest -live 1
-i Dash/glass_360.hdr
-f webm_dash_manifest -live 1
-i Dash/glass_171.hdr
-c copy
-map 0 -map 1
-f webm_dash_manifest -live 1
-adaptation_sets "id=0,streams=0 id=1,streams=1"
-chunk_start_index 1
-chunk_duration_ms 2000
-time_shift_buffer_depth 7200
-minimum_update_period 7200 Dash/glass_video_manifest.mpd


    


    Manifest output

    


    ffmpeg version git-2020-05-27-8b5ffae Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 9.3.1 (GCC) 20200523
  configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libsrt --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --disable-w32threads --enable-libmfx --enable-ffnvcodec --enable-cuda-llvm --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt --enable-amf
  libavutil      56. 49.100 / 56. 49.100
  libavcodec     58. 87.101 / 58. 87.101
  libavformat    58. 43.100 / 58. 43.100
  libavdevice    58.  9.103 / 58.  9.103
  libavfilter     7. 83.100 /  7. 83.100
  libswscale      5.  6.101 /  5.  6.101
  libswresample   3.  6.100 /  3.  6.100
  libpostproc    55.  6.100 / 55.  6.100
Input #0, webm_dash_manifest, from 'Dash/glass_360.hdr':
  Metadata:
    ENCODER         : Lavf58.43.100
  Duration: N/A, bitrate: N/A
    Stream #0:0(eng): Video: vp8, yuv420p, 640x480, SAR 1:1 DAR 4:3, 1k tbr, 1k tbn, 1k tbc (default)
    Metadata:
      ALPHA_MODE      : 1
      ENCODER         : Lavc58.87.101 libvpx
      webm_dash_manifest_file_name: glass_360.hdr
      webm_dash_manifest_track_number: 1
Input #1, webm_dash_manifest, from 'Dash/glass_171.hdr':
  Metadata:
    ENCODER         : Lavf58.43.100
  Duration: N/A, bitrate: N/A
    Stream #1:0(eng): Audio: vorbis, 44100 Hz, mono, fltp (default)
    Metadata:
      ENCODER         : Lavc58.87.101 libvorbis
      webm_dash_manifest_file_name: glass_171.hdr
      webm_dash_manifest_track_number: 1
Output #0, webm_dash_manifest, to 'Dash/glass_video_manifest.mpd':
  Metadata:
    encoder         : Lavf58.43.100
    Stream #0:0(eng): Video: vp8, yuv420p, 640x480 [SAR 1:1 DAR 4:3], q=2-31, 1k tbr, 1k tbn, 1k tbc (default)
    Metadata:
      ALPHA_MODE      : 1
      ENCODER         : Lavc58.87.101 libvpx
      webm_dash_manifest_file_name: glass_360.hdr
      webm_dash_manifest_track_number: 1
    Stream #0:1(eng): Audio: vorbis, 44100 Hz, mono, fltp (default)
    Metadata:
      ENCODER         : Lavc58.87.101 libvorbis
      webm_dash_manifest_file_name: glass_171.hdr
      webm_dash_manifest_track_number: 1
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
  Stream #1:0 -> #0:1 (copy)
Press [q] to stop, [?] for help
frame=    0 fps=0.0 q=-1.0 Lsize=       1kB time=00:00:00.00 bitrate=N/A speed=   0x
video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:4kB muxing overhead: unknown


    


    Manifest file
    (glass_video_manifest.mpd)
    
I tried to delete the ContetntComponent like suggested in other questions, but it didn't work.

    


    &lt;?xml version="1.0" encoding="UTF-8"?>&#xA;&#xA;<period start="PT0S">&#xA;<adaptationset mimetype="video/webm" codecs="vp8" lang="eng" bitstreamswitching="true" subsegmentalignment="true" subsegmentstartswithsap="1">&#xA;<contentcomponent type="video"></contentcomponent>&#xA;<segmenttemplate timescale="1000" duration="2000" media="glass_$RepresentationID$_$Number$.chk" startnumber="1" initialization="glass_$RepresentationID$.hdr"></segmenttemplate>&#xA;<representation bandwidth="1000000" width="640" height="480" codecs="vp8" mimetype="video/webm" startswithsap="1"></representation>&#xA;</adaptationset>&#xA;<adaptationset mimetype="audio/webm" codecs="vorbis" lang="eng" bitstreamswitching="true" subsegmentalignment="true" subsegmentstartswithsap="1">&#xA;<contentcomponent type="audio"></contentcomponent>&#xA;<segmenttemplate timescale="1000" duration="2000" media="glass_$RepresentationID$_$Number$.chk" startnumber="1" initialization="glass_$RepresentationID$.hdr"></segmenttemplate>&#xA;<representation bandwidth="128000" audiosamplingrate="44100" codecs="vorbis" mimetype="audio/webm" startswithsap="1"></representation>&#xA;</adaptationset>&#xA;</period>&#xA;&#xA;

    &#xA;

    Dash.js-Player

    &#xA;

    <code class="echappe-js">&lt;script&gt;&amp;#xA;&amp;#xA;(function(){&amp;#xA;  // var url = &quot;https://dash.akamaized.net/envivio/EnvivioDash3/manifest.mpd&quot;;&amp;#xA;  var url = &quot;http://localhost:8081/videos/Dash/glass_live_manifest.mpd&quot;;&amp;#xA;  var player = dashjs.MediaPlayer().create();&amp;#xA;                &amp;#xA;  // config&amp;#xA;  targetLatency = 2.0;        // Lowering this value will lower latency but may decrease the player&amp;#x27;s ability to build a stable buffer.&amp;#xA;  minDrift = 0.05;            // Minimum latency deviation allowed before activating catch-up mechanism.&amp;#xA;  catchupPlaybackRate = 0.5;  // Maximum catch-up rate, as a percentage, for low latency live streams.&amp;#xA;  stableBuffer = 2;           // The time that the internal buffer target will be set to post startup/seeks (NOT top quality).&amp;#xA;  bufferAtTopQuality = 2;     // The time that the internal buffer target will be set to once playing the top quality.&amp;#xA;&amp;#xA;   player.updateSettings({&amp;#xA;      &amp;#x27;streaming&amp;#x27;: {&amp;#xA;          &amp;#x27;liveDelay&amp;#x27;: 2,&amp;#xA;          &amp;#x27;liveCatchUpMinDrift&amp;#x27;: 0.05,&amp;#xA;          &amp;#x27;liveCatchUpPlaybackRate&amp;#x27;: 0.5,&amp;#xA;          &amp;#x27;stableBufferTime&amp;#x27;: 2,&amp;#xA;          &amp;#x27;bufferTimeAtTopQuality&amp;#x27;: 2,&amp;#xA;          &amp;#x27;bufferTimeAtTopQualityLongForm&amp;#x27;: 2,&amp;#xA;          &amp;#x27;bufferToKeep&amp;#x27;: 2,&amp;#xA;          &amp;#x27;bufferAheadToKeep&amp;#x27;: 2,&amp;#xA;          &amp;#x27;lowLatencyEnabled&amp;#x27;: true,&amp;#xA;          &amp;#x27;fastSwitchEnabled&amp;#x27;: true,&amp;#xA;          &amp;#x27;abr&amp;#x27;: {&amp;#xA;               &amp;#x27;limitBitrateByPortal&amp;#x27;: true&amp;#xA;          },&amp;#xA;       }&amp;#xA;   });&amp;#xA;&amp;#xA;   console.log(player.getSettings());&amp;#xA;&amp;#xA;   setInterval(() =&gt; {&amp;#xA;      console.log(&amp;#x27;Live latency= &amp;#x27;, player.getCurrentLiveLatency());&amp;#xA;      console.log(&amp;#x27;Buffer length= &amp;#x27;, player.getBufferLength(&amp;#x27;video&amp;#x27;));&amp;#xA;     }, 3000);&amp;#xA;&amp;#xA;     player.initialize(document.querySelector(&quot;#videoPlayer&quot;), url, true);&amp;#xA;&amp;#xA;   })();&amp;#xA;&lt;/script&gt;&#xA;

    &#xA;

    Chrome

    &#xA;

    {debug: {…}, streaming: {…}}&#xA;dash.all.min.js:2 XHR finished loading: GET "http://localhost:8081/videos/Dash/glass_live_manifest.mpd".&#xA;load @ dash.all.min.js:2&#xA;C @ dash.all.min.js:2&#xA;load @ dash.all.min.js:2&#xA;load @ dash.all.min.js:2&#xA;load @ dash.all.min.js:2&#xA;load @ dash.all.min.js:2&#xA;se @ dash.all.min.js:2&#xA;te @ dash.all.min.js:2&#xA;initialize @ dash.all.min.js:2&#xA;(anonymous) @ Dash:92&#xA;(anonymous) @ Dash:94&#xA;DevTools failed to load SourceMap: Could not parse content for http://localhost:8081/js/dash.all.min.js.map: Cannot read property &#x27;length&#x27; of undefined&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;Dash:88 Live latency=  NaN&#xA;Dash:89 Buffer length=  NaN&#xA;

    &#xA;

    UPDATE

    &#xA;

    Well, it seems like the problem in general was, that the mpd's wouldn't play from that /dash folder. So i took a look into the code and found a bad routing. Anyways, the mpd would't start with the given command I used, probably becasue it creates a dynamic manifest, like @Markus Schumann says. So I'm going with a new one which seems to be working for now, but not very well.

    &#xA;

    ffmpeg -y -re -i strm.webm &#xA;-c:v libx264 -x264opts "keyint=24:min-keyint=24:no-scenecut" &#xA;-r 24 -c:a aac -b:a 128k -bf 1 -b_strategy 0 -sc_threshold 0 -pix_fmt yuv420p &#xA;-map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 -map 0:v:0 -map 0:a:0 -b:v:0 250k  &#xA;-filter:v:0 "scale=-2:240" -profile:v:0 baseline -b:v:1 750k  &#xA;-filter:v:1 "scale=-2:480" -profile:v:1 main -b:v:2 1500k &#xA;-filter:v:2 "scale=-2:720" -profile:v:2 high &#xA;-use_timeline 1 -use_template 1 -window_size 5 -adaptation_sets "id=0,streams=v id=1,streams=a" &#xA;-f dash glass_video_manifest.mpd&#xA;

    &#xA;