Recherche avancée

Médias (0)

Mot : - Tags -/interaction

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (43)

  • La sauvegarde automatique de canaux SPIP

    1er avril 2010, par

    Dans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
    Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)

  • Script d’installation automatique de MediaSPIP

    25 avril 2011, par

    Afin de palier aux difficultés d’installation dues principalement aux dépendances logicielles coté serveur, un script d’installation "tout en un" en bash a été créé afin de faciliter cette étape sur un serveur doté d’une distribution Linux compatible.
    Vous devez bénéficier d’un accès SSH à votre serveur et d’un compte "root" afin de l’utiliser, ce qui permettra d’installer les dépendances. Contactez votre hébergeur si vous ne disposez pas de cela.
    La documentation de l’utilisation du script d’installation (...)

  • Automated installation script of MediaSPIP

    25 avril 2011, par

    To overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
    You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
    The documentation of the use of this installation script is available here.
    The code of this (...)

Sur d’autres sites (3993)

  • Error Opening RTMP Stream through FFmpeg command when executed through exec package [closed]

    3 octobre 2024, par Akhil

    I have been trying to transcode the live stream from RTMP server running on rtmp://localhost:1936/live/test with FFmpeg in a Go application using os/exec package, But seems to not work and gives the input/output error (I have attached below). The same exact ffmpeg command when I execute on terminal, works as its supposed to. Not Sure why that is, here is my code for reproducing and analyzing the mistakes.

    


    ffmpegCmd := fmt.Sprintf("ffmpeg -nostdin -i rtmp://localhost:1936/live/%s -c:v libx264 -s %s -f %s %s/stream.mpd",
        streamKey, resolution, sp.OutputFormat, outputPath)
    log.Printf("Executing FFmpeg command: %s", ffmpegCmd)

    // Prepare the command execution with a timeout context
    ctx, cancel := context.WithTimeout(context.Background(), 60*time.Second) // Set a 60-second timeout
    defer cancel()

    cmd := exec.CommandContext(ctx, "bash", "-c", ffmpegCmd)


    


    the ffmpeg command looks like this :
    
ffmpeg -nostdin -i rtmp://localhost:1936/live/test -c:v libx264 -s 1920x1080 -f dash output/test/1080p/stream.mpd

    


    I get the following error :

    


    Error opening input: Input/output error

Error opening input file rtmp://localhost:1936/live/test.

Error opening input files: Input/output error

Exiting normally, received signal 2.

signal: interrupt


    


    I have already tried to break the command, and then execute it. Something like :

    


    cmd := exec.CommandContext(ctx,
        "ffmpeg",
        "-nostdin",
        "-i", "rtmp://localhost:1936/live/"+streamKey,
        "-c:v", "libx264",
        "-s", resolution,
        "-f", sp.OutputFormat,
        outputPath+"/stream.mpd")


    


    After running the ffmpeg command with -loglevel debug and -report :

    


    Here is the logs and errors I get :

    


    When I run it within the go application :

    


    ffmpeg started on 2024-10-02 at 12:00:06
Report written to "ffmpeg-20241002-120006.log"
Log level: 48
Command line:
ffmpeg -loglevel debug -report -i rtmp://localhost:1936/live/test -c:v libx264 -s 1920x1080 -f dash ./output/test/1080p/stream.mpd
ffmpeg version 7.0.2 Copyright (c) 2000-2024 the FFmpeg developers
  built with Apple clang version 15.0.0 (clang-1500.3.9.4)
  configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/7.0.2_1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon
  libavutil      59.  8.100 / 59.  8.100
  libavcodec     61.  3.100 / 61.  3.100
  libavformat    61.  1.100 / 61.  1.100
  libavdevice    61.  1.100 / 61.  1.100
  libavfilter    10.  1.100 / 10.  1.100
  libswscale      8.  1.100 /  8.  1.100
  libswresample   5.  1.100 /  5.  1.100
  libpostproc    58.  1.100 / 58.  1.100
Splitting the commandline.
Reading option '-loglevel' ... matched as option 'loglevel' (set logging level) with argument 'debug'.
Reading option '-report' ... matched as option 'report' (generate a report) with argument '1'.
Reading option '-i' ... matched as input url with argument 'rtmp://localhost:1936/live/test'.
Reading option '-c:v' ... matched as option 'c' (select encoder/decoder ('copy' to copy stream without reencoding)) with argument 'libx264'.
Reading option '-s' ... matched as option 's' (set frame size (WxH or abbreviation)) with argument '1920x1080'.
Reading option '-f' ... matched as option 'f' (force container format (auto-detected otherwise)) with argument 'dash'.
Reading option './output/test/1080p/stream.mpd' ... matched as output url.
Finished splitting the commandline.
Parsing a group of options: global .
Applying option loglevel (set logging level) with argument debug.
Applying option report (generate a report) with argument 1.
Successfully parsed a group of options.
Parsing a group of options: input url rtmp://localhost:1936/live/test.
Successfully parsed a group of options.
Opening an input file: rtmp://localhost:1936/live/test.
[AVFormatContext @ 0x13f721f90] Opening 'rtmp://localhost:1936/live/test' for reading
[rtmp @ 0x13f6040e0] No default whitelist set
[tcp @ 0x13f7223d0] No default whitelist set
[tcp @ 0x13f7223d0] Original list of addresses:
[tcp @ 0x13f7223d0] Address ::1 port 1936
[tcp @ 0x13f7223d0] Address 127.0.0.1 port 1936
[tcp @ 0x13f7223d0] Interleaved list of addresses:
[tcp @ 0x13f7223d0] Address ::1 port 1936
[tcp @ 0x13f7223d0] Address 127.0.0.1 port 1936
[tcp @ 0x13f7223d0] Starting connection attempt to ::1 port 1936
[tcp @ 0x13f7223d0] Connection attempt to ::1 port 1936 failed: Connection refused
[tcp @ 0x13f7223d0] Starting connection attempt to 127.0.0.1 port 1936
[tcp @ 0x13f7223d0] Successfully connected to 127.0.0.1 port 1936
[rtmp @ 0x13f6040e0] Handshaking...
[rtmp @ 0x13f6040e0] Type answer 3
[rtmp @ 0x13f6040e0] Server version 13.14.10.13
[rtmp @ 0x13f6040e0] Proto = rtmp, path = /live/test, app = live, fname = test
[rtmp @ 0x13f6040e0] Window acknowledgement size = 5000000
[rtmp @ 0x13f6040e0] Max sent, unacked = 5000000
[rtmp @ 0x13f6040e0] New incoming chunk size = 4096
[rtmp @ 0x13f6040e0] Creating stream...
[rtmp @ 0x13f6040e0] Sending play command for 'test'
[rtmp @ 0x13f6040e0] Deleting stream...
[in#0 @ 0x13f721d40] Error opening input: Input/output error
Error opening input file rtmp://localhost:1936/live/test.
Error opening input files: Input/output error
Exiting normally, received signal 2.


    


    This is what i get when i run the same command on terminal :

    


    <same as="as" but="but" please="please" scroll="scroll" further="further">&#xA;&#xA;[rtmp @ 0x1437144c0] No default whitelist set&#xA;[tcp @ 0x143604f20] No default whitelist set&#xA;[tcp @ 0x143604f20] Original list of addresses:&#xA;[tcp @ 0x143604f20] Address ::1 port 1936&#xA;[tcp @ 0x143604f20] Address 127.0.0.1 port 1936&#xA;[tcp @ 0x143604f20] Interleaved list of addresses:&#xA;[tcp @ 0x143604f20] Address ::1 port 1936&#xA;[tcp @ 0x143604f20] Address 127.0.0.1 port 1936&#xA;[tcp @ 0x143604f20] Starting connection attempt to ::1 port 1936&#xA;[tcp @ 0x143604f20] Connection attempt to ::1 port 1936 failed: Connection refused&#xA;[tcp @ 0x143604f20] Starting connection attempt to 127.0.0.1 port 1936&#xA;[tcp @ 0x143604f20] Successfully connected to 127.0.0.1 port 1936&#xA;[rtmp @ 0x1437144c0] Handshaking...&#xA;[rtmp @ 0x1437144c0] Type answer 3&#xA;[rtmp @ 0x1437144c0] Server version 13.14.10.13&#xA;[rtmp @ 0x1437144c0] Proto = rtmp, path = /live/test, app = live, fname = test&#xA;[rtmp @ 0x1437144c0] Window acknowledgement size = 5000000&#xA;[rtmp @ 0x1437144c0] Max sent, unacked = 5000000&#xA;[rtmp @ 0x1437144c0] New incoming chunk size = 4096&#xA;[rtmp @ 0x1437144c0] Creating stream...&#xA;[rtmp @ 0x1437144c0] Sending play command for &#x27;test&#x27;&#xA;[flv @ 0x143604b30] Format flv probed with size=2048 and score=100&#xA;[flv @ 0x143604b30] Before avformat_find_stream_info() pos: 13 bytes read:2263 seeks:0 nb_streams:0&#xA;Transform tree:&#xA;    mdct_inv_float_c - type: mdct_float, len: 64, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]&#xA;        fft32_ns_float_neon - type: fft_float, len: 32, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]&#xA;Transform tree:&#xA;    mdct_inv_float_c - type: mdct_float, len: 64, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]&#xA;        fft32_ns_float_neon - type: fft_float, len: 32, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]&#xA;Transform tree:&#xA;    mdct_pfa_15xM_inv_float_c - type: mdct_float, len: 120, factors[2]: [15, any], flags: [unaligned, out_of_place, inv_only]&#xA;        fft4_fwd_float_neon - type: fft_float, len: 4, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]&#xA;Transform tree:&#xA;    mdct_inv_float_c - type: mdct_float, len: 128, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]&#xA;        fft_sr_ns_float_neon - type: fft_float, len: 64, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]&#xA;Transform tree:&#xA;    mdct_pfa_15xM_inv_float_c - type: mdct_float, len: 480, factors[2]: [15, any], flags: [unaligned, out_of_place, inv_only]&#xA;        fft16_ns_float_neon - type: fft_float, len: 16, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]&#xA;Transform tree:&#xA;    mdct_inv_float_c - type: mdct_float, len: 512, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]&#xA;        fft_sr_ns_float_neon - type: fft_float, len: 256, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]&#xA;Transform tree:&#xA;    mdct_pfa_15xM_inv_float_c - type: mdct_float, len: 960, factors[2]: [15, any], flags: [unaligned, out_of_place, inv_only]&#xA;        fft32_ns_float_neon - type: fft_float, len: 32, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]&#xA;Transform tree:&#xA;    mdct_inv_float_c - type: mdct_float, len: 1024, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]&#xA;        fft_sr_ns_float_neon - type: fft_float, len: 512, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]&#xA;Transform tree:&#xA;    mdct_fwd_float_c - type: mdct_float, len: 1024, factors[2]: [2, any], flags: [unaligned, out_of_place, fwd_only]&#xA;        fft_sr_ns_float_neon - type: fft_float, len: 512, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]&#xA;[NULL @ 0x144124920] nal_unit_type: 7(SPS), nal_ref_idc: 3&#xA;[NULL @ 0x144124920] Decoding VUI&#xA;[NULL @ 0x144124920] nal_unit_type: 8(PPS), nal_ref_idc: 3&#xA;[NULL @ 0x144124920] Decoding VUI&#xA;[h264 @ 0x144124920] nal_unit_type: 7(SPS), nal_ref_idc: 3&#xA;[h264 @ 0x144124920] Decoding VUI&#xA;[h264 @ 0x144124920] nal_unit_type: 8(PPS), nal_ref_idc: 3&#xA;[h264 @ 0x144124920] nal_unit_type: 7(SPS), nal_ref_idc: 3&#xA;[h264 @ 0x144124920] nal_unit_type: 8(PPS), nal_ref_idc: 3&#xA;[h264 @ 0x144124920] nal_unit_type: 5(IDR), nal_ref_idc: 3&#xA;[h264 @ 0x144124920] Decoding VUI&#xA;[h264 @ 0x144124920] Format yuv420p chosen by get_format().&#xA;[h264 @ 0x144124920] Reinit context to 1280x720, pix_fmt: yuv420p&#xA;[h264 @ 0x144124920] no picture &#xA;[flv @ 0x143604b30] All info found&#xA;[flv @ 0x143604b30] rfps: 29.666667 0.016552&#xA;[flv @ 0x143604b30] rfps: 29.750000 0.009347&#xA;[flv @ 0x143604b30] rfps: 29.750000 0.009347&#xA;[flv @ 0x143604b30] rfps: 29.833333 0.004197&#xA;[flv @ 0x143604b30] rfps: 29.916667 0.001104&#xA;[flv @ 0x143604b30] rfps: 29.916667 0.001104&#xA;[flv @ 0x143604b30] rfps: 30.000000 0.000067&#xA;[flv @ 0x143604b30] rfps: 30.000000 0.000067&#xA;[flv @ 0x143604b30] rfps: 60.000000 0.000270&#xA;[flv @ 0x143604b30] rfps: 60.000000 0.000270&#xA;[flv @ 0x143604b30] rfps: 120.000000 0.001079&#xA;[flv @ 0x143604b30] rfps: 120.000000 0.001079&#xA;[flv @ 0x143604b30] rfps: 240.000000 0.004316&#xA;[flv @ 0x143604b30] rfps: 240.000000 0.004316&#xA;[flv @ 0x143604b30] rfps: 29.970030 0.000204&#xA;[flv @ 0x143604b30] rfps: 29.970030 0.000204&#xA;[flv @ 0x143604b30] rfps: 59.940060 0.000814&#xA;[flv @ 0x143604b30] rfps: 59.940060 0.000814&#xA;[flv @ 0x143604b30] After avformat_find_stream_info() pos: 496783 bytes read:496783 seeks:0 frames:179&#xA;Input #0, flv, from &#x27;rtmp://localhost:1936/live/test&#x27;:&#xA;  Metadata:&#xA;    |RtmpSampleAccess: true&#xA;    Server          : NGINX RTMP (github.com/arut/nginx-rtmp-module)&#xA;    displayWidth    : 1280&#xA;    displayHeight   : 720&#xA;    fps             : 30&#xA;    profile         : &#xA;    level           : &#xA;  Duration: 00:00:00.00, start: 6.742000, bitrate: N/A&#xA;  Stream #0:0, 138, 1/1000: Audio: aac (LC), 48000 Hz, stereo, fltp, 163 kb/s&#xA;  Stream #0:1, 41, 1/1000: Video: h264 (High), 1 reference frame, yuv420p(tv, bt709, progressive, left), 1280x720 [SAR 1:1 DAR 16:9], 0/1, 2560 kb/s, 30 fps, 30 tbr, 1k tbn&#xA;Successfully opened the file.&#xA;Parsing a group of options: output url ./output/test/1080p/stream.mpd.&#xA;Applying option c:v (select encoder/decoder (&#x27;copy&#x27; to copy stream without reencoding)) with argument libx264.&#xA;Applying option s (set frame size (WxH or abbreviation)) with argument 1920x1080.&#xA;Applying option f (force container format (auto-detected otherwise)) with argument dash.&#xA;Successfully parsed a group of options.&#xA;Opening an output file: ./output/test/1080p/stream.mpd.&#xA;[out#0/dash @ 0x123707480] No explicit maps, mapping streams automatically...&#xA;[vost#0:0/libx264 @ 0x123707d60] Created video stream from input stream 0:1&#xA;detected 10 logical cores&#xA;[h264 @ 0x123607b70] nal_unit_type: 7(SPS), nal_ref_idc: 3&#xA;[h264 @ 0x123607b70] Decoding VUI&#xA;[h264 @ 0x123607b70] nal_unit_type: 8(PPS), nal_ref_idc: 3&#xA;[aost#0:1/aac @ 0x144028080] Created audio stream from input stream 0:0&#xA;Transform tree:&#xA;    mdct_inv_float_c - type: md&#xA;&#xA;<it simply="simply" starts="starts" working="working">&#xA;</it></same>

    &#xA;

    I am not sure if there is something to do with Permissions.

    &#xA;

  • Stream sent via FFMPEG (NodeJS) to RTMP (YouTube) not being received

    10 décembre 2024, par Qumber

    I am writing a very basic chrome extension that captures and sends video stream to a nodeJS server, which in turns sends it to Youtube live server.

    &#xA;

    Here is my implementation of the backend which receives data via WebRTC and send to YT using FFMPEG :

    &#xA;

    const express = require(&#x27;express&#x27;);&#xA;const cors = require(&#x27;cors&#x27;);&#xA;const { RTCPeerConnection, RTCSessionDescription } = require(&#x27;@roamhq/wrtc&#x27;);&#xA;const { spawn } = require(&#x27;child_process&#x27;);&#xA;&#xA;const app = express();&#xA;app.use(express.json());&#xA;app.use(cors());&#xA;&#xA;app.post(&#x27;/webrtc&#x27;, async (req, res) => {&#xA;  const peerConnection = new RTCPeerConnection();&#xA;&#xA;  // Start ffmpeg process for streaming&#xA;  const ffmpeg = spawn(&#x27;ffmpeg&#x27;, [&#xA;    &#x27;-f&#x27;, &#x27;flv&#x27;,&#xA;    &#x27;-i&#x27;, &#x27;pipe:0&#x27;,&#xA;    &#x27;-c:v&#x27;, &#x27;libx264&#x27;,&#xA;    &#x27;-preset&#x27;, &#x27;veryfast&#x27;,&#xA;    &#x27;-maxrate&#x27;, &#x27;3000k&#x27;,&#xA;    &#x27;-bufsize&#x27;, &#x27;6000k&#x27;,&#xA;    &#x27;-pix_fmt&#x27;, &#x27;yuv420p&#x27;,&#xA;    &#x27;-g&#x27;, &#x27;50&#x27;,&#xA;    &#x27;-f&#x27;, &#x27;flv&#x27;,&#xA;    &#x27;rtmp://a.rtmp.youtube.com/live2/MY_KEY&#x27;&#xA;  ]);&#xA;&#xA;  ffmpeg.on(&#x27;error&#x27;, (err) => {&#xA;    console.error(&#x27;FFmpeg error:&#x27;, err);&#xA;  });&#xA;&#xA;  ffmpeg.stderr.on(&#x27;data&#x27;, (data) => {&#xA;    console.error(&#x27;FFmpeg stderr:&#x27;, data.toString());&#xA;  });&#xA;&#xA;  ffmpeg.stdout.on(&#x27;data&#x27;, (data) => {&#xA;    console.log(&#x27;FFmpeg stdout:&#x27;, data.toString());&#xA;  });&#xA;&#xA;  // Handle incoming tracks&#xA;  peerConnection.ontrack = (event) => {&#xA;    console.log(&#x27;Track received:&#x27;, event.track.kind);&#xA;    const track = event.track;&#xA;&#xA;    // Stream the incoming track to FFmpeg&#xA;    track.onunmute = () => {&#xA;      console.log(&#x27;Track unmuted:&#x27;, track.kind);&#xA;      const reader = track.createReadStream();&#xA;      reader.on(&#x27;data&#x27;, (chunk) => {&#xA;        console.log(&#x27;Forwarding chunk to FFmpeg:&#x27;, chunk.length);&#xA;        ffmpeg.stdin.write(chunk);&#xA;      });&#xA;      reader.on(&#x27;end&#x27;, () => {&#xA;        console.log(&#x27;Stream ended&#x27;);&#xA;        ffmpeg.stdin.end();&#xA;      });&#xA;    };&#xA;&#xA;    track.onmute = () => {&#xA;      console.log(&#x27;Track muted:&#x27;, track.kind);&#xA;    };&#xA;  };&#xA;&#xA;  // Set the remote description (offer) received from the client&#xA;  await peerConnection.setRemoteDescription(new RTCSessionDescription(req.body.sdp));&#xA;&#xA;  // Create an answer and send it back to the client&#xA;  const answer = await peerConnection.createAnswer();&#xA;  await peerConnection.setLocalDescription(answer);&#xA;&#xA;  res.json({ sdp: peerConnection.localDescription });&#xA;});&#xA;&#xA;app.listen(3000, () => {&#xA;  console.log(&#x27;WebRTC to RTMP server running on port 3000&#x27;);&#xA;});&#xA;&#xA;

    &#xA;

    This is the output I get, but nothing gets sent to YouTube :

    &#xA;

    &#xA;
    FFmpeg stderr: ffmpeg version 7.0.2 Copyright (c) 2000-2024 the FFmpeg developers&#xA;  built with Apple clang version 15.0.0 (clang-1500.3.9.4)&#xA;&#xA;FFmpeg stderr:   configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/7.0.2_1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags=&#x27;-Wl,-ld_classic&#x27; --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon&#xA;&#xA;FFmpeg stderr:   libavutil      59.  8.100 / 59.  8.100&#xA;  libavcodec     61.  3.100 / 61.  3.100&#xA;  libavformat    61.  1.100 / 61.  1.100&#xA;  libavdevice    61.  1.100 / 61.  1.100&#xA;&#xA;FFmpeg stderr:   libavfilter    10.  1.100 / 10.  1.100&#xA;  libswscale      8.  1.100 /  8.  1.100&#xA;  libswresample   5.  1.100 /  5.  1.100&#xA;  libpostproc    58.  1.100 / 58.  1.100&#xA;

    &#xA;

    &#xA;

    I do not understand what I am doing wrong. Any help would be appreciated.

    &#xA;


    &#xA;

    Optionally Here's the frontend code from the extension, which (to me) appears to be recording and sending the capture :

    &#xA;

    popup.js & popup.html

    &#xA;

    &#xD;&#xA;
    &#xD;&#xA;
    document.addEventListener(&#x27;DOMContentLoaded&#x27;, () => {&#xA;  document.getElementById(&#x27;openCapturePage&#x27;).addEventListener(&#x27;click&#x27;, () => {&#xA;    chrome.tabs.create({&#xA;      url: chrome.runtime.getURL(&#x27;capture.html&#x27;)&#xA;    });&#xA;  });&#xA;});

    &#xD;&#xA;

    &#xA;&#xA;&#xA;&#xA;  &#xA;  <code class="echappe-js">&lt;script src='http://stackoverflow.com/feeds/tag/popup.js'&gt;&lt;/script&gt;&#xA;&#xA;&#xA;&#xA;  

    StreamSavvy

    &#xA;

    &#xA;&#xA;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;&#xA;

    capture.js & capture.html

    &#xA;

    &#xD;&#xA;
    &#xD;&#xA;
    let peerConnection;&#xA;&#xA;async function startStreaming() {&#xA;  try {&#xA;    const stream = await navigator.mediaDevices.getDisplayMedia({&#xA;      video: {&#xA;        cursor: "always"&#xA;      },&#xA;      audio: false&#xA;    });&#xA;&#xA;    peerConnection = new RTCPeerConnection({&#xA;      iceServers: [{&#xA;        urls: &#x27;stun:stun.l.google.com:19302&#x27;&#xA;      }]&#xA;    });&#xA;&#xA;    stream.getTracks().forEach(track => peerConnection.addTrack(track, stream));&#xA;&#xA;    const offer = await peerConnection.createOffer();&#xA;    await peerConnection.setLocalDescription(offer);&#xA;&#xA;    const response = await fetch(&#x27;http://localhost:3000/webrtc&#x27;, {&#xA;      method: &#x27;POST&#x27;,&#xA;      headers: {&#xA;        &#x27;Content-Type&#x27;: &#x27;application/json&#x27;&#xA;      },&#xA;      body: JSON.stringify({&#xA;        sdp: peerConnection.localDescription&#xA;      })&#xA;    });&#xA;&#xA;    const {&#xA;      sdp&#xA;    } = await response.json();&#xA;    await peerConnection.setRemoteDescription(new RTCSessionDescription(sdp));&#xA;&#xA;    console.log("Streaming to server via WebRTC...");&#xA;  } catch (error) {&#xA;    console.error("Error starting streaming:", error.name, error.message);&#xA;  }&#xA;}&#xA;&#xA;async function stopStreaming() {&#xA;  if (peerConnection) {&#xA;    // Stop all media tracks&#xA;    peerConnection.getSenders().forEach(sender => {&#xA;      if (sender.track) {&#xA;        sender.track.stop();&#xA;      }&#xA;    });&#xA;&#xA;    // Close the peer connection&#xA;    peerConnection.close();&#xA;    peerConnection = null;&#xA;    console.log("Streaming stopped");&#xA;  }&#xA;}&#xA;&#xA;document.addEventListener(&#x27;DOMContentLoaded&#x27;, () => {&#xA;  document.getElementById(&#x27;startCapture&#x27;).addEventListener(&#x27;click&#x27;, startStreaming);&#xA;  document.getElementById(&#x27;stopCapture&#x27;).addEventListener(&#x27;click&#x27;, stopStreaming);&#xA;});

    &#xD;&#xA;

    &#xA;&#xA;&#xA;&#xA;  &#xA;  <code class="echappe-js">&lt;script src='http://stackoverflow.com/feeds/tag/capture.js'&gt;&lt;/script&gt;&#xA;&#xA;&#xA;&#xA;  

    StreamSavvy Capture

    &#xA;

    &#xA;

    &#xA;&#xA;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;&#xA;

    background.js (service worker)

    &#xA;

    &#xD;&#xA;
    &#xD;&#xA;
    chrome.runtime.onInstalled.addListener(() => {&#xA;  console.log("StreamSavvy Extension Installed");&#xA;});&#xA;&#xA;chrome.runtime.onMessage.addListener((message, sender, sendResponse) => {&#xA;  if (message.type === &#x27;startStreaming&#x27;) {&#xA;    chrome.tabs.create({&#xA;      url: chrome.runtime.getURL(&#x27;capture.html&#x27;)&#xA;    });&#xA;    sendResponse({&#xA;      status: &#x27;streaming&#x27;&#xA;    });&#xA;  } else if (message.type === &#x27;stopStreaming&#x27;) {&#xA;    chrome.tabs.query({&#xA;      url: chrome.runtime.getURL(&#x27;capture.html&#x27;)&#xA;    }, (tabs) => {&#xA;      if (tabs.length > 0) {&#xA;        chrome.tabs.sendMessage(tabs[0].id, {&#xA;          type: &#x27;stopStreaming&#x27;&#xA;        });&#xA;        sendResponse({&#xA;          status: &#x27;stopped&#x27;&#xA;        });&#xA;      }&#xA;    });&#xA;  }&#xA;  return true; // Keep the message channel open for sendResponse&#xA;});

    &#xD;&#xA;

    &#xD;&#xA;

    &#xD;&#xA;&#xA;

  • FFMPEG is not working in app service post deployment

    1er mars 2024, par Tushar Gupta

    I using the Xabe.FFmpeg package to generate clips from a video. The code is basically converting a video to multiple clips and it's working fine in my local but whenever I am uploading the code to app service the code is not working.

    &#xA;

    Stack : .net&#xA;App Service OS : Windows

    &#xA;

    Error :

    &#xA;

    2024-02-13 08:57:34.092 +00:00 [Error] Microsoft.AspNetCore.Server.IIS.Core.IISHttpServer : Connection ID "16573246629528734243", Request ID "40000a24-0000-e600-b63f-84710c7967bb" : An unhandled exception was thrown by the application.System.ComponentModel.Win32Exception (193) : An error occurred trying to start process 'C :\home\site\wwwroot\Controllers\ffmpeg\bin\ffmpeg.exe' with working directory 'C :\home\site\wwwroot'. The specified executable is not a valid application for this OS platform.at System.Diagnostics.Process.StartWithCreateProcess(ProcessStartInfo startInfo)at System.Diagnostics.Process.Start()at Xabe.FFmpeg.FFmpeg.RunProcess(String args, String processPath, Nullable1 priority, Boolean standardInput, Boolean standardOutput, Boolean standardError)at Xabe.FFmpeg.FFmpegWrapper.&lt;>c__DisplayClass14_0.<runprocess>b__0()at System.Threading.Tasks.Task</runprocess>1.InnerInvoke()at System.Threading.Tasks.Task.<>c.<.cctor>b__281_0(Object obj)at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)--- End of stack trace from previous location ---at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state)at System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task& currentTaskSlot, Thread threadPoolThread)--- End of stack trace from previous location ---at Xabe.FFmpeg.Conversion.Start(String parameters, CancellationToken cancellationToken)at ExtractResponseAPI.Controllers.HomeController.PrepareVideoClips(CloudBlockBlob sourceVideoBlob, TimeSpan startTime, TimeSpan endTime) in C :\Users\tushar.h.gupta\source\repos\ExtractResponseAPI\Controllers\HomeController.cs:line 164at ExtractResponseAPI.Controllers.HomeController.GetIntervalsAsync(String query) in C :\Users\tushar.h.gupta\source\repos\ExtractResponseAPI\Controllers\HomeController.cs:line 60at ExtractResponseAPI.Controllers.HomeController.Get(String query) in C :\Users\tushar.h.gupta\source\repos\ExtractResponseAPI\Controllers\HomeController.cs:line 24at lambda_method4(Closure, Object)at Microsoft.AspNetCore.Mvc.Infrastructure.ActionMethodExecutor.AwaitableObjectResultExecutor.Execute(ActionContext actionContext, IActionResultTypeMapper mapper, ObjectMethodExecutor executor, Object controller, Object[] arguments)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.g__Awaited|12_0(ControllerActionInvoker invoker, ValueTask1 actionResultValueTask)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.<invokenextactionfilterasync>g__Awaited|10_0(ControllerActionInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.Rethrow(ActionExecutedContextSealed context)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.Next(State&amp; next, Scope&amp; scope, Object&amp; state, Boolean&amp; isCompleted)at Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.<invokeinnerfilterasync>g__Awaited|13_0(ControllerActionInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<invokefilterpipelineasync>g__Awaited|20_0(ResourceInvoker invoker, Task lastTask, State next, Scope scope, Object state, Boolean isCompleted)at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<invokeasync>g__Awaited|17_0(ResourceInvoker invoker, Task task, IDisposable scope)at Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.<invokeasync>g__Awaited|17_0(ResourceInvoker invoker, Task task, IDisposable scope)at Microsoft.AspNetCore.Authorization.AuthorizationMiddleware.Invoke(HttpContext context)at Microsoft.AspNetCore.Authentication.AuthenticationMiddleware.Invoke(HttpContext context)at Microsoft.AspNetCore.Server.IIS.Core.IISHttpContextOfT</invokeasync></invokeasync></invokefilterpipelineasync></invokeinnerfilterasync></invokenextactionfilterasync>1.ProcessRequestAsync()

    &#xA;

    Code :

    &#xA;

    private async Task> PrepareVideoClips(CloudBlockBlob sourceVideoBlob, TimeSpan startTime, TimeSpan endTime)&#xA;{&#xA;    string tempDirectory = Path.Combine(Path.GetTempPath(), Guid.NewGuid().ToString());&#xA;&#xA;    // Create a temporary directory to store clips locally&#xA;    Directory.CreateDirectory(tempDirectory);&#xA;&#xA;    // Download the source video&#xA;    string sourceVideoPath = Path.Combine(tempDirectory, "sourceVideo.mp4");&#xA;    await sourceVideoBlob.DownloadToFileAsync(sourceVideoPath, FileMode.Create);&#xA;    FFmpeg.SetExecutablesPath("Controllers\\ffmpeg\\bin\\");&#xA;    // Use FFmpegCore to trim the video&#xA;    string outputVideoPath = Path.Combine(tempDirectory, "output.mp4");&#xA;&#xA;    await FFmpeg.Conversions.New()&#xA;        .AddParameter($"-ss {startTime.TotalSeconds}") // Start time&#xA;        .AddParameter($"-i {sourceVideoPath}")&#xA;        .AddParameter($"-to {(endTime - startTime).TotalSeconds}") // Duration&#xA;        .SetOutput(outputVideoPath)&#xA;        .Start();&#xA;&#xA;    // Return a list of file paths for the clips&#xA;    return new List<string> { outputVideoPath };&#xA;}&#xA;</string>

    &#xA;

    I tried deploying the app service using different OS but still facing the same issue, I also tried using different packages but the result is same.

    &#xA;