
Recherche avancée
Médias (3)
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
-
Podcasting Legal guide
16 mai 2011, par
Mis à jour : Mai 2011
Langue : English
Type : Texte
-
Creativecommons informational flyer
16 mai 2011, par
Mis à jour : Juillet 2013
Langue : English
Type : Texte
Autres articles (46)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
MediaSPIP v0.2
21 juin 2013, parMediaSPIP 0.2 is the first MediaSPIP stable release.
Its official release date is June 21, 2013 and is announced here.
The zip file provided here only contains the sources of MediaSPIP in its standalone version.
To get a working installation, you must manually install all-software dependencies on the server.
If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (5808)
-
GC and onTouch cause Fatal signal 11 (SIGSEGV) error in app using ffmpeg through ndk
30 janvier 2015, par grzebykI am getting a nasty but well known error while working with FFmpeg and NDK :
A/libc(9845): Fatal signal 11 (SIGSEGV), code 1, fault addr 0xa0a9f000 in tid 9921 (AsyncTask #4)
UPDATE
After couple hours i found out that there might be two sources of the problem. One was related to multithreading. I checked it and I fixed it. Now the app crashes ONLY when the video playback (ndk) is on.
I put a "counter" in touch event
surfaceSterowanieKamera.setOnTouchListener(new View.OnTouchListener() {
int counter = 0;
@Override
public boolean onTouch(View v, MotionEvent event) {
if ((event.getAction() == MotionEvent.ACTION_MOVE)){
Log.i(TAG, "counter = " + counter);
//cameraMover.setPanTilt(some parameters);
counter++;
}And I started disabling other app functionalities one by one, but no video. I found out, that with every single functionality less, it takes app longer to crush - counter reaches higher values. After turning off everything besides video playback and touch interface (
cameraMover.setPanTilt()
commented out) the app crushes usually when counter is between 1600 - 1700.In such case logcat shows the above error and GC related info. For me it seems like GC is messing up with the ndk.
01-23 12:27:13.163: I/Display Activity(20633): n = 1649
01-23 12:27:13.178: I/art(20633): Background sticky concurrent mark sweep GC freed 158376(6MB) AllocSpace objects, 1(3MB) LOS objects, 17% free, 36MB/44MB, paused 689us total 140.284ms
01-23 12:27:13.169: A/libc(20633): Fatal signal 11 (SIGSEGV), code 1, fault addr 0x9bd6ec0c in tid 20734 (AsyncTask #3)Why is GC causing problem with ndk part of application ?
ORIGINAL PROBLEM
What am I doing ?
I am developing an application that streams live video feed from a webcam and enables user to pan and tilt the remote camera. I am using FFmpeg library built with NDK to achieve smooth playback with little delay.
I am using FFMpeg library to connect to the video stream. Then the ndk part creates bitmap, does the image processing and render frames on the
SurfaceView videoSurfaceView
object which is located in the android activity (java part).To move the webcam I created a separate class -
public class CameraMover implements Runnable{/**/}
. This class is a separate thread that connects through sockets with the remote camera and manages tasks connected ONLY with pan-tilt movement.Next in the main activity i created a touch listener
videoSurfaceView.setOnTouchListener(new View.OnTouchListener() {/**/
cameraMover.setPanTilt(some parameters);
/**/}which reads user’s finger movement and sends commands to the camera.
All tasks - moving camera around, touch interface and video playback are working perfectly when the one of the others is disabled, i.e. when I disable possibility to move camera, I can watch video streaming and register touch events till the end of time (or battery at least). The problem occurs only when task are configured to work simultaneously.
I am unable to find steps to reproduce the problem. It just happens, but only after user touches the screen to move camera. It can be 15 seconds after first interaction, but sometimes it takes app 10 or more minutes to crash. Usually it is something around a minute.
What have I done to fix it ?
- I tried to display millions of logs in logcat to find an error but
the last log was always different. - I created a transparent surface, that I put over the
videoSurfaceView
and assigned touch listener to it. It all ended in the same error. - As I mentioned before, I turned off some functionalities to find which one produces the error, but it appears that error occurs only when everything is working simultaneously.
Types of the error
Almost every time the error looks like this :
A/libc(11528): Fatal signal 11 (SIGSEGV), code 1, fault addr 0x9aa9f00c in tid 11637 (AsyncTask #4)
the difference between two errors is the number right after libc, addr number and tid number. Rarely the AsyncTask number varies - i received #1 couple times but I was unable to reproduce it.
Question
How can i avoid this error ? What can be the source of it ?
- I tried to display millions of logs in logcat to find an error but
-
Error Opening RTMP Stream through FFmpeg command when executed through exec package [closed]
3 octobre 2024, par AkhilI have been trying to transcode the live stream from RTMP server running on
rtmp://localhost:1936/live/test
with FFmpeg in a Go application usingos/exec
package, But seems to not work and gives the input/output error (I have attached below). The same exact ffmpeg command when I execute on terminal, works as its supposed to. Not Sure why that is, here is my code for reproducing and analyzing the mistakes.

ffmpegCmd := fmt.Sprintf("ffmpeg -nostdin -i rtmp://localhost:1936/live/%s -c:v libx264 -s %s -f %s %s/stream.mpd",
 streamKey, resolution, sp.OutputFormat, outputPath)
 log.Printf("Executing FFmpeg command: %s", ffmpegCmd)

 // Prepare the command execution with a timeout context
 ctx, cancel := context.WithTimeout(context.Background(), 60*time.Second) // Set a 60-second timeout
 defer cancel()

 cmd := exec.CommandContext(ctx, "bash", "-c", ffmpegCmd)



the ffmpeg command looks like this :

ffmpeg -nostdin -i rtmp://localhost:1936/live/test -c:v libx264 -s 1920x1080 -f dash output/test/1080p/stream.mpd


I get the following error :


Error opening input: Input/output error

Error opening input file rtmp://localhost:1936/live/test.

Error opening input files: Input/output error

Exiting normally, received signal 2.

signal: interrupt



I have already tried to break the command, and then execute it. Something like :


cmd := exec.CommandContext(ctx,
 "ffmpeg",
 "-nostdin",
 "-i", "rtmp://localhost:1936/live/"+streamKey,
 "-c:v", "libx264",
 "-s", resolution,
 "-f", sp.OutputFormat,
 outputPath+"/stream.mpd")



After running the ffmpeg command with -loglevel debug and -report :


Here is the logs and errors I get :


When I run it within the go application :


ffmpeg started on 2024-10-02 at 12:00:06
Report written to "ffmpeg-20241002-120006.log"
Log level: 48
Command line:
ffmpeg -loglevel debug -report -i rtmp://localhost:1936/live/test -c:v libx264 -s 1920x1080 -f dash ./output/test/1080p/stream.mpd
ffmpeg version 7.0.2 Copyright (c) 2000-2024 the FFmpeg developers
 built with Apple clang version 15.0.0 (clang-1500.3.9.4)
 configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/7.0.2_1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon
 libavutil 59. 8.100 / 59. 8.100
 libavcodec 61. 3.100 / 61. 3.100
 libavformat 61. 1.100 / 61. 1.100
 libavdevice 61. 1.100 / 61. 1.100
 libavfilter 10. 1.100 / 10. 1.100
 libswscale 8. 1.100 / 8. 1.100
 libswresample 5. 1.100 / 5. 1.100
 libpostproc 58. 1.100 / 58. 1.100
Splitting the commandline.
Reading option '-loglevel' ... matched as option 'loglevel' (set logging level) with argument 'debug'.
Reading option '-report' ... matched as option 'report' (generate a report) with argument '1'.
Reading option '-i' ... matched as input url with argument 'rtmp://localhost:1936/live/test'.
Reading option '-c:v' ... matched as option 'c' (select encoder/decoder ('copy' to copy stream without reencoding)) with argument 'libx264'.
Reading option '-s' ... matched as option 's' (set frame size (WxH or abbreviation)) with argument '1920x1080'.
Reading option '-f' ... matched as option 'f' (force container format (auto-detected otherwise)) with argument 'dash'.
Reading option './output/test/1080p/stream.mpd' ... matched as output url.
Finished splitting the commandline.
Parsing a group of options: global .
Applying option loglevel (set logging level) with argument debug.
Applying option report (generate a report) with argument 1.
Successfully parsed a group of options.
Parsing a group of options: input url rtmp://localhost:1936/live/test.
Successfully parsed a group of options.
Opening an input file: rtmp://localhost:1936/live/test.
[AVFormatContext @ 0x13f721f90] Opening 'rtmp://localhost:1936/live/test' for reading
[rtmp @ 0x13f6040e0] No default whitelist set
[tcp @ 0x13f7223d0] No default whitelist set
[tcp @ 0x13f7223d0] Original list of addresses:
[tcp @ 0x13f7223d0] Address ::1 port 1936
[tcp @ 0x13f7223d0] Address 127.0.0.1 port 1936
[tcp @ 0x13f7223d0] Interleaved list of addresses:
[tcp @ 0x13f7223d0] Address ::1 port 1936
[tcp @ 0x13f7223d0] Address 127.0.0.1 port 1936
[tcp @ 0x13f7223d0] Starting connection attempt to ::1 port 1936
[tcp @ 0x13f7223d0] Connection attempt to ::1 port 1936 failed: Connection refused
[tcp @ 0x13f7223d0] Starting connection attempt to 127.0.0.1 port 1936
[tcp @ 0x13f7223d0] Successfully connected to 127.0.0.1 port 1936
[rtmp @ 0x13f6040e0] Handshaking...
[rtmp @ 0x13f6040e0] Type answer 3
[rtmp @ 0x13f6040e0] Server version 13.14.10.13
[rtmp @ 0x13f6040e0] Proto = rtmp, path = /live/test, app = live, fname = test
[rtmp @ 0x13f6040e0] Window acknowledgement size = 5000000
[rtmp @ 0x13f6040e0] Max sent, unacked = 5000000
[rtmp @ 0x13f6040e0] New incoming chunk size = 4096
[rtmp @ 0x13f6040e0] Creating stream...
[rtmp @ 0x13f6040e0] Sending play command for 'test'
[rtmp @ 0x13f6040e0] Deleting stream...
[in#0 @ 0x13f721d40] Error opening input: Input/output error
Error opening input file rtmp://localhost:1936/live/test.
Error opening input files: Input/output error
Exiting normally, received signal 2.



This is what i get when i run the same command on terminal :


<same as="as" but="but" please="please" scroll="scroll" further="further">

[rtmp @ 0x1437144c0] No default whitelist set
[tcp @ 0x143604f20] No default whitelist set
[tcp @ 0x143604f20] Original list of addresses:
[tcp @ 0x143604f20] Address ::1 port 1936
[tcp @ 0x143604f20] Address 127.0.0.1 port 1936
[tcp @ 0x143604f20] Interleaved list of addresses:
[tcp @ 0x143604f20] Address ::1 port 1936
[tcp @ 0x143604f20] Address 127.0.0.1 port 1936
[tcp @ 0x143604f20] Starting connection attempt to ::1 port 1936
[tcp @ 0x143604f20] Connection attempt to ::1 port 1936 failed: Connection refused
[tcp @ 0x143604f20] Starting connection attempt to 127.0.0.1 port 1936
[tcp @ 0x143604f20] Successfully connected to 127.0.0.1 port 1936
[rtmp @ 0x1437144c0] Handshaking...
[rtmp @ 0x1437144c0] Type answer 3
[rtmp @ 0x1437144c0] Server version 13.14.10.13
[rtmp @ 0x1437144c0] Proto = rtmp, path = /live/test, app = live, fname = test
[rtmp @ 0x1437144c0] Window acknowledgement size = 5000000
[rtmp @ 0x1437144c0] Max sent, unacked = 5000000
[rtmp @ 0x1437144c0] New incoming chunk size = 4096
[rtmp @ 0x1437144c0] Creating stream...
[rtmp @ 0x1437144c0] Sending play command for 'test'
[flv @ 0x143604b30] Format flv probed with size=2048 and score=100
[flv @ 0x143604b30] Before avformat_find_stream_info() pos: 13 bytes read:2263 seeks:0 nb_streams:0
Transform tree:
 mdct_inv_float_c - type: mdct_float, len: 64, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]
 fft32_ns_float_neon - type: fft_float, len: 32, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_inv_float_c - type: mdct_float, len: 64, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]
 fft32_ns_float_neon - type: fft_float, len: 32, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_pfa_15xM_inv_float_c - type: mdct_float, len: 120, factors[2]: [15, any], flags: [unaligned, out_of_place, inv_only]
 fft4_fwd_float_neon - type: fft_float, len: 4, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_inv_float_c - type: mdct_float, len: 128, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]
 fft_sr_ns_float_neon - type: fft_float, len: 64, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_pfa_15xM_inv_float_c - type: mdct_float, len: 480, factors[2]: [15, any], flags: [unaligned, out_of_place, inv_only]
 fft16_ns_float_neon - type: fft_float, len: 16, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_inv_float_c - type: mdct_float, len: 512, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]
 fft_sr_ns_float_neon - type: fft_float, len: 256, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_pfa_15xM_inv_float_c - type: mdct_float, len: 960, factors[2]: [15, any], flags: [unaligned, out_of_place, inv_only]
 fft32_ns_float_neon - type: fft_float, len: 32, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_inv_float_c - type: mdct_float, len: 1024, factors[2]: [2, any], flags: [unaligned, out_of_place, inv_only]
 fft_sr_ns_float_neon - type: fft_float, len: 512, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
Transform tree:
 mdct_fwd_float_c - type: mdct_float, len: 1024, factors[2]: [2, any], flags: [unaligned, out_of_place, fwd_only]
 fft_sr_ns_float_neon - type: fft_float, len: 512, factor: 2, flags: [aligned, inplace, out_of_place, preshuf]
[NULL @ 0x144124920] nal_unit_type: 7(SPS), nal_ref_idc: 3
[NULL @ 0x144124920] Decoding VUI
[NULL @ 0x144124920] nal_unit_type: 8(PPS), nal_ref_idc: 3
[NULL @ 0x144124920] Decoding VUI
[h264 @ 0x144124920] nal_unit_type: 7(SPS), nal_ref_idc: 3
[h264 @ 0x144124920] Decoding VUI
[h264 @ 0x144124920] nal_unit_type: 8(PPS), nal_ref_idc: 3
[h264 @ 0x144124920] nal_unit_type: 7(SPS), nal_ref_idc: 3
[h264 @ 0x144124920] nal_unit_type: 8(PPS), nal_ref_idc: 3
[h264 @ 0x144124920] nal_unit_type: 5(IDR), nal_ref_idc: 3
[h264 @ 0x144124920] Decoding VUI
[h264 @ 0x144124920] Format yuv420p chosen by get_format().
[h264 @ 0x144124920] Reinit context to 1280x720, pix_fmt: yuv420p
[h264 @ 0x144124920] no picture 
[flv @ 0x143604b30] All info found
[flv @ 0x143604b30] rfps: 29.666667 0.016552
[flv @ 0x143604b30] rfps: 29.750000 0.009347
[flv @ 0x143604b30] rfps: 29.750000 0.009347
[flv @ 0x143604b30] rfps: 29.833333 0.004197
[flv @ 0x143604b30] rfps: 29.916667 0.001104
[flv @ 0x143604b30] rfps: 29.916667 0.001104
[flv @ 0x143604b30] rfps: 30.000000 0.000067
[flv @ 0x143604b30] rfps: 30.000000 0.000067
[flv @ 0x143604b30] rfps: 60.000000 0.000270
[flv @ 0x143604b30] rfps: 60.000000 0.000270
[flv @ 0x143604b30] rfps: 120.000000 0.001079
[flv @ 0x143604b30] rfps: 120.000000 0.001079
[flv @ 0x143604b30] rfps: 240.000000 0.004316
[flv @ 0x143604b30] rfps: 240.000000 0.004316
[flv @ 0x143604b30] rfps: 29.970030 0.000204
[flv @ 0x143604b30] rfps: 29.970030 0.000204
[flv @ 0x143604b30] rfps: 59.940060 0.000814
[flv @ 0x143604b30] rfps: 59.940060 0.000814
[flv @ 0x143604b30] After avformat_find_stream_info() pos: 496783 bytes read:496783 seeks:0 frames:179
Input #0, flv, from 'rtmp://localhost:1936/live/test':
 Metadata:
 |RtmpSampleAccess: true
 Server : NGINX RTMP (github.com/arut/nginx-rtmp-module)
 displayWidth : 1280
 displayHeight : 720
 fps : 30
 profile : 
 level : 
 Duration: 00:00:00.00, start: 6.742000, bitrate: N/A
 Stream #0:0, 138, 1/1000: Audio: aac (LC), 48000 Hz, stereo, fltp, 163 kb/s
 Stream #0:1, 41, 1/1000: Video: h264 (High), 1 reference frame, yuv420p(tv, bt709, progressive, left), 1280x720 [SAR 1:1 DAR 16:9], 0/1, 2560 kb/s, 30 fps, 30 tbr, 1k tbn
Successfully opened the file.
Parsing a group of options: output url ./output/test/1080p/stream.mpd.
Applying option c:v (select encoder/decoder ('copy' to copy stream without reencoding)) with argument libx264.
Applying option s (set frame size (WxH or abbreviation)) with argument 1920x1080.
Applying option f (force container format (auto-detected otherwise)) with argument dash.
Successfully parsed a group of options.
Opening an output file: ./output/test/1080p/stream.mpd.
[out#0/dash @ 0x123707480] No explicit maps, mapping streams automatically...
[vost#0:0/libx264 @ 0x123707d60] Created video stream from input stream 0:1
detected 10 logical cores
[h264 @ 0x123607b70] nal_unit_type: 7(SPS), nal_ref_idc: 3
[h264 @ 0x123607b70] Decoding VUI
[h264 @ 0x123607b70] nal_unit_type: 8(PPS), nal_ref_idc: 3
[aost#0:1/aac @ 0x144028080] Created audio stream from input stream 0:0
Transform tree:
 mdct_inv_float_c - type: md

<it simply="simply" starts="starts" working="working">
</it></same>


I am not sure if there is something to do with Permissions.


-
Stream sent via FFMPEG (NodeJS) to RTMP (YouTube) not being received
10 décembre 2024, par QumberI am writing a very basic chrome extension that captures and sends video stream to a nodeJS server, which in turns sends it to Youtube live server.


Here is my implementation of the backend which receives data via WebRTC and send to YT using FFMPEG :


const express = require('express');
const cors = require('cors');
const { RTCPeerConnection, RTCSessionDescription } = require('@roamhq/wrtc');
const { spawn } = require('child_process');

const app = express();
app.use(express.json());
app.use(cors());

app.post('/webrtc', async (req, res) => {
 const peerConnection = new RTCPeerConnection();

 // Start ffmpeg process for streaming
 const ffmpeg = spawn('ffmpeg', [
 '-f', 'flv',
 '-i', 'pipe:0',
 '-c:v', 'libx264',
 '-preset', 'veryfast',
 '-maxrate', '3000k',
 '-bufsize', '6000k',
 '-pix_fmt', 'yuv420p',
 '-g', '50',
 '-f', 'flv',
 'rtmp://a.rtmp.youtube.com/live2/MY_KEY'
 ]);

 ffmpeg.on('error', (err) => {
 console.error('FFmpeg error:', err);
 });

 ffmpeg.stderr.on('data', (data) => {
 console.error('FFmpeg stderr:', data.toString());
 });

 ffmpeg.stdout.on('data', (data) => {
 console.log('FFmpeg stdout:', data.toString());
 });

 // Handle incoming tracks
 peerConnection.ontrack = (event) => {
 console.log('Track received:', event.track.kind);
 const track = event.track;

 // Stream the incoming track to FFmpeg
 track.onunmute = () => {
 console.log('Track unmuted:', track.kind);
 const reader = track.createReadStream();
 reader.on('data', (chunk) => {
 console.log('Forwarding chunk to FFmpeg:', chunk.length);
 ffmpeg.stdin.write(chunk);
 });
 reader.on('end', () => {
 console.log('Stream ended');
 ffmpeg.stdin.end();
 });
 };

 track.onmute = () => {
 console.log('Track muted:', track.kind);
 };
 };

 // Set the remote description (offer) received from the client
 await peerConnection.setRemoteDescription(new RTCSessionDescription(req.body.sdp));

 // Create an answer and send it back to the client
 const answer = await peerConnection.createAnswer();
 await peerConnection.setLocalDescription(answer);

 res.json({ sdp: peerConnection.localDescription });
});

app.listen(3000, () => {
 console.log('WebRTC to RTMP server running on port 3000');
});




This is the output I get, but nothing gets sent to YouTube :




FFmpeg stderr: ffmpeg version 7.0.2 Copyright (c) 2000-2024 the FFmpeg developers
 built with Apple clang version 15.0.0 (clang-1500.3.9.4)

FFmpeg stderr: configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/7.0.2_1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon

FFmpeg stderr: libavutil 59. 8.100 / 59. 8.100
 libavcodec 61. 3.100 / 61. 3.100
 libavformat 61. 1.100 / 61. 1.100
 libavdevice 61. 1.100 / 61. 1.100

FFmpeg stderr: libavfilter 10. 1.100 / 10. 1.100
 libswscale 8. 1.100 / 8. 1.100
 libswresample 5. 1.100 / 5. 1.100
 libpostproc 58. 1.100 / 58. 1.100





I do not understand what I am doing wrong. Any help would be appreciated.



Optionally Here's the frontend code from the extension, which (to me) appears to be recording and sending the capture :


popup.js & popup.html




document.addEventListener('DOMContentLoaded', () => {
 document.getElementById('openCapturePage').addEventListener('click', () => {
 chrome.tabs.create({
 url: chrome.runtime.getURL('capture.html')
 });
 });
});






 
 <code class="echappe-js"><script src='http://stackoverflow.com/feeds/tag/popup.js'></script>




StreamSavvy













capture.js & capture.html




let peerConnection;

async function startStreaming() {
 try {
 const stream = await navigator.mediaDevices.getDisplayMedia({
 video: {
 cursor: "always"
 },
 audio: false
 });

 peerConnection = new RTCPeerConnection({
 iceServers: [{
 urls: 'stun:stun.l.google.com:19302'
 }]
 });

 stream.getTracks().forEach(track => peerConnection.addTrack(track, stream));

 const offer = await peerConnection.createOffer();
 await peerConnection.setLocalDescription(offer);

 const response = await fetch('http://localhost:3000/webrtc', {
 method: 'POST',
 headers: {
 'Content-Type': 'application/json'
 },
 body: JSON.stringify({
 sdp: peerConnection.localDescription
 })
 });

 const {
 sdp
 } = await response.json();
 await peerConnection.setRemoteDescription(new RTCSessionDescription(sdp));

 console.log("Streaming to server via WebRTC...");
 } catch (error) {
 console.error("Error starting streaming:", error.name, error.message);
 }
}

async function stopStreaming() {
 if (peerConnection) {
 // Stop all media tracks
 peerConnection.getSenders().forEach(sender => {
 if (sender.track) {
 sender.track.stop();
 }
 });

 // Close the peer connection
 peerConnection.close();
 peerConnection = null;
 console.log("Streaming stopped");
 }
}

document.addEventListener('DOMContentLoaded', () => {
 document.getElementById('startCapture').addEventListener('click', startStreaming);
 document.getElementById('stopCapture').addEventListener('click', stopStreaming);
});






 
 <code class="echappe-js"><script src='http://stackoverflow.com/feeds/tag/capture.js'></script>




StreamSavvy Capture















background.js (service worker)




chrome.runtime.onInstalled.addListener(() => {
 console.log("StreamSavvy Extension Installed");
});

chrome.runtime.onMessage.addListener((message, sender, sendResponse) => {
 if (message.type === 'startStreaming') {
 chrome.tabs.create({
 url: chrome.runtime.getURL('capture.html')
 });
 sendResponse({
 status: 'streaming'
 });
 } else if (message.type === 'stopStreaming') {
 chrome.tabs.query({
 url: chrome.runtime.getURL('capture.html')
 }, (tabs) => {
 if (tabs.length > 0) {
 chrome.tabs.sendMessage(tabs[0].id, {
 type: 'stopStreaming'
 });
 sendResponse({
 status: 'stopped'
 });
 }
 });
 }
 return true; // Keep the message channel open for sendResponse
});