
Recherche avancée
Médias (1)
-
MediaSPIP Simple : futur thème graphique par défaut ?
26 septembre 2013, par
Mis à jour : Octobre 2013
Langue : français
Type : Video
Autres articles (82)
-
Des sites réalisés avec MediaSPIP
2 mai 2011, parCette page présente quelques-uns des sites fonctionnant sous MediaSPIP.
Vous pouvez bien entendu ajouter le votre grâce au formulaire en bas de page. -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;
Sur d’autres sites (5648)
-
Export your data from Universal Analytics or Universal Analytics 360
26 juin 2024, par Erin -
ffmpeg error Error sending frames to consumers : No space left on device [closed]
4 août 2024, par Bernard Vatonen BernI do use a short funtion , to convert videos in a needed format ,, I used this function all the time , worked no problems ,
Lately I mado some chnages , and now geting an critical error : like : "no space left on device"

" \[af#0:1 @ 0x600001ad9cb0\] Error sending frames to consumers: No space left on device \[af#0:1 @ 0x600001ad9cb0\] Task finished with error code: -28 (No space left on device) \[af#0:1 @ 0x600001ad9cb0\] Terminating thread with return code -28 (No space left on device)"


I do have this function saved in my file : .zshrc




function indianull() 
ffmpeg -i movies/$1.* -i /Documents/indianul3.png -filter_complex "[1][0]scale2ref=w=ohmdar:h=ih0.1[logo][video] ;[video][logo]overlay=x=main_w-overlay_w-(main_w0.04):y=main_h0.14,subtitles=subs/$1.srt:force_style='FontSize=22,WrapStyle=0,MarginV=35" -preset fast -s 720x480 -vcodec libx264 -shortest output/$1-sub.mp4






\`indianull tt0066763





\[Parsed_subtitles_2 @ 0x600001fd8bb0\] libass API version: 0x1703000
\[Parsed_subtitles_2 @ 0x600001fd8bb0\] libass source: tarball: 0.17.3
\[Parsed_subtitles_2 @ 0x600001fd8bb0\] Shaper: FriBidi 1.0.15 (SIMPLE) HarfBuzz-ng 9.0.0 (COMPLEX)
\[Parsed_subtitles_2 @ 0x600001fd8bb0\] Using font provider coretext`



Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'movies/tt0066763.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
title : Anand.1971.720p.BluRay.x264-x0r
encoder : Lavf57.83.100
Duration: 02:02:08.08, start: 0.000000, bitrate: 1212 kb/s
Chapters:
Chapter #0:0: start 0.000000, end 209.375000
Stream #0:0\[0x1\](hin): Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1280x720 \[SAR 1:1 DAR 16:9\], 1079 kb/s, 24 fps, 24 tbr, 12288 tbn (default)
Metadata:
handler_name : VideoHandler
vendor_id : \[0\]\[0\]\[0\]\[0\]
Stream #0:1\[0x2\](hin): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 127 kb/s (default)
Metadata:
handler_name : SoundHandler
vendor_id : \[0\]\[0\]\[0\]\[0\]
Stream #0:2\[0x3\](eng): Data: bin_data (text / 0x74786574)
Metadata:
handler_name : SubtitleHandler
Input #1, png_pipe, from '/Users/bv2004/Documents/indianul3.png':
Duration: N/A, bitrate: N/A
Stream #1:0: Video: png, rgba(pc, gbr/unknown/unknown), 630x124, 25 fps, 25 tbr, 25 tbn
File 'output/tt0066763-sub.mp4' already exists. Overwrite? \[y/N\] y
Stream mapping:
Stream #0:0 (h264) -\> scale2ref (graph 0)
Stream #1:0 (png) -\> scale2ref (graph 0)
subtitles:default (graph 0) -\> Stream #0:0 (libx264)
Stream #0:1 -\> #0:1 (aac (native) -\> aac (native))
Press \[q\] to stop, \[?\] for help
\[Parsed_subtitles_2 @ 0x600001fcc420\] libass API version: 0x1703000
\[Parsed_subtitles_2 @ 0x600001fcc420\] libass source: tarball: 0.17.3
\[Parsed_subtitles_2 @ 0x600001fcc420\] Shaper: FriBidi 1.0.15 (SIMPLE) HarfBuzz-ng 9.0.0 (COMPLEX)
\[Parsed_subtitles_2 @ 0x600001fcc420\] Using font provider coretext
\[vost#0:0/libx264 @ 0x128e07bb0\] No filtered frames for output stream, trying to initialize anyway.
\[libx264 @ 0x128e084e0\] using SAR=32/27
\[libx264 @ 0x128e084e0\] using cpu capabilities: ARMv8 NEON
\[libx264 @ 0x128e084e0\] profile High, level 3.0, 4:2:0, 8-bit
\[libx264 @ 0x128e084e0\] 264 - core 164 r3108 31e19f9 - H.264/MPEG-4 AVC codec - Copyleft 2003-2023 - http://www.videolan.org/x264.html - options: cabac=1 ref=2 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=6 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=15 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=1 keyint=250 keyint_min=24 scenecut=40 intra_refresh=0 rc_lookahead=30 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'output/tt0066763-sub.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
title : Anand.1971.720p.BluRay.x264-x0r
encoder : Lavf61.1.100

 encoder : Lavc61.3.100 aac



\[af#0:1 @ 0x600001ad9cb0\] Error sending frames to consumers: No space left on device
\[af#0:1 @ 0x600001ad9cb0\] Task finished with error code: -28 (No space left on device)
\[af#0:1 @ 0x600001ad9cb0\] Terminating thread with return code -28 (No space left on device)
\[out#0/mp4 @ 0x600001dd8540\] video:0KiB audio:0KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: unknown
\[out#0/mp4 @ 0x600001dd8540\] Output file is empty, nothing was encoded(check -ss / -t / -frames parameters if used)
frame= 0 fps=0.0 q=0.0 Lsize= 2KiB time=N/A bitrate=N/A speed=N/A
\[aac @ 0x128e09070\] Qavg: nan
Conversion failed!`



df -h
Filesystem Size Used Avail Capacity iused ifree %iused Mounted on
/dev/disk3s3s1 926Gi 9.6Gi 539Gi 2% 404k 4.3G 0% /
devfs 200Ki 200Ki 0Bi 100% 693 0 100% /dev
/dev/disk3s6 926Gi 1.0Gi 539Gi 1% 1 5.7G 0% /System/Volumes/VM
/dev/disk3s4 926Gi 5.7Gi 539Gi 2% 1.1k 5.7G 0% /System/Volumes/Preboot
/dev/disk3s2 926Gi 89Mi 539Gi 1% 53 5.7G 0% /System/Volumes/Update
/dev/disk1s2 500Mi 6.0Mi 479Mi 2% 1 4.9M 0% /System/Volumes/xarts
/dev/disk1s1 500Mi 6.1Mi 479Mi 2% 31 4.9M 0% /System/Volumes/iSCPreboot
/dev/disk1s3 500Mi 3.9Mi 479Mi 1% 57 4.9M 0% /System/Volumes/Hardware
/dev/disk3s1 926Gi 370Gi 539Gi 41% 778k 5.7G 0% /System/Volumes/Data
map auto_home 0Bi 0Bi 0Bi 100% 0 0 - /System/Volumes/Data/home\`



I run the funtion indianull and expected it to convert a video , in my desired format + add logo + add subtitles ! Takes the original Video from on folder , Subtitles from another folder , and converts the video , and saves it in another folder ..


the issue i see only is :

"\[af#0:1 @ 0x600001ad9cb0\] Error sending frames to consumers: No space left on device \[af#0:1 @ 0x600001ad9cb0\] Task finished with error code: -28 (No space left on device) \[af#0:1 @ 0x600001ad9cb0\] Terminating thread with return code -28 (No space left on device) \[out#0/mp4 @ 0x600001dd8540\] video:0KiB audio:0KiB subtitle:0KiB other streams:0KiB global headers:0KiB muxing overhead: unknown"


I do defenitly have over 400GB free on My SSD !
Is an an speciffic folder that is full ? and can not hold any files ?


Any idea how to solve this ? or any suggestion how to try to solve ?


-
Stream sent via FFMPEG (NodeJS) to RTMP (YouTube) not being received
10 décembre 2024, par QumberI am writing a very basic chrome extension that captures and sends video stream to a nodeJS server, which in turns sends it to Youtube live server.


Here is my implementation of the backend which receives data via WebRTC and send to YT using FFMPEG :


const express = require('express');
const cors = require('cors');
const { RTCPeerConnection, RTCSessionDescription } = require('@roamhq/wrtc');
const { spawn } = require('child_process');

const app = express();
app.use(express.json());
app.use(cors());

app.post('/webrtc', async (req, res) => {
 const peerConnection = new RTCPeerConnection();

 // Start ffmpeg process for streaming
 const ffmpeg = spawn('ffmpeg', [
 '-f', 'flv',
 '-i', 'pipe:0',
 '-c:v', 'libx264',
 '-preset', 'veryfast',
 '-maxrate', '3000k',
 '-bufsize', '6000k',
 '-pix_fmt', 'yuv420p',
 '-g', '50',
 '-f', 'flv',
 'rtmp://a.rtmp.youtube.com/live2/MY_KEY'
 ]);

 ffmpeg.on('error', (err) => {
 console.error('FFmpeg error:', err);
 });

 ffmpeg.stderr.on('data', (data) => {
 console.error('FFmpeg stderr:', data.toString());
 });

 ffmpeg.stdout.on('data', (data) => {
 console.log('FFmpeg stdout:', data.toString());
 });

 // Handle incoming tracks
 peerConnection.ontrack = (event) => {
 console.log('Track received:', event.track.kind);
 const track = event.track;

 // Stream the incoming track to FFmpeg
 track.onunmute = () => {
 console.log('Track unmuted:', track.kind);
 const reader = track.createReadStream();
 reader.on('data', (chunk) => {
 console.log('Forwarding chunk to FFmpeg:', chunk.length);
 ffmpeg.stdin.write(chunk);
 });
 reader.on('end', () => {
 console.log('Stream ended');
 ffmpeg.stdin.end();
 });
 };

 track.onmute = () => {
 console.log('Track muted:', track.kind);
 };
 };

 // Set the remote description (offer) received from the client
 await peerConnection.setRemoteDescription(new RTCSessionDescription(req.body.sdp));

 // Create an answer and send it back to the client
 const answer = await peerConnection.createAnswer();
 await peerConnection.setLocalDescription(answer);

 res.json({ sdp: peerConnection.localDescription });
});

app.listen(3000, () => {
 console.log('WebRTC to RTMP server running on port 3000');
});




This is the output I get, but nothing gets sent to YouTube :




FFmpeg stderr: ffmpeg version 7.0.2 Copyright (c) 2000-2024 the FFmpeg developers
 built with Apple clang version 15.0.0 (clang-1500.3.9.4)

FFmpeg stderr: configuration: --prefix=/opt/homebrew/Cellar/ffmpeg/7.0.2_1 --enable-shared --enable-pthreads --enable-version3 --cc=clang --host-cflags= --host-ldflags='-Wl,-ld_classic' --enable-ffplay --enable-gnutls --enable-gpl --enable-libaom --enable-libaribb24 --enable-libbluray --enable-libdav1d --enable-libharfbuzz --enable-libjxl --enable-libmp3lame --enable-libopus --enable-librav1e --enable-librist --enable-librubberband --enable-libsnappy --enable-libsrt --enable-libssh --enable-libsvtav1 --enable-libtesseract --enable-libtheora --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-lzma --enable-libfontconfig --enable-libfreetype --enable-frei0r --enable-libass --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libspeex --enable-libsoxr --enable-libzmq --enable-libzimg --disable-libjack --disable-indev=jack --enable-videotoolbox --enable-audiotoolbox --enable-neon

FFmpeg stderr: libavutil 59. 8.100 / 59. 8.100
 libavcodec 61. 3.100 / 61. 3.100
 libavformat 61. 1.100 / 61. 1.100
 libavdevice 61. 1.100 / 61. 1.100

FFmpeg stderr: libavfilter 10. 1.100 / 10. 1.100
 libswscale 8. 1.100 / 8. 1.100
 libswresample 5. 1.100 / 5. 1.100
 libpostproc 58. 1.100 / 58. 1.100





I do not understand what I am doing wrong. Any help would be appreciated.



Optionally Here's the frontend code from the extension, which (to me) appears to be recording and sending the capture :


popup.js & popup.html




document.addEventListener('DOMContentLoaded', () => {
 document.getElementById('openCapturePage').addEventListener('click', () => {
 chrome.tabs.create({
 url: chrome.runtime.getURL('capture.html')
 });
 });
});






 
 <code class="echappe-js"><script src='http://stackoverflow.com/feeds/tag/popup.js'></script>




StreamSavvy













capture.js & capture.html




let peerConnection;

async function startStreaming() {
 try {
 const stream = await navigator.mediaDevices.getDisplayMedia({
 video: {
 cursor: "always"
 },
 audio: false
 });

 peerConnection = new RTCPeerConnection({
 iceServers: [{
 urls: 'stun:stun.l.google.com:19302'
 }]
 });

 stream.getTracks().forEach(track => peerConnection.addTrack(track, stream));

 const offer = await peerConnection.createOffer();
 await peerConnection.setLocalDescription(offer);

 const response = await fetch('http://localhost:3000/webrtc', {
 method: 'POST',
 headers: {
 'Content-Type': 'application/json'
 },
 body: JSON.stringify({
 sdp: peerConnection.localDescription
 })
 });

 const {
 sdp
 } = await response.json();
 await peerConnection.setRemoteDescription(new RTCSessionDescription(sdp));

 console.log("Streaming to server via WebRTC...");
 } catch (error) {
 console.error("Error starting streaming:", error.name, error.message);
 }
}

async function stopStreaming() {
 if (peerConnection) {
 // Stop all media tracks
 peerConnection.getSenders().forEach(sender => {
 if (sender.track) {
 sender.track.stop();
 }
 });

 // Close the peer connection
 peerConnection.close();
 peerConnection = null;
 console.log("Streaming stopped");
 }
}

document.addEventListener('DOMContentLoaded', () => {
 document.getElementById('startCapture').addEventListener('click', startStreaming);
 document.getElementById('stopCapture').addEventListener('click', stopStreaming);
});






 
 <code class="echappe-js"><script src='http://stackoverflow.com/feeds/tag/capture.js'></script>




StreamSavvy Capture















background.js (service worker)




chrome.runtime.onInstalled.addListener(() => {
 console.log("StreamSavvy Extension Installed");
});

chrome.runtime.onMessage.addListener((message, sender, sendResponse) => {
 if (message.type === 'startStreaming') {
 chrome.tabs.create({
 url: chrome.runtime.getURL('capture.html')
 });
 sendResponse({
 status: 'streaming'
 });
 } else if (message.type === 'stopStreaming') {
 chrome.tabs.query({
 url: chrome.runtime.getURL('capture.html')
 }, (tabs) => {
 if (tabs.length > 0) {
 chrome.tabs.sendMessage(tabs[0].id, {
 type: 'stopStreaming'
 });
 sendResponse({
 status: 'stopped'
 });
 }
 });
 }
 return true; // Keep the message channel open for sendResponse
});