
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (53)
-
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Les images
15 mai 2013 -
Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur
8 février 2011, parLa visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
Configuration de la boite multimédia
Dès (...)
Sur d’autres sites (7686)
-
Firebase Functions : FFMPEG Images to Video [closed]
28 août 2020, par Vinayak VanarseI am testing Firebase cloud functions ability to process FFmpeg commands.


I have a set of images in .jpeg format (around 92). They have been named in sequenced e.g. Imgx_1.jpeg, Imgx_2.jpeg … Imgx_92.jpeg.


Each image is on an average 125kb to 200kb in size.


Approach : Pretty straightforward one ...


- 

- Download the images in cloud functions into /tmp folder (got from os.tmpdir()). For now I am doing it sequentially (I know not a good design but let’s go by it. Later can convert to promise.all for async processing).
- Spawn the FFMPEG command with input from /tmp for images and output to /tmp as video
- Push the /tmp video into bucket.








On my mac ... it works perfectly fine with the same command and files.


Issue :


- 

-
FFMPEG command exit with code 1 in firebase/google cloud function.


Spawn error : "name" :"ChildProcessError","code":1,"childProcess" :






Firebase functions (3.11.0) :


"engines": {
 "node": "10"
 },



Dependencies :


"child-process-promise": "^2.2.1",
"@ffmpeg-installer/ffmpeg": "^1.0.20",



Code Snippet to reproduce :


const spawn = require('child-process-promise').spawn;
const { Storage } = require('@google-cloud/storage');
var configStorage = {
 projectId: '<your project="project">',
 keyFilename: '<service>.json'
};
const gcs = new Storage(configStorage);
const path = require('path');
const os = require('os');
const fs = require('fs');
const mkdirp = require('mkdirp-promise');

const ffmpegPath = require('@ffmpeg-installer/ffmpeg').path;


Var idx = 1;
var ext = '.jpeg';
const bucket = gcs.bucket('Your_bucket');
const files = [<urls of="of" files="files">];
Promise.all(files.map(async (file) => {
 var finalName = 'Imgx' + '_' + idxs + ext;
 await file.download({ destination: tempDir + '/' + finalName });
 idx++;
})
).then(async (result) => {
 console.log(`cmd -> ffmpegPath ['-start_number 1' '-i'
 ${tempDir}/Imgx_%d${ext}
 ${tempDir}/video/videoFile.mp4']`);

 await spawn(ffmpegPath,
 ['-start_number 1', '-i', tempDir + '/Imgx_%d' + ext,
 tempDir + '/video/videoFile.mp4']);

 await bucket.upload(tempDir + '/video/videoFile.mp4', {
 destination: ‘video/output.mp4'
 });

});
</urls></service></your>


-
Unhandled stream error in pipe : write EPIPE in Node.js
13 juillet 2020, par Michael RomanenkoThe idea is to serve screenshots of RTSP video stream with Express.js server. There is a continuously running spawned openRTSP process in flowing mode (it's stdout is consumed by another ffmpeg process) :



function spawnProcesses (camera) {
 var openRTSP = spawn('openRTSP', ['-c', '-v', '-t', camera.rtsp_url]),
 encoder = spawn('ffmpeg', ['-i', 'pipe:', '-an', '-vcodec', 'libvpx', '-r', 10, '-f', 'webm', 'pipe:1']);

 openRTSP.stdout.pipe(encoder.stdin);

 openRTSP.on('close', function (code) {
 if (code !== 0) {
 console.log('Encoder process exited with code ' + code);
 }
 });

 encoder.on('close', function (code) {
 if (code !== 0) {
 console.log('Encoder process exited with code ' + code);
 }
 });

 return { rtsp: openRTSP, encoder: encoder };
}

...

camera.proc = spawnProcesses(camera);




There is an Express server with single route :



app.get('/cameras/:id.jpg', function(req, res){
 var camera = _.find(cameras, {id: parseInt(req.params.id, 10)});
 if (camera) {
 res.set({'Content-Type': 'image/jpeg'});
 var ffmpeg = spawn('ffmpeg', ['-i', 'pipe:', '-an', '-vframes', '1', '-s', '800x600', '-f', 'image2', 'pipe:1']);
 camera.proc.rtsp.stdout.pipe(ffmpeg.stdin);
 ffmpeg.stdout.pipe(res);
 } else {
 res.status(404).send('Not found');
 }
});

app.listen(3333);




When i request
http://localhost:3333/cameras/1.jpg
i get desired image, but from time to time app breaks with error :


stream.js:94
 throw er; // Unhandled stream error in pipe.
 ^
Error: write EPIPE
 at errnoException (net.js:901:11)
 at Object.afterWrite (net.js:718:19)




Strange thing is that sometimes it successfully streams image to
res
stream and closes child process without any error, but, sometimes, streams image and falls down.


I tried to create
on('error', ...)
event handlers on every possible stream, tried to changepipe(...)
calls toon('data',...)
constructions, but could not succeed.


My environment : node v0.10.22, OSX Mavericks 10.9.



UPDATE :



I wrapped
spawn('ffmpeg',...
block with try-catch :


app.get('/cameras/:id.jpg', function(req, res){
....
 try {
 var ffmpeg = spawn('ffmpeg', ['-i', 'pipe:', '-an', '-vframes', '1', '-s', '800x600', '-f', 'image2', 'pipe:1']);
 camera.proc.rtsp.stdout.pipe(ffmpeg.stdin);
 ffmpeg.stdout.pipe(res);
 } catch (e) {
 console.log("Gotcha!", e);
 }
....
});




... and this error disappeared, but log is silent, it doesn't catch any errors. What's wrong ?


-
How do I send a mediaStream from the electron renderer process to a background ffmpeg process ?
26 juillet 2020, par Samamoma_VadakopaGoal (to avoid the XY problem) :


I'm building a small linux desktop application using webRTC, electron, and create-react-app. The application should receive a mediaStream via a webRTC peer connection, display the stream to the user, create a virtual webcam device, and send the stream to the virtual webcam so it can be selected as the input on most major videoconferencing platforms.


Problem :


The individual parts all work : receiving the stream (webRTC), creating the webcam device (v4l2loopback), creating a child process of ffmpeg from within electron, passing the video stream to the ffmpeg process, streaming the video to the virtual device using ffmpeg, and selecting the virtual device and seeing the video stream in a videoconference meeting.


But I'm currently stuck on tying the parts together.
The problem is, the mediaStream object is available inside electron's renderer process (as state in a deeply nested react component, FWIW). As far as I can tell, I can only create a node.js child process of ffmpeg from within electron's main process. That implies that I need to get the mediaStream from the renderer to the main process. To communicate between processes, electron uses an IPC system. Unfortunately, it seems that IPC doesn't support sending a complex object like a video stream.


What I've tried :


- 

-
starting ffmpeg child process (using child_process.spawn) from within renderer process throws 'fs.fileexistssync' error. Browsing SO indicates that only the main process can start these background processes.


-
creating separate webRTC connection between renderer and main to re-stream the video. I'm using IPC to facilitate the connection, but offer/answer descriptions aren't reaching the other peer over IPC - my guess is this is due to the same limitations on IPC as before.








My next step is to create a separate node server on app startup which ingests the incoming RTC stream and rebroadcasts it to the app's renderer process, as well as to a background ffmpeg process.


Before I try that, though, does anyone have suggestions for approaches I should consider ? (this is my first SO question, so any advice on how to improve it is appreciated).


-