
Recherche avancée
Médias (1)
-
GetID3 - Bloc informations de fichiers
9 avril 2013, par
Mis à jour : Mai 2013
Langue : français
Type : Image
Autres articles (109)
-
Mediabox : ouvrir les images dans l’espace maximal pour l’utilisateur
8 février 2011, parLa visualisation des images est restreinte par la largeur accordée par le design du site (dépendant du thème utilisé). Elles sont donc visibles sous un format réduit. Afin de profiter de l’ensemble de la place disponible sur l’écran de l’utilisateur, il est possible d’ajouter une fonctionnalité d’affichage de l’image dans une boite multimedia apparaissant au dessus du reste du contenu.
Pour ce faire il est nécessaire d’installer le plugin "Mediabox".
Configuration de la boite multimédia
Dès (...) -
Les tâches Cron régulières de la ferme
1er décembre 2010, parLa gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
Le super Cron (gestion_mutu_super_cron)
Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...) -
Encoding and processing into web-friendly formats
13 avril 2011, parMediaSPIP automatically converts uploaded files to internet-compatible formats.
Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
All uploaded files are stored online in their original format, so you can (...)
Sur d’autres sites (8841)
-
how can i stop ffmpeg when no new frame come to /dev/video0
17 mai 2022, par itirou tarouI am now modifying stream from camera with our program(programA) and output as .ts files like below.


camera => programA => /dev/video0 => ffmpeg => .ts files


When camera stopped, programA stop automatically and no new frame arrive to /dev/video0.
I want ffmpeg stop automatically in such situation.
What ffmpeg option should I use ?
Currently I'm using option below, but ffmpeg doesn't stop.


ffmpeg -f video4linux2 -i /dev/video0 -vsync vfr -f hls -c:v h264_nvenc out.m3u8



Edited :

I tried to invalidate /dev/video0 but it failed like below.

# modprobe -r v4l2loopback
modprobe: FATAL: Module v4l2loopback is in use.



After I stopped programA and ffmpeg, the command success.

ProgramA is kicked by systemd service and restart automatically when it stopped.

ffmpeg is kicked by programB, but programB doesn't control programA.

But it sounds like good idea to inform B so B can kill ffmpeg.

I'll try that way. Thank you.

-
Live audio using ffmpeg, javascript and nodejs
8 novembre 2017, par klausI am new to this thing. Please don’t hang me for the poor grammar. I am trying to create a proof of concept application which I will later extend. It does the following : We have a html page which asks for permission to use the microphone. We capture the microphone input and send it via websocket to a node js app.
JS (Client) :
var bufferSize = 4096;
var socket = new WebSocket(URL);
var myPCMProcessingNode = context.createScriptProcessor(bufferSize, 1, 1);
myPCMProcessingNode.onaudioprocess = function(e) {
var input = e.inputBuffer.getChannelData(0);
socket.send(convertFloat32ToInt16(input));
}
function convertFloat32ToInt16(buffer) {
l = buffer.length;
buf = new Int16Array(l);
while (l--) {
buf[l] = Math.min(1, buffer[l])*0x7FFF;
}
return buf.buffer;
}
navigator.mediaDevices.getUserMedia({audio:true, video:false})
.then(function(stream){
var microphone = context.createMediaStreamSource(stream);
microphone.connect(myPCMProcessingNode);
myPCMProcessingNode.connect(context.destination);
})
.catch(function(e){});In the server we take each incoming buffer, run it through ffmpeg, and send what comes out of the std out to another device using the node js ’http’ POST. The device has a speaker. We are basically trying to create a 1 way audio link from the browser to the device.
Node JS (Server) :
var WebSocketServer = require('websocket').server;
var http = require('http');
var children = require('child_process');
wsServer.on('request', function(request) {
var connection = request.accept(null, request.origin);
connection.on('message', function(message) {
if (message.type === 'utf8') { /*NOP*/ }
else if (message.type === 'binary') {
ffm.stdin.write(message.binaryData);
}
});
connection.on('close', function(reasonCode, description) {});
connection.on('error', function(error) {});
});
var ffm = children.spawn(
'./ffmpeg.exe'
,'-stdin -f s16le -ar 48k -ac 2 -i pipe:0 -acodec pcm_u8 -ar 48000 -f aiff pipe:1'.split(' ')
);
ffm.on('exit',function(code,signal){});
ffm.stdout.on('data', (data) => {
req.write(data);
});
var options = {
host: 'xxx.xxx.xxx.xxx',
port: xxxx,
path: '/path/to/service/on/device',
method: 'POST',
headers: {
'Content-Type': 'application/octet-stream',
'Content-Length': 0,
'Authorization' : 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx',
'Transfer-Encoding' : 'chunked',
'Connection': 'keep-alive'
}
};
var req = http.request(options, function(res) {});The device supports only continuous POST and only a couple of formats (ulaw, aiff, wav)
This solution doesn’t seem to work. In the device speaker we only hear something like white noise.
Also, I think I may have a problem with the buffer I am sending to the ffmpeg std in -> Tried to dump whatever comes out of the websocket to a .wav file then play it with VLC -> it plays everything in the record very fast -> 10 seconds of recording played in about 1 second.
I am new to audio processing and have searched for about 3 days now for solutions on how to improve this and found nothing.
I would ask from the community for 2 things :
-
Is something wrong with my approach ? What more can I do to make this work ? I will post more details if required.
-
If what I am doing is reinventing the wheel then I would like to know what other software / 3rd party service (like amazon or whatever) can accomplish the same thing.
Thank you.
-
-
Loading and unloading C jni library based on when it's needed
30 septembre 2014, par AlinI finally managed to compile ffmpeg for android and I’ve been able to use it in my app.
Here is the scenario of my app :
- I show the user a gridview with thumbnails of images and videos
- the user can click on a cell and it is taken to image/video details where he can see the full image or play the video
- the user can apply an image over an video and this is when ffmpeg is used
So basically, the user might never actually use the watermarking option or he can do it very rare because the amount of available videos is way smaller than images.
I am loading the ffmpeg library, first time it is needed by running :
static {
System.loadLibrary("ffmpeglib");
}Now here are my questions :
- loading the library like this, uses app’s memory and resources ?
- can I unload the library, or better said, is it needed to unload it ? I have not found any java code like System.unloadLibrary to take care of unloading
- Since the library might be used rarely, wouldn’t a load => do encoding => unload be a better approach ? Or maybe having it loaded would allow easy reuse since no loading is necessary.
- If I use an IntentService to load the library and make the encoding, when the service completes the job, does the library gets unloaded ?