
Recherche avancée
Autres articles (54)
-
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...) -
Les autorisations surchargées par les plugins
27 avril 2010, parMediaspip core
autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs
Sur d’autres sites (8821)
-
ffmpeg save mp3 file from available wss stream
11 juillet 2021, par phoenixAZIn a hello world node.js app I am succeeding in getting a feed from twilio conference and sending to the google speech to text. Concurrently I want to control recording to mp3 of the available audio stream (programmatically call start and stop). The was is subscribed to audio stream but I don't know how to attach ffmpeg to the local stream. I have tried :


// ffmpeg('rtsp://host:port/path/to/stream')
 //experimenting telling it to use the local stream
 //
 //ffmpeg(wss.addListener) //invlaid input error
 //ffmpeg(wss.stream) //thsi hits the console error below
 ffmpeg(wss.stream)
 .noVideo()
 .audioChannels(1)
 .audioBitrate(128)
 .duration('1:00')
 .on('end', function () { console.log('saved mp3'); })
 .on('error', function (err) { console.log('error mp3'); })
 .save('/path/to/output.mp3');



Any suggestions are welcomed. I am in a node.js project


-
Multiprocessing | Multithreading ffmpeg in python
13 octobre 2016, par lordcantideI am developing a python WSGI script to interface with an HDHomeRun Prime. In a perfect world it will pass URI values as commands to FFMPEG and display the resulting stream in a browser. I have the "show stuff in browser" and the "pass instructions to FFMPEG" parts working fine, but I do not have them working simultaneously.
1) Given that this middleware is being used to transcode MPEG-2 to h.264, does it make more sense to use multiprocessing or multithread to start and stop the respective processes ?
2) If the WSGI script is brokering the initiation of FFMPEG feeds (if the input feed isn’t already brokered) and connecting clients to the associated FFServer streams, does mean I’ll need to use some sort of pool to keep track of the middleware’s activities ?
-
pnacl-clang doesn't know where ffmpeg libraries are (but Eclipse does ?)
10 août 2014, par lavspratI’m trying to make my first "hello world"-like app using ffmpeg libraries. I already got NaCl SDK and downloaded & compiled the ffmpeg port.
This is my code :
main.c
#include <libavformat></libavformat>avformat.h>
int main()
{
av_register_all();
return 0;
}Building with
$ (...)/pnacl-clang main.c -o main -lavformat
in terminal.The output :
main.c:2:10: fatal error: 'libavformat/avformat.h' file not found
#include <libavformat></libavformat>avformat.h>
^Now, why am I not using
-L(...)\lib
and-I(...)\include
in the build command ? Because it should work without it. In my workplace nacl-clang somehow knows where the libs are and compiles everything successfully. Why is that not working on my personal computer ? How can I permanently let pnacl-clang know where to look for them ?