
Recherche avancée
Médias (1)
-
MediaSPIP Simple : futur thème graphique par défaut ?
26 septembre 2013, par
Mis à jour : Octobre 2013
Langue : français
Type : Video
Autres articles (40)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (6963)
-
Streaming h264 from a named pipe to a browser (how to set up the webserver)
22 août 2020, par DeanI have a h264 stream from a webcam : I set up this stream from a C library with
mkfifo
and I can display it with mplayer with the following line

execlp("xterm", "xterm", "-e", "mplayer", "-demuxer", "h264es", fifo_name, "-benchmark", "-really-quiet", NULL);



this will open an xterm window with an mplayer instance demuxing a h264 stream from the named pipe into it.


I must, somehow, stream this to a browser (one client is enough). Is there any way to achieve this by setting up a webserver on this same local machine, reading from this named pipe and sending the h264 stream to a remote browser ?


-
Did not able to pipe output of the ffmpeg using nodejs stdout
4 mars 2014, par rughimireI am not being able to pipe the output of the ffmpeg over a stdout.
Following are the block of code what I coded so far.
var http = require('http')
, fs = require('fs')
var child_process = require("child_process")
http.createServer(function (req, res) {
console.log("Request:", dump_req(req) , "\n")
// path of the
var path = 'test-mp4.mp4' //test-mp4-long.mp4
, stat = fs.statSync(path)
, total = stat.size
var range = req.headers.range
, parts = range.replace(/bytes=/, "").split("-")
, partialstart = parts[0]
, partialend = parts[1]
, start = parseInt(partialstart, 10)
, end = partialend ? parseInt(partialend, 10) : total-1
, chunksize = (end-start)+1
console.log('RANGE: ' + start + ' - ' + end + ' = ' + chunksize + "\n")
var ffmpeg = child_process.spawn("ffmpeg",[
"-i", path, // path
"-b:v" , "64k", // bitrate to 64k
"-bufsize", "64k",
"-" // Output to STDOUT
]);
//set header
res.writeHead(206
, { 'Content-Range': 'bytes ' + start + '-' + end + '/' + total
, 'Accept-Ranges': 'bytes', 'Content-Length': chunksize
, 'Content-Type': 'video/mp4'
})
stdout[ params[1] ] = ffmpeg.stdout
// Pipe the video output to the client response
ffmpeg.stdout.pipe(res);
console.log("Response", dump_res(res), "\n")
}).listen(1337)When i replaced the ffmpeg stuffs from above code, all works fine. Following is the part of the code when i replace the ffmpeg stuffs.
var file = fs.createReadStream(path, {start: start, end: end})
And piping like :
file.pipe(res)
What wrong I am running ?
Edit :
The ffmpeg command works fine. I have tested this through the command line and generating proper output. -
Unable to correctly format opencv raw video data to pipe to ffmpeg to transcode and stream over network
8 février 2014, par AnthonyAlatorreI am trying to pipe the output of a simple OpenCV program to ffmpeg using the following code.
`int main(){
VideoCapture cap(-1);
if(!cap.isOpened()){
cout << "Unable to capture webcam."<< endl;
return -1;
}
cap.set( cv::CAP_PROP_FRAME_WIDTH, 640 );
cap.set( cv::CAP_PROP_FRAME_HEIGHT, 480 );
bool done = false;
while(!done){
Mat frame;
cap >> frame;
if(frame.empty()){
cout << "Unable to read frame."<< endl;
return -1;
}
//imshow("frame",frame);
cout << frame.data;
if(cv::waitKey(25) >= 0){ done=true; }
}
destroyAllWindows();
return 0;
}`
I have tried using the commands from the following posts : Pipe raw OpenCV images to FFmpeg and How to avoid a growing delay with ffmpeg between sound and raw video data ?. But I still seem to be getting the incorrect output. I have also tried converting the BGR image to YCrCb and then splitting the YCrCb image to merge the same image back together however just switching the Cr & Cb channels. Here's the code for thatbool done = false;
while(!done){
Mat frame;
cap >> frame;
if(frame.empty()){
cout << "Unable to read frame."<< endl;
return -1;
}
//imshow("frame",frame);
// cout << frame.data;
cvtColor( frame, frame, cv::COLOR_BGR2YCrCb);
vector<mat> ycrcb(3);
split(frame,ycrcb);
vector<mat> newChannels;
newChannels.push_back(ycrcb[0]);
newChannels.push_back(ycrcb[2]);
newChannels.push_back(ycrcb[1]);
Mat res;
merge(newChannels,res);
//cout << res.data;
//imshow("res",res);
cout << res.data;
if(cv::waitKey(25) >= 0){ done=true; }
}
destroyAllWindows();
return 0;
}
</mat></mat>`
This seems to stop the video from stuttering and what not however the color is incorrect. However I am not sure if I have to do a 4:2:0 sub sampling on the newly converted YCrCb image. I have also gone through such post for help : Converting YUV into BGR or RGB in OpenCV and How to convert YUV422 (sub sampling) to YUV ?. I also have screenshots of the problems however I do not have enough rep points to post images. Any help would be appreciated and thank you very much for your time.