
Advanced search
Medias (91)
-
Head down (wav version)
26 September 2011, by
Updated: April 2013
Language: English
Type: Audio
-
Echoplex (wav version)
26 September 2011, by
Updated: April 2013
Language: English
Type: Audio
-
Discipline (wav version)
26 September 2011, by
Updated: April 2013
Language: English
Type: Audio
-
Letting you (wav version)
26 September 2011, by
Updated: April 2013
Language: English
Type: Audio
-
1 000 000 (wav version)
26 September 2011, by
Updated: April 2013
Language: English
Type: Audio
-
999 999 (wav version)
26 September 2011, by
Updated: April 2013
Language: English
Type: Audio
Other articles (37)
-
Publier sur MédiaSpip
13 June 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Supporting all media types
13 April 2011, byUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats: images: png, gif, jpg, bmp and more audio: MP3, Ogg, Wav and more video: AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data: OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
-
Configuration spécifique pour PHP5
4 February 2011, byPHP5 est obligatoire, vous pouvez l’installer en suivant ce tutoriel spécifique.
Il est recommandé dans un premier temps de désactiver le safe_mode, cependant, s’il est correctement configuré et que les binaires nécessaires sont accessibles, MediaSPIP devrait fonctionner correctement avec le safe_mode activé.
Modules spécifiques
Il est nécessaire d’installer certains modules PHP spécifiques, via le gestionnaire de paquet de votre distribution ou manuellement : php5-mysql pour la connectivité avec la (...)
On other websites (3397)
-
Streaming client over TCP and RTSP through Wi-Fi or LAN in Android
6 January 2015, by GowthamI am struggling to develop streaming client for DVR camera’s, I tried with VLC Media player through RTSP protocol I got the solution (used Wi-Fi standard model like, Netgear etc.,), but the same code is not supporting for other Wi-Fi Modem’s, now am working with FFMPEG framework to implement the streaming client in android using JNI API. Not getting any proper idea to implement JNI api
Network Camera working with IP Cam Viewer App
code below,
/*****************************************************/
/* functional call */
/*****************************************************/
jboolean Java_FFmpeg_allocateBuffer( JNIEnv* env, jobject thiz )
{
// Allocate an AVFrame structure
pFrameRGB=avcodec_alloc_frame();
if(pFrameRGB==NULL)
return 0;
sprintf(debugMsg, "%d %d", screenWidth, screenHeight);
INFO(debugMsg);
// Determine required buffer size and allocate buffer
numBytes=avpicture_get_size(dstFmt, screenWidth, screenHeight);
/*
numBytes=avpicture_get_size(dstFmt, pCodecCtx->width,
pCodecCtx->height);
*/
buffer=(uint8_t *)av_malloc(numBytes * sizeof(uint8_t));
// Assign appropriate parts of buffer to image planes in pFrameRGB
// Note that pFrameRGB is an AVFrame, but AVFrame is a superset
// of AVPicture
avpicture_fill((AVPicture *)pFrameRGB, buffer, dstFmt, screenWidth, screenHeight);
return 1;
}
/* for each decoded frame */
jbyteArray Java_FFmpeg_getNextDecodedFrame( JNIEnv* env, jobject thiz )
{
av_free_packet(&packet);
while(av_read_frame(pFormatCtx, &packet)>=0) {
if(packet.stream_index==videoStream) {
avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
if(frameFinished) {
img_convert_ctx = sws_getContext(pCodecCtx->width, pCodecCtx->height, pCodecCtx->pix_fmt, screenWidth, screenHeight, dstFmt, SWS_BICUBIC, NULL, NULL, NULL);
/*
img_convert_ctx = sws_getContext(pCodecCtx->width, pCodecCtx->height, pCodecCtx->pix_fmt, pCodecCtx->width, pCodecCtx->height, dstFmt, SWS_BICUBIC, NULL, NULL, NULL);
*/
sws_scale(img_convert_ctx, (const uint8_t* const*)pFrame->data, pFrame->linesize,
0, pCodecCtx->height, pFrameRGB->data, pFrameRGB->linesize);
++frameCount;
/* uint8_t == unsigned 8 bits == jboolean */
jbyteArray nativePixels = (*env)->NewByteArray(env, numBytes);
(*env)->SetByteArrayRegion(env, nativePixels, 0, numBytes, buffer);
return nativePixels;
}
}
av_free_packet(&packet);
}
return NULL;
}
/*****************************************************/
/* / functional call */
/*****************************************************/
jstring
Java_FFmpeg_play( JNIEnv* env, jobject thiz, jstring jfilePath )
{
INFO("--- Play");
char* filePath = (char *)(*env)->GetStringUTFChars(env, jfilePath, NULL);
RE(filePath);
/*****************************************************/
AVFormatContext *pFormatCtx;
int i, videoStream;
AVCodecContext *pCodecCtx;
AVCodec *pCodec;
AVFrame *pFrame;
AVPacket packet;
int frameFinished;
float aspect_ratio;
struct SwsContext *img_convert_ctx;
INFO(filePath);
/* FFmpeg */
av_register_all();
if(av_open_input_file(&pFormatCtx, filePath, NULL, 0, NULL)!=0)
RE("failed av_open_input_file ");
if(av_find_stream_info(pFormatCtx)<0)
RE("failed av_find_stream_info");
videoStream=-1;
for(i=0; inb_streams; i++)
if(pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO) {
videoStream=i;
break;
}
if(videoStream==-1)
RE("failed videostream == -1");
pCodecCtx=pFormatCtx->streams[videoStream]->codec;
pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
if(pCodec==NULL) {
RE("Unsupported codec!");
}
if(avcodec_open(pCodecCtx, pCodec)<0)
RE("failed codec_open");
pFrame=avcodec_alloc_frame();
/* /FFmpeg */
INFO("codec name:");
INFO(pCodec->name);
INFO("Getting into stream decode:");
/* video stream */
i=0;
while(av_read_frame(pFormatCtx, &packet)>=0) {
if(packet.stream_index==videoStream) {
avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
if(frameFinished) {
++i;
INFO("frame finished");
AVPicture pict;
/*
img_convert_ctx = sws_getContext(pCodecCtx->width, pCodecCtx->height,
pCodecCtx->pix_fmt, pCodecCtx->width, pCodecCtx->height,
PIX_FMT_YUV420P, SWS_BICUBIC, NULL, NULL, NULL);
sws_scale(img_convert_ctx, (const uint8_t* const*)pFrame->data, pFrame->linesize,
0, pCodecCtx->height, pict.data, pict.linesize);
*/
}
}
av_free_packet(&packet);
}
/* /video stream */
av_free(pFrame);
avcodec_close(pCodecCtx);
av_close_input_file(pFormatCtx);
RE("end of main");
}I can’t able to get the frames from Network camera
And give some idea to implement the live stream client for DVR camera in Android
-
Unable to transfer continuous FFmpeg buffer to client browser using node.js
10 December 2016, by chintitomasudI have tried to process a Video file transcoding on demand by using FFmpeg to transfer the chunk(buffer) to the client browser as mp4 format but I failed to show the mp4 content on html5 video player . Without using ffmpeg all code run properly . I have replaced createReadSteam with ffmpeg . Using it I have faced some problems. FFmpeg is new to me and I ’m kind of confused with spawn method. When I post a url path it shows the following text on the command line
Spawning new process /samiul113039/1080.mp4:GET
piping ffmpeg output to client, pid 10016
HTTP connection disrupted, killing ffmpeg: 10016
Spawning new process /samiul113039/1080.mp4:GET
piping ffmpeg output to client, pid 4796
HTTP connection disrupted, killing ffmpeg: 4796
ffmpeg didn’t quit on q, sending signals ffmpeg has exited: 10016,
code null ffmpeg didn’t quit on q, sending signals ffmpeg has exited:
4796, code nul=
var fs=require('fs');
var url=require("url");
var urlvalue="http://csestudents.uiu.ac.bd/samiul113039/1080.mp4";
var parseurl=url.parse(urlvalue);
var HDHomeRunIP = parseurl.hostname;
var HDHomeRunPort = parseurl.port;
var childKillTimeoutMs = 1000;
var parseArgs = require('minimist')(process.argv.slice(2));
// define startsWith for string
if (typeof String.prototype.startsWith != 'function') {
// see below for better implementation!
String.prototype.startsWith = function (str){
return this.indexOf(str) == 0;
};
}
// Called when the response object fires the 'close' handler, kills ffmpeg
function responseCloseHandler(command) {
if (command.exited != true) {
console.log('HTTP connection disrupted, killing ffmpeg: ' + command.pid);
// Send a 'q' which signals ffmpeg to quit.
// Then wait half a second, send a nice signal, wait another half second
// and send SIGKILL
command.stdin.write('q\n');
command.stdin.destroy();
// install timeout and wait
setTimeout(function() {
if (command.exited != true) {
console.log('ffmpeg didn\'t quit on q, sending signals');
// still connected, do safe sig kills
command.kill();
try {
command.kill('SIGQUIT');
} catch (err) {}
try {
command.kill('SIGINT');
} catch (err) {}
// wait some more!
setTimeout(function() {
if (command.exited != true) {
console.log('ffmpeg didn\'t quit on signals, sending SIGKILL');
// at this point, just give up and whack it
try {
command.kill('SIGKILL');
} catch (err) {}
}
}, childKillTimeoutMs);
}
}, childKillTimeoutMs);
}
}
// Performs a proxy. Copies data from proxy_request into response
function doProxy(request,response,http,options) {
var proxy_request = http.request(options, function (proxy_response) {
proxy_response.on('data', function(chunk) {
response.write(chunk, 'binary');
});
proxy_response.on('end', function() {
response.end();
});
response.writeHead(proxy_response.statusCode, proxy_response.headers);
});
request.on('data', function(chunk) {
proxy_request.write(chunk, 'binary');
});
// error handler
proxy_request.on('error', function(e) {
console.log('problem with request: ' + e.message);
response.writeHeader(500);
response.end();
});
proxy_request.end();
}
var child_process = require('child_process');
var auth = require('./auth');
// Performs the transcoding after the URL is validated
function doTranscode(request,response) {
//res.setHeader("Accept-Ranges", "bytes");
response.setHeader("Accept-Ranges", "bytes");
response.setHeader("Content-Type", "video/mp4");
response.setHeader("Connection","close");
response.setHeader("Cache-Control","no-cache");
response.setHeader("Pragma","no-cache");
// always write the header
response.writeHeader(200);
// if get, spawn command stream it
if (request.method == 'GET') {
console.log('Spawning new process ' + request.url + ":" + request.method);
var command = child_process.spawn('ffmpeg',
['-i','http://csestudents.uiu.ac.bd/samiul113039/1080.mp4','-f','mpegts','-'],
{ stdio: ['pipe','pipe','ignore'] });
command.exited = false;
// handler for when ffmpeg dies unexpectedly
command.on('exit',function(code,signal) {
console.log('ffmpeg has exited: ' + command.pid + ", code " + code);
// set flag saying we've quit
command.exited = true;
response.end();
});
command.on('error',function(error) {
console.log('ffmpeg error handler - unable to kill: ' + command.pid);
// on well, might as well give up
command.exited = true;
try {
command.stdin.close();
} catch (err) {}
try {
command.stdout.close();
} catch (err) {}
try {
command.stderr.close();
} catch (err) {}
response.end();
});
// handler for when client closes the URL connection - stop ffmpeg
response.on('end',function() {
responseCloseHandler(command);
});
// handler for when client closes the URL connection - stop ffmpeg
response.on('close',function() {
responseCloseHandler(command);
});
// now stream
console.log('piping ffmpeg output to client, pid ' + command.pid);
command.stdout.pipe(response);
command.stdin.on('error',function(err) {
console.log("Weird error in stdin pipe ", err);
response.end();
});
command.stdout.on('error',function(err) {
console.log("Weird error in stdout pipe ",err);
response.end();
});
}
else {
// not GET, so close response
response.end();
}
}
// Load the http module to create an http server.
var http = require('http');
// Configure our HTTP server to respond with Hello World to all requests.
var server = http.createServer(function (request, response) {
//console.log("New connection from " + request.socket.remoteAddress + ":" + request.url);
if (auth.validate(request,response)) {
// first send a HEAD request to our HD Home Run with the same url to see if the address is valid.
// This prevents an ffmpeg instance to spawn when clients request invalid things - like robots.txt/etc
var options = {method: 'HEAD', hostname: HDHomeRunIP, port: HDHomeRunPort, path: request.url};
var req = http.request(options, function(res) {
// if they do a get, and it returns good status
if (request.method == "GET" &&
res.statusCode == 200 &&
res.headers["content-type"] != null &&
res.headers["content-type"].startsWith("video")) {
// transcode is possible, start it now!
doTranscode(request,response);
}
else {
// no video or error, cannot transcode, just forward the response from the HD Home run to the client
if (request.method == "HEAD") {
response.writeHead(res.statusCode,res.headers);
response.end();
}
else {
// do a 301 redirect and have the device response directly
// just proxy it, that way browser doesn't redirect to HDHomeRun IP but keeps the node.js server IP
options = {method: request.method, hostname: HDHomeRunIP, /* port: HDHomeRunPort, */path: request.url};
doProxy(request,response,http,options);
}
}
});
req.on('error', function(e) {
console.log('problem with request: ' + e.message);
response.writeHeader(500);
response.end();
});
// finish the client request, rest of processing done in the async callbacks
req.end();
}
});
// turn on no delay for tcp
server.on('connection', function (socket) {
socket.setNoDelay(true);
});
server.listen(7000);stdio: [’pipe’,’pipe’,’ignore’]
Actually the code was written by someone. i have just modified the code.
[’pipe’,’pipe’,’ignore’] what does pipe,pipe.ignore mean here, -
Best approach to real time http streaming to HTML5 video client
12 October 2016, by deandobI’m really stuck trying to understand the best way to stream real time output of ffmpeg to a HTML5 client using node.js, as there are a number of variables at play and I don’t have a lot of experience in this space, having spent many hours trying different combinations.
My use case is:
1) IP video camera RTSP H.264 stream is picked up by FFMPEG and remuxed into a mp4 container using the following FFMPEG settings in node, output to STDOUT. This is only run on the initial client connection, so that partial content requests don’t try to spawn FFMPEG again.
liveFFMPEG = child_process.spawn("ffmpeg", [
"-i", "rtsp://admin:12345@192.168.1.234:554" , "-vcodec", "copy", "-f",
"mp4", "-reset_timestamps", "1", "-movflags", "frag_keyframe+empty_moov",
"-" // output to stdout
], {detached: false});2) I use the node http server to capture the STDOUT and stream that back to the client upon a client request. When the client first connects I spawn the above FFMPEG command line then pipe the STDOUT stream to the HTTP response.
liveFFMPEG.stdout.pipe(resp);
I have also used the stream event to write the FFMPEG data to the HTTP response but makes no difference
xliveFFMPEG.stdout.on("data",function(data) {
resp.write(data);
}I use the following HTTP header (which is also used and working when streaming pre-recorded files)
var total = 999999999 // fake a large file
var partialstart = 0
var partialend = total - 1
if (range !== undefined) {
var parts = range.replace(/bytes=/, "").split("-");
var partialstart = parts[0];
var partialend = parts[1];
}
var start = parseInt(partialstart, 10);
var end = partialend ? parseInt(partialend, 10) : total; // fake a large file if no range reques
var chunksize = (end-start)+1;
resp.writeHead(206, {
'Transfer-Encoding': 'chunked'
, 'Content-Type': 'video/mp4'
, 'Content-Length': chunksize // large size to fake a file
, 'Accept-Ranges': 'bytes ' + start + "-" + end + "/" + total
});3) The client has to use HTML5 video tags.
I have no problems with streaming playback (using fs.createReadStream with 206 HTTP partial content) to the HTML5 client a video file previously recorded with the above FFMPEG command line (but saved to a file instead of STDOUT), so I know the FFMPEG stream is correct, and I can even correctly see the video live streaming in VLC when connecting to the HTTP node server.
However trying to stream live from FFMPEG via node HTTP seems to be a lot harder as the client will display one frame then stop. I suspect the problem is that I am not setting up the HTTP connection to be compatible with the HTML5 video client. I have tried a variety of things like using HTTP 206 (partial content) and 200 responses, putting the data into a buffer then streaming with no luck, so I need to go back to first principles to ensure I’m setting this up the right way.
Here is my understanding of how this should work, please correct me if I’m wrong:
1) FFMPEG should be setup to fragment the output and use an empty moov (FFMPEG frag_keyframe and empty_moov mov flags). This means the client does not use the moov atom which is typically at the end of the file which isn’t relevant when streaming (no end of file), but means no seeking possible which is fine for my use case.
2) Even though I use MP4 fragments and empty MOOV, I still have to use HTTP partial content, as the HTML5 player will wait until the entire stream is downloaded before playing, which with a live stream never ends so is unworkable.
3) I don’t understand why piping the STDOUT stream to the HTTP response doesn’t work when streaming live yet if I save to a file I can stream this file easily to HTML5 clients using similar code. Maybe it’s a timing issue as it takes a second for the FFMPEG spawn to start, connect to the IP camera and send chunks to node, and the node data events are irregular as well. However the bytestream should be exactly the same as saving to a file, and HTTP should be able to cater for delays.
4) When checking the network log from the HTTP client when streaming a MP4 file created by FFMPEG from the camera, I see there are 3 client requests: A general GET request for the video, which the HTTP server returns about 40Kb, then a partial content request with a byte range for the last 10K of the file, then a final request for the bits in the middle not loaded. Maybe the HTML5 client once it receives the first response is asking for the last part of the file to load the MP4 MOOV atom? If this is the case it won’t work for streaming as there is no MOOV file and no end of the file.
5) When checking the network log when trying to stream live, I get an aborted initial request with only about 200 bytes received, then a re-request again aborted with 200 bytes and a third request which is only 2K long. I don’t understand why the HTML5 client would abort the request as the bytestream is exactly the same as I can successfully use when streaming from a recorded file. It also seems node isn’t sending the rest of the FFMPEG stream to the client, yet I can see the FFMPEG data in the .on event routine so it is getting to the FFMPEG node HTTP server.
6) Although I think piping the STDOUT stream to the HTTP response buffer should work, do I have to build an intermediate buffer and stream that will allow the HTTP partial content client requests to properly work like it does when it (successfully) reads a file? I think this is the main reason for my problems however I’m not exactly sure in Node how to best set that up. And I don’t know how to handle a client request for the data at the end of the file as there is no end of file.
7) Am I on the wrong track with trying to handle 206 partial content requests, and should this work with normal 200 HTTP responses? HTTP 200 responses works fine for VLC so I suspect the HTML5 video client will only work with partial content requests?
As I’m still learning this stuff its difficult to work through the various layers of this problem (FFMPEG, node, streaming, HTTP, HTML5 video) so any pointers will be greatly appreciated. I have spent hours researching on this site and the net, and I have not come across anyone who has been able to do real time streaming in node but I can’t be the first, and I think this should be able to work (somehow!).