Newest 'ffmpeg' Questions - Stack Overflow
Articles published on the website
-
how does the FFmpegFrameRecorder record portrait video?
10 November 2013, by user1329261i hava use the javacv FFmpegFrameRecorder to record a video on android ,finally i found that this recorder only record landscape video,can anyone tell me how to record video in portrait direction?
thank you very much!
-
How to give a file path in FFmpeg for Android
10 November 2013, by ssrpI am developing a movie player using FFmpeg libraries. So far I have built FFmpeg for android and can call a native function through JNI. I want to give a file to be opened by FFMpeg open_file function, I want to know how to give a file to the function that is stored inside the external storage in the device. I am using HTC One x for debugging the app. And I know how to get the absolute path of the file in Java. Please help.
-
iOS- ffmpeg stop rtsp stream
9 November 2013, by user2964075I am developing an iOS application that streams rtsp feeds. My application can only stream one feed at a time. My question is how can I stop/close the stream so that I can load a new rtsp feed and start streaming it?
Thanks a lot.
-
FFMPEG: On Upload To Youtube The Video Starts At 0:06 (Shorter Than Original)
9 November 2013, by mashupI uploaded a video MP4 to Youtube which was created by FFMPEG
Unfortunately, all videos seem to be shorter than the original. I read all sorts of stuff about an moov atom and Youtube being unreliable, but I did not find the actual cause
MP4 is supposed to be the preferred file format: https://support.google.com/youtube/answer/1722171?hl=en
THE FILZE: Less Than 1MB! Original Length: 24 seconds, Once Uploaded: 18 seconds
-
I have an audio data stream from ffmpeg, how can I play it in a browser?
9 November 2013, by Conor PatrickI've been able to successfully stream live audio from my mic to my node server. I would now like to stream that to all connected clients. I have been trying to do it with web sockets.
I'm streaming the audio with this command
ffmpeg -f alsa -i hw:0 -acodec mp2 -f mp3 -r 30 http://localhost:8086
Node gets the buffer array and I write it to all connected clients like so with the 'ws' package
// HTTP Server to accept incomming MP3 Stream (audio) var audioServer = require('http').createServer( function(request, response) { audioSocket.broadcast(data, {binary:true}); }).listen(8086); var audioSocket = new (require('ws').Server)({port: 8088}); audioSocket.broadcast = function(data, opts) { for( var i in this.clients ) { this.clients[i].send(data); } };
Any idea of how I can play this data on a browser? I tried following this topic but the decodeAudioData() method fails.
My client side code
node={}; var audio = new WebSocket('ws://localhost:8088/'); audio.binaryType = "arraybuffer"; var context = new webkitAudioContext(); audio.onmessage = function(data){ node.buf=data.data; node.sync=0; node.retry=0; decode(node); } function syncStream(node){ // should be done by api itself. and hopefully will. var buf8 = new Uint8Array(node.buf); buf8.indexOf = Array.prototype.indexOf; var i=node.sync, b=buf8; while(1) { node.retry++; i=b.indexOf(0xFF,i); if(i==-1 || (b[i+1] & 0xE0 == 0xE0 )) break; i++; } if(i!=-1) { var tmp=node.buf.slice(i); //carefull there it returns copy delete(node.buf); node.buf=null; node.buf=tmp; node.sync=i; return true; } return false; } function decode(node) { context.decodeAudioData(node.buf, function(decoded){ node.source = context.createBufferSource(); node.source.connect(context.destination); node.source.buffer=decoded; node.source.noteOn(context.currentTime); console.log('IT WORKED! DECODED', decoded); }, function(){ // only on error attempt to sync on frame boundary //console.log('error'); if(syncStream(node)) decode(node); }); }