Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
iPAD Streaming video to a ffmpeg server. front facing camera
6 décembre 2011, par IrishGringoThis is a video chat type program. My project is to write a native ObjC app that will stream video from the front facing camera to a server. This server will format and relay to be sent to another location. In a related question, I want to display video streaming from the server. The video server will probably be running ffmpeg for formating. But this question is just asking advice for the iPAD project. I wanted to get comments about issues I need to be thinking about.
This is my strategy: I was thinking of using AVFoundation framework to stream from the cam to a URL server. I don't know if I will be formatting on the client or no, so some comment there would be interesting. http://developer.apple.com/library/mac/#documentation/AVFoundation/Reference/AVFoundationFramework/_index.html
For streaming the video, I was going to be using: http://developer.apple.com/library/IOs/#documentation/AVFoundation/Reference/AVCaptureSession_Class/Reference/Reference.html#//apple_ref/occ/cl/AVCaptureSession
so if someone has some ideas/suggestions... extra code I can look at. I would appreciate it.
-
Webcam stream with FFMpeg on iPhone
6 décembre 2011, par SaphrositI'm trying to send and show a webcam stream from a linux server to an iPhone app. I don't know if it's the best solution, but I downloaded and installed FFMpeg on the linux server (following, for those who want to know, this tutorial). FFMpeg is working fine. After a lots of wandering, I managed to send a stream to the client launching
ffmpeg -s 320x240 -f video4linux2 -i /dev/video0 -f mpegts -vcodec libx264 udp://192.168.1.34:1234
where 192.168.1.34 is the address of the client. Actually the client is a Mac, but it is supposed to be an iPhone. I know the stream is sent and received correctly (tested in different ways).
However I didn't managed to watch the stream directly on the iPhone.
I thought of different (possible) solutions:first solution: store incoming data in a
NSMutableData
object. Then, when the stream ends, store it and then play it using aMPMoviePlayerController
. Here's the code:[video writeToFile:@"videoStream.m4v" atomically:YES]; NSURL *url = [NSURL fileURLWithPath:@"videoStream.m4v"]; MPMoviePlayerController *videoController = [[MPMoviePlayerController alloc] initWithContentURL:url]; [videoController.view setFrame:CGRectMake(100, 100, 150, 150)]; [self.view addSubview:videoController.view]; [videoController play];
the problem of this solution is that nothing is played (I only see a black square), even if the video is saved correctly (I can play it directly from my disk using VLC). Besides, it's not such a great idea. It's just to make things work.
Second solution: use
CMSampleBufferRef
to store the incoming video. Much more problems comes with this solution: first of all, there's noCoreMedia.framework
in my system. Besides I do not get well what does this class represents and what should I do to make it works: I mean if I start (somehow) filling this "SampleBuffer" with bytes I receive from UDP connection, then it will automatically call theCMSampleBufferMakeDataReadyCallback
function I set during creation? If yes, when? When the single frame is completed or when the whole stream is received?Third solution: use
AVFoundation
framework (neither this is actually available on my Mac). I did not understand if it's actually possible to start recording from a remote source or even from aNSMutableData
, achar*
or something like that. OnAVFoundation Programming Guide
I didn't find any reference that say if it's possible or not.
I don't know which one of this solution is the best for my purpose. ANY suggestion would be appreciate.Besides, there's also another problem: I didn't use any segmenter program to send the video. Now, if I'm not getting wrong, segmenter needs to split the source video in smaller/shorter video easier to send. If it is right, then maybe it's not strictly necessary to make things work (may be added later). However, since the server is running under linux, I cannot use Apple's mediastreamsegmeter. May someone suggest an opensource segmenter to use in association with FFMpeg?
UPDATE: I edited my question adding more informations on what I did since now and what my doubts are.
-
OpenGL ES glReadPixels exc_bad_access
6 décembre 2011, par YannyI'm trying to create video from images using OpenGL ES and ffmpeg, but on iPad(4.3) I have a crash on
glReadPixels
-(NSData *) glToUIImage { int numberOfComponents = NUMBER_OF_COMPONENTS; //4 int width = PICTURE_WIDTH; int height = PICTURE_HEIGHT; NSInteger myDataLength = width * height * numberOfComponents; NSMutableData * buffer= [NSMutableData dataWithLength :myDataLength]; [self checkForGLError]; GLenum type = NUMBER_OF_COMPONENTS == 3 ? GL_RGB : GL_RGBA; //RGBA glReadPixels(0, 0, width, height, type, GL_UNSIGNED_BYTE, [buffer mutableBytes]); //EXC_BAD_ACCESS here return buffer; }
It is working on iPhone 4 (4.3) and iPod Touch, but have problems on iPhone 3G(3.0) and iPad(4.3). Can you help me with this issue?
Also on iPhone 3G(3.0) and iPad(4.3) I have problems with Video - first 5-20 video frames have trash. Maybe issue with optimization? Or architecture?
EDITED Stack:
#0 0x33be3964 in void BlockNxN<64ul, 16ul, 1, BLOCK_CONVERTER_NULL_32>(unsigned long, int, int, unsigned long, int, int, unsigned int, unsigned int, unsigned int, unsigned int) () #1 0x33be1c76 in glrBIFDetile () #2 0x33b586b2 in sgxGetImage(SGXImageReadParams const*) () #3 0x33b50d38 in gldReadPixels () #4 0x31813e16 in glReadPixels_Exec () #5 0x31e3c518 in glReadPixels ()
-
ffmpeg2theora oggfwd not working with icecast2
6 décembre 2011, par achillesI have a camera streaming (mjpeg) in http://192.168.x.x/image (where x are the rest of the IP). I start my icecast2 server (Ubuntu 10.10) and then I stream using:
ffmpeg2theora -f mjpeg http://192.168.x.x/image -o /dev/stdout - | oggfwd localhost 8000 password /test
The mountpoint is created but the video is not showing on Firefox. I do see the video box but it's just infinitely showing the "thinking" icon and video does not show.
If I download a proper ogg file and do
cat proper_ogg_file.ogg | oggfwd localhost 8000 password /test
I see the video on the icecast server's website.
In addition I did:
ffmpeg2theora -f mjpeg http://192.168.x.x/image -o test_video.ogg
Once I stop the process (CTRL+C) and go to my Desktop where the video is saved and open it with VLC or any other media player, it plays the portion of the stream that I allowed to be recorded all the way up to pressing CTRL+C.
If I take that file and use the previous method:
cat test_video.ogg | oggfwd localhost 8000 password /test
I get the same issue as when I was directly piping the camera to stdout and then to oggfwd. So therefore I assume this is a "conversion" to ogg issue? Can anybody help? Any idea why i can't do that?
-
Why the converted videos file size is greater than original file size ?
6 décembre 2011, par svkI am using ffmpeg to convert the videos into mp4.Its working fine and its playing with high quality.No problem.But the worst case is I uploaded 14Mb file and after converting it goes to 30 Mb file.I am using the following the script to convert
exec("ffmpeg -i videowithaudio.flv -vcodec libx264 -vpre hq -vpre ipod640 -b 250k -bt 50k -acodec libfaac -ab 56k -ac 2 -s 480x320 video_out_file.mp4 > output1.txt 2> apperror1.txt"); //webkit compatible
I am using PHP for executing this command.Could you please help me how to reduce the file size from this 30Mb (nearly to uploaded file size is ok) with same quality.