Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
Simulate concatenated file using hard link ?
16 décembre 2011, par SugrueI have multiple parts of a single file which I want a 3rd party c++/c# plugin to read as a single file. Basically, when the plugin file reader gets to the end of one file-part, I want it to continue to the next one.
(For anyone interested, the plugin is Aforge.Net FFMpeg, and I am trying to import VOB files)
It looks like quite a task to reprogram the plugin. An alternative solution is to copy the file parts to a concatenated file, but this is slow because I am dealing with many GBs of data.
Is it possible to use a file system hard link to point to multiple files? Or is there some other way to 'fake' a concatenated file? Using command line FFMpeg I can use 'type' to live stream a concatenate file in, but I can't figure out how to achieve this in c# with this plugin.
I am on Windows 7.
-
need to Setup ffmpeg in eclipse for windows
16 décembre 2011, par user1101687i am getting this error!
Building target: cyg.exe Invoking: Cygwin C Linker gcc -L"C:\cygwin\usr\libjjmpeg.so" -o "cyg.exe" ./tut.o -lavcodec /usr/lib/gcc/i686-pc-cygwin/4.5.3/../../../../i686-pc-cygwin/bin/ld: cannot find -lavcodec collect2: ld returned 1 exit status makefile:29: recipe for target `cyg.exe' failed make: *** [cyg.exe] Error 1
-
ffmpeg return black and white image
16 décembre 2011, par njaiVideo capture device is kworld dvd maker 2 (em2861) It returns the image in "black and white" with green line at the bottom of screen; but when I tested it with webcam, it worked properly.
Here is 30s video. http://www.youtube.com/watch?v=7q2wFGVwGGI
How can I convert it to color?
[root@localhost ~]# ffmpeg -f video4linux2 -i /dev/video0 -vcodec mpeg4 -y output.mp4 ffmpeg version git-2011-11-05-5fd1a69, Copyright (c) 2000-2011 the FFmpeg developers built on Nov 5 2011 21:10:52 with gcc 4.4.4 20100726 (Red Hat 4.4.4-13) configuration: --enable-libx264 --enable-gpl --enable-libvpx libavutil 51. 23. 0 / 51. 23. 0 libavcodec 53. 27. 0 / 53. 27. 0 libavformat 53. 18. 0 / 53. 18. 0 libavdevice 53. 4. 0 / 53. 4. 0 libavfilter 2. 47. 0 / 2. 47. 0 libswscale 2. 1. 0 / 2. 1. 0 libpostproc 51. 2. 0 / 51. 2. 0 [video4linux2,v4l2 @ 0x2a64780] Estimating duration from bitrate, this may be inaccurate Input #0, video4linux2,v4l2, from '/dev/video0': Duration: N/A, start: 1320637788.113946, bitrate: 165888 kb/s Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 720x576, 165888 kb/s, 25 tbr, 1000k tbn, 25 tbc Incompatible pixel format 'yuyv422' for codec 'mpeg4', auto-selecting format 'yuv420p' [buffer @ 0x2a63920] w:720 h:576 pixfmt:yuyv422 tb:1/1000000 sar:0/1 sws_param: [buffersink @ 0x2a63cc0] auto-inserting filter 'auto-inserted scale 0' between the filter 'src' and the filter 'out' [scale @ 0x2a64560] w:720 h:576 fmt:yuyv422 -> w:720 h:576 fmt:yuv420p flags:0x4 Output #0, mp4, to 'output.mp4': Metadata: encoder : Lavf53.18.0 Stream #0:0: Video: mpeg4 ( [0][0][0] / 0x0020), yuv420p, 720x576, q=2-31, 200 kb/s, 25 tbn, 25 tbc Stream mapping: Stream #0.0 -> #0.0 (rawvideo -> mpeg4) Press [q] to stop, [?] for help frame= 969 fps= 25 q=31.0 Lsize= 1547kB time=00:00:38.76 bitrate= 327.0kbits/s video:1538kB audio:0kB global headers:0kB muxing overhead 0.561411%
-
FFMPEG for iPhone recorded video encoding
16 décembre 2011, par SarahHi found lots and lots of links for video encoding through ffmpeg 1,2,3,4 etc but they all start with using terminal commands but when i try to implement any on terminal like:
git clone git://github.com/lajos/iFrameExtractor.git
it says that
-bash: git: command not found.Also as per my knowledge it is not possible to use terminal command on iPhone. Can anybody point out how to encode a video recorded through ffmpeg in
mp4
format and also to reduce the size of the video?Thanks in advance.EDIT: I am already implementing this method to resize my video and it successfully takes place and I am able to send the video on server but then on server side it's giving problem in retrieving the data and to use it.
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info { [self convertVideoToLowQuailtyWithInputURL:videoURL1 outputURL:[NSURL fileURLWithPath:videoStoragePath] handler:^(AVAssetExportSession *exportSession) { if (exportSession.status == AVAssetExportSessionStatusCompleted) { NSLog(@"%@",exportSession.error); printf("completed\n"); } else { NSLog(@"%@",exportSession.error); printf("error\n"); } }]; } - (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL outputURL:(NSURL*)outputURL handler:(void (^)(AVAssetExportSession*))handler { [[NSFileManager defaultManager] removeItemAtURL:outputURL error:nil]; AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil]; AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetLowQuality]; exportSession.outputURL = outputURL; exportSession.outputFileType = AVFileTypeQuickTimeMovie; [exportSession exportAsynchronouslyWithCompletionHandler:^(void) { handler(exportSession); [exportSession release]; }]; }
-
How do I dump the buffer when encoding H264 with FFMPEG ?
16 décembre 2011, par SeanI'm using a c++ library to write images captured from a webcam to an libx264 encoded mp4 file. The encoding is working properly but when it starts it writes 40 frames to the buffer. When I close the file these frames aren't flushed so about 6 seconds of video are left unwritten (cam is about 6fps).
So i'm calling:
out_size = libffmpeg::avcodec_encode_video( codecContext, data->VideoOutputBuffer,data->VideoOutputBufferSize, data->VideoFrame ); // if zero size, it means the image was buffered if ( out_size > 0 ) { //... write to file }
I can't see a way of accessing the images that are left in the buffer. Any ideas?