Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
Android FFMPEG java concat multiple files
17 juillet 2015, par Kim TI'm using a fork of the cordova-plugin-video-editor library:
https://github.com/jbavari/cordova-plugin-video-editor/pull/13
Which uses the android-ffmpeg-java library here:
https://github.com/guardianproject/android-ffmpeg-java
The out of box cordova plugin example works well to encode a single video using the following code:
File tempFile = File.createTempFile("ffmpeg", null, appContext.getCacheDir()); FfmpegController ffmpegController = new FfmpegController(appContext, tempFile); TranscodeCallback tcCallback = new TranscodeCallback(); Clip clipIn = new Clip(videoSrcPath); Clip clipOut = new Clip(outputFilePath); clipOut.videoCodec = "libx264"; clipOut.videoFps = "24"; // tailor this to your needs clipOut.videoBitrate = 512; // 512 kbps - tailor this to your needs clipOut.audioChannels = 1; clipOut.width = outputWidth; clipOut.height = outputHeight; clipOut.duration = videoDuration; ffmpegController.processVideo(clipIn, clipOut, true, tcCallback);
This calls the android-ffmpeg-java code here:
They have a multiple file concat test example here:
So I have modified the cordova plugin code to match the example:
ArrayList
listVideos = new ArrayList (); Clip clip = new Clip(); clip.path = new File(videoSrcPath).getCanonicalPath(); ffmpegController.getInfo(clip); clip.duration = 5; listVideos.add(clip); Clip clip2 = new Clip(); clip2.path = new File(videoSrcPath2).getCanonicalPath(); ffmpegController.getInfo(clip2); clip2.duration = 5; listVideos.add(clip2); Clip clipOut = new Clip(); clipOut.path = new File(outputFilePath).getCanonicalPath(); ffmpegController.concatAndTrimFilesMP4Stream(listVideos, clipOut, false, false, new ShellUtils.ShellCallback() { @Override public void shellOut(String shellLine) { System.out.println("fc>" + shellLine); } @Override public void processComplete(int exitValue) { if (exitValue < 0) System.err.println("concat non-zero exit: " + exitValue); } }); However when run I get the error:
23:15:08.498 3218-3293/com.example.hello D/VideoEditor﹕ execute method starting 07-10 23:15:08.498 3218-3293/com.example.hello D/VideoEditor﹕ transcodeVideo firing 07-10 23:15:08.499 3218-3293/com.example.hello D/VideoEditor﹕ options: {"fileUri":"content:\/\/com.android.providers.media.documents\/document\/video%3A23389","fileUri2":"content:\/\/com.android.providers.media.documents\/document\/video%3A23390","outputFileName":"1436584506888","quality":2,"outputFileType":1,"optimizeForNetworkUse":1,"duration":2} 07-10 23:15:08.615 3218-3293/com.example.hello D/VideoEditor﹕ videoSrcPath: /storage/emulated/0/Movies/-a.mp4 07-10 23:15:08.615 3218-3293/com.example.hello D/VideoEditor﹕ videoSrcPath2: /storage/emulated/0/Movies/-b.mp4 07-10 23:15:08.618 3218-3293/com.example.hello V/VideoEditor﹕ outputFilePath: /storage/emulated/0/Movies/HelloWorld/VID_1436584506888.mp4 07-10 23:15:08.618 3218-3293/com.example.hello W/PluginManager﹕ THREAD WARNING: exec() call to VideoEditor.transcodeVideo blocked the main thread for 121ms. Plugin should use CordovaInterface.getThreadPool(). 07-10 23:15:09.126 3742-3742/? W/linker﹕ /data/data/com.example.hello/app_bin/ffmpeg has text relocations. This is wasting memory and prevents security hardening. Please fix. 07-10 23:15:09.506 3750-3750/? W/linker﹕ /data/data/com.example.hello/app_bin/ffmpeg has text relocations. This is wasting memory and prevents security hardening. Please fix. 07-10 23:15:09.836 3218-3264/com.example.hello I/System.out﹕ fc>/data/data/com.example.hello/app_bin/ffmpeg -y -t 0 0 : 0 0 : 5.000000 -i /storage/emulated/0/Movies/-a.mp4 -f mpegts -c copy -an -bsf:v h264_mp4toannexb /storage/emulated/0/Android/data/com.example.hello/cache/ffmpeg-246029513.tmp/0.ts 07-10 23:15:09.864 3758-3758/? W/linker﹕ /data/data/com.example.hello/app_bin/ffmpeg has text relocations. This is wasting memory and prevents security hardening. Please fix. 07-10 23:15:09.865 3218-3759/com.example.hello I/System.out﹕ fc>WARNING: linker: /data/data/com.example.hello/app_bin/ffmpeg has text relocations. This is wasting memory and prevents security hardening. Please fix. 07-10 23:15:09.869 3218-3759/com.example.hello I/System.out﹕ fc>ffmpeg version 0.11.1 Copyright (c) 2000-2012 the FFmpeg developers 07-10 23:15:09.870 3218-3759/com.example.hello I/System.out﹕ fc> built on Dec 22 2014 12:52:34 with gcc 4.6 20120106 (prerelease) 07-10 23:15:09.870 3218-3759/com.example.hello I/System.out﹕ fc> configuration: --arch=arm --cpu=cortex-a8 --target-os=linux --enable-runtime-cpudetect --prefix=/data/data/info.guardianproject.ffmpeg/app_opt --enable-pic --disable-shared --enable-static --cross-prefix=/home/n8fr8/dev/android/ndk/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86_64/bin/arm-linux-androideabi- --sysroot=/home/n8fr8/dev/android/ndk/platforms/android-16/arch-arm --extra-cflags='-I../x264 -mfloat-abi=softfp -mfpu=neon -fPIE -pie' --extra-ldflags='-L../x264 -fPIE -pie' --enable-version3 --enable-gpl --disable-doc --enable-yasm --enable-decoders --enable-encoders --enable-muxers --enable-demuxers --enable-parsers --enable-protocols --enable-filters --enable-avresample --enable-libfreetype --disable-indevs --enable-indev=lavfi --disable-outdevs --enable-hwaccels --enable-ffmpeg --disable-ffplay --disable-ffprobe --disable-ffserver --disable-network --enable-libx264 --enable-zlib 07-10 23:15:09.870 3218-3759/com.example.hello I/System.out﹕ fc> libavutil 51. 54.100 / 51. 54.100 07-10 23:15:09.870 3218-3759/com.example.hello I/System.out﹕ fc> libavcodec 54. 23.100 / 54. 23.100 07-10 23:15:09.870 3218-3759/com.example.hello I/System.out﹕ fc> libavformat 54. 6.100 / 54. 6.100 07-10 23:15:09.870 3218-3759/com.example.hello I/System.out﹕ fc> libavdevice 54. 0.100 / 54. 0.100 07-10 23:15:09.870 3218-3759/com.example.hello I/System.out﹕ fc> libavfilter 2. 77.100 / 2. 77.100 07-10 23:15:09.870 3218-3759/com.example.hello I/System.out﹕ fc> libswscale 2. 1.100 / 2. 1.100 07-10 23:15:09.870 3218-3759/com.example.hello I/System.out﹕ fc> libswresample 0. 15.100 / 0. 15.100 07-10 23:15:09.870 3218-3759/com.example.hello I/System.out﹕ fc> libpostproc 52. 0.100 / 52. 0.100 07-10 23:15:09.870 3218-3759/com.example.hello I/System.out﹕ fc>[NULL @ 0xb6421100] Unable to find a suitable output format for '0' 07-10 23:15:09.870 3218-3759/com.example.hello I/System.out﹕ fc>0: Invalid argument 07-10 23:15:09.891 3218-3264/com.example.hello I/System.out﹕ fc>/data/data/com.example.hello/app_bin/ffmpeg -y -t 0 0 : 0 0 : 5.000000 -i /storage/emulated/0/Movies/-b.mp4 -f mpegts -c copy -an -bsf:v h264_mp4toannexb /storage/emulated/0/Android/data/com.example.hello/cache/ffmpeg-246029513.tmp/1.ts 07-10 23:15:09.912 3762-3762/? W/linker﹕ /data/data/com.example.hello/app_bin/ffmpeg has text relocations. This is wasting memory and prevents security hardening. Please fix. 07-10 23:15:09.913 3218-3763/com.example.hello I/System.out﹕ fc>WARNING: linker: /data/data/com.example.hello/app_bin/ffmpeg has text relocations. This is wasting memory and prevents security hardening. Please fix. 07-10 23:15:09.917 3218-3763/com.example.hello I/System.out﹕ fc>ffmpeg version 0.11.1 Copyright (c) 2000-2012 the FFmpeg developers 07-10 23:15:09.917 3218-3763/com.example.hello I/System.out﹕ fc> built on Dec 22 2014 12:52:34 with gcc 4.6 20120106 (prerelease) 07-10 23:15:09.918 3218-3763/com.example.hello I/System.out﹕ fc> configuration: --arch=arm --cpu=cortex-a8 --target-os=linux --enable-runtime-cpudetect --prefix=/data/data/info.guardianproject.ffmpeg/app_opt --enable-pic --disable-shared --enable-static --cross-prefix=/home/n8fr8/dev/android/ndk/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86_64/bin/arm-linux-androideabi- --sysroot=/home/n8fr8/dev/android/ndk/platforms/android-16/arch-arm --extra-cflags='-I../x264 -mfloat-abi=softfp -mfpu=neon -fPIE -pie' --extra-ldflags='-L../x264 -fPIE -pie' --enable-version3 --enable-gpl --disable-doc --enable-yasm --enable-decoders --enable-encoders --enable-muxers --enable-demuxers --enable-parsers --enable-protocols --enable-filters --enable-avresample --enable-libfreetype --disable-indevs --enable-indev=lavfi --disable-outdevs --enable-hwaccels --enable-ffmpeg --disable-ffplay --disable-ffprobe --disable-ffserver --disable-network --enable-libx264 --enable-zlib 07-10 23:15:09.918 3218-3763/com.example.hello I/System.out﹕ fc> libavutil 51. 54.100 / 51. 54.100 07-10 23:15:09.918 3218-3763/com.example.hello I/System.out﹕ fc> libavcodec 54. 23.100 / 54. 23.100 07-10 23:15:09.918 3218-3763/com.example.hello I/System.out﹕ fc> libavformat 54. 6.100 / 54. 6.100 07-10 23:15:09.918 3218-3763/com.example.hello I/System.out﹕ fc> libavdevice 54. 0.100 / 54. 0.100 07-10 23:15:09.918 3218-3763/com.example.hello I/System.out﹕ fc> libavfilter 2. 77.100 / 2. 77.100 07-10 23:15:09.918 3218-3763/com.example.hello I/System.out﹕ fc> libswscale 2. 1.100 / 2. 1.100 07-10 23:15:09.918 3218-3763/com.example.hello I/System.out﹕ fc> libswresample 0. 15.100 / 0. 15.100 07-10 23:15:09.918 3218-3763/com.example.hello I/System.out﹕ fc> libpostproc 52. 0.100 / 52. 0.100 07-10 23:15:09.918 3218-3763/com.example.hello I/System.out﹕ fc>[NULL @ 0xb6321100] Unable to find a suitable output format for '0' 07-10 23:15:09.918 3218-3763/com.example.hello I/System.out﹕ fc>0: Invalid argument 07-10 23:15:09.940 3218-3264/com.example.hello I/System.out﹕ fc>/data/data/com.example.hello/app_bin/ffmpeg -y -i concat:/storage/emulated/0/Android/data/com.example.hello/cache/ffmpeg-246029513.tmp/0.ts|/storage/emulated/0/Android/data/com.example.hello/cache/ffmpeg-246029513.tmp/1.ts -c copy -an /storage/emulated/0/Movies/HelloWorld/VID_1436584506888.mp4 07-10 23:15:09.963 3766-3766/? W/linker﹕ /data/data/com.example.hello/app_bin/ffmpeg has text relocations. This is wasting memory and prevents security hardening. Please fix. 07-10 23:15:09.964 3218-3767/com.example.hello I/System.out﹕ fc>WARNING: linker: /data/data/com.example.hello/app_bin/ffmpeg has text relocations. This is wasting memory and prevents security hardening. Please fix. 07-10 23:15:09.971 3218-3767/com.example.hello I/System.out﹕ fc>ffmpeg version 0.11.1 Copyright (c) 2000-2012 the FFmpeg developers 07-10 23:15:09.972 3218-3767/com.example.hello I/System.out﹕ fc> built on Dec 22 2014 12:52:34 with gcc 4.6 20120106 (prerelease) 07-10 23:15:09.972 3218-3767/com.example.hello I/System.out﹕ fc> configuration: --arch=arm --cpu=cortex-a8 --target-os=linux --enable-runtime-cpudetect --prefix=/data/data/info.guardianproject.ffmpeg/app_opt --enable-pic --disable-shared --enable-static --cross-prefix=/home/n8fr8/dev/android/ndk/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86_64/bin/arm-linux-androideabi- --sysroot=/home/n8fr8/dev/android/ndk/platforms/android-16/arch-arm --extra-cflags='-I../x264 -mfloat-abi=softfp -mfpu=neon -fPIE -pie' --extra-ldflags='-L../x264 -fPIE -pie' --enable-version3 --enable-gpl --disable-doc --enable-yasm --enable-decoders --enable-encoders --enable-muxers --enable-demuxers --enable-parsers --enable-protocols --enable-filters --enable-avresample --enable-libfreetype --disable-indevs --enable-indev=lavfi --disable-outdevs --enable-hwaccels --enable-ffmpeg --disable-ffplay --disable-ffprobe --disable-ffserver --disable-network --enable-libx264 --enable-zlib 07-10 23:15:09.973 3218-3767/com.example.hello I/System.out﹕ fc> libavutil 51. 54.100 / 51. 54.100 07-10 23:15:09.973 3218-3767/com.example.hello I/System.out﹕ fc> libavcodec 54. 23.100 / 54. 23.100 07-10 23:15:09.974 3218-3767/com.example.hello I/System.out﹕ fc> libavformat 54. 6.100 / 54. 6.100 07-10 23:15:09.974 3218-3767/com.example.hello I/System.out﹕ fc> libavdevice 54. 0.100 / 54. 0.100 07-10 23:15:09.974 3218-3767/com.example.hello I/System.out﹕ fc> libavfilter 2. 77.100 / 2. 77.100 07-10 23:15:09.975 3218-3767/com.example.hello I/System.out﹕ fc> libswscale 2. 1.100 / 2. 1.100 07-10 23:15:09.976 3218-3767/com.example.hello I/System.out﹕ fc> libswresample 0. 15.100 / 0. 15.100 07-10 23:15:09.976 3218-3767/com.example.hello I/System.out﹕ fc> libpostproc 52. 0.100 / 52. 0.100 07-10 23:15:09.976 3218-3767/com.example.hello I/System.out﹕ fc>concat:/storage/emulated/0/Android/data/com.example.hello/cache/ffmpeg-246029513.tmp/0.ts|/storage/emulated/0/Android/data/com.example.hello/cache/ffmpeg-246029513.tmp/1.ts: Not a directory 07-10 23:15:09.981 3218-3264/com.example.hello D/VideoEditor﹕ transcode exception java.lang.Exception: There was a problem rendering the video: /storage/emulated/0/Movies/HelloWorld/VID_1436584506888.mp4 at org.ffmpeg.android.FfmpegController.concatAndTrimFilesMP4Stream(FfmpegController.java:1272) at org.apache.cordova.videoeditor.VideoEditor$1.run(VideoEditor.java:257) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587) at java.lang.Thread.run(Thread.java:818)
I'm not sure why it says 'Not a directory'? Surely it's valid because the files are created at the previous step?
-
Recording real-time video from images with FFmpeg
17 juillet 2015, par SolarnumI am really not sure what else I could be doing to achieve this. I'm trying to record the actions in one of the views in my Android app so that it can be played back later and show the previous actions in real time. The major problem (among others, because there is no way I'm doing this optimally) is that the video takes at least 4 times longer to make than it does to playback. If I ask FFmpeg to create a 5 second video the process will run in the background for 20 seconds and output a greatly accelerated 5 second video.
My current strategy is to use the
-loop 1
parameter on a single image file and continuously write a jpeg to that image file. (If someone has a better idea than this for feeding continuously updated image information to FFmpeg let me know)encodingThread = new Thread(new Runnable() { private boolean running = true; @Override public void run() { while (running) { try { Bitmap bitmap = makeBitmapFromView(); String filepath = Environment.getExternalStorageDirectory().getAbsolutePath() + "/test.jpg"; File file = new File(filepath); FileOutputStream fout = new FileOutputStream(file); bitmap.compress(Bitmap.CompressFormat.JPEG, 100, fout); fout.flush(); fout.close(); Thread.sleep(50); } catch (IOException e) { e.printStackTrace(); } catch (InterruptedException e) { running = false; } } } }); startVideoMaking(); encodingThread.start();
The startVideoMaking method is as follows:
private void startVideoMaking(){ ffmpeg.killRunningProcesses(); File outputFile = new File(Environment.getExternalStorageDirectory().getAbsolutePath() + "/testout.mp4"); String path = Environment.getExternalStorageDirectory().getAbsolutePath() + "/test.jpg"; String output = Environment.getExternalStorageDirectory().getAbsolutePath() + "/testout.mp4"; String command = "-loop 1 -t 5 -re -i " + path + " -c:v libx264 -loglevel verbose -vsync 0 -threads 0 -preset ultrafast -tune zerolatency -y -pix_fmt yuv420p " + output; executeFFM(command); }
Just to make it clear, the FFmpeg command that I am executing is
ffmpeg -loop 1 -re -i /storage/emulated/0/test.jpg -t 5 -c:v libx264 -loglevel verbose -vsync 0 -threads 0 -preset ultrafast -tune zerolatency -y -pix_fmt yuv420p /storage/emulated/0/testout.mp4
The
makeBitmapFromView()
method takes about 50ms to process and writing the bitmap to the sd card takes around 200ms, which is not great.I'm pretty lost as to what other solutions there would be to creating a video of a single view in Android. I know there is the MediaCodec class, but I couldn't get that to work and also it would raise my minimum sdk, which is not ideal. I'm also not sure that the MediaCodec class would even solve my problem.
Is there some way that I can get FFmpeg to create a 5 second video that is equivalent to 5 seconds of real time? I have also tried converting a single image, without updating it's content continuously and had the same results.
If my question isn't clear enough let me know.
-
How to using every 5 sec generate video output File Path to Encode with RTMP Formate write data in ios ? [on hold]
16 juillet 2015, par Sandeep Joshi(void) segmentRecording:(NSTimer*)timer { if (!shouldBeRecording) { [timer invalidate]; } AVAssetWriter *tempAssetWriter = self.assetWriter; AVAssetWriterInput *tempAudioEncoder = self.audioEncoder; AVAssetWriterInput *tempVideoEncoder = self.videoEncoder; self.assetWriter = queuedAssetWriter; self.audioEncoder = queuedAudioEncoder; self.videoEncoder = queuedVideoEncoder; NSLog(@"Switching encoders"); dispatch_async(segmentingQueue, ^{ if (tempAssetWriter.status == AVAssetWriterStatusWriting) { @try { [tempAudioEncoder markAsFinished]; [tempVideoEncoder markAsFinished]; [tempAssetWriter finishWritingWithCompletionHandler:^{ if (tempAssetWriter.status == AVAssetWriterStatusFailed) { [self showError:tempAssetWriter.error]; } else { [self uploadLocalURL:tempAssetWriter.outputURL]; } }]; } @catch (NSException *exception) { NSLog(@"Caught exception: %@", [exception description]); //[BugSenseController logException:exception withExtraData:nil]; } } self.segmentCount++; if (self.readyToRecordAudio && self.readyToRecordVideo) { NSError *error = nil; self.queuedAssetWriter = [[AVAssetWriter alloc] initWithURL:[OWUtilities urlForRecordingSegmentCount:segmentCount basePath:self.basePath] fileType:(NSString *)kUTTypeMPEG4 error:&error]; if (error) { [self showError:error]; } self.queuedVideoEncoder = [self setupVideoEncoderWithAssetWriter:self.queuedAssetWriter formatDescription:videoFormatDescription bitsPerSecond:videoBPS]; self.queuedAudioEncoder = [self setupAudioEncoderWithAssetWriter:self.queuedAssetWriter formatDescription:audioFormatDescription bitsPerSecond:audioBPS]; //NSLog(@"Encoder switch finished"); } });} (void) uploadLocalURL:(NSURL*)url { NSLog(@"upload local url: %@", url); NSString *inputPath = [url path]; NSString *outputPath = [inputPath stringByReplacingOccurrencesOfString:@".mp4" withString:@".ts"]; NSString *outputFileName = [outputPath lastPathComponent]; NSDictionary *options = @{kFFmpegOutputFormatKey: @"mpegts"}; NSLog(@"%@ conversion...", outputFileName); [ffmpegWrapper convertInputPath:[url path] outputPath:outputPath options:options progressBlock:nil completionBlock:^(BOOL success, NSError *error) { if (success) { if (!isRtmpConnected) { isRtmpConnected = [rtmp openWithURL:HOST_URL enableWrite:YES]; } isRtmpConnected = [rtmp isConnected]; if (isRtmpConnected) { NSData *video = [NSData dataWithContentsOfURL:[NSURL URLWithString:outputPath]]; NSUInteger length = [video length]; NSUInteger chunkSize = 1024 * 5;; NSUInteger offset = 0; NSLog(@"original video length: %lu \n chunkSize : %lu", length,chunkSize); // Let's split video to small chunks to publish to media server do { NSUInteger thisChunkSize = length - offset > chunkSize ? chunkSize : length - offset; NSData* chunk = [NSData dataWithBytesNoCopy:(char *)[video bytes] + offset length:thisChunkSize freeWhenDone:NO]; offset += thisChunkSize; // Write new chunk to rtmp server NSLog(@"%lu", (unsigned long)[rtmp write:chunk]); sleep(1); } while (offset < length); }else{ [rtmp close]; } } else { NSLog(@"conversion error: %@", error.userInfo); } }];}
This code use for live streaming for send data using RTMP Wrapper. Not write in Socket properly because every 5 second to generate different file output file.
This is proper way ?
I have no idea how to get NSData in proper way.
Please help me .
-
Segmentation fault with avcodec_encode_video2() while encoding H.264
16 juillet 2015, par Baris DemirayI'm trying to convert a
cv::Mat
to anAVFrame
to encode it then in H.264 and wanted to start from a simple example, as I'm a newbie in both. So I first read in a JPEG file, and then do the pixel format conversion withsws_scale()
fromAV_PIX_FMT_BGR24
toAV_PIX_FMT_YUV420P
keeping the dimensions the same, and it all goes fine until I callavcodec_encode_video2()
.I read quite a few discussions regarding an
AVFrame
allocation and the question segmetation fault while avcodec_encode_video2 seemed like a match but I just can't see what I'm missing or getting wrong.Here is the minimal code that you can reproduce the crash, it should be compiled with,
g++ -o OpenCV2FFmpeg OpenCV2FFmpeg.cpp -lopencv_imgproc -lopencv_highgui -lopencv_core -lswscale -lavutil -lavcodec -lavformat
It's output on my system,
cv::Mat [width=420, height=315, depth=0, channels=3, step=1260] I'll soon crash.. Segmentation fault
And that
sample.jpg
file's details byidentify
tool,~temporary/sample.jpg JPEG 420x315 420x315+0+0 8-bit sRGB 38.3KB 0.000u 0:00.000
Please note that I'm trying to create a video out of a single image, just to keep things simple.
#include
#include using namespace std; extern "C" { #include avcodec.h> #include swscale.h> #include avformat.h> } #include core/core.hpp> #include highgui/highgui.hpp> const string TEST_IMAGE = "/home/baris/temporary/sample.jpg"; int main(int /*argc*/, char** argv) { av_register_all(); avcodec_register_all(); /** * Initialise the encoder */ AVCodec *h264encoder = avcodec_find_encoder(AV_CODEC_ID_H264); AVFormatContext *cv2avFormatContext = avformat_alloc_context(); /** * Create a stream and allocate frames */ AVStream *h264outputstream = avformat_new_stream(cv2avFormatContext, h264encoder); avcodec_get_context_defaults3(h264outputstream->codec, h264encoder); AVFrame *sourceAvFrame = av_frame_alloc(), *destAvFrame = av_frame_alloc(); int got_frame; /** * Pixel formats for the input and the output */ AVPixelFormat sourcePixelFormat = AV_PIX_FMT_BGR24; AVPixelFormat destPixelFormat = AV_PIX_FMT_YUV420P; /** * Create cv::Mat */ cv::Mat cvFrame = cv::imread(TEST_IMAGE, CV_LOAD_IMAGE_COLOR); int width = cvFrame.size().width, height = cvFrame.size().height; cerr << "cv::Mat [width=" << width << ", height=" << height << ", depth=" << cvFrame.depth() << ", channels=" << cvFrame.channels() << ", step=" << cvFrame.step << "]" << endl; h264outputstream->codec->pix_fmt = destPixelFormat; h264outputstream->codec->width = cvFrame.cols; h264outputstream->codec->height = cvFrame.rows; /** * Prepare the conversion context */ SwsContext *bgr2yuvcontext = sws_getContext(width, height, sourcePixelFormat, h264outputstream->codec->width, h264outputstream->codec->height, h264outputstream->codec->pix_fmt, SWS_BICUBIC, NULL, NULL, NULL); /** * Convert and encode frames */ for (uint i=0; i < 250; i++) { /** * Allocate source frame, i.e. input to sws_scale() */ avpicture_alloc((AVPicture*)sourceAvFrame, sourcePixelFormat, width, height); for (int h = 0; h < height; h++) memcpy(&(sourceAvFrame->data[0][h*sourceAvFrame->linesize[0]]), &(cvFrame.data[h*cvFrame.step]), width*3); /** * Allocate destination frame, i.e. output from sws_scale() */ avpicture_alloc((AVPicture *)destAvFrame, destPixelFormat, width, height); sws_scale(bgr2yuvcontext, sourceAvFrame->data, sourceAvFrame->linesize, 0, height, destAvFrame->data, destAvFrame->linesize); /** * Prepare an AVPacket for encoded output */ AVPacket avEncodedPacket; av_init_packet(&avEncodedPacket); avEncodedPacket.data = NULL; avEncodedPacket.size = 0; // av_free_packet(&avEncodedPacket); w/ or w/o result doesn't change cerr << "I'll soon crash.." << endl; if (avcodec_encode_video2(h264outputstream->codec, &avEncodedPacket, destAvFrame, &got_frame) < 0) exit(1); cerr << "Checking if we have a frame" << endl; if (got_frame) av_write_frame(cv2avFormatContext, &avEncodedPacket); av_free_packet(&avEncodedPacket); av_frame_free(&sourceAvFrame); av_frame_free(&destAvFrame); } } Thanks in advance!
EDIT: And the stack trace after the crash,
Thread 2 (Thread 0x7fffe5506700 (LWP 10005)): #0 0x00007ffff4bf6c5d in poll () at /lib64/libc.so.6 #1 0x00007fffe9073268 in () at /usr/lib64/libusb-1.0.so.0 #2 0x00007ffff47010a4 in start_thread () at /lib64/libpthread.so.0 #3 0x00007ffff4bff08d in clone () at /lib64/libc.so.6 Thread 1 (Thread 0x7ffff7f869c0 (LWP 10001)): #0 0x00007ffff5ecc7dc in avcodec_encode_video2 () at /usr/lib64/libavcodec.so.56 #1 0x00000000004019b6 in main(int, char**) (argv=0x7fffffffd3d8) at ../src/OpenCV2FFmpeg.cpp:99
EDIT2: Problem was that I hadn't
avcodec_open2()
the codec as spotted by Ronald. Final version of the code is at https://github.com/barisdemiray/opencv2ffmpeg/, with leaks and probably other problems hoping that I'll improve it while learning both libraries. -
FFprobe Check Stream Link
16 juillet 2015, par KrasicI am trying to use FFprobe to test if a streaming link is active or not.
For example this is a working streaming link:
ffprobe -loglevel quiet -show_streams rtmp://Lrmh0w.cloud.influxis.com/yoy/_definst_/185
I do get output which mean link is active.
However, once I change link to something not work:
ffprobe -loglevel quiet -show_streams rtmp://Lrmh0w.cloud.influxis.com/yoy/_definst_/18555555555
The command keeps running in background with no result.
Is there a way to bypass this, or is there any ffprobe timeout parameter? I couldn't find it from the official website documentation.