Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
ffmpeg result to a tempfile
26 septembre 2012, par user1165201I am new to python and ffmpeg. I have a following question to ask.
If I run the following command from command-line and it works.
ffmpeg -i 1.flv temp_filename
If I put it in a program
temp_file_handle, temp_filename = tempfile.mkstemp('.flv') command = "ffmpeg -i " + newvideo.location + " "+ temp_filename out = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True) res = out.communicate()
The generated video didn't write to the tem_filename. Why?
-
ffmpeg keyframe extraction
26 septembre 2012, par Shikhar ShrivastavI have been trying to extract keyframes from video using ffmpeg 0.11.1 . So far all the commands I have tried do not extract keyframes but return all the frames ie 25fps*total time number of frames in the output. I tried setting the keyint_min as 25 to make sure there is a amximum of 1 keyframe per second.
ffmpeg -vf select="eq(pict_type\,PICT_TYPE_I)" -g 250 -keyint_min 25 -i C:\test.mp4 -vsync 2 -f image2 C:\testTemp\thumbnails-%02d.jpeg
But still all the frames are returned.
Then i tried, to separate the keyframes by 20 seconds.
ffmpeg -i C:\test.mp4 -vf select='eq(pict_type\,I)*(isnan(prev_selected_t)+gte(t-prev_selected_t\,20))' -vsync 0 -f image2 C:\testTemp\%09d.jpg
Again same result, all the frames are returned.
What should I do?
-
Encode image to video using ffmpeg (sws_scale)
26 septembre 2012, par bahar_pI'm trying to encode an image to video using ffmpeg library. I have these global params:
//Global params AVCodec *codec; AVCodecContext *codecCtx; uint8_t *output_buffer; int output_buffer_size;
I divided the encoding to 3 methods: Initialize the encoder:
jint Java_com_camera_simpledoublewebcams2_CameraPreview_initencoder(JNIEnv* env,jobject thiz){ avcodec_register_all(); avcodec_init(); av_register_all(); int fps = 30; /* find the H263 video encoder */ codec = avcodec_find_encoder(CODEC_ID_H263); if (!codec) { LOGI("avcodec_find_encoder() run fail."); return -5; } //allocate context codecCtx = avcodec_alloc_context(); /* put sample parameters */ codecCtx->bit_rate = 400000; /* resolution must be a multiple of two */ codecCtx->width = 176; codecCtx->height = 144; /* frames per second */ codecCtx->time_base = (AVRational){1,fps}; codecCtx->pix_fmt = PIX_FMT_YUV420P; codecCtx->codec_id = CODEC_ID_H263; codecCtx->codec_type = AVMEDIA_TYPE_VIDEO; /* open it */ if (avcodec_open(codecCtx, codec) < 0) { LOGI("avcodec_open() run fail."); return -10; } //init buffer output_buffer_size = 500000; output_buffer = malloc(output_buffer_size); return 0;
}
Encoding the image:
jint Java_com_camera_simpledoublewebcams2_CameraPreview_encodejpeg(JNIEnv* env,jobject thiz,jchar* cImage, jint imageSize){ int out_size; AVFrame *picture; AVFrame *outpic; uint8_t *outbuffer; //allocate frame picture = avcodec_alloc_frame(); outpic = avcodec_alloc_frame(); int nbytes = avpicture_get_size(PIX_FMT_YUV420P, codecCtx->width, codecCtx->height); outbuffer = (uint8_t*)av_malloc(nbytes); outpic->pts = 0; //fill picture with image avpicture_fill((AVPicture*)picture, (uint8_t*)cImage, PIX_FMT_RGBA, codecCtx->width, codecCtx->height); //fill outpic with empty image avpicture_fill((AVPicture*)outpic, outbuffer, PIX_FMT_YUV420P, codecCtx->width, codecCtx->height); //rescale the image struct SwsContext* fooContext = sws_getContext(codecCtx->width, codecCtx->height, PIX_FMT_RGBA, codecCtx->width, codecCtx->height, PIX_FMT_YUV420P, SWS_FAST_BILINEAR, NULL, NULL, NULL); sws_scale(fooContext, picture->data, picture->linesize, 0, codecCtx->height, outpic->data, outpic->linesize); //encode the image out_size = avcodec_encode_video(codecCtx, output_buffer, output_buffer_size, outpic); out_size += avcodec_encode_video(codecCtx, output_buffer, output_buffer_size, outpic); //release pictures av_free(outbuffer); av_free(picture); av_free(outpic); return out_size;
}
And closing the encoder:
void Java_com_camera_simpledoublewebcams2_CameraPreview_closeencoder(JNIEnv* env,jobject thiz){ free(output_buffer); avcodec_close(codecCtx); av_free(codecCtx);
}
When I send the first image, I get a result from the encoder. When I try to send another image the program crashes. I tried calling init once and then the images, then the close - didn't work. I tried calling the init and the close for every image - didn't work.
Any suggestions?
Thanks!
EDIT: After further research I found that the problem is at
sws_scale
method. Still don't know what is causing this issue... -
Compiling ffmpeg for use in Android (ndk) for x86 architecture
26 septembre 2012, par Ankur22I am trying to compile ffmpeg for Android but for the x86 architecture. I've successfully compiled an arm .so and it all works, and I though that by replacing arm with x86 in the relevant places within the build script would have done the trick. Unfortunately that's not the case and I get some strange things occurring.
First I get this while it's compiling ffmpeg:
/home/ankur/android-ndk-r8/platforms/android-8/arch-arm//usr/include/strings.h:49: warning: redundant redeclaration of 'index'
Notice the arch-arm instead of arch-x86. Finally I get this:
/home/ankur/android-ndk-r8/toolchains/x86-4.4.3/prebuilt/linux-x86/bin/i686-android-linux-ld: libavcodec/libavcodec.a(4xm.o): Relocations in generic ELF (EM: 40) /home/ankur/android-ndk-r8/toolchains/x86-4.4.3/prebuilt/linux-x86/bin/i686-android-linux-ld: libavcodec/libavcodec.a(4xm.o): Relocations in generic ELF (EM: 40) /home/ankur/android-ndk-r8/toolchains/x86-4.4.3/prebuilt/linux-x86/bin/i686-android-linux-ld: libavcodec/libavcodec.a(4xm.o): Relocations in generic ELF (EM: 40) libavcodec/libavcodec.a(4xm.o): could not read symbols: File in wrong format
The build script I've created looks like this:
#!/bin/bash NDK=~/android-ndk-r8 PLATFORM=$NDK/platforms/android-8/arch-x86/ PREBUILT=$NDK/toolchains/x86-4.4.3/prebuilt/linux-x86 function build_one_r8 { ./configure \ --disable-shared \ --enable-static \ --enable-gpl \ --enable-version3 \ --enable-nonfree \ --disable-doc \ --disable-ffmpeg \ --disable-ffplay \ --disable-ffprobe \ --disable-ffserver \ --disable-avdevice \ --disable-avfilter \ --disable-postproc \ --enable-small \ --cross-prefix=$PREBUILT/bin/i686-android-linux- \ --enable-cross-compile \ --target-os=linux \ --extra-cflags="-I$PLATFORM/usr/include" \ --arch=x86 \ --disable-symver \ --disable-debug \ --disable-stripping \ $ADDITIONAL_CONFIGURE_FLAG sed -i 's/HAVE_LRINT 0/HAVE_LRINT 1/g' config.h sed -i 's/HAVE_LRINTF 0/HAVE_LRINTF 1/g' config.h sed -i 's/HAVE_ROUND 0/HAVE_ROUND 1/g' config.h sed -i 's/HAVE_ROUNDF 0/HAVE_ROUNDF 1/g' config.h sed -i 's/HAVE_TRUNC 0/HAVE_TRUNC 1/g' config.h sed -i 's/HAVE_TRUNCF 0/HAVE_TRUNCF 1/g' config.h make clean make -j4 install $PREBUILT/bin/i686-android-linux-ar d libavcodec/libavcodec.a inverse.o $PREBUILT/bin/i686-android-linux-ld -rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -soname libffmpeg.so -shared -nostdlib -z,noexecstack -Bsymbolic --whole-archive --no-undefined -o $PREFIX/libffmpeg.so libavcodec/libavcodec.a libavformat/libavformat.a libavutil/libavutil.a libswscale/libswscale.a -lc -lm -lz -ldl -llog --warn-once --dynamic-linker=/system/bin/linker $PREBUILT/lib/gcc/i686-android-linux/4.4.3/libgcc.a } function build_one_r8_2 { $PREBUILT/bin/i686-android-linux-ar d libavcodec/libavcodec.a inverse.o $PREBUILT/bin/i686-android-linux-ld -rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -soname libffmpeg.so -shared -nostdlib -z,noexecstack -Bsymbolic --whole-archive --no-undefined -o $PREFIX/libffmpeg.so libavcodec/libavcodec.a libavformat/libavformat.a libavutil/libavutil.a libswscale/libswscale.a -lc -lm -lz -ldl -llog --warn-once --dynamic-linker=/system/bin/linker $PREBUILT/lib/gcc/i686-android-linux/4.4.3/libgcc.a } #x86 CPU=x86 OPTIMIZE_CFLAGS="-march=$CPU " PREFIX=./android/$CPU ADDITIONAL_CONFIGURE_FLAG= build_one_r8
I hope I'm doing something wrong rather than this not being possible.
Thanks!
-
Start using ffmpeg Android with NDK - Windows
26 septembre 2012, par idishI'm new to ffmpeg and I'm trying to compile it/use it's libraries in my Android App with Android NDK. I have everything ready: the NDK, the ffmpeg source, but I've been searching the web for long time trying to compile some ffmpeg into my app. I found this answer, but I don't really understand the instructions, I repeat, I'm using Windows, not a Linux based operating system. Is it that hard to make ffmpeg work in my android app? Could anyone please give me instructions to get it work?
Thank you.