Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
How to continuously extract video frames from streaming RTMP using avcong / ffmpeg ?
7 octobre 2013, par mvbl fstWe're dealing with streaming video on RTMP and my goal is to extract frames from the stream at a given interval, e.g. every 1 second.
Currently I run a command in a loop, which takes a frame and exports it as base64 JPEG:
avconv -i
-y -f image2 -ss 3 -vcodec mjpeg -vframes 1 -s sqcif /dev/stdout 2>/dev/null | base64 -w 0 But each of these processes is long (takes a few seconds -- which adds even more delay on streaming video that's not real time already). I am wondering if there is a way to make avconv or ffmpeg to extract frames at an interval (in seconds or frames) and either save as a file or dump to stdout.
I would really appreciate your help!
-
How can I mux (or encapsulate) H.264 RTP output into a container using FFMPEG ?
7 octobre 2013, par GradI am working on the effects of network losses in video transmission. In order to simulate the network losses I use a simple program which drops random RTP packets from the output of H.264 RTP encoding.
I use Joint Model (JM) 14.2 in order to encode the video. However, I don't use AnnexB format as my output, instead I choose the output as RTP packets. The JM output is generated as RTP packets with RTP headers and payload as a sequence. After that, some of RTP packets are dropped by using a simple program. Then, I can decode the output bitstream by using also JM and it's error concealment methods.
The main purpose of this process is to evaluate the differences created by network losses on the human video quality perception. In order to measure the perceived quality, the shown video must be in its decoded form (i.e. full resolution) or it can be decodable at the receiver side. The RTP packets created by the JM Encoder cannot be decoded without the JM software installed. However, with the proper header (or container) most video players are able to decode the bitstream. So, the my goal in this question is to encapsulate my encoded RTP packet bitstream in a common container such as AVI or MP4 to have my content decodable at the receiver computer.
The format of the encoded bitstream in RTP packetized form is as follows:
---------------------------------------------------------------------- | RTP Header #1 | RTP Payload #1 | RTP Header #2 | RTP Payload #2 |... ----------------------------------------------------------------------
In order to find the video quality, I want to make a subjective test with these bitstreams. I can make these test by using the full resolution data decoded by myself whereas it's very inconvenient to crowdsource this subjective test with GBs of video data on the Internet. So, I want to mux these bitstreams into a container (i.e. AVI) by using FFMPEG. I have tried to decode these bitstreams with FFMPEG and FFPLAY; however, both of them didn't work. I also tried the following command and it didn't work, either.
ffmpeg - f h264 -i -vcodec copy -r 25 out.avi
Which format or muxer should I use? Do I need to convert these files to any other format?
-
Failed to build Android FFmpeg using the NDK
6 octobre 2013, par Blaze TamaFirst, I'm a newbie in Ubuntu and Android's Ff-mpeg so please bear with me.
I'm using this library: https://github.com/appunite/AndroidFFmpeg and I'm trying to build it with Ubuntu 13.04 and Android NDK r9.
First, I'm getting a
C compiler cannot create exec
error. After some research (and a little struggling), I noticed that my version is 4.8, while the version inandroid_build.sh
is 4.4.3, so i changed all of those values and the error is gone.After that, I tried to build it again, but the
libffmpeg.so
was not built, which means I failed.I tried to see the config.log (
vo-amwrbenc's
log) and i found those errors :configure:4179: /home/tama/Documents/android-ndk-r9/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/tama/Documents/android-ndk-r9/platforms/android-5/arch-arm/ -c -marm -march=armv5 -marm -march=armv5 conftest.c >&5 conftest.c:61:29: error: expected ';', ',' or ')' before 'text' /home/tama/Documents/android-ndk-r9/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-ar: conftest.o: No such file or directory configure:7482: /home/tama/Documents/android-ndk-r9/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/tama/Documents/android-ndk-r9/platforms/android-5/arch-arm/ -std=gnu99 -E -marm -march=armv5 conftest.c conftest.c:11:28: fatal error: ac_nonexistent.h: No such file or directory configure:7527: $? = 0 configure:7541: /home/tama/Documents/android-ndk-r9/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/tama/Documents/android-ndk-r9/platforms/android-5/arch-arm/ -std=gnu99 -E -marm -march=armv5 conftest.c conftest.c:11:28: fatal error: ac_nonexistent.h: No such file or directory
And more errors which are contained in the C code.
Please kindly download my config.log (less than 50KB) if you need more information: https://www.dropbox.com/s/ptn1gvnik3v341y/config.log
I'm racing with time now, any help is appreciated.
-
Find the library calls from FFMPEG command line
6 octobre 2013, par BudiusI'm trying to create an Android app that will use video edition, thou, using FFMPEG for the task. I already successfully compiled FFMPEG as a library (libavcodec, libavformat, etc) and included them in the Android project.
Note that it does not contain the ffmpeg.c that can be called as a command line and the problem is that I only know the command lines to be used for all different things I want to accomplish.
So the question is:
from my Linux machine, how would I call ffmpeg
main()
in a "debug mode" to follow line-by-line what is being calling on those libraries, so I can write methods to mimic what I need to get done? (currently I only have Android Studio installed, but I'm open to install whatever IDE ppl might suggest) -
Procedure entry point couldn't be located in the DLL
6 octobre 2013, par Michael IVI am trying to launch release build(32bit) of my application in VS2012.On the start-up I am getting this error:
"Entry Point Not found" The procedure entry point __glewMatrixTranslatefEXT could not be located in the dynamic link library avcodec-55.dll.
Now, I am using OpenGL GLEW extension loader lib and FFMPEG libs in my software,and I can't get why the error asks for __glewMatrixTranslatefEXT ,which is OpenGL method defined in GLEW , to be present in avcodec-55.dll which is FFMPEG lib.Btw, it doesn't happen in debug build and not on Linux.
UPDATE:
I finally solved it based on this post ,so setting /OPT:NOREF in linker optimzation settings menu removed that error.It still would be nice to get an explanation why it happened.