Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
No data written to stdin or stderr from ffmpeg
13 décembre 2017, par Mikkel BachmannI have a dummy client that is suppose to simulate a video recorder, on this client i want to simulate a video stream; I have gotten so far that i can create a video from bitmap images that i create in code.
The dummy client is a nodejs application running on an Raspberry Pi 3 with the latest version of raspian lite.
In order to use the video I have created, I need to get ffmpeg to dump the video to pipe:1. The problem is that I need the
-f rawvideo
as a input parameter, else ffmpeg can't understand my video, but when i have that parameter set ffmpeg refuses to write anything tostdio
ffmpeg is running with these parameters
ffmpeg -r 15 -f rawvideo -s 3840x2160 -pixel_format rgba -i pipe:0 -r 15 -vcodec h264 pipe:1
Can anybody help with a solution to my problem?
--Edit
Maybe i sould explain a bit more. The system i am creating is to be set up in a way, where instead of my stream server ask the video recorder for a video stream, it will be the recorder that tells the server that there is a stream.
-
FFMPEG ALSA xrun crash
13 décembre 2017, par Liam MartensI'm running a YouTube RTMP stream using FFMPEG with x11grab and an alsa loopback device but sometimes after let's say 20 hours there is an ALSA xrun and then the ffmpeg command crashes, but I'm not sure why or how this happens. (mind you the ffmpeg command does not run continuously it gets restarted automatically every so often, but the xrun makes the command crash causing the stream to go offline sometimes because a crash restart is not fast enough)
I'm using
thread_queue_size
and I've even manually compiled ffmpeg with a higherALSA BUFFER SIZE
, but the issue appears to persist still. Besides this I've also scoured many posts with people having similar issues but these never really seem to end up resolved.This is the stream command
ffmpeg -loglevel verbose -f alsa -thread_queue_size 12288 -ac 2 -i hw:Loopback,1,0 \ -probesize 10M -f x11grab -field_order tt -thread_queue_size 12288 -video_size 1280x720 -r 30 -i :1.1 \ -c:v libx264 -c:a libmp3lame -shortest -tune fastdecode -tune zerolatency \ -crf 26 -pix_fmt yuv420p -threads 0 -maxrate 2500k -bufsize 2500k -pass 1 -af aresample=async=1 \ -movflags +faststart -flags +global_header -preset ultrafast -r 30 -g 60 -b:v 2000k -b:a 192k -ar 44100 \ -f flv -rtmp_live live rtmp://a.rtmp.youtube.com/live2/{KEY}
Log excerpt
ffmpeg version N-89463-gc7a5e80 Copyright (c) 2000-2017 the FFmpeg developers built with gcc 6.3.0 (Debian 6.3.0-18) 20170516 configuration: --prefix=/usr --enable-avresample --enable-avfilter --enable-gpl --enable-libmp3lame --enable-librtmp --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libtheora --enable-postproc --enable-pic --enable-pthreads --enable-shared --disable-stripping --disable-static --enable-vaapi --enable-libopus --enable-libfreetype --enable-libfontconfig --enable-libpulse --disable-debug libavutil 56. 5.100 / 56. 5.100 libavcodec 58. 6.103 / 58. 6.103 libavformat 58. 3.100 / 58. 3.100 libavdevice 58. 0.100 / 58. 0.100 libavfilter 7. 7.100 / 7. 7.100 libavresample 4. 0. 0 / 4. 0. 0 libswscale 5. 0.101 / 5. 0.101 libswresample 3. 0.101 / 3. 0.101 libpostproc 55. 0.100 / 55. 0.100 Guessed Channel Layout for Input Stream #0.0 : stereo Input #0, alsa, from 'hw:Loopback,1,0': Duration: N/A, start: 1513163617.594224, bitrate: 1536 kb/s Stream #0:0: Audio: pcm_s16le, 48000 Hz, stereo, s16, 1536 kb/s Input #1, x11grab, from ':1.1': Duration: N/A, start: 1513163617.632434, bitrate: N/A Stream #1:0: Video: rawvideo, 1 reference frame (BGR[0] / 0x524742), bgr0(top first), 854x480, 30 fps, 30 tbr, 1000k tbn, 1000k tbc Parsing... Parsed protocol: 0 Parsed host : a.rtmp.youtube.com Parsed app : live2 RTMP_Connect1, ... connected, handshaking HandShake: Type Answer : 03 HandShake: Server Uptime : 0 HandShake: FMS Version : 4.0.0.1 HandShake: Handshaking finished.... RTMP_Connect1, handshaked Invoking connect HandleServerBW: server BW = 2500000 HandleClientBW: client BW = 10000000 2 HandleChangeChunkSize, received: chunk size change to 256 RTMP_ClientPacket, received: invoke 240 bytes (object begin) Property: Property: Property: (object begin) Property: 3,5,3,824> Property: Property: (object end) Property: (object begin) Property: Property: Property: Property: Property: (object begin) Property: (object end) (object end) (object end) HandleInvoke, server invoking <_result> HandleInvoke, received result for method call
Invoking releaseStream Invoking FCPublish Invoking createStream RTMP_ClientPacket, received: invoke 21 bytes (object begin) Property: Property: Property: NULL (object end) HandleInvoke, server invoking Invoking _checkbw RTMP_ClientPacket, received: invoke 29 bytes (object begin) Property: Property: Property: NULL Property: (object end) HandleInvoke, server invoking <_result> HandleInvoke, received result for method call Invoking publish RTMP_ClientPacket, received: invoke 73 bytes (object begin) Property: Property: Property: NULL Property: (object begin) Property: Property: (object end) (object end) HandleInvoke, server invoking HandleInvoke, onStatus: NetStream.Publish.Start Stream mapping: Stream #1:0 -> #0:0 (rawvideo (native) -> h264 (libx264)) Stream #0:0 -> #0:1 (pcm_s16le (native) -> mp3 (libmp3lame)) Press [q] to stop, [?] for help [graph 0 input from stream 1:0 @ 0x5607d087e060] w:854 h:480 pixfmt:bgr0 tb:1/30 fr:30/1 sar:0/1 sws_param:flags=2 [auto_scaler_0 @ 0x5607d087d800] w:iw h:ih flags:'bicubic' interl:0 [format @ 0x5607d087ed40] auto-inserting filter 'auto_scaler_0' between the filter 'Parsed_null_0' and the filter 'format' [auto_scaler_0 @ 0x5607d087d800] w:854 h:480 fmt:bgr0 sar:0/1 -> w:854 h:480 fmt:yuv420p sar:0/1 flags:0x4 [swscaler @ 0x5607d0880260] Warning: data is not aligned! This can lead to a speed loss [libx264 @ 0x5607d08684e0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX [libx264 @ 0x5607d08684e0] profile Constrained Baseline, level 3.1 [libx264 @ 0x5607d08684e0] 264 - core 148 r2748 97eaef2 - H.264/MPEG-4 AVC codec - Copyleft 2003-2016 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=2 lookahead_threads=2 sliced_threads=1 slices=2 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=60 keyint_min=6 scenecut=0 intra_refresh=0 rc_lookahead=0 rc=crf mbtree=0 crf=26.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 vbv_maxrate=1500 vbv_bufsize=1500 crf_max=0.0 nal_hrd=none filler=0 ip_ratio=1.40 aq=0 [graph_1_in_0_0 @ 0x5607d091c840] tb:1/48000 samplefmt:s16 samplerate:48000 chlayout:0x3 [Parsed_aresample_0 @ 0x5607d0916b40] ch:2 chl:stereo fmt:s16 r:48000Hz -> ch:2 chl:stereo fmt:s16p r:44100Hz Output #0, flv, to 'rtmp://a.rtmp.youtube.com/live2/{KEY}': Metadata: encoder : Lavf58.3.100 Stream #0:0: Video: h264 (libx264), 1 reference frame ([7][0][0][0] / 0x0007), yuv420p(top coded first (swapped)), 854x480, q=-1--1, 1000 kb/s, 30 fps, 1k tbn, 30 tbc Metadata: encoder : Lavc58.6.103 libx264 Side data: cpb: bitrate max/min/avg: 1500000/0/1000000 buffer size: 1500000 vbv_delay: -1 Stream #0:1: Audio: mp3 (libmp3lame) ([2][0][0][0] / 0x0002), 44100 Hz, stereo, s16p, delay 1105, 192 kb/s Metadata: encoder : Lavc58.6.103 libmp3lame frame= 29 fps=0.0 q=17.0 size= 146kB time=00:00:00.94 bitrate=1267.3kbits/s speed=1.86x frame= 44 fps= 44 q=18.0 size= 168kB time=00:00:01.46 bitrate= 942.4kbits/s speed=1.45x frame= 60 fps= 40 q=16.0 size= 191kB time=00:00:01.96 bitrate= 794.8kbits/s speed= 1.3x ... frame= 2740 fps= 30 q=17.0 size= 7993kB time=00:01:31.32 bitrate= 717.0kbits/s speed= 1x frame= 2755 fps= 30 q=18.0 size= 8013kB time=00:01:31.82 bitrate= 714.9kbits/s speed= 1x [alsa @ 0x5607d084d7e0] ALSA buffer xrun. -
Find timestamps of videos split using FFmpeg
13 décembre 2017, par Spandan ThakurI am able successfully able to split a video with FFmpeg using the below command
ffmpeg -i Sample.mp4 -c copy -map 0 -segment_time 900 -f segment -reset_timestamps 1 output%03d.mp4
This will split videos into 15 mins chunks (900 seconds) however this is not 100% accurate.
Is there a way to add the starting and ending timestamp of each segment, in the output file name, relative to the original file? Or generate a file containing the timestamps?
I want to know exactly where the cuts where made. I do not want to slow down the process though by forcing FFmpeg to cut at exactly 15 mins.
-
Unable to port FFmpeg C library into android
13 décembre 2017, par Navjot BediWhat I exactly want to-do: Access the ffmpeg.c file to modify the int main(int argc, char **argv) function to JNI and passing the command of ffmpeg as string.
I have tried to port ffmpeg C library to android(ARM processor). I followed following different ways to do this.
1st Try: using official ffmpeg installation documentation. Steps as follows
a) git clone git://source.ffmpeg.org/ffmpeg.git ffmpeg
b) Read the INSTALL file.
c) Download x264 library and build by using build_x264.sh which build successfully.
NDK=/home/nav/Work/android/ndk PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt PLATFORM=$NDK/platforms/android-8/arch-arm PREFIX=/home/nav/28ffmpeg/android-ffmpeg ./configure --prefix=$PREFIX --enable-static --enable-pic --disable-asm --disable-cli --host=arm-linux --cross-prefix=$PREBUILT/linux-x86/bin/arm-linux-androideabi- --sysroot=$PLATFORM make sudo make install sudo ldconfig
d) Then I download ffmpeg library from (http://ffmpeg.org/releases/ffmpeg-snapshot.tar.bz2) build it by using build_android.sh
#!/bin/bash NDK=/home/nav/Work/android/ndk PLATFORM=$NDK/platforms/android-8/arch-arm PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86 PREFIX=/home/nav/28ffmpeg/android-ffmpeg function build_one { ./configure --target-os=linux --prefix=$PREFIX \ --enable-cross-compile \ --enable-runtime-cpudetect \ --disable-asm \ --arch=arm \ --cc=$PREBUILT/bin/arm-linux-androideabi-gcc \ --cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \ --disable-stripping \ --nm=$PREBUILT/bin/arm-linux-androideabi-nm \ --sysroot=$PLATFORM \ --enable-nonfree \ --enable-version3 \ --disable-everything \ --enable-gpl \ --disable-doc \ --enable-avresample \ --enable-demuxer=rtsp \ --enable-muxer=rtsp \ --disable-ffplay \ --disable-ffserver \ --enable-ffmpeg \ --disable-ffprobe \ --enable-libx264 \ --enable-encoder=libx264 \ --enable-decoder=h264 \ --enable-protocol=rtp \ --enable-hwaccels \ --enable-zlib \ --disable-devices \ --disable-avdevice \ --extra-cflags="-I/home/android-ffmpeg/include -fPIC -DANDROID -D__thumb__ -mthumb -Wfatal-errors -Wno-deprecated -mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=armv7-a" \ --extra-ldflags="-L/home/android-ffmpeg/lib" make -j4 install $PREBUILT/bin/arm-linux-androideabi-ar d libavcodec/libavcodec.a inverse.o $PREBUILT/bin/arm-linux-androideabi-ld -rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -L$PREFIX/lib -soname libffmpeg.so -shared -nostdlib -z,noexecstack -Bsymbolic --whole-archive --no-undefined -o $PREFIX/libffmpeg.so libavcodec/libavcodec.a libavfilter/libavfilter.a libavresample/libavresample.a libavformat/libavformat.a libavutil/libavutil.a libswscale/libswscale.a -lc -lm -lz -ldl -llog -lx264 --warn-once --dynamic-linker=/system/bin/linker $PREBUILT/lib/gcc/arm-linux-androideabi/4.4.3/libgcc.a } build_one
e) Initially it works perfectly. But after that error comes:
libavformat/libavformat.a(log2_tab.o):(.rodata+0x0): multiple definition of `ff_log2_tab' libavcodec/libavcodec.a(log2_tab.o):(.rodata+0x0): first defined here libavutil/libavutil.a(log2_tab.o):(.rodata+0x0): multiple definition of `ff_log2_tab' libavcodec/libavcodec.a(log2_tab.o):(.rodata+0x0): first defined here build_android.sh: 48: build_one: not found
Result: libffmpeg.so not found.
2nd Try: Then I follow steps in http://dl.dropbox.com/u/22605641/ffmpeg_android/main.html-> Builds
a) I downloaded Pre-Build libffmpeg.so from above link.
b) Add libffmpeg.so to libs/armeabi/ .
c) Make Android.mk
include $(CLEAR_VARS) LOCAL_MODULE := ffmpeg LOCAL_SRC_FILES := libffmpeg.so include $(PREBUILT_SHARED_LIBRARY) LOCAL_PATH := $(call my-dir) include $(CLEAR_VARS) LOCAL_MODULE := main LOCAL_STATIC_LIBRARIES += ffmpeg LOCAL_SRC_FILES := ffmpeg-test.c include $(BUILD_SHARED_LIBRARY)
d) Then I do all ndk set up and all.Copy ffmpeg.c from library to ffmpeg-test.c by changing its int main function to my JNI functin and include all necessary include files.
Error:
Console:
/home/nav/Work/android/ndk/ndk-build all Prebuilt : libffmpeg.so <= jni/ Install : libffmpeg.so => libs/armeabi/libffmpeg.so Compile thumb : main <= ffmpeg-test.c jni/ffmpeg-test.c: In function 'print_report': jni/ffmpeg-test.c:1139:94: warning: incompatible implicit declaration of built-in function 'log2' [enabled by default] SharedLibrary : libmain.so /home/nav/Work/android/ndk/toolchains/arm-linux-androideabi-4.6/prebuilt/linux-x86/bin/../lib/gcc/arm-linux-androideabi/4.6/../../../../arm-linux-androideabi/bin/ld: ./obj/local/armeabi/objs/main/ffmpeg-test.o: in function check_keyboard_interaction:jni/ffmpeg-test.c:2496: error: undefined reference to 'qp_hist'
and many other undefined references.
-
Android : How to make camera capture video only of preview size and not full screen ?
13 décembre 2017, par Ashutosh TiwariI am working on an app that needs to capture square videos just like
Instagram
. I used many libraries for the purpose but none helped to capture the video itself in square size. I can make the preview look like square but not the recorded video. For that I have to useFFMPEG
library for cropping the video after it has been captured. But this process takes too long for 1 minute videos even with 480pvideo
capture quality. Someone please guide me to achieve this task.Libraries that I have used are: https://github.com/natario1/CameraView for cameraView and http://writingminds.github.io/ffmpeg-android-java/ for using ffmpeg.
Commands that I have tried for ffmpeg cropping operation are:
command = new String[]{"-y", "-f", "concat", "-safe", "0", "-i", "" + sdCardPathFile, "-c:v", "libx264", "-vf", "crop=" + getVideoResolution(), "-preset", "ultrafast", "-qscale", "0", "-crf", "28", "-c:a", "copy", "-flags", "+global_header", "" + joinedVideoFile.getAbsolutePath() }; String[] joinCommand = new String[]{ "-y", "-f", "concat", "-safe", "0", "-i", "" + sdCardPathFile, "-filter:v", "crop=480:480", "-preset", "superfast", "-c:a", "copy", "" + joinedVideoFile.getAbsolutePath() };