Newest 'x264' Questions - Stack Overflow
Les articles publiés sur le site
-
First/single frame encoded with ffmpeg/libavcodec library cannot be immediately decoded
11 septembre 2023, par Marek KijoI'm using libavcodec library and h264 codec to prepare the video stream on one end, transmit the encoded frames to the other PC and there decode it.
What I noticed after receiving very first packet (first encoded video frame) and feeding decoder with it, it is not possible to decode that frame. Only when I receive another frame the first one can be decoded but 'current' one not. So in the end I have constantly one frame delay on the decoder side.
I was trying different
preset
s (focusing rather on'ultrafast'
), also'zerolatency'
tune
, also whole variety ofbit_rate
values ofAVCodecContext
.I also tried to flush (with nullptr packet) after injecting first frame data, just to check if it is maybe because of some internal buffers optimization - the frame still not decoded. Experimenting with other codecs (like mpeg4) gives even worse 'dalay' in number of frames to the point when when first frames can become decodable.
Is it normal, unavoidable because of some internal mechanisms? Otherwise how I can achieve real zero latency.
Supplementary setup information:
max_b_frames
set to 0 (higher value gives even more delay)pix_fmt
set toAV_PIX_FMT_YUV420P
edit:
Answering some comment question:
(1) What is the decoder (or playback system)?
Custom decoder written using libavcodec, the decoded frames are later displayed on screen by OpenGL.
- initialization:
parser_ = av_parser_init(AV_CODEC_ID_H264); codec_ = avcodec_find_decoder(AV_CODEC_ID_H264); context_ = avcodec_alloc_context3(codec_); context_->width = 1024; context_->height = 768; context_->thread_count = 1; if ((codec_->capabilities & AV_CODEC_CAP_TRUNCATED) == 0) { context_->flags |= AV_CODEC_FLAG_TRUNCATED; } if (avcodec_open2(context_, codec_, nullptr) < 0) { throw std::runtime_error{"avcodec_open2 failed"}; } avcodec_flush_buffers(context_);
- then player periodically calls of the method of decoder that suppose to check if the another frame can be retrieved and displayed:
auto result = avcodec_receive_frame(context_, frame_); if (!buffer_.empty()) { // upload another packet for decoding int used; if (upload_package(buffer_.data(), buffer_.size(), &used)) { buffer_.erase(buffer_.begin(), buffer_.begin() + used); } } if (result == AVERROR(EAGAIN) || result == AVERROR_EOF || result < 0) { return false; } yuv_to_rgb(); return true;
boolean return value informs if the decoding succeeded, and every time the buffer where the incomming packets are stored is checked and uploaded to libavcodec decoder
- and that is how the method that uploads the buffer looks like:
bool upload_package(const void* data, const std::size_t size, int* used) { auto result = av_parser_parse2(parser_, context_, &packet_->data, &packet_->size, reinterpret_cast
(data), size, AV_NOPTS_VALUE, AV_NOPTS_VALUE, 0); if (result < 0) { return false; } *used = result; if (packet_->size != 0) { result = avcodec_send_packet(context_, packet_); if (result < 0) { return false; } } return true; } (2) If possible, save each one as a
.bin
file and then share the links with us for testing.I will try to figure out something...
(3) Show example C++ code of your encoder settings for H264...
- initialization:
codec_ = avcodec_find_encoder(AV_CODEC_ID_H264); context_ = avcodec_alloc_context3(codec_); context_->bit_rate = 1048576; // 1xMbit; context_->width = 1024; context_->height = 768; context_->time_base = {1, 30}; // 30 fps context_->pix_fmt = AV_PIX_FMT_YUV420P; context_->thread_count = 1; av_opt_set(context_->priv_data, "preset", "ultrafast", 0); av_opt_set(context_->priv_data, "tune", "zerolatency", 0); avcodec_open2(context_, codec_, nullptr); frame_->format = AV_PIX_FMT_YUV420P; frame_->width = 1024; frame_->height = 768; av_image_alloc(frame_->data, frame_->linesize, 1024, 768, AV_PIX_FMT_YUV420P, 32);
- frame encoding:
rgb_to_yuv(); frame_->pts = frame_num_++; auto result = avcodec_send_frame(context_, frame_); while (result >= 0) { result = avcodec_receive_packet(context_, packet_); if (result == AVERROR(EAGAIN) || result == AVERROR_EOF) { return; } else if (result < 0) { throw std::runtime_error{"avcodec_receive_packet failed"}; } // here the packet is send to the decoder, the whole packet is stored on the mentioned before buffer_ and uploaded with avcodec_send_packet // I can also add that the whole buffer/packet us uploaded at once stream_video_data(packet_->data, packet_->size); } av_packet_unref(packet_); }
edit2:
I think I figured out the issue that I had.
For every incoming data packet (encoded frame) I was calling first
av_parser_parse2
, and then I was sending the data throughavcodec_send_packet
. And I was not recalling that procedure having emptybuffer_
, so for the first frame data theav_parser_parse2
was never called after uploading it throughavcodec_send_packet
, for the second frame it was called and the first frame was parsed, so it could be properly decoded, but for that (second) frame the parse2 was also not called, and so on ...So the issue in my case was wrong sequence of
av_parser_parse2
andavcodec_send_packet
to handle the encoded data. -
building ffmpeg for windows x264.h : No such file or directory
22 août 2023, par 王伯荣I am building
ffmpeg
usingMSYS2
inwindows
system.My compilation package is "
i686-8.1.0-release-posix-dwarf
" ofmingw
.My configure option parameters are as follows:
./configure --prefix="D:\fkit\b" --disable-static --enable-shared --enable-libx264 --enable-gpl --enable-version3 --enable-nonfree --enable-ffserver --enable-ffplay --disable-debug --disable-iconv --disable-zlib --extra-ldflags="-LD:\fkit\xb\lib -lx264" --extra-cflags="-ID:\ fkit\xb\include" --pkgconfigdir="D:\fkit\xb\lib\pkgconfig"
As you can see, "D:fkit\xb" is my libx264 installation directory. I configured --extra-cflags and --extra-ldflags, and the configure script executed fine. But when I execute mingw32-make.exe, it prompts "
libavcodec/libx264.c:36:10: fatal error: x264.h: No such file or directory
"I don't have any ideas because everything I do seems normal.
-
Build FFMPEG with x264 for Android
26 mai 2023, par KageI am trying to build FFMPEG with libx264 for Android.
I can successfully build and use FFMPEG for Android but I realized that I need the ability to encode, therefore I am trying to build FFMPEG with x264.
I am using this tutorial to build FFmpeg for Android http://www.roman10.net/how-to-build-ffmpeg-for-android/
When trying to build FFMPEG I get an error:
"ERROR: libx264 not found"
And in my log it says:
"/usr/local/lib/libx264.a: could not read symbols: Archive has no index; run ranlib to add one..."
I have the latest versions of both FFMPEG and x264. I understand that FFMPEG looks for the header and libraries in usr/lib and usr/include, so in order to make it find x264 I use the cflags and ldflags:
- --extra-cflags = " -I/usr/local/include "
- --extra-ldflags = " -L/usr/local/lib "
I have tried building x264 with many different options that other people on the internet have said that i need. eg. --enable-shared, --enable-static, --disable-pthreads etc. Some forums say enable this, others say no disable that.
Any help would be much appreciated, Thanks
EDIT:
If I build FFmpeg with the simplest commands to include libx264 then it works. ie.
./configure --enable-gpl --enable-libx264 --extra-cflags="-I/usr/local/include" --extra-ldflags="-L/usr/local/lib" --enable-static --enable-shared
However I need it to work for Android. The script I am using is:
NDK=~/Desktop/android-ndk-r7 PLATFORM=$NDK/platforms/android-8/arch-arm/ PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/darwin-x86 function build_one { ./configure --target-os=linux \ --prefix=$PREFIX \ --enable-cross-compile \ --enable-shared \ --enable-static \ --extra-libs="-lgcc" \ --arch=arm \ --cc=$PREBUILT/bin/arm-linux-androideabi-gcc \ --cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \ --nm=$PREBUILT/bin/arm-linux-androideabi-nm \ --sysroot=$PLATFORM \ --extra-cflags=" -O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 -Dipv6mr_interface=ipv6mr_ifindex -fasm -Wno-psabi -fno-short-enums -fno-strict-aliasing -finline-limit=300 $OPTIMIZE_CFLAGS -I/usr/local/include" \ --extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L $PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog -L/usr/local/lib " \ --enable-gpl \ --enable-libx264 \ --disable-everything \ --enable-demuxer=mov \ --enable-demuxer=h264 \ --disable-ffplay \ --enable-protocol=file \ --enable-avformat \ --enable-avcodec \ --enable-decoder=rawvideo \ --enable-decoder=mjpeg \ --enable-decoder=h263 \ --enable-decoder=mpeg4 \ --enable-decoder=h264 \ --enable-encoder=mjpeg \ --enable-encoder=h263 \ --enable-encoder=mpeg4 \ --enable-encoder=h264 \ --enable-parser=h264 \ --disable-network \ --enable-zlib \ --disable-avfilter \ --disable-avdevice \ $ADDITIONAL_CONFIGURE_FLAG make clean make -j4 install $PREBUILT/bin/arm-linux-androideabi-ar d libavcodec/libavcodec.a inverse.o $PREBUILT/bin/arm-linux-androideabi-ld -rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -soname libffmpeg.so -shared -nostdlib -z,noexecstack -Bsymbolic --whole-archive --no-undefined -o $PREFIX/libffmpeg.so libavcodec/libavcodec.a libavformat/libavformat.a libavutil/libavutil.a libswscale/libswscale.a -lc -lm -lz -ldl -llog --warn-once --dynamic-linker=/system/bin/linker $PREBUILT/lib/gcc/arm-linux-androideabi/4.4.3/libgcc.a } CPU=armv7-a OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=vfpv3-d16 -marm -march=$CPU " PREFIX=./android/$CPU ADDITIONAL_CONFIGURE_FLAG= build_one
I am guessing that some option in my configure command is conflicting with enabling libx264
NOTE: If I remove --enable-libx264 then it works
-
Error encountered during encoding with 1 thread and decoding with multiple threads using FFmpeg in C
17 mai 2023, par RachelI'm currently working on a project where I'm using FFmpeg in C programming language for encoding and decoding video. I have configured the encoder to use 1 thread and the decoder to use multiple threads (specifically, 23 threads). However, when I run the code, I encounter the following error:
avcodec_receive_frame failure - Resource temporarily unavailable
Here are the relevant code snippets:
Encoder:
pt_handle->t_x264_param.i_threads = 1;
pt_handle->pt_x264_encoder = x264_encoder_open(&pt_handle->t_x264_param);
Decoder:
pt_handle->pt_avcodec_ctx->thread_count = 23;
i_retval = avcodec_open2(pt_handle->pt_avcodec_ctx, pt_h264_decoder, NULL);
I'm unsure about the cause of this error and how to resolve it. Any insights or suggestions on how to address this issue would be greatly appreciated. Thank you!
-
Gstreamer convert and display video v4l2 - tee problems in rust
27 mars 2023, par d3imI have USB grabber v4l2 source and I want to tee stream to autovideosink and x264enc to file (now as fake black hole)
When I disable one or another branch it works but together Pipeline goes:
Pipeline state changed from Null to Ready Pipeline state changed from Ready to Paused
and stays there never switches to Playing
gst-launch-1.0 with similar functionality works well.
gst::Element::link_many(&[&pw_video, &v_caps, &vid_queuey, &vid_tee]).unwrap(); gst::Element::link_many(&[&vid_queue1, &autovideoconvert, &vid_queuex, &autovideosink]).unwrap(); gst::Element::link_many(&[&vid_queue2, &autovideoconvert_x264, &vid_queue3, &x264, &vid_queue4, &fake]).unwrap(); let tee_display_pad = vid_tee.request_pad_simple("src_10").unwrap(); let vid_queue1_pad = vid_queue1.static_pad("sink").unwrap(); tee_display_pad.link(&vid_queue1_pad).unwrap(); let tee_convert_pad = vid_tee.request_pad_simple("src_20").unwrap(); let vid_queue2_pad = vid_queue2.static_pad("sink").unwrap(); tee_convert_pad.link(&vid_queue2_pad).unwrap();
How can I use tee in rust properly to have playable pipeline with two branches?
Update: I read some posts about increasing queue size, so I tried for this and then all queues:
let vid_queue1 = gst::ElementFactory::make("queue") .name("queue1") .property("max-size-buffers", 5000 as u32) .property("max-size-bytes", 1048576000 as u32) .property("max-size-time", 60000000000 as u64) .build() .expect("queue1");
but it didn't help so I tried set zero latency:
let x264 = gst::ElementFactory::make("x264enc") .name("x264") .property_from_str("speed-preset", "ultrafast") .property_from_str("pass", "qual") .property_from_str("tune", "zerolatency") .property("quantizer", 0 as u32) .property("threads", 8 as u32) .build() .expect("!x264");
and it works now. But comparable gst-launch-1.0 settings didn't had such option - only queues sizes increased.
Is there any other option than setting zerolatency?