Newest 'ffmpeg' Questions - Stack Overflow
Les articles publiés sur le site
-
Forcing custom H.264 intra-frames (keyframes) at encode-time ?
1er juin 2017, par Henry CookeI have a video sequence that I'd like to skip to specific frames at playback-time (my player is implemented using AVPlayer in iOS, but that's incidental). Since these frames will fall at unpredictable intervals, I can't use the standard "keyframe every N frames/seconds" functionality present in most video encoders. I do, however, know the target frames in advance.
In order to do this skipping as efficiently as possible, I need to force the target frames to be i-frames at encode time. Ideally in some kind of GUI which would let me scrub to a frame, mark it as a keyframe, and then (re)encode my video.
If such a tool isn't available, I have the feeling this could probably be done by rolling a custom encoder with libavcodec, but I'd rather use a higher-level (and preferably scriptable) tool to do the job if a GUI isn't possible. Is this the kind of task ffmpeg or mencoder can be bent to?
Does anybody have a technique for doing this? Also, it's entirely possible that this is an impossible task because of some fundamental ignorance I have of the h.264 codec. If so, please do put me right.
-
Portable YUV Drawing Context
1er juin 2017, par Leif AndersenI have a stream of YUV data (from a video file) that I want to stream to a screen in real time. (Basically, I want to write a program that plays the video in real time.)
As such, I am looking for a portable way to send YUV data to the screen. I would ideally like to use something portable so I don't have to reimplement it for every major platform.
I have found a few options, but all of them seem to have significant issues. They are:
- Use OpenGL directly, converting the YUV data to RGB. (And using the single quad for the whole screen trick.)
This obviously won't work because converting RGB to YUV on the CPU is going to be too slow for displaying images in real time.
- Use OpenGL, but use a shader to convert the YUV stream to RGB.
This option is a bit better. Although the problem here is that (afaict), this will involve making two streams and splicing them together. It might work, but may have issues with larger resolutions.
- Instead use SDL, which has the option of creating a YUV context directly.
The problem with this is I already am using a cross platform widget library for other aspects of my program (such as playback controls). As far as I can tell, SDL only opens up in its on (possibly borderless) window. I would ideally like my controls and drawing context to be in the same window. Which I can do with opengl, but not SDL.
- Use SDL, and also use something like Qt for the on screen widgets, use a message passing protocol to communicate between the two libraries. Have the (borderless) SDL window constantly move itself on top of the opengl window.
While this approach is clever, it seems like the two windows could get out of sink easily making the user experience sub-optimal.
- Forget a cross platform library, do thinks OS specific, making use of hardware acceleration if present.
This is a fine solution although its not cross platform.
As such, is there any good way to draw YUV data to a screen that ideally is:
- Portable (at least to the major platforms).
- Fast enough to be real time.
- Allows other widgets in the same window.
-
Segmentation fault on ffmpeg sws_scale
1er juin 2017, par Zach ZundelI'm trying to convert an AVFrame from a JPEG (YUV pixel format) to an RGB24 format using ffmpeg's
sws_scale
function. I set up the SwsContext as follows:struct SwsContext *sws_ctx = NULL; int frameFinished; AVPacket packet; // initialize SWS context for software scaling sws_ctx = sws_getContext(pCodecCtx->width, pCodecCtx->height, pCodecCtx->pix_fmt, pCodecCtx->width, pCodecCtx->height, AV_PIX_FMT_RGB24, SWS_BICUBIC, NULL, NULL, NULL );
And then, I perform the
sws_scale
, with the following commandsws_scale(sws_ctx, (uint8_t const * const *)pFrame->data, pFrame->linesize, 0, pCodecCtx->height, pFrameRGB->data, pFrameRGB->linesize);
which gives me a segfault, though I'm not sure why. I've tried examining the values through prints and the heights and linesizes and everything all appear to have valid values.
-
How with ffmpeg one char with different color ?
1er juin 2017, par FatasHow to ffmpeg one char with different color?
I have draw text with ffmpeg, but I can't find how I can change only one character's color. All text I can change, but I need one char with a different color.
I have this part of code:
ffmpeg -f lavfi \ -i color=c=white:s=1280x720 \ -vf "drawtext=text='aaaaaaaaBaaaaa':\ fontcolor=black:\ fontsize=24:\" -pix_fmt yuv420p -t 30 -y out.mp4 ");
But I can't figure out how need change letter
B
to red -
FFMPEG live buffer stream recording
1er juin 2017, par KencanaI am trying to record a live stream video by feeding buffer data to ffmpeg. I would like ffmpeg to keep recording the last buffer stream, when it's idling (no buffer data being passed). So I tried to use the
-stream-loop -1
options, but doesn't seems to work. Here is the snippets of the ffmpeg commandffmpeg -y -f rawvideo -video_size 1024x768 -re -fflags +genpts -stream_loop -1 -i - -vcodec libx264 -pix_fmt yuv420p output.mp4
Any idea on this?