23:25
I'd like to embed bounding boxes, labels, etc. into live video stream to optionally draw them thereafter on the client-side in web-browser and looking for a way how to do.
Could you please give me a hint how to insert the meta-data "on fly" and which player might be used for it?
17:29
I'm trying to allow the user to export the scene as an mp4 (video format), the items of the scene consists of QGraphicsVideoItem and multiple QGraphicsTextItem, I need to export the scene as it will allow the user to save the video with the text items. I've found one of the ways to do this but the issue is that it will take hours for a simple 5 seconds videos as it saves every image to a byte to create a video, every image is a millisecond. If I change from millisecond to seconds it could speed up but the video will not look as smooth, is there a more efficient way of doing (...)
17:17
I am running ffmpeg in python as a subprocess. I want to burn the subtitles into a video. Just using ffmpeg (commandline without python in windows) the following works:
ffmpeg.exe" -y -i "input_file" -vf "subtitles= \\'input_file_path\\' :si=0" -acodec copy "output_file_path"
the \\ escape characters are required by ffmpeg for special characters within a filter, However trying to replicate this within a python subprocess has proved problematic, here is one of many failed attempts:
command = (...)
16:30
I am recording desktop video and audio with ffmpeg. I am controlling the process with c#. Simplified, my ffmpeg arguments are "ffmpeg -f gdigrab -framerate 30 -i desktop -preset ultrafast -pix_fmt yuv420p output.mp4"
I have some functionality to pause and resume the recording, essentially just suspending and resuming the process. This works, however, it leaves the paused section in the video, just one frame for the entire duration. To fix this, I added "-vf setpts=N/FR/TB" yo the arguments. This resolved that issue, but while recording my framerate was 1.3 fps rather than 30 and the (...)
15:07
I use ffmpeg to format videos according to tik-toke standards, but I don't know why videos are not cropped and lose quality.
const transcode = async () =>
const ffmpeg = ffmpegRef.current;
await ffmpeg.writeFile('input.mp4', await fetchFile(file));
await ffmpeg.exec([
'-i', 'input.mp4',
'-vf', 'scale=1080:1920,crop=ih*(9/16):ih',
'output.mp4'
]);
const data = await ffmpeg.readFile('output.mp4');
videoRef.current.src =
URL.createObjectURL(new Blob([data.buffer], (...)
13:13
I have a opengl buffer that I need to forward directly to ffmpeg to do the nvenc based h264 encoding.
My current way of doing this is glReadPixels to get the pixels out of the frame buffer and then passing that pointer into ffmpeg such that it can encode the frame into H264 packets for RTSP. However, this is bad because I have to copy bytes out of the GPU ram into CPU ram, to only copy them back into the GPU for encoding.
12:20
I tried it and sure enough I'm not typing the correct command in exec. Please let me know if you have a video resizing command for exec.
const compress = async () =>
const videoURL =
"../public/videos/20240508_095425.mp4";
const ffmpeg = ffmpegRef.current;
await ffmpeg.writeFile("input.mp4", await fetchFile(videoURL));
await ffmpeg.exec([
"-i",
"input.mp4",
"-c:v",
"copy",
"-c:a",
"aac",
"-b:a",
"128k",
"output.mp4",
]);
const fileData = await ffmpeg.readFile("output.mp4");
const data = new (...)