20:06
I'm trying to allow the user to export the scene as an mp4 (video format), the items of the scene consists of QGraphicsVideoItem and multiple QGraphicsTextItem, I need to export the scene as it will allow the user to save the video with the text items. I've found one of the ways to do this but the issue is that it will take hours for a simple 5 seconds videos as it saves every image to a byte to create a video, every image is a millisecond. If I change from millisecond to seconds it could speed up but the video will not look as smooth, is there a more efficient way of doing (...)
16:44
I have installed ffmpeg (version 4) with Homebrew and I am trying to use the various ffmpeg libraries in a C++ project, but I am getting multiple errors during linking.
Undefined symbols for architecture x86_64:
"_av_free", referenced from:
_main in main.cpp.o
"_av_packet_alloc", referenced from:
_main in main.cpp.o
"_av_parser_init", referenced from:
And so on ...
I have included the libraries as follow
extern "C"
#include
#include
#include
But still, this doesn't work. (...)
15:05
I am trying to convert an audio file generated from flutter "text to speech" package to mp3 file but it is failing everytime.
I have written the below code for file conversion. The conversion fails everytime. I have imported the package ffmpeg_kit_flutter. It doesnt even show why the conversion is failing.
I have looked up in stackoverflow and other sites but could not find any relevant solutions. I am using vscode as editor. I have attached flutter doctor output below as well. Could anyone please guide me? Let me know if you need more information.
List command = [
(...)
15:02
I'm working on a project where I have a canvas displaying gameplay, which is currently being live-streamed to the client canvas. Now, I want to stream this gameplay canvas to a third-party web server in real time using HLS or RTMP, but I'm unsure where to start. Could anyone provide some guidance or tips on how to achieve this?
Here's what I've tried so far:
I experimented with FFmpeg for the streaming, but didn't make much progress.
I have a system in place that streams the gameplay to the client canvas.
Any advice on how to set up the (...)
07:30
I try to run scrcpy command like what I did before, and proved if it's worked not just once.
scrcpy -m 540 -S
but after re-installing my OS and use Fedora OS with dnf package manager, I got errors with output below.
WARN: Demuxer 'audio': stream explicitly disabled by the device
WARN: [FFmpeg] libopenh264.so.7: cannot open shared object file: No such file or directory: libopenh264.so.7 is missing, openh264 support will be disabled
ERROR: Demuxer 'video': could not open codec
ERROR: Demuxer error
[server] INFO: Device screen turned (...)
04:49
We've asked a freelancer to build a video encoder with FFMPeg for iOS but there is a bug and the freelancer is no longer available. I very inexperienced in FFMpeg and video encoding and am trying to debug this error.
From what I understand, we're attempting to create an output file and create a header for it however, avformat_write_header is always less than zero. If I comment it out, it does not work
- (BOOL) writeHeaderWithError:(NSError *__autoreleasing *)error
AVDictionary *options = NULL;
// Write header for output file
int (...)
01:59
Could you please help me on the below?
Whats wrong in the below command, am not seeing output file
ffmpeg -y -framerate 30 -start_number 1 -loop 1 -i "C:\\Intel\\images%d.jpg" -i "C:\\Intel\\audio\\synthesized_audio.wav" -analyzeduration 2500000 -probesize 5000000 -c:v libx264 -pix_fmt yuv420p -c:a aac -strict experimental -shortest -muxers -f mpg "C:\\Intel\\output.mpg"
string imagesDirectory = ⓐ"C:\\Intel\\images";
string audioFile = ⓐ"C:\\Intel\\audio\\synthesized_audio.wav";
string outputVideo = ⓐ"C:\\Intel\\output.mpg";
string ffmpegCommand =
$"ffmpeg -y (...)