12:26
When I decode video frames with FFmpeg (avcodec_decode_video2(), sws_scale()), with some videos (e.g., ProRes4444), I get colors pre-multiplied by alpha, and with other videos (e.g., QuickTime PNG), I get colors that aren't pre-multiplied by alpha.
How can I tell whether the colors are pre-multiplied? Alternatively, how can I tell FFmpeg to always provide either pre-multiplied or un-pre-multiplied ("straight alpha") colors?
12:17
I'm trying to create a video from a single image with a very specific duration of 0.09375 seconds using FFmpeg. I've tried various commands, but I can't seem to get the exact duration I need. The closest I've gotten is 0.080000 seconds. It doesn't always have to be something like 0.09375, but I wanted to have an example for it.
I've also tried trimming a video, but from what I've read so far, the encoding of the video can be a problem. Even after trying different FFmpeg commands or using MoviePy directly, I've never arrived at the desired (...)
12:11
OS: Windows 11 Pro
IDE: Visual Studio 2022
PL: C++
Libraries: FFMPEG shared (own x64 Windows build with libmfx)
I am trying to encode frames from a framegrabber card with Intel Quick Sync Encoder using libmfx in FFMPEG shared / libav but I am getting an error always. The "same" procedure using ffmpeg.exe of same build works fine.
I increased the FFMPEG logging level to the maximum level. The avcodec_send_frame function always returns -22 (which means "Invalid argument"). And I do not get more information from the log output /console. The only (...)
11:03
I want to use FFMpegCore to convert some audio files to raw pcm. I noticed that this always cuts off ~1.5 seconds of my audio from the start. I check my input stream, saved it to HD all good. If use it from cli with the same arguments everything seem fine. I tried -ss 0, no luck. This behavior is observed with .wav (RIFF (little-endian) data, WAVE audio, Microsoft PCM, 16 bit, stereo 44100 Hz), same issue with different sample rate. I tested mp3 works fine.
public async Task ConvertToPcmStreamAsync(Stream inputStream)
var outputStream = new MemoryStream();
var (...)
12:57
CMake cannot find FFmpeg even though everything is right setup. I used vcpkg to install ffmpeg
I can't find any information to setup ffmpeg with CMake. I'm on Windows 10 with Visual Studio.
Get findFFMPEG and put in ffmpeg folder but doesn't solve the issue. change the folder name to
FFMPEG capital letters but still doesn't solve the issue.
I get this error:
Severity Code Description Project File Line Suppression State Details
Error CMake Error at CMakeProject1/CMakeLists.txt:15 (find_package):
By not providing "FindFFMPEG.cmake" in (...)
10:10
I am trying stream a static image on the network. Everything I have tried yet so far has failed:
Examples:
ffmpeg -y -stream_loop -1 -r 1 -i text2.png -vcodec libx264 -crf 17 -pix_fmt yuv420p -f mpegts udp://239.1.250.12:1234
File is generated but unplayable by VLC or any other TS capable device. ffplay complains about input frame errors but still produces a picture after a few seconds.
Tried this for youtube solution as well to a local file.
ffmpeg -f image2 -loop 1 -i text1.jpg -re -f lavfi -i anullsrc -vf format=yuv420p -c:v libx264 -b:v 2000k -maxrate 2000k (...)
09:46
I have the following command:
"ffmpeg -i video.mp4 -i input.png -filter_complex [1:v]scale=iw*(iw/920):-1[scaled];[0:v][scaled]overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2 -c:v libx264 -c:a aac -strict experimental -pix_fmt yuv420p -y output-scaling.mp4"
In the scale section, it takes the width of the input.png but i want the width of the video.mp4. Is this possible?
05:54
So I have upscaled videos of the Fallout 1 cinematics but they're in .avi and I'm using Fallout1in2 which has the cinematics in .mve so I'm trying to convert the .avi to .mve and using ffmpeg I have used this command "ffmpeg -i C:\\PATH_Programs\\BOIL1.AVI BOIL1.MVE" but get this error "ffmpeg unable to choose an output format use standard extension for filename or specify the format manually". Am I doing something wrong or is it just not possible to do it this way?
I have used this command "ffmpeg -i C:\\PATH_Programs\\BOIL1.AVI BOIL1.MVE" but get this error "ffmpeg (...)
08:22
i am using "ffmpeg_kit_flutter" to merge two videos with code
import 'dart:io';
import 'package:ffmpeg_kit_flutter/ffmpeg_kit.dart';
import 'package:ffmpeg_kit_flutter/abstract_session.dart';
import 'package:ffmpeg_kit_flutter/return_code.dart';
import 'package:flutter/material.dart';
import 'package:flutter_bloc/flutter_bloc.dart';
import 'package:wechat_assets_picker/wechat_assets_picker.dart';
import (...)
08:00
If I wanted to play a video stream over the network but the video and audio are separated into two different URLs, can I use FFMPEG to stream merge those two sources? Because waiting for FFMPEG to merge both files into a single local file will probably take sometime which wouldn't be nice for the user experience.
07:02
I am trying to covert a video to h.264 format. I can save the new video to a normal directory but saving the new video to a Windows temp file fails. However I can manage to save a snapshot image from the video into Windows temp folder without problem. I cannot figure out what is the problem?
require_once(dirname(__FILE__).'/../vendor/autoload.php');
function get_ffmpeg_exe_path_arr_def()
return( array( 'ffmpeg.binaries' => "C:/bin/ffmpeg.exe",
'ffprobe.binaries' => "C:/bin/ffmpeg.exe",
'timeout' => 3600000000, (...)
04:40
When extracting Audio streams using ffmpeg from containers such as MP4, how does ffmpeg increase bitrate, if it is higher than the source bitrate?
An example might be ffmpeg -i video.mp4 -f mp3 -ab 256000 -vn music.mp3. What does ffmpeg do if the incoming bitrate is 128k? Does it interpolate or default to 128k on the output music.mp3? I know this seems like not a so-called "programming question" but ffmpeg forum says it is going out of business and no one will reply to posts (...)