
Recherche avancée
Médias (1)
-
Bug de détection d’ogg
22 mars 2013, par
Mis à jour : Avril 2013
Langue : français
Type : Video
Autres articles (65)
-
Les formats acceptés
28 janvier 2010, parLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
Automated installation script of MediaSPIP
25 avril 2011, parTo overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
The documentation of the use of this installation script is available here.
The code of this (...)
Sur d’autres sites (9318)
-
Chrome times out on streaming FFMPEG output from ASP.NET Web Api
3 août 2014, par Hayden McAfeeI’ve got a unique problem here !
UPDATE 2 So it turns out the development below is FALSE, the inconsistency of the bug made it seem like not closing the stream made it work... but in fact the same issue persists !
UPDATE Interesting development ; if I comment outffmpegBufferedIn.Close();
below, the entire stream always goes through fine... the request just never ends. What could be going on here ?I’m writing a web service that stores audio files in Azure Blob Storage, and converts them to MP3 live when requested through my ASP.NET Web API endpoint. I accomplish this by using ’DownloadToStream’ via the Azure Storage API, feeding that stream through the STDIN of an FFMPEG process, and sending the STDOUT stream as the request response.
The block of code that does this looks like this :
public HttpResponseMessage Get(Guid songid)
{
// This could take awhile.
HttpContext.Current.Server.ScriptTimeout = 600;
Process ffmpeg = new Process();
ProcessStartInfo startinfo = new ProcessStartInfo(HostingEnvironment.MapPath("~/App_Data/executables/ffmpeg.exe"), "-i - -vn -ar 44100 -ac 2 -ab 192k -f mp3 - ");
startinfo.RedirectStandardError = true;
startinfo.RedirectStandardOutput = true;
startinfo.RedirectStandardInput = true;
startinfo.UseShellExecute = false;
startinfo.CreateNoWindow = true;
ffmpeg.StartInfo = startinfo;
ffmpeg.ErrorDataReceived += ffmpeg_ErrorDataReceived;
// Our response is a stream
var response = Request.CreateResponse();
response.StatusCode = HttpStatusCode.OK;
// Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
CloudConfigurationManager.GetSetting("StorageConnectionString"));
// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("songs");
// Retrieve reference to a blob
CloudBlockBlob blockBlob = container.GetBlockBlobReference(songid.ToString());
ffmpeg.Start();
ffmpeg.BeginErrorReadLine();
// Buffer the streams
var ffmpegBufferedIn = new BufferedStream(ffmpeg.StandardInput.BaseStream);
var ffmpegBufferedOut = new BufferedStream(ffmpeg.StandardOutput.BaseStream);
blockBlob.DownloadToStreamAsync(ffmpegBufferedIn).ContinueWith((t) => {
ffmpegBufferedIn.Flush();
ffmpegBufferedIn.Close();
});
response.Content = new StreamContent(ffmpegBufferedOut);
response.Content.Headers.ContentType = new MediaTypeHeaderValue("audio/mpeg");
System.Diagnostics.Debug.WriteLine("Returned response.");
return response;
}This works quite well in all browsers - all except for Chrome, which has an interesting way of buffering audio streams. Chrome will buffer the first 2 megabytes of a stream, then keep the connection open and wait until the user gets closer to playing the next segment of a file before consuming the rest of the stream. This should be fine - and for some songs it is. For others, I get this :
At first I thought this was due to some kind of timeout - But it happens at a different time and size for each file. It is consistent within about 15 seconds on the same songs, however. The output on the server side is normal - no exceptions thrown, and FFMpeg finishes encoding the song successfully.
Here’s the server-side output of the above request :
ffmpeg version N-64919-ga613257 Copyright (c) 2000-2014 the FFmpeg developers
built on Jul 23 2014 00:27:32 with gcc 4.8.3 (GCC)
configuration: --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-decklink --enable-zlib
libavutil 52. 92.101 / 52. 92.101
libavcodec 55. 69.100 / 55. 69.100
libavformat 55. 48.101 / 55. 48.101
libavdevice 55. 13.102 / 55. 13.102
libavfilter 4. 11.102 / 4. 11.102
libswscale 2. 6.100 / 2. 6.100
libswresample 0. 19.100 / 0. 19.100
libpostproc 52. 3.100 / 52. 3.100
Input #0, mp3, from 'pipe:':
Metadata:
TSRC : AUUM71001516
title : Sunlight
track : 2
artist : Bag Raiders
copyright : 2010 Modular Recordings
genre : Electronic
album : Bag Raiders
album_artist : Bag Raiders
disc : 1/1
publisher : Modular Recordings
composer : Chris Stracey/Jack Glass/Dan Black
date : 2010
Duration: N/A, start: 0.000000, bitrate: 320 kb/s
Stream #0:0: Audio: mp3, 44100 Hz, stereo, s16p, 320 kb/s
Stream #0:1: Video: mjpeg, yuvj420p(pc, bt470bg), 600x600 [SAR 300:300 DAR 1:1], 90k tbr, 90k tbn, 90k tbc
Metadata:
title :
comment : Other
Output #0, mp3, to 'pipe:':
Metadata:
TSRC : AUUM71001516
TIT2 : Sunlight
TRCK : 2
TPE1 : Bag Raiders
TCOP : 2010 Modular Recordings
TCON : Electronic
TALB : Bag Raiders
TPE2 : Bag Raiders
TPOS : 1/1
TPUB : Modular Recordings
TCOM : Chris Stracey/Jack Glass/Dan Black
TDRL : 2010
TSSE : Lavf55.48.101
Stream #0:0: Audio: mp3 (libmp3lame), 44100 Hz, stereo, s16p, 192 kb/s
Metadata:
encoder : Lavc55.69.100 libmp3lame
Stream mapping:
Stream #0:0 -> #0:0 (mp3 (native) -> mp3 (libmp3lame))
size= 6kB time=00:00:00.21 bitrate= 227.6kbits/s
size= 102kB time=00:00:04.31 bitrate= 193.7kbits/s
size= 202kB time=00:00:08.56 bitrate= 192.9kbits/s
size= 341kB time=00:00:14.49 bitrate= 192.5kbits/s
size= 489kB time=00:00:20.82 bitrate= 192.4kbits/s
size= 642kB time=00:00:27.35 bitrate= 192.3kbits/s
size= 792kB time=00:00:33.75 bitrate= 192.2kbits/s
size= 950kB time=00:00:40.49 bitrate= 192.2kbits/s
size= 1106kB time=00:00:47.15 bitrate= 192.2kbits/s
size= 1258kB time=00:00:53.63 bitrate= 192.1kbits/s
size= 1415kB time=00:01:00.31 bitrate= 192.1kbits/s
size= 1563kB time=00:01:06.66 bitrate= 192.1kbits/s
size= 1710kB time=00:01:12.90 bitrate= 192.1kbits/s
size= 1857kB time=00:01:19.17 bitrate= 192.1kbits/s
size= 2008kB time=00:01:25.63 bitrate= 192.1kbits/s
size= 2162kB time=00:01:32.21 bitrate= 192.1kbits/s
size= 2299kB time=00:01:38.03 bitrate= 192.1kbits/s
size= 2457kB time=00:01:44.80 bitrate= 192.1kbits/s
size= 2600kB time=00:01:50.89 bitrate= 192.1kbits/s
size= 2755kB time=00:01:57.52 bitrate= 192.1kbits/s
size= 2864kB time=00:02:02.17 bitrate= 192.1kbits/s
size= 3022kB time=00:02:08.88 bitrate= 192.1kbits/s
size= 3172kB time=00:02:15.31 bitrate= 192.1kbits/s
size= 3284kB time=00:02:20.06 bitrate= 192.1kbits/s
size= 3385kB time=00:02:24.40 bitrate= 192.1kbits/s
size= 3529kB time=00:02:30.51 bitrate= 192.0kbits/s
size= 3687kB time=00:02:37.25 bitrate= 192.0kbits/s
size= 3838kB time=00:02:43.71 bitrate= 192.0kbits/s
size= 3988kB time=00:02:50.11 bitrate= 192.0kbits/s
size= 4138kB time=00:02:56.53 bitrate= 192.0kbits/s
size= 4279kB time=00:03:02.54 bitrate= 192.0kbits/s
size= 4408kB time=00:03:08.03 bitrate= 192.0kbits/s
size= 4544kB time=00:03:13.85 bitrate= 192.0kbits/s
size= 4683kB time=00:03:19.78 bitrate= 192.0kbits/s
size= 4805kB time=00:03:24.95 bitrate= 192.0kbits/s
size= 4939kB time=00:03:30.67 bitrate= 192.0kbits/s
size= 5049kB time=00:03:35.38 bitrate= 192.0kbits/s
size= 5141kB time=00:03:39.32 bitrate= 192.0kbits/s
size= 5263kB time=00:03:44.49 bitrate= 192.0kbits/s
size= 5372kB time=00:03:49.17 bitrate= 192.0kbits/s
The thread 0xb24 has exited with code 259 (0x103).
size= 5436kB time=00:03:51.91 bitrate= 192.0kbits/s
size= 5509kB time=00:03:55.02 bitrate= 192.0kbits/s
size= 5657kB time=00:04:01.32 bitrate= 192.0kbits/s
size= 5702kB time=00:04:03.22 bitrate= 192.0kbits/s
video:0kB audio:5701kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.005738%Any ideas ? I’m grateful for suggestions - I’ve been chasing this for a week now !
-
ffmpeg4.1.3 built with Android NDK is 4 times bigger than linux gcc
3 juillet 2020, par ravin.wangFFmpeg version : 4.1.3

android-ndk-r19c, 44 MB (armeabi-v7a)

gcc-7 : 9.7 MB (x86_64)


The configuration is almost the same :



For android-ndk-19c :

I didn't link x264, under Linux x86_64, x264 is linked, so in theory Linux X86_64 links more code than the android one.


#!/bin/bash

# set the toolchain
TOOLCHAIN=$NDK_HOME/toolchains/llvm/prebuilt/linux-x86_64
CROSS_PREFIX=$TOOLCHAIN/bin/arm-linux-androideabi-
rm -f $(pwd)/compat/strtod.o
function build_one
{
#configure ffmpeg w/o libx264
./configure \
 --prefix=$PREFIX \
 --disable-everything --disable-programs --disable-doc \
 --enable-ffmpeg --enable-jni \
 --enable-encoder=aac \
 --enable-encoder=mpeg4 \
 --enable-encoder=pcm_u16le --enable-encoder=pcm_s16le \
 --enable-encoder=yuv4 \
 --enable-encoder=rawvideo \
 --enable-encoder=wrapped_avframe \
 --enable-decoder=aac --enable-decoder=aac_at --enable-decoder=aac_fixed \
 --enable-decoder=aac_latm \
 --enable-decoder=ac3 --enable-decoder=ac3_at --enable-decoder=ac3_fixed \
 --enable-decoder=avs \
 --enable-decoder=eac3 \
 --enable-decoder=eac3_at \
 --enable-decoder=h263 \
 --enable-decoder=h264 \
 --enable-decoder=hevc \
 --enable-decoder=mp1 --enable-decoder=mp1_at --enable-decoder=mp1float \
 --enable-decoder=mp2 \
 --enable-decoder=mp2_at \
 --enable-decoder=mp2float \
 --enable-decoder=mp3 --enable-decoder=mp3_at --enable-decoder=mp3float \
 --enable-decoder=vp8 --enable-decoder=vp9 \
 --enable-decoder=mpeg1video \
 --enable-decoder=mpeg2video \
 --enable-decoder=mpeg4 \
 --enable-decoder=mpegvideo \
 --enable-decoder=pcm_u16le --enable-decoder=pcm_s16le \
 --enable-decoder=yuv4 \
 --enable-decoder=rawvideo \
 --enable-decoder=wrapped_avframe \
 --enable-muxer=mpegts \
 --enable-muxer=mp4 \
 --enable-muxer=matroska --enable-muxer=matroska_audio --enable-muxer=webm \
 --enable-muxer=h264 \
 --enable-muxer=hevc \
 --enable-muxer=rawvideo \
 --enable-muxer=adts \
 --enable-muxer=yuv4mpegpipe \
 --enable-demuxer=aac \
 --enable-demuxer=ac3 \
 --enable-demuxer=eac3 \
 --enable-demuxer=m4v \
 --enable-demuxer=matroska \
 --enable-demuxer=mp3 \
 --enable-demuxer=mpegps \
 --enable-demuxer=mpegts \
 --enable-demuxer=mpegvideo \
 --enable-demuxer=h263 \
 --enable-demuxer=h264 \
 --enable-demuxer=hevc \
 --enable-demuxer=mov \
 --enable-demuxer=rawvideo \
 --enable-demuxer=yuv4mpegpipe \
 --enable-parser=aac \
 --enable-parser=aac_latm \
 --enable-parser=ac3 \
 --enable-parser=h263 \
 --enable-parser=h264 \
 --enable-parser=hevc \
 --enable-parser=mpeg4video \
 --enable-parser=mpegaudio \
 --enable-parser=mpegvideo \
 --enable-parser=vp8 \
 --enable-parser=vp9 \
 --enable-protocol=file \
 --enable-bsf=aac_adtstoasc \
 --enable-bsf=h264_mp4toannexb \
 --enable-bsf=hevc_mp4toannexb \
 --enable-bsf=mpeg4_unpack_bframes \
 --enable-filter=scale \
 --disable-zlib --disable-xlib --disable-iconv --disable-alsa \
 --disable-libxcb_shape --disable-libxcb_xfixes --disable-lzma \
 --disable-sdl2 --disable-sndio --disable-bzlib --disable-libxcb \
 --disable-hwaccels \
 --target-os=android \
 --enable-cross-compile \
 --toolchain=clang-usan \
 --cross-prefix=$CROSS_PREFIX \
 --cc=$TOOLCHAIN/bin/armv7a-linux-androideabi28-clang \
 --cxx=$TOOLCHAIN/bin/armv7a-linux-androideabi28-clang++ \
 --extra-cflags="-Os -fpic $ADDI_CFLAGS" \
 --extra-ldflags="$ADDI_LDFLAGS" \
 --sysroot=$TOOLCHAIN/sysroot $ADDITIONAL_CONFIG_FLAG
make clean
make -j4
make install
}

CPU=armeabi-v7a
mkdir -p $(pwd)/android/$CPU
PREFIX=$(pwd)/android/$CPU 
ADDI_CFLAGS=""
ADDI_LDFLAGS="-Wl,-Bsymbolic"
ADDITIONAL_CONFIG_FLAG="--arch=arm"
build_one




After finishing and stripping it, 44MB ffmpeg is generated, it is horrible, but I have no idea for it, I tried to add -Os CFLAGS, but it does not help.



Here is the X86_64 :



#!/bin/bash

configure()
{
 CPU=$1
 HOST=linux
 PREFIX=$(pwd)/$HOST/$CPU 
 ADDI_CFLAGS="-I$(pwd)/../x264-stable/$HOST/$CPU/include"
 ADDI_LDFLAGS="-L$(pwd)/../x264-stable/$HOST/$CPU/lib"
 ADDITIONAL_CONFIG_FLAG=""

 ./configure \
 --prefix=$PREFIX \
 --disable-everything --disable-programs --disable-doc \
 --enable-ffmpeg --enable-libx264 --enable-gpl \
 --enable-encoder=aac \
 --enable-encoder=libx264 \
 --enable-encoder=mpeg4 \
 --enable-encoder=pcm_u16le --enable-encoder=pcm_s16le \
 --enable-encoder=yuv4 \
 --enable-encoder=rawvideo \
 --enable-encoder=wrapped_avframe \
 --enable-decoder=aac \
 --enable-decoder=aac_at \
 --enable-decoder=aac_fixed \
 --enable-decoder=aac_latm \
 --enable-decoder=ac3 --enable-decoder=ac3_at --enable-decoder=ac3_fixed \
 --enable-decoder=avs \
 --enable-decoder=eac3 \
 --enable-decoder=eac3_at \
 --enable-decoder=h263 \
 --enable-decoder=h264 \
 --enable-decoder=hevc \
 --enable-decoder=mp1 --enable-decoder=mp1_at --enable-decoder=mp1float \
 --enable-decoder=mp2 \
 --enable-decoder=mp2_at \
 --enable-decoder=mp2float \
 --enable-decoder=mp3 --enable-decoder=mp3_at --enable-decoder=mp3float \
 --enable-decoder=vp8 --enable-decoder=vp9 \
 --enable-decoder=mpeg1video \
 --enable-decoder=mpeg2video \
 --enable-decoder=mpeg4 \
 --enable-decoder=mpegvideo \
 --enable-decoder=pcm_u16le --enable-decoder=pcm_s16le \
 --enable-decoder=yuv4 \
 --enable-decoder=rawvideo --enable-decoder=wrapped_avframe \
 --enable-muxer=mpegts \
 --enable-muxer=mp4 \
 --enable-muxer=matroska --enable-muxer=matroska_audio --enable-muxer=webm \
 --enable-muxer=h264 \
 --enable-muxer=hevc \
 --enable-muxer=rawvideo \
 --enable-muxer=adts \
 --enable-muxer=yuv4mpegpipe \
 --enable-demuxer=aac \
 --enable-demuxer=ac3 \
 --enable-demuxer=eac3 \
 --enable-demuxer=m4v \
 --enable-demuxer=matroska \
 --enable-demuxer=mp3 \
 --enable-demuxer=mpegps \
 --enable-demuxer=mpegts \
 --enable-demuxer=mpegvideo \
 --enable-demuxer=h263 \
 --enable-demuxer=h264 \
 --enable-demuxer=hevc \
 --enable-demuxer=mov \
 --enable-demuxer=rawvideo \
 --enable-demuxer=yuv4mpegpipe \
 --enable-parser=aac \
 --enable-parser=aac_latm \
 --enable-parser=ac3 \
 --enable-parser=h263 \
 --enable-parser=h264 \
 --enable-parser=hevc \
 --enable-parser=mpeg4video \
 --enable-parser=mpegaudio \
 --enable-parser=mpegvideo \
 --enable-parser=vp8 \
 --enable-parser=vp9 \
 --enable-protocol=file \
 --enable-bsf=aac_adtstoasc \
 --enable-bsf=h264_mp4toannexb \
 --enable-bsf=hevc_mp4toannexb \
 --enable-bsf=mpeg4_unpack_bframes \
 --enable-filter=scale \
 --disable-zlib --disable-xlib --disable-iconv \
 --disable-alsa --disable-libxcb_shape --disable-libxcb_xfixes \
 --disable-lzma --disable-sdl2 --disable-sndio \
 --disable-bzlib --disable-libxcb \
 --disable-hwaccels \
 --extra-cflags="-Os -fpic $ADDI_CFLAGS" \
 --extra-ldflags="$ADDI_LDFLAGS"
}

build()
{
 make clean
 cpu=$1

 echo "build $cpu"

 configure $cpu
 make -j4
 make install
}

build linux_x86_64




-
using ffmpeg libraray to write to a mp4, ffprobe shows there are 100 frames and 100 packets, but av_interleaved_write_frame only called 50 times
2 mai 2023, par ollydbg23here is my code to generate a mp4 file by using ffmpeg and opencv library. The opencv library is only try to generate 100 images(frames), and ffmpeg library is to compress the images to a mp4 files.


Here is the working code :


#include <iostream>
#include <vector>
#include <cstring>
#include <fstream>
#include <sstream>
#include <stdexcept>
#include <opencv2></opencv2>opencv.hpp>
extern "C" {
#include <libavutil></libavutil>imgutils.h>
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
#include <libavutil></libavutil>opt.h>
}

#include<cstdlib> // to generate time stamps

using namespace std;
using namespace cv;

int main()
{
 // Set up input frames as BGR byte arrays
 vector<mat> frames;

 int width = 640;
 int height = 480;
 int num_frames = 100;
 Scalar black(0, 0, 0);
 Scalar white(255, 255, 255);
 int font = FONT_HERSHEY_SIMPLEX;
 double font_scale = 1.0;
 int thickness = 2;

 for (int i = 0; i < num_frames; i++) {
 Mat frame = Mat::zeros(height, width, CV_8UC3);
 putText(frame, std::to_string(i), Point(width / 2 - 50, height / 2), font, font_scale, white, thickness);
 frames.push_back(frame);
 }

 // generate a serial of time stamps which is used to set the PTS value
 // suppose they are in ms unit, the time interval is between 30ms to 59ms
 vector<int> timestamps;

 for (int i = 0; i < num_frames; i++) {
 int timestamp;
 if (i == 0)
 timestamp = 0;
 else
 {
 int random = 30 + (rand() % 30);
 timestamp = timestamps[i-0] + random;
 }

 timestamps.push_back(timestamp);
 }

 // Populate frames with BGR byte arrays

 // Initialize FFmpeg
 //av_register_all();

 // Set up output file
 AVFormatContext* outFormatCtx = nullptr;
 //AVCodec* outCodec = nullptr;
 AVCodecContext* outCodecCtx = nullptr;
 //AVStream* outStream = nullptr;
 //AVPacket outPacket;

 const char* outFile = "output.mp4";
 int outWidth = frames[0].cols;
 int outHeight = frames[0].rows;
 int fps = 25;

 // Open the output file context
 avformat_alloc_output_context2(&outFormatCtx, nullptr, nullptr, outFile);
 if (!outFormatCtx) {
 cerr << "Error: Could not allocate output format context" << endl;
 return -1;
 }

 // Open the output file
 if (avio_open(&outFormatCtx->pb, outFile, AVIO_FLAG_WRITE) < 0) {
 cerr << "Error opening output file" << std::endl;
 return -1;
 }

 // Set up output codec
 const AVCodec* outCodec = avcodec_find_encoder(AV_CODEC_ID_H264);
 if (!outCodec) {
 cerr << "Error: Could not find H.264 codec" << endl;
 return -1;
 }

 outCodecCtx = avcodec_alloc_context3(outCodec);
 if (!outCodecCtx) {
 cerr << "Error: Could not allocate output codec context" << endl;
 return -1;
 }
 outCodecCtx->codec_id = AV_CODEC_ID_H264;
 outCodecCtx->codec_type = AVMEDIA_TYPE_VIDEO;
 outCodecCtx->pix_fmt = AV_PIX_FMT_YUV420P;
 outCodecCtx->width = outWidth;
 outCodecCtx->height = outHeight;
 //outCodecCtx->time_base = { 1, fps*1000 }; // 25000
 outCodecCtx->time_base = { 1, fps}; // 25000
 outCodecCtx->framerate = {fps, 1}; // 25
 outCodecCtx->bit_rate = 4000000;

 //https://github.com/leandromoreira/ffmpeg-libav-tutorial
 //We set the flag AV_CODEC_FLAG_GLOBAL_HEADER which tells the encoder that it can use the global headers.
 if (outFormatCtx->oformat->flags & AVFMT_GLOBALHEADER)
 {
 outCodecCtx->flags |= AV_CODEC_FLAG_GLOBAL_HEADER; //
 }

 // Open output codec
 if (avcodec_open2(outCodecCtx, outCodec, nullptr) < 0) {
 cerr << "Error: Could not open output codec" << endl;
 return -1;
 }

 // Create output stream
 AVStream* outStream = avformat_new_stream(outFormatCtx, outCodec);
 if (!outStream) {
 cerr << "Error: Could not allocate output stream" << endl;
 return -1;
 }

 // Configure output stream parameters (e.g., time base, codec parameters, etc.)
 // ...

 // Connect output stream to format context
 outStream->codecpar->codec_id = outCodecCtx->codec_id;
 outStream->codecpar->codec_type = AVMEDIA_TYPE_VIDEO;
 outStream->codecpar->width = outCodecCtx->width;
 outStream->codecpar->height = outCodecCtx->height;
 outStream->codecpar->format = outCodecCtx->pix_fmt;
 outStream->time_base = outCodecCtx->time_base;

 int ret = avcodec_parameters_from_context(outStream->codecpar, outCodecCtx);
 if (ret < 0) {
 cerr << "Error: Could not copy codec parameters to output stream" << endl;
 return -1;
 }

 outStream->avg_frame_rate = outCodecCtx->framerate;
 //outStream->id = outFormatCtx->nb_streams++; <--- We shouldn't modify outStream->id

 ret = avformat_write_header(outFormatCtx, nullptr);
 if (ret < 0) {
 cerr << "Error: Could not write output header" << endl;
 return -1;
 }

 // Convert frames to YUV format and write to output file
 int frame_count = -1;
 for (const auto& frame : frames) {
 frame_count++;
 AVFrame* yuvFrame = av_frame_alloc();
 if (!yuvFrame) {
 cerr << "Error: Could not allocate YUV frame" << endl;
 return -1;
 }
 av_image_alloc(yuvFrame->data, yuvFrame->linesize, outWidth, outHeight, AV_PIX_FMT_YUV420P, 32);

 yuvFrame->width = outWidth;
 yuvFrame->height = outHeight;
 yuvFrame->format = AV_PIX_FMT_YUV420P;

 // Convert BGR frame to YUV format
 Mat yuvMat;
 cvtColor(frame, yuvMat, COLOR_BGR2YUV_I420);
 memcpy(yuvFrame->data[0], yuvMat.data, outWidth * outHeight);
 memcpy(yuvFrame->data[1], yuvMat.data + outWidth * outHeight, outWidth * outHeight / 4);
 memcpy(yuvFrame->data[2], yuvMat.data + outWidth * outHeight * 5 / 4, outWidth * outHeight / 4);

 // Set up output packet
 //av_init_packet(&outPacket); //error C4996: 'av_init_packet': was declared deprecated
 AVPacket* outPacket = av_packet_alloc();
 memset(outPacket, 0, sizeof(outPacket)); //Use memset instead of av_init_packet (probably unnecessary).
 //outPacket->data = nullptr;
 //outPacket->size = 0;

 // set the frame pts, do I have to set the package pts?

 // yuvFrame->pts = av_rescale_q(timestamps[frame_count]*25, outCodecCtx->time_base, outStream->time_base); //Set PTS timestamp
 yuvFrame->pts = av_rescale_q(frame_count*frame_count, outCodecCtx->time_base, outStream->time_base); //Set PTS timestamp

 // Encode frame and write to output file
 int ret = avcodec_send_frame(outCodecCtx, yuvFrame);
 if (ret < 0) {
 cerr << "Error: Could not send frame to output codec" << endl;
 return -1;
 }
 while (ret >= 0)
 {
 ret = avcodec_receive_packet(outCodecCtx, outPacket);

 if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
 {
 int abc;
 abc++;
 break;
 }
 else if (ret < 0)
 {
 cerr << "Error: Could not receive packet from output codec" << endl;
 return -1;
 }

 //av_packet_rescale_ts(&outPacket, outCodecCtx->time_base, outStream->time_base);

 outPacket->stream_index = outStream->index;

 outPacket->duration = av_rescale_q(1, outCodecCtx->time_base, outStream->time_base); // Set packet duration

 ret = av_interleaved_write_frame(outFormatCtx, outPacket);

 static int call_write = 0;

 call_write++;
 printf("av_interleaved_write_frame %d\n", call_write);

 av_packet_unref(outPacket);
 if (ret < 0) {
 cerr << "Error: Could not write packet to output file" << endl;
 return -1;
 }
 }

 av_frame_free(&yuvFrame);
 }

 // Flush the encoder
 ret = avcodec_send_frame(outCodecCtx, nullptr);
 if (ret < 0) {
 std::cerr << "Error flushing encoder: " << std::endl;
 return -1;
 }

 while (ret >= 0) {
 AVPacket* pkt = av_packet_alloc();
 if (!pkt) {
 std::cerr << "Error allocating packet" << std::endl;
 return -1;
 }
 ret = avcodec_receive_packet(outCodecCtx, pkt);

 // Write the packet to the output file
 if (ret == 0)
 {
 pkt->stream_index = outStream->index;
 pkt->duration = av_rescale_q(1, outCodecCtx->time_base, outStream->time_base); // <---- Set packet duration
 ret = av_interleaved_write_frame(outFormatCtx, pkt);
 av_packet_unref(pkt);
 if (ret < 0) {
 std::cerr << "Error writing packet to output file: " << std::endl;
 return -1;
 }
 }
 }


 // Write output trailer
 av_write_trailer(outFormatCtx);

 // Clean up
 avcodec_close(outCodecCtx);
 avcodec_free_context(&outCodecCtx);
 avformat_free_context(outFormatCtx);

 return 0;
}

</int></mat></cstdlib></stdexcept></sstream></fstream></cstring></vector></iostream>


Note that I have used the
ffprobe
tool(one of the tool from ffmpeg) to inspect the generated mp4 files.

I see that the mp4 file has 100 frames and 100 packets, but in my code, I have such lines :


static int call_write = 0;

 call_write++;
 printf("av_interleaved_write_frame %d\n", call_write);



I just see that the
av_interleaved_write_frame
function is only called 50 times, not the expected 100 times, anyone can explain it ?

Thanks.


BTW, from the ffmpeg document( see here : For video, it should typically contain one compressed frame ), I see that a
packet
mainly has one videoframe
, so theffprobe
's result looks correct.

Here is the command I used to inspect the mp4 file :


ffprobe -show_frames output.mp4 >> frames.txt
ffprobe -show_packets output.mp4 >> packets.txt



My testing code is derived from an answer in another question here : avformat_write_header() function call crashed when I try to save several RGB data to a output.mp4 file