
Recherche avancée
Médias (91)
-
Valkaama DVD Cover Outside
4 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
Valkaama DVD Label
4 octobre 2011, par
Mis à jour : Février 2013
Langue : English
Type : Image
-
Valkaama DVD Cover Inside
4 octobre 2011, par
Mis à jour : Octobre 2011
Langue : English
Type : Image
-
1,000,000
27 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Demon Seed
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
The Four of Us are Dying
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
Autres articles (45)
-
Personnaliser les catégories
21 juin 2013, parFormulaire de création d’une catégorie
Pour ceux qui connaissent bien SPIP, une catégorie peut être assimilée à une rubrique.
Dans le cas d’un document de type catégorie, les champs proposés par défaut sont : Texte
On peut modifier ce formulaire dans la partie :
Administration > Configuration des masques de formulaire.
Dans le cas d’un document de type média, les champs non affichés par défaut sont : Descriptif rapide
Par ailleurs, c’est dans cette partie configuration qu’on peut indiquer le (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...)
Sur d’autres sites (9167)
-
Webm - Video playback stops after seeking
27 février 2017, par Ryan Vancity GoslingI am attempting to stream a Webm video file encoded from FFmpeg.
However when I seek to a position which hasn’t been buffered, video playback stops.
I’ve verified that the correct byte range has been downloaded.
Any idea whats causing this ? Seeking worked flawlessly for mp4 files encoded in x264. Webm seems to be the problem. Might be a problem with the way video was encoded.
ffmpeg -i input.mp4 -codec:v libvpx -quality best -b:v 4000k -qmin 10
-qmax 42 -maxrate 4024k -bufsize 8048k -vf scale=-1:1080 -codec:a libvorbis -b:a 384k -movflags faststart out.webm -
FFMpeg is slow on Android [duplicate]
28 juin 2017, par Oleg FilimonovThis question already has an answer here :
I’m using this library : https://github.com/WritingMinds/ffmpeg-android
Executing this command :
ffmpeg -y -i video.mp4 -ignore_loop 0 -i overlay.gif -filter_complex `
>> "[0][1]overlay=x=mod((1080/4 + sin(t)*1080/4)\,1920):y=mod(t*300\,1080):shortest=1[res];`
>> [res][1]overlay=x=mod((200*1 + 1080/4 - sin(2*t)*1080/4)\,1920):y=mod(100 + t*300\,1080):shortest=1[res];`
>> [res][1]overlay=x=mod((200*2 + 1080/4 - sin(-90 + t/2)*1080/4)\,1920):y=mod(150 + t*250\,1080):shortest=1[res];`
>> [res][1]overlay=x=mod((200*3 + 1080/4 - sin(90 + 3*t)*1080/4)\,1920):y=mod(200 + t*100\,1080):shortest=1[res];`
>> [res][1]overlay=x=mod((200*4 + 1080/4 - sin(180 + t)*1080/4)\,1920):y=mod(t*250\,1080):shortest=1[res];`
>> [res][1]overlay=x=mod((200*5 + 1080/4 - sin(-180 + t)*1080/4)\,1920):y=mod(t*100\,1080):shortest=1[res];`
>> [res][1]overlay=x=mod((200*6 + 1080/4 - sin(2*t)*1080/4)\,1920):y=100 + mod(100 + t*300\,1080):shortest=1[res]`
>> " -map "[res]" -preset ultrafast out.mp4(Basically add a bunch of animated overlays to the video)
On PC this command takes about a second to execute (render speed is about 278 fps). But on Android device with Snapdragon 821 it takes about 10 seconds (about 30 fps).
The curious thing is that whenever command is executing on PC, CPU load is around 90%, but on Android it’s less than 1% (see screenshot : initial spike is UI stuff plus loading binaries, afterwards ffmpeg was executing until 10 second mark)
I tried executing the same command, but writing to /dev/null
instead of the file :ffmpeg -y -i video.mp4 -ignore_loop 0 -i overlay.gif -filter_complex `
>> "[0][1]overlay=x=mod((1080/4 + sin(t)*1080/4)\,1920):y=mod(t*300\,1080):shortest=1[res];`
>> [res][1]overlay=x=mod((200*1 + 1080/4 - sin(2*t)*1080/4)\,1920):y=mod(100 + t*300\,1080):shortest=1[res];`
>> [res][1]overlay=x=mod((200*2 + 1080/4 - sin(-90 + t/2)*1080/4)\,1920):y=mod(150 + t*250\,1080):shortest=1[res];`
>> [res][1]overlay=x=mod((200*3 + 1080/4 - sin(90 + 3*t)*1080/4)\,1920):y=mod(200 + t*100\,1080):shortest=1[res];`
>> [res][1]overlay=x=mod((200*4 + 1080/4 - sin(180 + t)*1080/4)\,1920):y=mod(t*250\,1080):shortest=1[res];`
>> [res][1]overlay=x=mod((200*5 + 1080/4 - sin(-180 + t)*1080/4)\,1920):y=mod(t*100\,1080):shortest=1[res];`
>> [res][1]overlay=x=mod((200*6 + 1080/4 - sin(2*t)*1080/4)\,1920):y=100 + mod(100 + t*300\,1080):shortest=1[res]`
>> " -map "[res]" -preset ultrafast -f null /dev/nullThis reduced time by 3 seconds.
I also tried replacing gif overlay with png overlay, but speed didn’t improve significantly.
Here’s log of the command execution on Android :
ffmpeg version n3.0.1 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 4.8 (GCC)
configuration: --target-os=linux --cross-prefix=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/bin/arm-linux-androideabi- --arch=arm --cpu=cortex-a8 --enable-runtime-cpudetect --sysroot=/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/sysroot --enable-pic --enable-libx264 --enable-libass --enable-libfreetype --enable-libfribidi --enable-libmp3lame --enable-fontconfig --enable-pthreads --disable-debug --disable-ffserver --enable-version3 --enable-hardcoded-tables --disable-ffplay --disable-ffprobe --enable-gpl --enable-yasm --disable-doc --disable-shared --enable-static --pkg-config=/home/vagrant/SourceCode/ffmpeg-android/ffmpeg-pkg-config --prefix=/home/vagrant/SourceCode/ffmpeg-android/build/armeabi-v7a --extra-cflags='-I/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/include -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -fno-strict-overflow -fstack-protector-all' --extra-ldflags='-L/home/vagrant/SourceCode/ffmpeg-android/toolchain-android/lib -Wl,-z,relro -Wl,-z,now -pie' --extra-libs='-lpng -lexpat -lm' --extra-cxxflags=
libavutil 55. 17.103 / 55. 17.103
libavcodec 57. 24.102 / 57. 24.102
libavformat 57. 25.100 / 57. 25.100
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 31.100 / 6. 31.100
libswscale 4. 0.100 / 4. 0.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/data/user/0/com.example.user.proofofconcept/files/in_10s.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
creation_time : 2017-06-27 18:07:54
Duration: 00:00:10.47, start: 0.000000, bitrate: 14043 kb/s
Stream #0:0(eng): Video: h264 (Baseline) (avc1 / 0x31637661), yuvj420p(pc, smpte170m), 1280x720, 13971 kb/s, 29.86 fps, 30.01 tbr, 90k tbn, 180k tbc (default)
Metadata:
creation_time : 2017-06-27 18:07:53
handler_name : VideoHandle
Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 96 kb/s (default)
Metadata:
creation_time : 2017-06-27 18:07:54
handler_name : SoundHandle
Input #1, gif, from '/data/user/0/com.example.user.proofofconcept/files/overlay.gif':
Duration: N/A, bitrate: N/A
Stream #1:0: Video: gif, bgra, 130x98, 6.17 fps, 4.92 tbr, 100 tbn, 100 tbc
[swscaler @ 0xeddb3000] deprecated pixel format used, make sure you did set range correctly
[libx264 @ 0xee1fdc00] using cpu capabilities: none!
[libx264 @ 0xee1fdc00] profile Constrained Baseline, level 3.2
[libx264 @ 0xee1fdc00] 264 - core 148 - H.264/MPEG-4 AVC codec - Copyleft 2003-2015 - http://www.videolan.org/x264.html - options: cabac=0 ref=1 deblock=0:0:0 analyse=0:0 me=dia subme=0 psy=1 psy_rd=1.00:0.00 mixed_ref=0 me_range=16 chroma_me=1 trellis=0 8x8dct=0 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=0 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=0 weightp=0 keyint=250 keyint_min=25 scenecut=0 intra_refresh=0 rc=crf mbtree=0 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=0
Output #0, mp4, to '/storage/emulated/0/temp/out.mp4':
Metadata:
major_brand : isom
minor_version : 512
compatible_brands: isomiso2avc1mp41
encoder : Lavf57.25.100
Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 1280x720, q=-1--1, 30.01 fps, 90k tbn, 30.01 tbc (default)
Metadata:
encoder : Lavc57.24.102 libx264
Side data:
unknown side data type 10 (24 bytes)
Stream mapping:
Stream #0:0 (h264) -> overlay:main
Stream #1:0 (gif) -> overlay:overlay
Stream #1:0 (gif) -> overlay:overlay
Stream #1:0 (gif) -> overlay:overlay
Stream #1:0 (gif) -> overlay:overlay
Stream #1:0 (gif) -> overlay:overlay
Stream #1:0 (gif) -> overlay:overlay
Stream #1:0 (gif) -> overlay:overlay
overlay -> Stream #0:0 (libx264)
Press [q] to stop, [?] for help
frame= 16 fps=0.0 q=24.0 size= 227kB time=00:00:00.29 bitrate=6210.2kbits/s dup=2 drop=0 speed=0.56x
frame= 29 fps= 28 q=24.0 size= 451kB time=00:00:00.73 bitrate=5042.1kbits/s dup=2 drop=0 speed=0.707x
frame= 40 fps= 26 q=24.0 size= 637kB time=00:00:01.09 bitrate=4744.5kbits/s dup=2 drop=0 speed=0.71x
frame= 57 fps= 28 q=24.0 size= 903kB time=00:00:01.66 bitrate=4441.3kbits/s dup=2 drop=0 speed=0.812x
frame= 75 fps= 29 q=24.0 size= 1186kB time=00:00:02.26 bitrate=4288.0kbits/s dup=2 drop=0 speed=0.885x
frame= 94 fps= 31 q=24.0 size= 1456kB time=00:00:02.89 bitrate=4113.4kbits/s dup=2 drop=0 speed=0.945x
frame= 113 fps= 32 q=25.0 size= 1780kB time=00:00:03.53 bitrate=4127.3kbits/s dup=2 drop=0 speed=0.985x
frame= 131 fps= 32 q=24.0 size= 2101kB time=00:00:04.13 bitrate=4165.9kbits/s dup=2 drop=0 speed=1.01x
frame= 150 fps= 33 q=24.0 size= 2461kB time=00:00:04.76 bitrate=4231.2kbits/s dup=2 drop=0 speed=1.03x
frame= 165 fps= 32 q=24.0 size= 2721kB time=00:00:05.26 bitrate=4234.1kbits/s dup=2 drop=0 speed=1.03x
frame= 185 fps= 33 q=24.0 size= 3028kB time=00:00:05.93 bitrate=4182.4kbits/s dup=2 drop=0 speed=1.05x
frame= 205 fps= 33 q=24.0 size= 3329kB time=00:00:06.59 bitrate=4133.3kbits/s dup=2 drop=0 speed=1.07x
frame= 224 fps= 34 q=24.0 size= 3610kB time=00:00:07.23 bitrate=4089.7kbits/s dup=2 drop=0 speed=1.08x
frame= 243 fps= 34 q=24.0 size= 3901kB time=00:00:07.86 bitrate=4063.3kbits/s dup=2 drop=0 speed=1.09x
frame= 264 fps= 34 q=24.0 size= 4297kB time=00:00:08.56 bitrate=4110.6kbits/s dup=2 drop=0 speed=1.11x
frame= 284 fps= 35 q=24.0 size= 4601kB time=00:00:09.23 bitrate=4083.5kbits/s dup=2 drop=0 speed=1.12x
frame= 304 fps= 35 q=24.0 size= 4886kB time=00:00:09.89 bitrate=4044.4kbits/s dup=2 drop=0 speed=1.14x
frame= 314 fps= 35 q=-1.0 Lsize= 5138kB time=00:00:10.46 bitrate=4022.4kbits/s dup=2 drop=0 speed=1.16x
video:5135kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.040961%
[libx264 @ 0xee1fdc00] frame I:2 Avg QP:20.50 size: 84700
[libx264 @ 0xee1fdc00] frame P:312 Avg QP:24.06 size: 16310
[libx264 @ 0xee1fdc00] mb I I16..4: 100.0% 0.0% 0.0%
[libx264 @ 0xee1fdc00] mb P I16..4: 7.7% 0.0% 0.0% P16..4: 54.9% 0.0% 0.0% 0.0% 0.0% skip:37.4%
[libx264 @ 0xee1fdc00] coded y,uvDC,uvAC intra: 62.7% 22.9% 3.4% inter: 26.1% 8.4% 0.9%
[libx264 @ 0xee1fdc00] i16 v,h,dc,p: 13% 14% 59% 13%
[libx264 @ 0xee1fdc00] i8c dc,h,v,p: 47% 24% 22% 6%
[libx264 @ 0xee1fdc00] kb/s:4020.26
Success
time used: 9843So what could be the reason of such low CPU load ?
-
FFmpeg filtergraph memory leak
5 juillet 2017, par Leif AndersenI have an FFmpeg program that :
- Demuxes and decodes a video file.
- Passes it through a filtergraph
- encodes and muxes the new video.
The filtergraph itself is rather complex, and can be run directly from the command line as such :
ffmpeg -i demo.mp4 -filter_complex \
"[audio3]atrim=end=30:start=10[audio2];\
[video5]trim=end=30:start=10[video4];[audio2]anull[audio6];\
[video4]scale=width=1920:height=1080[video7];[audio6]anull[audio8];\
[video7]fps=fps=30[video9];[audio8]anull[audio10];\
[video9]format=pix_fmts=yuv420p[video11];\
[audio10]asetpts=expr=PTS-STARTPTS[audio12];\
[video11]setpts=expr=PTS-STARTPTS[video13];\
[audio15]concat=v=0:a=1:n=1[audio14];\
[video17]concat=v=1:a=0:n=1[video16];\
[audio12]afifo[audio15];[video13]fifo[video17];\
[audio14]afifo[audio18];[video16]fifo[video19];\
[audio18]anull[audio20];\
[video19]pad=width=1920:height=1080[video21];\
[audio20]anull[audio22];[video21]fps=fps=25[video23];\
[audio22]aformat=sample_fmts=fltp:sample_rates=44100:channel_layouts=stereo[fa];\
[video23]format=pix_fmts=yuv420p[fv];[0:a]afifo[audio3];\
[0:v]fifo[video5]" \
-map "[fv]" -map "[fa]" out.mp4I realize this is a massive filtergraph with a lot of no-op filters, it was autogenerated rather than being hand written. Here is a more cleaner version of the graph. (Its a graphviz file, you can run it in the command line or here.)
Anyway, when I run the program that uses this filtergraph my memory usage spikes. I end up using about 7 GB of RAM for a 30 second clip. However, when I run the program using the ffmpeg command above, it peaks out at about 600 MB of RAM. This causes me to believe that the problem is not the ungodly size of the filtergraph, but a problem with how my program is using it.
The program sets up the filtergraph (using
av_filter_parse_ptr
, giving the filtergraph string shown above), encoder, muxer, decoder, and demuxer, then spawns two threads, one that sends frames into the filtergraph, and one that receives them. The frame that sends them looks something like :void decode () {
while(... more_frames ...) {
AVFrame *frame = av_frame_alloc();
... fill next frame of stream ...
av_buffersrc_write_frame(ctx, frame);
av_frame_free(&frame);
}
}(I have elided the
av_send_packet/av_receive_frame
functions as they don’t seem to be leaking memory. I have also elided the process of flushing the buffersrc as that won’t happen until the end, and the memory spikes long before that.)And the encoder thread looks similar :
void encode() {
while(... nodes_in_graph ...) {
AVFrame *frame = av_frame_alloc();
av_buffersink_get_frame(ctx, frame);
... ensure frame actually was filled ...
... send frame to encoder ...
av_frame_free(&frame);
}
}As with the decoder, I have elided the
send_frame/receive_packet
combo as they don’t seem to be leaking memory. Additionally I have elided the details of ensuring that the frame actually was filled. The code loops until the frame eventually does get filled.Every frame I allocate I fairly quickly deallocate. I additionally handled all of the error cases that the ffmpeg can give (Elided in the example).
I have also tried having only one frame for the encoder and one for the decoder (and calling
av_frame_unref
in each iteration of the loop).Am I forgetting to free something, or am I just using the calls to libavfilter incorrectly such that it has to buffer all of the data ? I don’t think the leak is caused by the memory graph because running it from the command line doesn’t seem to cause the same memory explosion.
FWIW, the actual code is here, although its written in Racket. I do have a minimal example that also seems to duplicate this behavior (modified from the
doc/example/filtering_video.c
file from the ffmpeg code :#include
#include <libavcodec></libavcodec>avcodec.h>
#include <libavformat></libavformat>avformat.h>
#include <libavfilter></libavfilter>avfiltergraph.h>
#include <libavfilter></libavfilter>buffersink.h>
#include <libavfilter></libavfilter>buffersrc.h>
#include <libavutil></libavutil>opt.h>
const char *filter_descr = "trim=start=10:end=30,scale=78:24,transpose=cclock";
static AVFormatContext *fmt_ctx;
static AVCodecContext *dec_ctx;
AVFilterContext *buffersink_ctx;
AVFilterContext *buffersrc_ctx;
AVFilterGraph *filter_graph;
static int video_stream_index = -1;
static int64_t last_pts = AV_NOPTS_VALUE;
static int open_input_file(const char *filename)
{
int ret;
AVCodec *dec;
if ((ret = avformat_open_input(&fmt_ctx, filename, NULL, NULL)) < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot open input file\n");
return ret;
}
if ((ret = avformat_find_stream_info(fmt_ctx, NULL)) < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot find stream information\n");
return ret;
}
/* select the video stream */
ret = av_find_best_stream(fmt_ctx, AVMEDIA_TYPE_VIDEO, -1, -1, &dec, 0);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot find a video stream in the input file\n");
return ret;
}
video_stream_index = ret;
/* create decoding context */
dec_ctx = avcodec_alloc_context3(dec);
if (!dec_ctx)
return AVERROR(ENOMEM);
avcodec_parameters_to_context(dec_ctx, fmt_ctx->streams[video_stream_index]->codecpar);
av_opt_set_int(dec_ctx, "refcounted_frames", 1, 0);
/* init the video decoder */
if ((ret = avcodec_open2(dec_ctx, dec, NULL)) < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot open video decoder\n");
return ret;
}
return 0;
}
static int init_filters(const char *filters_descr)
{
char args[512];
int ret = 0;
AVFilter *buffersrc = avfilter_get_by_name("buffer");
AVFilter *buffersink = avfilter_get_by_name("buffersink");
AVFilterInOut *outputs = avfilter_inout_alloc();
AVFilterInOut *inputs = avfilter_inout_alloc();
AVRational time_base = fmt_ctx->streams[video_stream_index]->time_base;
enum AVPixelFormat pix_fmts[] = { AV_PIX_FMT_GRAY8, AV_PIX_FMT_NONE };
filter_graph = avfilter_graph_alloc();
if (!outputs || !inputs || !filter_graph) {
ret = AVERROR(ENOMEM);
goto end;
}
/* buffer video source: the decoded frames from the decoder will be inserted here. */
snprintf(args, sizeof(args),
"video_size=%dx%d:pix_fmt=%d:time_base=%d/%d:pixel_aspect=%d/%d",
dec_ctx->width, dec_ctx->height, dec_ctx->pix_fmt,
time_base.num, time_base.den,
dec_ctx->sample_aspect_ratio.num, dec_ctx->sample_aspect_ratio.den);
ret = avfilter_graph_create_filter(&buffersrc_ctx, buffersrc, "in",
args, NULL, filter_graph);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot create buffer source\n");
goto end;
}
/* buffer video sink: to terminate the filter chain. */
ret = avfilter_graph_create_filter(&buffersink_ctx, buffersink, "out",
NULL, NULL, filter_graph);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot create buffer sink\n");
goto end;
}
ret = av_opt_set_int_list(buffersink_ctx, "pix_fmts", pix_fmts,
AV_PIX_FMT_NONE, AV_OPT_SEARCH_CHILDREN);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Cannot set output pixel format\n");
goto end;
}
outputs->name = av_strdup("in");
outputs->filter_ctx = buffersrc_ctx;
outputs->pad_idx = 0;
outputs->next = NULL;
inputs->name = av_strdup("out");
inputs->filter_ctx = buffersink_ctx;
inputs->pad_idx = 0;
inputs->next = NULL;
if ((ret = avfilter_graph_parse_ptr(filter_graph, filters_descr,
&inputs, &outputs, NULL)) < 0)
goto end;
if ((ret = avfilter_graph_config(filter_graph, NULL)) < 0)
goto end;
end:
avfilter_inout_free(&inputs);
avfilter_inout_free(&outputs);
return ret;
}
int main(int argc, char **argv)
{
int ret;
AVPacket packet;
AVFrame *frame = av_frame_alloc();
AVFrame *filt_frame = av_frame_alloc();
if (!frame || !filt_frame) {
perror("Could not allocate frame");
exit(1);
}
if (argc != 2) {
fprintf(stderr, "Usage: %s file\n", argv[0]);
exit(1);
}
av_register_all();
avfilter_register_all();
if ((ret = open_input_file(argv[1])) < 0)
goto end;
if ((ret = init_filters(filter_descr)) < 0)
goto end;
/* read all packets */
while (1) {
if ((ret = av_read_frame(fmt_ctx, &packet)) < 0)
break;
if (packet.stream_index == video_stream_index) {
ret = avcodec_send_packet(dec_ctx, &packet);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Error while sending a packet to the decoder\n");
break;
}
while (ret >= 0) {
ret = avcodec_receive_frame(dec_ctx, frame);
if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF) {
break;
} else if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Error while receiving a frame from the decoder\n");
goto end;
}
if (ret >= 0) {
frame->pts = av_frame_get_best_effort_timestamp(frame);
/* push the decoded frame into the filtergraph */
if (av_buffersrc_add_frame_flags(buffersrc_ctx, frame, AV_BUFFERSRC_FLAG_KEEP_REF) < 0) {
av_log(NULL, AV_LOG_ERROR, "Error while feeding the filtergraph\n");
break;
}
/* pull filtered frames from the filtergraph */
while (1) {
ret = av_buffersink_get_frame(buffersink_ctx, filt_frame);
if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
break;
if (ret < 0)
goto end;
av_frame_unref(filt_frame);
}
av_frame_unref(frame);
}
}
}
av_packet_unref(&packet);
}
end:
avfilter_graph_free(&filter_graph);
avcodec_free_context(&dec_ctx);
avformat_close_input(&fmt_ctx);
av_frame_free(&frame);
av_frame_free(&filt_frame);
return ret;
}