
Recherche avancée
Autres articles (50)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Websites made with MediaSPIP
2 mai 2011, parThis page lists some websites based on MediaSPIP.
-
Creating farms of unique websites
13 avril 2011, parMediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)
Sur d’autres sites (7473)
-
ffmpeg - Adding audio to a complex filter
9 janvier 2015, par user3630651I am using c# to programmatically write an ffmpeg command to crossfade a logo into a video. I also need to cut a couple of seconds into the video.
Here is the currently working ffmpeg command :
(No audio)-y -loop 1 -t 3.2 -i "logo.png" -i "video.mp4" -r 29.97 -f lavfi -i color=black -filter_complex "
[0:v] scale=640:-1, crop=640:360, format=pix_fmts=yuv420p, fade=t=out:st=2.7:d=0.5:alpha=1, setpts=PTS-STARTPTS [va0];
[1:v] format=pix_fmts=yuv420p, fade=t=in:st=0:d=0.5:alpha=1, setpts=PTS-STARTPTS + 3.2 / TB [va1];
[2:v] scale=640:-1, crop=640:360, trim=duration=147.768 [over];
[over][va0] overlay [over1];
[over1][va1] overlay=format=yuv420 [outv]
" -vcodec libx264 -map [outv] "Finished.mp4"I want to add audio from "video.mp4"([1:a]) to the output starting a few seconds in, how do I go about this without causing a buffer overflow ?
Thank you for your help in advance !
Edit, Adding ffmpeg output on request :
ffmpeg version N-66012-g97b8809 Copyright (c) 2000-2014 the FFmpeg developers
built on Sep 1 2014 00:21:15 with gcc 4.8.3 (GCC)
configuration: --disable-static --enable-shared --enable-gpl --enable-version3 --disable-w32threads --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-decklink --enable-zlib
libavutil 54. 7.100 / 54. 7.100
libavcodec 56. 1.100 / 56. 1.100
libavformat 56. 3.100 / 56. 3.100
libavdevice 56. 0.100 / 56. 0.100
libavfilter 5. 0.103 / 5. 0.103
libswscale 3. 0.100 / 3. 0.100
libswresample 1. 1.100 / 1. 1.100
libpostproc 53. 0.100 / 53. 0.100
Input #0, image2, from 'C:\VideoApplicationAssets\test\logo.png':
Duration: 00:00:00.04, start: 0.000000, bitrate: N/A
Stream #0:0: Video: png, rgba, 500x400 [SAR 9065:9065 DAR 5:4], 25 fps, 25 tbr, 25 tbn, 25 tbc
Input #1, mov,mp4,m4a,3gp,3g2,mj2, from 'C:\VideoApplicationAssets\test\2.mp4':
Metadata:
major_brand : mp42
minor_version : 0
compatible_brands: isommp42
creation_time : 2014-03-07 04:36:36
Duration: 00:02:24.68, start: 0.000000, bitrate: 574 kb/s
Stream #1:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p, 640x360 [SAR 1:1 DAR 16:9], 475 kb/s, 29.97 fps, 29.97 tbr, 30k tbn, 59.94 tbc (default)
Metadata:
handler_name : VideoHandler
Stream #1:1(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 96 kb/s (default)
Metadata:
creation_time : 2014-03-07 04:36:36
handler_name : IsoMedia File Produced by Google, 5-11-2011
Input #2, lavfi, from 'color=black':
Duration: N/A, start: 0.000000, bitrate: N/A
Stream #2:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 320x240 [SAR 1:1 DAR 4:3], 25 tbr, 25 tbn, 25 tbc
[libx264 @ 000000000049cf60] using SAR=1/1
[libx264 @ 000000000049cf60] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
[libx264 @ 000000000049cf60] profile High, level 3.0
[libx264 @ 000000000049cf60] 264 - core 142 r2479 dd79a61 - H.264/MPEG-4 AVC codec - Copyleft 2003-2014 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to 'C:\VideoApplicationAssets\test\Finished.mp4':
Metadata:
encoder : Lavf56.3.100
Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 640x360 [SAR 1:1 DAR 16:9], q=-1--1, 29.97 fps, 11988 tbn, 29.97 tbc (default)
Metadata:
encoder : Lavc56.1.100 libx264
Stream mapping:
Stream #0:0 (png) -> scale
Stream #1:0 (h264) -> format
Stream #2:0 (rawvideo) -> scale
overlay -> Stream #0:0 (libx264)
Press [q] to stop, [?] for help
frame= 149 fps=0.0 q=29.0 size= 23kB time=00:00:03.23 bitrate= 57.6kbits/s
frame= 240 fps=239 q=29.0 size= 200kB time=00:00:06.27 bitrate= 260.6kbits/s
frame= 345 fps=229 q=29.0 size= 343kB time=00:00:09.77 bitrate= 287.2kbits/s
frame= 431 fps=215 q=29.0 size= 496kB time=00:00:12.64 bitrate= 321.1kbits/s
frame= 502 fps=200 q=29.0 size= 633kB time=00:00:15.01 bitrate= 345.5kbits/s
frame= 659 fps=219 q=29.0 size= 737kB time=00:00:20.25 bitrate= 298.0kbits/s
frame= 843 fps=240 q=26.0 size= 790kB time=00:00:26.39 bitrate= 245.1kbits/s
frame= 988 fps=246 q=29.0 size= 982kB time=00:00:31.23 bitrate= 257.7kbits/s
frame= 1124 fps=249 q=29.0 size= 1200kB time=00:00:35.76 bitrate= 274.9kbits/s
frame= 1292 fps=257 q=29.0 size= 1410kB time=00:00:41.37 bitrate= 279.3kbits/s
frame= 1437 fps=260 q=29.0 size= 1773kB time=00:00:46.21 bitrate= 314.2kbits/s
frame= 1565 fps=260 q=29.0 size= 2206kB time=00:00:50.48 bitrate= 358.0kbits/s
frame= 1709 fps=262 q=29.0 size= 2542kB time=00:00:55.28 bitrate= 376.7kbits/s
frame= 1872 fps=266 q=29.0 size= 2920kB time=00:01:00.72 bitrate= 393.9kbits/s
frame= 2000 fps=265 q=29.0 size= 3157kB time=00:01:04.99 bitrate= 397.9kbits/s
frame= 2108 fps=262 q=29.0 size= 3378kB time=00:01:08.60 bitrate= 403.4kbits/s
frame= 2196 fps=257 q=29.0 size= 3592kB time=00:01:11.53 bitrate= 411.4kbits/s
frame= 2291 fps=254 q=29.0 size= 3791kB time=00:01:14.70 bitrate= 415.7kbits/s
frame= 2394 fps=251 q=29.0 size= 4013kB time=00:01:18.14 bitrate= 420.7kbits/s
frame= 2492 fps=248 q=29.0 size= 4191kB time=00:01:21.41 bitrate= 421.7kbits/s
frame= 2670 fps=253 q=29.0 size= 4323kB time=00:01:27.35 bitrate= 405.4kbits/s
frame= 2829 fps=256 q=29.0 size= 4437kB time=00:01:32.65 bitrate= 392.3kbits/s
frame= 2972 fps=257 q=29.0 size= 4749kB time=00:01:37.43 bitrate= 399.3kbits/s
frame= 3143 fps=261 q=29.0 size= 5016kB time=00:01:43.13 bitrate= 398.4kbits/s
frame= 3247 fps=259 q=29.0 size= 5430kB time=00:01:46.60 bitrate= 417.3kbits/s
frame= 3396 fps=260 q=29.0 size= 5737kB time=00:01:51.57 bitrate= 421.2kbits/s
frame= 3544 fps=262 q=29.0 size= 5995kB time=00:01:56.51 bitrate= 421.5kbits/s
frame= 3691 fps=263 q=29.0 size= 6251kB time=00:02:01.42 bitrate= 421.7kbits/s
frame= 3880 fps=267 q=29.0 size= 6342kB time=00:02:07.72 bitrate= 406.7kbits/s
frame= 4059 fps=270 q=29.0 size= 6426kB time=00:02:13.70 bitrate= 393.7kbits/s
frame= 4158 fps=267 q=29.0 size= 6699kB time=00:02:17.00 bitrate= 400.5kbits/s
frame= 4309 fps=268 q=29.0 size= 6911kB time=00:02:22.04 bitrate= 398.6kbits/s
frame= 4388 fps=265 q=29.0 size= 7187kB time=00:02:24.67 bitrate= 407.0kbits/s
frame= 4430 fps=260 q=-1.0 Lsize= 7537kB time=00:02:27.74 bitrate= 417.9kbits/s dup=0 drop=2
video:7491kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.618146%
[libx264 @ 000000000049cf60] frame I:40 Avg QP:18.78 size: 22979
[libx264 @ 000000000049cf60] frame P:1964 Avg QP:22.33 size: 3056
[libx264 @ 000000000049cf60] frame B:2426 Avg QP:27.35 size: 309
[libx264 @ 000000000049cf60] consecutive B-frames: 24.5% 5.1% 6.6% 63.7%
[libx264 @ 000000000049cf60] mb I I16..4: 19.5% 35.8% 44.7%
[libx264 @ 000000000049cf60] mb P I16..4: 4.1% 5.7% 2.0% P16..4: 24.4% 6.9% 2.5% 0.0% 0.0% skip:54.5%
[libx264 @ 000000000049cf60] mb B I16..4: 0.3% 0.2% 0.1% B16..8: 11.1% 0.7% 0.1% direct: 0.4% skip:87.0% L0:44.2% L1:52.6% BI: 3.2%
[libx264 @ 000000000049cf60] 8x8 transform intra:45.7% inter:56.0%
[libx264 @ 000000000049cf60] coded y,uvDC,uvAC intra: 45.5% 44.9% 11.1% inter: 5.9% 6.3% 0.4%
[libx264 @ 000000000049cf60] i16 v,h,dc,p: 24% 39% 14% 23%
[libx264 @ 000000000049cf60] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 28% 24% 24% 3% 4% 4% 5% 3% 4%
[libx264 @ 000000000049cf60] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 22% 26% 14% 5% 7% 7% 7% 5% 6%
[libx264 @ 000000000049cf60] i8c dc,h,v,p: 57% 22% 17% 4%
[libx264 @ 000000000049cf60] Weighted P-Frames: Y:6.6% UV:4.4%
[libx264 @ 000000000049cf60] ref P L0: 79.1% 10.7% 8.3% 1.9% 0.1%
[libx264 @ 000000000049cf60] ref B L0: 92.1% 7.3% 0.7%
[libx264 @ 000000000049cf60] ref B L1: 94.8% 5.2%
[libx264 @ 000000000049cf60] kb/s:415.11 -
Display filtered ffmpeg frame with GTK
22 janvier 2015, par FjottenI want to display cropped and scaled frames using the ffmpeg api and I am using GTK+ 3 for the GUI components. From following the this tutorial and the ffmpeg examples, I am able to display unfiltered frames, though with some instability. The filtered frames does not display correctly at all. It mostly produces completely black output. I suspect that this is due to sws_scale() but I have not found out why this is happening.
Using the "trivial" display from the ffmpeg examples I can confirm that the frame is being cropped and scaled properly.
Running the code below I get a bunch of errors :
[swscaler @ 0x7fb58b025400] bad src image pointers
[swscaler @ 0x7fb58b025400] bad dst image pointersI also get this error :
[swscaler @ 0x7fd05c025600] Warning: data is not aligned! This can lead to a speedloss
I tried making a buffer that was 16 bit aligned, but it did not seem to have any effect on the result.
This is how I decode the frames and apply the filters :
void decode(gpointer args) {
int ret;
AVPacket packet;
AVFrame *frame = av_frame_alloc();
AVFrame *filt_frame = av_frame_alloc();
int got_frame;
if(!frame || !filt_frame) {
perror("Could not allocate frame");
exit(1);
}
/* read all packets */
while (1) {
if ((ret = av_read_frame(fmt_ctx, &packet)) < 0)
break;
if (packet.stream_index == video_stream_index) {
got_frame = 0;
ret = avcodec_decode_video2(dec_ctx, frame, &got_frame, &packet);
if (ret < 0) {
av_log(NULL, AV_LOG_ERROR, "Error decoding video\n");
break;
}
if (got_frame) {
frame->pts = av_frame_get_best_effort_timestamp(frame);
/* push the decoded frame into the filtergraph */
if (av_buffersrc_add_frame_flags(buffersrc_ctx, frame, AV_BUFFERSRC_FLAG_KEEP_REF) < 0) {
av_log(NULL, AV_LOG_ERROR, "Error while feeding the filtergraph\n");
break;
}
/* pull filtered frames from the filtergraph */
while (1) {
ret = av_buffersink_get_frame(buffersink_ctx, filt_frame);
if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
break;
if (ret < 0)
goto end;
display_frame2(filt_frame, buffersink_ctx->inputs[0]->time_base);
av_frame_unref(filt_frame);
}
av_frame_unref(frame);
}
}
av_free_packet(&packet);
}
end:
avfilter_graph_free(&filter_graph);
avcodec_close(dec_ctx);
avformat_close_input(&fmt_ctx);
av_frame_free(&frame);
av_frame_free(&filt_frame);
if (ret < 0 && ret != AVERROR_EOF) {
fprintf(stderr, "Error occurred: %s\n", av_err2str(ret));
exit(1);
}
}And this is how I display the frames.
void display_frame2(const AVFrame *frame, AVRational time_base) {
GdkPixbuf *pixbuf;
int64_t delay;
AVFrame *filt;
uint8_t *buffer;
int num_bytes, i;
buffer = NULL;
filt = av_frame_alloc();
num_bytes = avpicture_get_size(PIX_FMT_RGB24, dec_ctx->width, dec_ctx->height);
buffer = (uint8_t *)av_malloc(num_bytes * sizeof(uint8_t));
avpicture_fill((AVPicture *)filt, buffer, PIX_FMT_RGB24, dec_ctx->width, dec_ctx->height);
if (frame->pts != AV_NOPTS_VALUE) {
if (last_pts != AV_NOPTS_VALUE) {
/* sleep roughly the right amount of time;
* usleep is in microseconds, just like AV_TIME_BASE. */
delay = av_rescale_q(frame->pts - last_pts,
time_base, AV_TIME_BASE_Q);
if (delay > 0 && delay < 1000000)
usleep(delay);
}
last_pts = frame->pts;
}
sws_scale( sws_ctx,
(uint8_t const * const *)frame->data,
frame->linesize,
0,
frame->height,
filt->data,
filt->linesize);
pixbuf = gdk_pixbuf_new_from_data( filt->data[0], GDK_COLORSPACE_RGB,
0, 8, dec_ctx->width, dec_ctx->height,
filt->linesize[0], NULL, NULL);
gtk_image_set_from_pixbuf((GtkImage *)image, pixbuf);
free( filt );
free( buffer );
}EDIT :
After some more thought and experimentation I got the filtered frames to be displayed, albeit in SDL, not GTK+. I used the transcoding example from ffmpeg to see if I could re-encode the video with the filters, and that does indeed work. With that example I basically changed the filter being fed the filtergtaph and most of the work is already done. At this point all I am doing is to display the video using SDL as shown in danger’s tutorial. The cropping filter creates a lot of artifacts but it is at least showing something.I have to do some more work to see if it will work with GTK+. I have not taken a detailed look at the differences between the above program and the one in the transcoding example, so I have not yet figured out why my old code does not display filtered frames. Both sets of code use sws_scale() but I get no errors with the new code, so that must mean that something is different. I will update this post once I make more progress.
EDIT 2 :
Added a small compilable example that should work, as per @drahnr’s request. I have not had the chance to try out replacing GtkPixbuf.#define _XOPEN_SOURCE 600
#include <libavformat></libavformat>avformat.h>
#include <libavformat></libavformat>avio.h>
#include <libavfilter></libavfilter>avfiltergraph.h>
#include <libavfilter></libavfilter>avcodec.h>
#include <libavfilter></libavfilter>buffersink.h>
#include <libavfilter></libavfilter>buffersrc.h>
#include <libavcodec></libavcodec>avcodec.h>
#include <libswscale></libswscale>swscale.h>
#include <libavutil></libavutil>avstring.h>
#include <libavutil></libavutil>time.h>
#include <libavutil></libavutil>opt.h>
#include
#include <gtk></gtk>gtk.h>
#include <gdk></gdk>gdkx.h>
GtkWidget *image;
GtkWidget *window;
struct SwsContext *sws_ctx;
char *filter_descr = "crop=100:100,scale=640:360";
static AVFormatContext *fmt_ctx;
static AVCodecContext *dec_ctx;
AVFilterContext *buffersink_ctx;
AVFilterContext *buffersrc_ctx;
AVFilterGraph *filter_graph;
static int video_stream_index = -1;
static void open_input_file(const char *filename)
{
AVCodec *dec;
avformat_open_input(&fmt_ctx, filename, NULL, NULL);
avformat_find_stream_info(fmt_ctx, NULL);
video_stream_index = av_find_best_stream(fmt_ctx, AVMEDIA_TYPE_VIDEO, -1, -1, &dec, 0);
dec_ctx = fmt_ctx->streams[video_stream_index]->codec;
av_opt_set_int(dec_ctx, "refcounted_frames", 1, 0);
avcodec_open2(dec_ctx, dec, NULL);
}
static void init_filters(const char *filters_descr)
{
char args[512];
AVFilter *buffersrc = avfilter_get_by_name("buffer");
AVFilter *buffersink = avfilter_get_by_name("buffersink");
AVFilterInOut *outputs = avfilter_inout_alloc();
AVFilterInOut *inputs = avfilter_inout_alloc();
AVRational time_base = fmt_ctx->streams[video_stream_index]->time_base;
enum AVPixelFormat pix_fmts[] = { PIX_FMT_RGB24, AV_PIX_FMT_NONE };
filter_graph = avfilter_graph_alloc();
snprintf(args, sizeof(args),
"video_size=%dx%d:pix_fmt=%d:time_base=%d/%d:pixel_aspect=%d/%d",
dec_ctx->width, dec_ctx->height, dec_ctx->pix_fmt,
time_base.num, time_base.den,
dec_ctx->sample_aspect_ratio.num, dec_ctx->sample_aspect_ratio.den);
avfilter_graph_create_filter(&buffersrc_ctx, buffersrc, "in", args, NULL, filter_graph);
avfilter_graph_create_filter(&buffersink_ctx, buffersink, "out", NULL, NULL, filter_graph);
av_opt_set_int_list(buffersink_ctx, "pix_fmts", pix_fmts, AV_PIX_FMT_NONE, AV_OPT_SEARCH_CHILDREN);
outputs->name = av_strdup("in");
outputs->filter_ctx = buffersrc_ctx;
outputs->pad_idx = 0;
outputs->next = NULL;
inputs->name = av_strdup("out");
inputs->filter_ctx = buffersink_ctx;
inputs->pad_idx = 0;
inputs->next = NULL;
avfilter_graph_parse_ptr(filter_graph, filters_descr, &inputs, &outputs, NULL);
avfilter_graph_config(filter_graph, NULL);
}
static void display_frame2(const AVFrame *frame, AVRational time_base) {
GdkPixbuf *pixbuf;
AVFrame *filt;
uint8_t *buffer;
int num_bytes;
buffer = NULL;
filt = av_frame_alloc();
num_bytes = avpicture_get_size(PIX_FMT_RGB24, dec_ctx->width, dec_ctx->height);
buffer = (uint8_t *)av_malloc(num_bytes * sizeof(uint8_t));
avpicture_fill((AVPicture *)filt, buffer, PIX_FMT_RGB24, dec_ctx->width, dec_ctx->height);
usleep(33670 / 4);
sws_scale( sws_ctx,
(uint8_t const * const *)frame->data,
frame->linesize,
0,
frame->height,
filt->data,
filt->linesize);
pixbuf = gdk_pixbuf_new_from_data( filt->data[0], GDK_COLORSPACE_RGB,
0, 8, dec_ctx->width, dec_ctx->height,
filt->linesize[0], NULL, NULL);
gtk_image_set_from_pixbuf((GtkImage *)image, pixbuf);
free( filt );
free( buffer );
}
void decode(gpointer args) {
int ret;
AVPacket packet;
AVFrame *frame = av_frame_alloc();
AVFrame *filt_frame = av_frame_alloc();
int got_frame;
while (1) {
av_read_frame(fmt_ctx, &packet);
if (packet.stream_index == video_stream_index) {
got_frame = 0;
avcodec_decode_video2(dec_ctx, frame, &got_frame, &packet);
if (got_frame) {
frame->pts = av_frame_get_best_effort_timestamp(frame);
if (av_buffersrc_add_frame_flags(buffersrc_ctx, frame, AV_BUFFERSRC_FLAG_KEEP_REF) < 0) {
av_log(NULL, AV_LOG_ERROR, "Error while feeding the filtergraph\n");
break;
}
while (1) {
ret = av_buffersink_get_frame(buffersink_ctx, filt_frame);
if (ret == AVERROR(EAGAIN) || ret == AVERROR_EOF)
break;
// Display original frame
display_frame2(frame, buffersink_ctx->inputs[0]->time_base);
// Display filtered frame
// display_frame2(filt_frame, buffersink_ctx->inputs[0]->time_base);
av_frame_unref(filt_frame);
}
av_frame_unref(frame);
}
}
av_free_packet(&packet);
}
}
static void realize_cb(GtkWidget *widget, gpointer data) {
GThread *tid;
tid = g_thread_new("video", decode, NULL);
}
static void destroy(GtkWidget *widget, gpointer data) {
gtk_main_quit();
}
int main(int argc, char **argv)
{
av_register_all();
avfilter_register_all();
open_input_file(argv[1]);
init_filters(filter_descr);
sws_ctx = NULL;
sws_ctx = sws_getContext( dec_ctx->width, dec_ctx->height, dec_ctx->pix_fmt, dec_ctx->width, dec_ctx->height,
PIX_FMT_RGB24, SWS_BILINEAR, NULL, NULL, NULL );
av_dump_format( fmt_ctx, 0, argv[1], 0);
gtk_init(&argc, &argv);
window = gtk_window_new( GTK_WINDOW_TOPLEVEL );
g_signal_connect(window, "realize", G_CALLBACK(realize_cb), NULL);
g_signal_connect(window, "destroy", G_CALLBACK(destroy), NULL);
gtk_container_set_border_width(GTK_CONTAINER(window), 10);
image = gtk_image_new();
gtk_widget_show(image);
gtk_container_add(GTK_CONTAINER(window), image);
gtk_widget_show(window);
gtk_main();
return 0;
} -
OUTPUT webcam video over rtp
20 août 2013, par Manjinder Singh SekhonI'm currently trying to stream webcam video from beagleboard black to my laptop and am using this line of code :
avconv -f video4linux2 -i /dev/video0 -debug -vcodec mpeg2video -r 25 -pix_fmt yuv420p -me_method epzs -b 2600k -bt 256k -f rtp rtp ://192.168.1.12:5004
the program starts working, but after a while it doesn't really do anything.
debug log :avconv version v0.8.4, Copyright (c) 2000-2012 the Libav developers
built on May 20 2013 13:00:42 with gcc 4.7.3 20130205 (prerelease)
configuration: --enable-shared --enable-pthreads --enable-gpl --enable-postproc --enable-avfilter --cross-prefix=arm-angstrom-linux-gnueabi- --prefix=/usr --enable-avserver --enable-avplay --enable-x11grab --enable-libtheora --enable-libvorbis --enable-libx264 --arch=arm --target-os=linux --enable-cross-compile --extra-cflags=' -fexpensive-optimizations -fomit-frame-pointer -O4 -ffast-math -march=armv7-a -mthumb-interwork -mfloat-abi=softfp -mfpu=neon -mtune=cortex-a8 --sysroot=/build/v2012.12/build/tmp-angstrom_v2012_12-eglibc/sysroots/beaglebone' --extra-ldflags='-Wl,-O1 -Wl,--hash-style=gnu -Wl,--as-needed' --sysroot=/build/v2012.12/build/tmp-angstrom_v2012_12-eglibc/sysroots/beaglebone --enable-hardcoded-tables --cpu=cortex-a8
libavutil 51. 22. 1 / 51. 22. 1
libavcodec 53. 35. 0 / 53. 35. 0
libavformat 53. 21. 0 / 53. 21. 0
libavdevice 53. 2. 0 / 53. 2. 0
libavfilter 2. 15. 0 / 2. 15. 0
libswscale 2. 1. 0 / 2. 1. 0
libpostproc 52. 0. 0 / 52. 0. 0
[video4linux2 @ 0x2dae0] [3]Capabilities: 84000001
[video4linux2 @ 0x2dae0] Querying the device for the current frame size
[video4linux2 @ 0x2dae0] Setting frame size to 640x480
[video4linux2 @ 0x2dae0] The V4L2 driver changed the pixel format from 0x32315559 to 0x56595559
Last message repeated 1 times
[video4linux2 @ 0x2dae0] The V4L2 driver changed the pixel format from 0x50323234 to 0x56595559
[video4linux2 @ 0x2dae0] The V4L2 driver set input_id: 0, input: Camera 1from what I can see, the program stops because it can't get any input from webcam. The webcam works when I use a opencv program.(so /dev/video0 definitly works.) Any ideas. I've also tried using ffmpeg, but i get message that I should use avconv.