
Recherche avancée
Autres articles (53)
-
Keeping control of your media in your hands
13 avril 2011, parThe vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...) -
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)
Sur d’autres sites (6823)
-
Understanding FFMPEG Video Encoding
20 juin 2014, par SetSlapShotGot this from the encoding example in ffmpeg. I can somewhat follow the authors example for audio encoding, but I find myself befuddled looking at the C code (I commented in block numbers to help me reference what I’m talking about)...
static void video_encode_example(const char *filename)
{
AVCodec *codec;
AVCodecContext *c= NULL;
int i, out_size, size, x, y, outbuf_size;
FILE *f;
AVFrame *picture;
uint8_t *outbuf, *picture_buf; //BLOCK ONE
printf("Video encoding\n");
/* find the mpeg1 video encoder */
codec = avcodec_find_encoder(CODEC_ID_MPEG1VIDEO);
if (!codec) {
fprintf(stderr, "codec not found\n");
exit(1); //BLOCK TWO
}
c= avcodec_alloc_context();
picture= avcodec_alloc_frame();
/* put sample parameters */
c->bit_rate = 400000;
/* resolution must be a multiple of two */
c->width = 352;
c->height = 288;
/* frames per second */
c->time_base= (AVRational){1,25};
c->gop_size = 10; /* emit one intra frame every ten frames */
c->max_b_frames=1;
c->pix_fmt = PIX_FMT_YUV420P; //BLOCK THREE
/* open it */
if (avcodec_open(c, codec) < 0) {
fprintf(stderr, "could not open codec\n");
exit(1);
}
f = fopen(filename, "wb");
if (!f) {
fprintf(stderr, "could not open %s\n", filename);
exit(1);
} //BLOCK FOUR
/* alloc image and output buffer */
outbuf_size = 100000;
outbuf = malloc(outbuf_size);
size = c->width * c->height;
picture_buf = malloc((size * 3) / 2); /* size for YUV 420 */
picture->data[0] = picture_buf;
picture->data[1] = picture->data[0] + size;
picture->data[2] = picture->data[1] + size / 4;
picture->linesize[0] = c->width;
picture->linesize[1] = c->width / 2;
picture->linesize[2] = c->width / 2; //BLOCK FIVE
/* encode 1 second of video */
for(i=0;i<25;i++) {
fflush(stdout);
/* prepare a dummy image */
/* Y */
for(y=0;yheight;y++) {
for(x=0;xwidth;x++) {
picture->data[0][y * picture->linesize[0] + x] = x + y + i * 3;
}
} //BLOCK SIX
/* Cb and Cr */
for(y=0;yheight/2;y++) {
for(x=0;xwidth/2;x++) {
picture->data[1][y * picture->linesize[1] + x] = 128 + y + i * 2;
picture->data[2][y * picture->linesize[2] + x] = 64 + x + i * 5;
}
} //BLOCK SEVEN
/* encode the image */
out_size = avcodec_encode_video(c, outbuf, outbuf_size, picture);
printf("encoding frame %3d (size=%5d)\n", i, out_size);
fwrite(outbuf, 1, out_size, f);
} //BLOCK EIGHT
/* get the delayed frames */
for(; out_size; i++) {
fflush(stdout);
out_size = avcodec_encode_video(c, outbuf, outbuf_size, NULL);
printf("write frame %3d (size=%5d)\n", i, out_size);
fwrite(outbuf, 1, out_size, f);
} //BLOCK NINE
/* add sequence end code to have a real mpeg file */
outbuf[0] = 0x00;
outbuf[1] = 0x00;
outbuf[2] = 0x01;
outbuf[3] = 0xb7;
fwrite(outbuf, 1, 4, f);
fclose(f);
free(picture_buf);
free(outbuf);
avcodec_close(c);
av_free(c);
av_free(picture);
} //BLOCK TENHere’s what I can get from the authors code block by block...
BLOCK ONE : Initializing Variables and pointers. I couldn’t find the AVFrame struct yet in the ffmpeg source code so I don’t know what its referencing
BLOCK TWO : Uses a codec from the file, if not found close.
BLOCK THREE : Sets sample video parameters. Only thing I don’t really get is gop size. I read about intra frames and I still don’t get what they are.
BLOCK FOUR : Open the file for writing...
BLOCK FIVE : Here’s where they really start losing me. Part is probably because I don’t know exactly what AVFrame is, but why do they only use 3/2 of the image size ?
BLOCK SIX & SEVEN : I don’t understand what they are trying to accomplish with this math.
BLOCK EIGHT : It looks like the avcodec function does all the work here, not concerned with that for the time being..
BLOCK NINE : Since it’s outside the 25 frame for loop I assume it gets the leftover frames ?
BLOCK TEN : Close, free mem, etc...
I know this is a large block of code to be confused with, any input would be helpful. I got put in over my head at work. Thanks in advance SO.
-
How to display image for particular time in a video using ffmpeg
18 juin 2013, par Pratik BhingardeveI am trying to created a video from the sequence of images. But i have to display each image with different numbers of seconds. How to do this with FFMPEG.
Thanks in advance.
-
ffmpeg with a set of image files not formatted as %d.jpg
12 juin 2012, par Rob LourensI'm using ffmpeg to compile a set of jpgs into a video. There is plenty written about this, but it seems that the only way to do it is to have images named as consecutive padded numbers, e.g.
0001.jpg
,0002.jpg
...The ffmpeg documentation states that it is possible to use other types of patterns, such as
%*.jpg
to capture all*.jpg
files, but the only pattern that I have gotten to work on my own is the%0Nd
-type pattern. The man page only mentions that type.I really want to have ffmpeg use a set of images with arbitrary names. It would simplify my app quite a lot, make it easier to keep thumbnails and metadata in sync as images are inserted and deleted, etc. Creating links is not an option since I'm working on Android. Is there any way to do this ?
I'm also willing to modify the ffmpeg source or work with the C api to get it to do what I want, but I can't find the right spot in the code to do it, or appropriate docs for the C api. Any advice ? Thanks.