
Recherche avancée
Médias (91)
-
Spoon - Revenge !
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
My Morning Jacket - One Big Holiday
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Zap Mama - Wadidyusay ?
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
David Byrne - My Fair Lady
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Beastie Boys - Now Get Busy
15 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Audio
-
Granite de l’Aber Ildut
9 septembre 2011, par
Mis à jour : Septembre 2011
Langue : français
Type : Texte
Autres articles (27)
-
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...) -
Gestion de la ferme
2 mars 2010, parLa ferme est gérée dans son ensemble par des "super admins".
Certains réglages peuvent être fais afin de réguler les besoins des différents canaux.
Dans un premier temps il utilise le plugin "Gestion de mutualisation" -
Supporting all media types
13 avril 2011, parUnlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)
Sur d’autres sites (3937)
-
Center crop image overlay using FFmpeg
27 mai 2020, par HB.I currently have an image that is being overlayed on top of a video, as demonstrated in this image :






The blue square is representing the video and the purple lines are representing the image on top of the video.





Currently, I have the following command :



"-i", InputVideoPath, "-i", InputImagePath, "-filter_complex", "[0:v]scale=iw*sar:ih,setsar=1,pad='max(iw\\,2*trunc(ih*9/16/2))':'max(ih\\,2*trunc(ow*16/9/2))':(ow-iw)/2:(oh-ih)/2[v0];[1:v][v0]scale2ref[v1][v0];[v0][v1]overlay=x=(W-w)/2:y=(H-h)/2[v]", "-map", "[v]", "-map", "0:a", "-c:v", "libx264", "-preset", "ultrafast", "-r", "30", OutputPath




This adds black padding to the sides of the video and outputs the following :






But I would like the center crop the image that is being overlayed instead, giving me this output :






I've seen answers that demonstrate how to crop an image or crop 2 videos, but I couldn't find a way to center crop an image that is being overlayed on top of a video.



The video I'm testing with is
1920x1080
and the size of the image is not constant.


Any help in achieving this will be appreciated.





EDIT (This edit is to add more clarification).



Please have a look at the image below :






The image above demonstrates :



- 

- Purple Lines : The entire screen of the device/player, this will be used as the input image. The user draws on the screen/player.
- Blue : The input video, scaled to fill the screen
- Green : The actual size of the input video









With this example, the player/image is
1920x1080
and the actual video size is640x640
. So the video is scaled440x440
to fill the player.




I tried to use a simple overlay, with the hopes that it will crop the video/image and output a video with the image at the same position as what was displayed on the device, by doing the following :



ffmpeg -i InputVideo -i InputImage -filter_complex "overlay=(main_w-overlay_w)/2:(main_h-overlay_h)/2" -c:v libx264 -preset ultrafast OutputPath




But, the image is not at the same position as what it was on the device.



I suspect that I will have to take into account the size difference between the scaling of the video to fit into the video player.



I'm not sure how I can do this ?


-
How to change the colour of pixels in videos with OpenGL in C++ [closed]
8 juin 2020, par DennioI would like to overlay a specifically coloured pattern or matrix over every frame of a video, so that every pixel changes it's color slightly corresponding to a data-matrix which I generate from a bitstream. It would begin with the upper left pixel and would go on till the end of the "line" and so on. I would like to change the red and blue values which means if the bitstream begins with a "1" the amount of red should be raised by 5 and if it begins with a "0" the amount of blue should be raised by 5. That would be done for every pixel of the frame.



I can already open a video using FFmpeg in a selfmade videoplayer, I can also generate the data-matrix but I just don't know which way is suitable to manipulate the videoframes in c++. I already successfully compiled some OpenGL and OpenGL ES triangles on my RaspberryPi 4. Is it possible to convert the frames and pixels into textures and go from there to display everything ? Or is there maybe a better way to do this ? I would like to use the GPU of the RaspberryPi for the tasks to get a good performance out of this.


-
How do I set the framerate/FPS in an FFmpeg code (C) ?
2 juin 2020, par Tobias v. BrevernI try to encode single pictures to a .avi video. The goal is to have every picture displayed for a set amount of seconds to create a slide show. I tried my script with 10 pictures and a delay of 1/5 of a second but the output file was not even half a second long (but displayed every picture). For setting the framerate I use the time_base option of the AVCodeContext :



ctx->time_base = (AVRational) {1, 5};



When I use the command
ffmpeg -framerate 1/3 -i img%03d.png -codec png output.avi
everything works fine and I get the file I want. I use the png codec because it was the only one i tried that is playable with Windows Media Player.


Am I missing anything here ? Is there another option that has impact on the framerate ?



This is my code so far :



Note : I use a couple of self made data structures and methodes from other classes. They are the ones written in Caps Lock. They basicly do what the name suggests but are necessary for my project. The Input Array contains the pictures that i want to encode.



include <libavutil></libavutil>opt.h>
include <libavutil></libavutil>imgutils.h>
include <libavutil></libavutil>error.h>

void PixmapsToAVI (ARRAY* arr, String outfile, double secs)
{
 if (arr!=nil && outfile!="" && secs!=0) {
 AVCodec* codec = avcodec_find_encoder(AV_CODEC_ID_PNG);
 if (codec) {
 int width = -1;
 int height = -1;
 int ret = 0;

 AVCodecContext* ctx = NULL;
 ctx = avcodec_alloc_context3(codec);
 AVFrame* frame = av_frame_alloc();
 AVPacket* pkt = av_packet_alloc();

 FILE* file = fopen(outfile, "wb");

 ARRAYELEMENT* e;
 int count = 0;
 forall (e, *arr) {
 BITMAP bitmap (e->value, false);
 if (width < 0) {
 width = bitmap.Width();
 height = bitmap.Height();

 ctx->width = width;
 ctx->height = height;
 ctx->time_base = (AVRational){1, 5};
 ctx->framerate = (AVRational){5, 1};
 ctx->pix_fmt = AV_PIX_FMT_RGB24;
 ret = avcodec_open2(ctx, codec, NULL);

 frame->width = width;
 frame->height = height;
 frame->format = ctx->pix_fmt;
 av_opt_set(ctx->priv_data, "preset", "slow", 1);

 }
 ret = av_frame_get_buffer(frame, 1);
 frame->linesize[0] = width*3;

 bitmap.Convert32();
 byte* pixels = bitmap.PixelsRGB(); 

//The two methodes above convert the Pixmap into the RGB structure we need
//They are not needed to get an output file but are needed to get one that makes sense

 fflush(stdout);
 int writeable = av_frame_make_writable(frame);
 if (writeable>=0) {
 for(int i=0; i<(height*width*3); i++){
 frame->data[0][i] = pixels[i];
 }
 }
 ret = avcodec_send_frame(ctx, frame);
 for(int i=0; i= 0) {
 ret = avcodec_receive_packet(ctx, pkt);
 }
 count++;
 avcodec_receive_packet(ctx, pkt);
 fwrite(pkt->data, 1, pkt->size, file);
 fflush(stdout);
 av_packet_unref(pkt);
 }
 fclose(file);
 avcodec_free_context(&ctx);
 av_frame_free(&frame);
 av_packet_free(&pkt);

 }
 }
}