
Recherche avancée
Médias (1)
-
The Slip - Artworks
26 septembre 2011, par
Mis à jour : Septembre 2011
Langue : English
Type : Texte
Autres articles (49)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Support audio et vidéo HTML5
10 avril 2011MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)
Sur d’autres sites (10197)
-
ffmpeg how to end video with a black frame/solid color
10 mai 2021, par requestI'm scaling 2 videos of different sizes and hstack them side by side together.


If the length of the two videos is different and for example the first one is over, then it shows the last frame of that video the whole time. I want it to show nothing / a black frame until the other video is over


My code so far :


ffmpeg -i vid1.mp4 -i vid2.mp4 -filter_complex "[1][0]scale2ref[2nd][ref];[ref][2nd]hstack" -vsync 0 output.mp4



How could this be achieved ? (something with tpad or stop_mode maybe ?)


Here are two sample videos to test with :






-
Using GDIgrab in FFmpeg with dshow Audio produces black screen
16 mars 2016, par Spreadyshere is my command :
ffmpeg -f gdigrab -framerate 25 -offset_x 10 -offset_y 10 -show_region 1 -draw_mouse 1 -video_size 1280x720 -i desktop -f dshow -i audio="Microphone (2- ATR USB microphone)" -r 25 -threads 4 -c:v libx264 -pix_fmt yuv422p -preset superfast -tune fastdecode -x264opts keyint=25:min-keyint=1 -crf 4 -c:a aac -profile:a aac_low -async 25 "C:\Users\david\Desktop\%output%.mp4"
The gdigrab video works great when it is on its own (no audio). The audio works fine when it is on its own (no video). When I join the two commands to capture both together, as soon as I move a window within my capture area, the area goes black.
In Windows 7, I used to get around this by stopping the desktop composition service prior to capture, (SC stop uxsms), but this is now not possible in Win10.
I thought it may be something graphics card related.
My main monitor is on an Nvidia card, with my second running from the onboard Intel. This is setup for Quicksync H264 playback and encoding with my NLE.I know that I could use a dshow screen capture driver such as UScreen but am trying to avoid that as I need the capture area to be specified each time from a simple batch.
Any help appreciated to solve this black area problem- its driving me crazy !
David -
YUV to RGB by Shader on iPhone
27 octobre 2012, par user1333656Currently I am developing a video player using FFMPEG.
I'm trying to convert YUV420P to RGB by Shader to reduce performance hit and I could see it works fine. The problem is caused when I try to change image size.Case 1. YUV to RGB is perfect. but the image is not exactly fit to Texture Bounds.
For example, if i play 640x360 video, right (640-512) part is cropped and bottom (512-360) is filled with green colored rectangle.FRAME_X=512; //This is texture size
FRAME_Y=512;
avpicture_fill((AVPicture *) f, [currentVideoBuffer.data mutableBytes],
enc->pix_fmt,
FRAME_X, FRAME_Y);
av_picture_copy((AVPicture *) f, (AVPicture *) avFrame,
enc->pix_fmt,
enc->width, enc->height);....
int yuvWidth= FRAME_X ;
int yuvHeight= FRAME_Y;
glBindTexture ( GL_TEXTURE_2D, textureIdY );
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE,
yuvWidth, yuvHeight, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, y_channel);
glBindTexture ( GL_TEXTURE_2D, textureIdU );
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE,
yuvWidth/2, yuvHeight/2, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, u_channel);
glBindTexture ( GL_TEXTURE_2D, textureIdV );
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE,
yuvWidth/2, yuvHeight/2, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, v_channel);Case 2. If i set actual image size to texture size, then image is exactly fit to texture but the color of image is a little bit strange. It has too much green color.
Does anybody give me some clues for this ??
Thanks in advance.