
Recherche avancée
Autres articles (96)
-
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir -
HTML5 audio and video support
13 avril 2011, parMediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
For older browsers the Flowplayer flash fallback is used.
MediaSPIP allows for media playback on major mobile platforms with the above (...) -
Emballe médias : à quoi cela sert ?
4 février 2011, parCe plugin vise à gérer des sites de mise en ligne de documents de tous types.
Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;
Sur d’autres sites (9185)
-
a lot of GREEN Color at YUV420p —> RGB in OpenGL 2.0 Shader on iOS
25 octobre 2012, par 이형근I want to make a movie player for iOS using ffmpeg and OpenGL ES 2.0
but I have some problem. Output RGB image has a lot of GREEN color.
This is code and images- 480x320 width & height :
- 512x512 Texture width & height
I got a YUV420p row data from ffmpeg AVFrame.
for (int i = 0, nDataLen = 0; i < 3; i++) {
int nShift = (i == 0) ? 0 : 1;
uint8_t *pYUVData = (uint8_t *)_frame->data[i];
for (int j = 0; j < (mHeight >> nShift); j++) {
memcpy(&pData->pOutBuffer[nDataLen], pYUVData, (mWidth >> nShift));
pYUVData += _frame->linesize[i];
nDataLen += (mWidth >> nShift);
}
}and prepare texture for Y, U & V channel.
//: U Texture
if (sampler1Texture) glDeleteTextures(1, &sampler1Texture);
glActiveTexture(GL_TEXTURE1);
glGenTextures(1, &sampler1Texture);
glBindTexture(GL_TEXTURE_2D, sampler1Texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
// This is necessary for non-power-of-two textures
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glEnable(GL_TEXTURE_2D);
glTexImage2D(GL_TEXTURE_2D,
0,
GL_LUMINANCE,
texW / 2,
texH / 2,
0,
GL_LUMINANCE,
GL_UNSIGNED_BYTE,
NULL);
//: V Texture
if (sampler2Texture) glDeleteTextures(1, &sampler2Texture);
glActiveTexture(GL_TEXTURE2);
glGenTextures(1, &sampler2Texture);
glBindTexture(GL_TEXTURE_2D, sampler2Texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
// This is necessary for non-power-of-two textures
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glEnable(GL_TEXTURE_2D);
glTexImage2D(GL_TEXTURE_2D,
0,
GL_LUMINANCE,
texW / 2,
texH / 2,
0,
GL_LUMINANCE,
GL_UNSIGNED_BYTE,
NULL);
//: Y Texture
if (sampler0Texture) glDeleteTextures(1, &sampler0Texture);
glActiveTexture(GL_TEXTURE0);
glGenTextures(1, &sampler0Texture);
glBindTexture(GL_TEXTURE_2D, sampler0Texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
// This is necessary for non-power-of-two textures
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glEnable(GL_TEXTURE_2D);
glTexImage2D(GL_TEXTURE_2D,
0,
GL_LUMINANCE,
texW,
texH,
0,
GL_LUMINANCE,
GL_UNSIGNED_BYTE,
NULL);Rendering part is below.
int _idxU = mFrameW * mFrameH;
int _idxV = _idxU + (_idxU / 4);
// U data
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D, sampler1Texture);
glUniform1i(sampler1Uniform, 1);
glTexSubImage2D(
GL_TEXTURE_2D,
0,
0,
0,
mFrameW / 2, // source width
mFrameH / 2, // source height
GL_LUMINANCE,
GL_UNSIGNED_BYTE,
&_frameData[_idxU]);
// V data
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, sampler2Texture);
glUniform1i(sampler2Texture, 2);
glTexSubImage2D(
GL_TEXTURE_2D,
0,
0,
0,
mFrameW / 2, // source width
mFrameH / 2, // source height
GL_LUMINANCE,
GL_UNSIGNED_BYTE,
&_frameData[_idxV]);
// Y data
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, sampler0Texture);
glUniform1i(sampler0Uniform, 0);
glTexSubImage2D(
GL_TEXTURE_2D,
0,
0,
0,
mFrameW, // source width
mFrameH, // source height
GL_LUMINANCE,
GL_UNSIGNED_BYTE,
_frameData);Vertex Shader & Fragment Shader is below.
attribute vec4 Position;
attribute vec2 TexCoordIn;
varying vec2 TexCoordOut;
varying vec2 TexCoordOut_UV;
uniform mat4 Projection;
uniform mat4 Modelview;
void main()
{
gl_Position = Projection * Modelview * Position;
TexCoordOut = TexCoordIn;
}
uniform sampler2D sampler0; // Y Texture Sampler
uniform sampler2D sampler1; // U Texture Sampler
uniform sampler2D sampler2; // V Texture Sampler
varying highp vec2 TexCoordOut;
void main()
{
highp float y = texture2D(sampler0, TexCoordOut).r;
highp float u = texture2D(sampler2, TexCoordOut).r - 0.5;
highp float v = texture2D(sampler1, TexCoordOut).r - 0.5;
//y = 0.0;
//u = 0.0;
//v = 0.0;
highp float r = y + 1.13983 * v;
highp float g = y - 0.39465 * u - 0.58060 * v;
highp float b = y + 2.03211 * u;
gl_FragColor = vec4(r, g, b, 1.0);
}Y Texture (Grayscale) is correct but U & V has a lot of Green Color.
So final RGB image (Y+U+V) has a lot of GREEN Color.
What's the problem ?Please help.
thanks. -
Fragmented MP4 has wrong duration in Windows Media Player
3 février 2020, par ExpressingxI’m streaming video from a camera to a file using
ffmpeg
. The case to useFragmented MP4
is that there can be sudden power off and lose the video. I’ve configured the fragments to be at least 30 secs and this is how much theWindows Media Player
reports for the duration even if its 2 hours long, after that it keeps streaming the video but is not seekable after the 30th second. How can be that fixed ?ffmpeg -c copy -movflags frag_keyframe+separate_moof+omit_tfhd_offset -min_frag_duration 30000000 -bsf:a aac_adtstoasc \ out.mp4
If I use
-movflags frag_keyframe+empty_moov
then the video is not seekable at all and no duration reported.I aim for seekable video with correct duration both on VLC/Windows Media Player
Version of ffmpeg : 3.3.4
-
Smooth FPS change via FFMPEG
2 juin 2020, par RazrFalconI have a 240FPS video from GoPro and I want to slow down it to 30FPS linearly. For this I'm using :



ffmpeg -i raw.mp4 -filter_complex "[v:0]setpts='lerp(2,8,T/5)*PTS" -r 30 output.mp4




* 5 is video duration in seconds



The problem is that the resulting video stutters pretty heavily. But if I use a fixed
setpts
, likesetpts=8*PTS
, then everything is fine.


How to make smooth FPS transition via FFMPEG ?