
Advanced search
Medias (91)
-
Spoon - Revenge!
15 September 2011, by
Updated: September 2011
Language: English
Type: Audio
-
My Morning Jacket - One Big Holiday
15 September 2011, by
Updated: September 2011
Language: English
Type: Audio
-
Zap Mama - Wadidyusay?
15 September 2011, by
Updated: September 2011
Language: English
Type: Audio
-
David Byrne - My Fair Lady
15 September 2011, by
Updated: September 2011
Language: English
Type: Audio
-
Beastie Boys - Now Get Busy
15 September 2011, by
Updated: September 2011
Language: English
Type: Audio
-
Granite de l’Aber Ildut
9 September 2011, by
Updated: September 2011
Language: français
Type: Text
Other articles (20)
-
Les formats acceptés
28 January 2010, byLes commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
ffmpeg -codecs ffmpeg -formats
Les format videos acceptés en entrée
Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
Les formats vidéos de sortie possibles
Dans un premier temps on (...) -
Ajouter notes et légendes aux images
7 February 2011, byPour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
Modification lors de l’ajout d’un média
Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...) -
Les vidéos
21 April 2011, byComme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)
On other websites (4219)
-
avcodec: move mastering display colour volume SEI handling to h2645_sei
11 July 2023, by Jan Ekströmavcodec: move mastering display colour volume SEI handling to h2645_sei
This allows this common H.274 SEI to be parsed from both H.264
as well as HEVC, as well as probably from VVC in the future.Generally attempts to keep the original code as similar as possible.
FATE test refererence changes only change the order of side data
export within a single frame. Nothing else seems to have changed. -
10bit DPX to DNXHR_444 with FFmpeg causing colour shift
15 August 2019, by Josh NortheastI’m trying to build a python application to convert a 10bit DPX sequence to a 4k DNXHR_444 MOV with a Arri to Rec709 lut, just as I would in Davcincci resolve.
ffmpeg -f image2 -framerate 24 -pattern_type glob -i INPUT.dpx -c:v dnxhd -profile:v dnxhr_444 -vf lut3d=ArriAlexa_LogCtoRec709_Resolve.cube,colormatrix=bt601:bt709 -pix_fmt yuv444p10le -c:a pcm_s16le -y -timecode 00:00:41:16 OUTPUT.mov
When comparing the output to the dpx in resolve with the lut on it, there is a slight colour shift making everything slightly more red. Even when i take the lut out of the ffmpeg code, there is still a slight redness. The colourmatrix helps a bit to get it closer but it isn’t close enough. Any ideas why I can’t get them to match?
LOG:
ffmpeg -f image2 -framerate 24 -pattern_type glob -i /dpx/*.dpx -c:v dnxhd -profile:v dnxhr_444 -vf lut3d=/Arri/ArriAlexa_LogCtoRec709_Resolve.cube,colormatrix=bt601:bt709 -pix_fmt yuv444p10le -c:a pcm_s16le -y -timecode 00:00:41:16 /dpx/test.mov
ffmpeg version 4.2-static https://johnvansickle.com/ffmpeg/ Copyright (c) 2000-2019 the FFmpeg developers
built with gcc 6.3.0 (Debian 6.3.0-18+deb9u1) 20170516
configuration: --enable-gpl --enable-version3 --enable-static --disable-debug --disable-ffplay --disable-indev=sndio --disable-outdev=sndio --cc=gcc-6 --enable-fontconfig --enable-frei0r --enable-gnutls --enable-gmp --enable-libgme --enable-gray --enable-libaom --enable-libfribidi --enable-libass --enable-libvmaf --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-librubberband --enable-libsoxr --enable-libspeex --enable-libsrt --enable-libvorbis --enable-libopus --enable-libtheora --enable-libvidstab --enable-libvo-amrwbenc --enable-libvpx --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libdav1d --enable-libxvid --enable-libzvbi --enable-libzimg
libavutil 56. 31.100 / 56. 31.100
libavcodec 58. 54.100 / 58. 54.100
libavformat 58. 29.100 / 58. 29.100
libavdevice 58. 8.100 / 58. 8.100
libavfilter 7. 57.100 / 7. 57.100
libswscale 5. 5.100 / 5. 5.100
libswresample 3. 5.100 / 3. 5.100
libpostproc 55. 5.100 / 55. 5.100
Input #0, image2, from '/dpx/*.dpx':
Duration: 00:00:07.67, start: 0.000000, bitrate: N/A
Stream #0:0: Video: dpx, gbrp10le, 4096x1716 [SAR 1:1 DAR 1024:429], 24 tbr, 24 tbn, 24 tbc
Stream mapping:
Stream #0:0 -> #0:0 (dpx (native) -> dnxhd (native))
Press [q] to stop, [?] for help
Output #0, mov, to '/dpx/test.mov':
Metadata:
timecode : 00:00:41:16
encoder : Lavf58.29.100
Stream #0:0: Video: dnxhd (DNXHR 444) (AVdh / 0x68645641), yuv444p10le, 4096x1716 [SAR 1:1 DAR 1024:429], q=2-1024, 200 kb/s, 0.04 fps, 12288 tbn, 24 tbc
Metadata:
encoder : Lavc58.54.100 dnxhd
frame= 184 fps=1.9 q=1.0 Lsize= 1117250kB time=00:00:07.62 bitrate=1200316.8kbits/s speed=0.0799x
video:1117248kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.000187%Here is a:
- JPG - reference of what the DPX + LUT should look like
- DPX - a frame of the dpx sequence I’m trying to convert to DNXHR_444
- the ArriLogC_to_Rec709 LUT to be applied to the DPX
Let me know if you need anything else.
https://drive.google.com/drive/folders/1j2Qq1sV5ZJJsMYe3DOFOV0dnQ3VIotcw
Cheers,
Josh -
Fragment shader does not show any colour when compiled with vs2013
11 June 2015, by 5mayfiveWhen compiled with vs2010, the fragment shader works, but when I compiled and run in vs 2013, it’s grey.
My fragment shader converts the yuv texture into rgb
Below is my fragment code
const char *FProgram =
"uniform sampler2D Ytex;\n"
"uniform sampler2D Utex;\n"
"uniform sampler2D Vtex;\n"
"void main(void) {\n"
" vec4 c = vec4((texture2D(Ytex, gl_TexCoord[0]).r - 16./255.) * 1.164);\n"
" vec4 U = vec4(texture2D(Utex, gl_TexCoord[0]).r - 128./255.);\n"
" vec4 V = vec4(texture2D(Vtex, gl_TexCoord[0]).r - 128./255.);\n"
" c += V * vec4(1.596, -0.813, 0, 0);\n"
" c += U * vec4(0, -0.392, 2.017, 0);\n"
" c.a = 1.0;\n"
" gl_FragColor = c;\n"
"}\n";
glClearColor(0, 0, 0, 0);
PHandle = glCreateProgram();
FSHandle = glCreateShader(GL_FRAGMENT_SHADER);
glShaderSource(FSHandle, 1, &FProgram, NULL);
glCompileShader(FSHandle);
glAttachShader(PHandle, FSHandle);
glLinkProgram(PHandle);
glUseProgram(PHandle);
glDeleteProgram(PHandle);
glDeleteProgram(FSHandle);This is my texture code, I receive linesize and yuv frame data from ffmpeg and make into texture. Everything works fine in VS 2010 computer, but when compiled and run in vs2013 computer, it is grey (black n white), no colour
/* Select texture unit 1 as the active unit and bind the U texture. */
glPixelStorei(GL_UNPACK_ROW_LENGTH, linesize1);
glActiveTexture(GL_TEXTURE1);
i = glGetUniformLocation(PHandle, "Utex");
glUniform1i(i, 1); /* Bind Utex to texture unit 1 */
glBindTexture(GL_TEXTURE_2D, 1);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, width / 2, height / 2, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, frame1);
glPixelStorei(GL_UNPACK_ROW_LENGTH, 0);
/* Select texture unit 2 as the active unit and bind the V texture. */
glPixelStorei(GL_UNPACK_ROW_LENGTH, linesize2);
glActiveTexture(GL_TEXTURE2);
i = glGetUniformLocation(PHandle, "Vtex");
glUniform1i(i, 2); /* Bind Vtext to texture unit 2 */
glBindTexture(GL_TEXTURE_2D, 2);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, width / 2, height / 2, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, frame2);
glPixelStorei(GL_UNPACK_ROW_LENGTH, 0);
/* Select texture unit 0 as the active unit and bind the Y texture. */
glPixelStorei(GL_UNPACK_ROW_LENGTH, linesize0);
glActiveTexture(GL_TEXTURE0);
i = glGetUniformLocation(PHandle, "Ytex");
glUniform1i(i, 0);
glBindTexture(GL_TEXTURE_2D, 0);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE, width, height, 0, GL_LUMINANCE, GL_UNSIGNED_BYTE, frame0);
glPixelStorei(GL_UNPACK_ROW_LENGTH, 0);
glClear(GL_COLOR_BUFFER_BIT);
/* Draw image (again and again). */
glBegin(GL_QUADS);
glTexCoord2i(0, 0);
glVertex2i(-w / 2, h / 2);
glTexCoord2i(1, 0);
glVertex2i(w / 2, h / 2);
glTexCoord2i(1, 1);
glVertex2i(w / 2, -h / 2);
glTexCoord2i(0, 1);
glVertex2i(-w / 2, -h / 2);
glEnd();Need guidance here, thanks in advance!