
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (63)
-
Participer à sa traduction
10 avril 2011Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
Actuellement MediaSPIP n’est disponible qu’en français et (...) -
List of compatible distributions
26 avril 2011, parThe table below is the list of Linux distributions compatible with the automated installation script of MediaSPIP. Distribution nameVersion nameVersion number Debian Squeeze 6.x.x Debian Weezy 7.x.x Debian Jessie 8.x.x Ubuntu The Precise Pangolin 12.04 LTS Ubuntu The Trusty Tahr 14.04
If you want to help us improve this list, you can provide us access to a machine whose distribution is not mentioned above or send the necessary fixes to add (...) -
Submit bugs and patches
13 avril 2011Unfortunately a software is never perfect.
If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
You may also (...)
Sur d’autres sites (8699)
-
NV12 textures not working in DirectX 11.1
28 mars 2017, par André VitorI’m trying to render NV12 textures from frames decoded with ffmpeg 2.8.11 using DirectX 11.1 but when I do render them the texture is broken and the color is always off.
Result is : http://imgur.com/a/YIVQk
Code below is how I get the frame decoded by ffmpeg that is in YUV420P format and then I convert(not sure) to NV12 format by interleaving the U and V planes.
static uint8_t *pixelsPtr_ = nullptr;
UINT rowPitch = ((width + 1) >> 1) * 2;
UINT imageSize = (rowPitch * height) + ((rowPitch * height + 1) >> 1);
if (!pixelsPtr_)
{
pixelsPtr_ = new uint8_t[imageSize];
}
int j, position = 0;
uint32_t pitchY = avFrame.linesize[0];
uint32_t pitchU = avFrame.linesize[1];
uint32_t pitchV = avFrame.linesize[2];
uint8_t *avY = avFrame.data[0];
uint8_t *avU = avFrame.data[1];
uint8_t *avV = avFrame.data[2];
::SecureZeroMemory(pixelsPtr_, imageSize);
for (j = 0; j < height; j++)
{
::CopyMemory(pixelsPtr_ + position, avY, (width));
position += (width);
avY += pitchY;
}
for (j = 0; j < height >> 1; j++)
{
::CopyMemory(pixelsPtr_ + position, avU, (width >> 1));
position += (width >> 1);
avU += pitchU;
::CopyMemory(pixelsPtr_ + position, avV, (width >> 1));
position += (width >> 1);
avV += pitchV;
}This is how I’m creating the Texture2D with the data I just got.
// Create texture
D3D11_TEXTURE2D_DESC desc;
desc.Width = width;
desc.Height = height;
desc.MipLevels = 1;
desc.ArraySize = 1;
desc.Format = DXGI_FORMAT_NV12;
desc.SampleDesc.Count = 1;
desc.SampleDesc.Quality = 0;
desc.Usage = D3D11_USAGE_DEFAULT;
desc.BindFlags = D3D11_BIND_SHADER_RESOURCE;
desc.CPUAccessFlags = 0;
desc.MiscFlags = 0;
D3D11_SUBRESOURCE_DATA initData;
initData.pSysMem = pixelsPtr_;
initData.SysMemPitch = rowPitch;
ID3D11Texture2D* tex = nullptr;
hr = d3dDevice->CreateTexture2D(&desc, &initData, &tex);
if (SUCCEEDED(hr) && tex != 0)
{
D3D11_SHADER_RESOURCE_VIEW_DESC SRVDesc;
memset(&SRVDesc, 0, sizeof(SRVDesc));
SRVDesc.Format = DXGI_FORMAT_R8_UNORM;
SRVDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D;
SRVDesc.Texture2D.MipLevels = 1;
hr = d3dDevice->CreateShaderResourceView(tex, &SRVDesc, &textureViewYUV[0]);
if (FAILED(hr))
{
tex->Release();
return hr;
}
SRVDesc.Format = DXGI_FORMAT_R8G8_UNORM;
hr = d3dDevice->CreateShaderResourceView(tex, &SRVDesc, &textureViewYUV[1]);
if (FAILED(hr))
{
tex->Release();
return hr;
}
tex->Release();
}Then I pass both Shader Resource View to Pixel Shader
graphics->Context()->PSSetShaderResources(0, 2, textureViewYUV);
This is the pixel shader :
struct PixelShaderInput
{
float4 pos : SV_POSITION;
float4 Color : COLOR;
float2 texCoord : TEXCOORD;
};
static const float3x3 YUVtoRGBCoeffMatrix =
{
1.164383f, 1.164383f, 1.164383f,
0.000000f, -0.391762f, 2.017232f,
1.596027f, -0.812968f, 0.000000f
};
Texture2D<float> luminanceChannel;
Texture2D<float2> chrominanceChannel;
SamplerState linearfilter
{
Filter = MIN_MAG_MIP_LINEAR;
};
float3 ConvertYUVtoRGB(float3 yuv)
{
// Derived from https://msdn.microsoft.com/en-us/library/windows/desktop/dd206750(v=vs.85).aspx
// Section: Converting 8-bit YUV to RGB888
// These values are calculated from (16 / 255) and (128 / 255)
yuv -= float3(0.062745f, 0.501960f, 0.501960f);
yuv = mul(yuv, YUVtoRGBCoeffMatrix);
return saturate(yuv);
}
float4 main(PixelShaderInput input) : SV_TARGET
{
float y = luminanceChannel.Sample(linearfilter, input.texCoord);
float2 uv = chrominanceChannel.Sample(linearfilter, input.texCoord);
float3 YUV = float3(y, uv.x, uv.y);
float4 YUV4 = float4(YUV.x, YUV.y, YUV.z, 1);
float3 RGB = ConvertYUVtoRGB(YUV);
float4 RGB4 = float4(RGB.x, RGB.y, RGB.z, 1);
return RGB4;
}
</float2></float>Can someone help me ? What I’m doing wrong ?
EDIT #1
int skipLineArea = 0;
int uvCount = (height >> 1) * (width >> 1);
for (j = 0, k = 0; j < uvCount; j++, k++)
{
if (skipLineArea == (width >> 1))
{
k += pitchU - (width >> 1);
skipLineArea = 0;
}
pixelsPtr_[position++] = avU[k];
pixelsPtr_[position++] = avV[k];
skipLineArea++;
}EDIT #2
Updating the texture instead of creating new ones
D3D11_MAPPED_SUBRESOURCE mappedResource;
d3dContext->Map(tex, 0, D3D11_MAP_WRITE_DISCARD, 0, &mappedResource);
uint8_t* mappedData = reinterpret_cast(mappedResource.pData);
for (UINT i = 0; i < height * 1.5; ++i)
{
memcpy(mappedData, frameData, rowPitch);
mappedData += mappedResource.RowPitch;
frameData += rowPitch;
}
d3dContext->Unmap(tex, 0); -
h264_mp4toannexb : Improve extradata overread checks
14 décembre 2019, par Andreas Rheinhardth264_mp4toannexb : Improve extradata overread checks
Currently during parsing the extradata, h264_mp4toannexb checks for
overreads by adding the size of the current unit to the current position
pointer and comparing this to the end position of the extradata. But
pointer comparisons and pointer arithmetic are only defined if it does not
exceed the object it is used on (one past the last element of an array
is allowed, too). In practice, this might lead to overflows. Therefore
the check has been changed to use bytestream2_get_bytes_left() which
means that the pointers get subtracted and the result gets compared to
the available size.Furthermore, the error code has been fixed.
Signed-off-by : Andreas Rheinhardt <andreas.rheinhardt@gmail.com>
Signed-off-by : Michael Niedermayer <michael@niedermayer.cc> -
HEVC : Fetching the input width and height from input bin stream
3 septembre 2014, par ZaxI have created an elementary bin stream using
HM-12.0
reference code. So the out put is an HEVC encoded bin stream (say input.bin).I have a task which involves reading the header of this elementary stream. That is i need to fetch information such a the stream width, height etc. from the
input.bin
file.After seeing a lots of streams, i can conclude that all these bin streams starts from the sequence :
00 00 00 01
So whenever i see this sequence in any bin stream, i can say that this stream has to be decoded by HEVC decoder.
Further if i want to fetch the width, height, fps etc. from the input.bin (like ff_raw_video_read_header function in ffmpeg), that are the steps need to be performed to fetch this information ?
I have gone through the parsing section of the HEVC draft, but its very complicated for my level in video domain. Can anyone suggest a simple way to fetch the required information from the encoded bin file ?
Any suggestions will be really helpful to me. Thanks in advance.