Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • iOS multiple video display

    28 mars 2017, par Cuevas

    I'm currently doing an iOS project that uses IJKPlayer which is based on FFmpeg and SDL to display RTSP feed from a certain source.

    I have no problem in displaying a single video feed but my project requires me to display multiple stream on the screen simultaneously, the problem I'm facing right now is to separate each of the streams and display it on n number of instances of player.

    RTSP -> stream 0, stream 1, stream 2, stream 4 -> display
    

    Here is a sample output I want to achieve. Each color represents a single stream. Thanks!

    enter image description here

    Edit: If this is not possible on IJKPlayer, can someone recommend a different approach on how to implement this?

  • NV12 textures not working in DirectX 11.1

    28 mars 2017, par André Vitor

    I’m trying to render NV12 textures from frames decoded with ffmpeg 2.8.11 using DirectX 11.1 but when I do render them the texture is broken and the color is always off.

    Result is: http://imgur.com/a/YIVQk

    Code below is how I get the frame decoded by ffmpeg that is in YUV420P format and then I convert(not sure) to NV12 format by interleaving the U and V planes.

    static uint8_t *pixelsPtr_ = nullptr;
    
    UINT rowPitch = ((width + 1) >> 1) * 2;
    UINT imageSize = (rowPitch * height) + ((rowPitch * height + 1) >> 1);
    
    if (!pixelsPtr_)
    {
        pixelsPtr_ = new uint8_t[imageSize];
    }
    
    int j, position = 0;
    
    uint32_t pitchY = avFrame.linesize[0];
    uint32_t pitchU = avFrame.linesize[1];
    uint32_t pitchV = avFrame.linesize[2];
    
    uint8_t *avY = avFrame.data[0];
    uint8_t *avU = avFrame.data[1];
    uint8_t *avV = avFrame.data[2];
    
    ::SecureZeroMemory(pixelsPtr_, imageSize);
    
    for (j = 0; j < height; j++)
    {               
        ::CopyMemory(pixelsPtr_ + position, avY, (width));
        position += (width);
        avY += pitchY;
    }
    
    for (j = 0; j < height >> 1; j++)
    {
        ::CopyMemory(pixelsPtr_ + position, avU, (width >> 1));
        position += (width >> 1);
        avU += pitchU;
    
        ::CopyMemory(pixelsPtr_ + position, avV, (width >> 1));
        position += (width >> 1);
        avV += pitchV;
    }
    

    This is how I’m creating the Texture2D with the data I just got.

    // Create texture
    D3D11_TEXTURE2D_DESC desc;
    desc.Width = width;
    desc.Height = height;
    desc.MipLevels = 1;
    desc.ArraySize = 1;
    desc.Format = DXGI_FORMAT_NV12;
    desc.SampleDesc.Count = 1;
    desc.SampleDesc.Quality = 0;
    desc.Usage = D3D11_USAGE_DEFAULT;
    desc.BindFlags = D3D11_BIND_SHADER_RESOURCE;
    desc.CPUAccessFlags = 0;
    desc.MiscFlags = 0;
    
    D3D11_SUBRESOURCE_DATA initData;
    initData.pSysMem = pixelsPtr_;
    initData.SysMemPitch = rowPitch;
    
    ID3D11Texture2D* tex = nullptr;
    hr = d3dDevice->CreateTexture2D(&desc, &initData, &tex);
    if (SUCCEEDED(hr) && tex != 0)
    {
        D3D11_SHADER_RESOURCE_VIEW_DESC SRVDesc;
        memset(&SRVDesc, 0, sizeof(SRVDesc));
        SRVDesc.Format = DXGI_FORMAT_R8_UNORM;
        SRVDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D;
        SRVDesc.Texture2D.MipLevels = 1;
    
        hr = d3dDevice->CreateShaderResourceView(tex, &SRVDesc, &textureViewYUV[0]);
        if (FAILED(hr))
        {
            tex->Release();
            return hr;
        }
    
        SRVDesc.Format = DXGI_FORMAT_R8G8_UNORM;
    
        hr = d3dDevice->CreateShaderResourceView(tex, &SRVDesc, &textureViewYUV[1]);
        if (FAILED(hr))
        {
            tex->Release();
            return hr;
        }
    
        tex->Release();
    }
    

    Then I pass both Shader Resource View to Pixel Shader

    graphics->Context()->PSSetShaderResources(0, 2, textureViewYUV);
    

    This is the pixel shader:

    struct PixelShaderInput
    {
        float4 pos         : SV_POSITION;
        float4 Color       : COLOR;
        float2 texCoord    : TEXCOORD;
    };
    
    static const float3x3 YUVtoRGBCoeffMatrix =
    {
        1.164383f,  1.164383f, 1.164383f,
        0.000000f, -0.391762f, 2.017232f,
        1.596027f, -0.812968f, 0.000000f
    };
    
    Texture2D  luminanceChannel;
    Texture2D chrominanceChannel;
    
    SamplerState linearfilter
    {
        Filter = MIN_MAG_MIP_LINEAR;
    };
    
    float3 ConvertYUVtoRGB(float3 yuv)
    {
        // Derived from https://msdn.microsoft.com/en-us/library/windows/desktop/dd206750(v=vs.85).aspx
        // Section: Converting 8-bit YUV to RGB888
    
        // These values are calculated from (16 / 255) and (128 / 255)
        yuv -= float3(0.062745f, 0.501960f, 0.501960f);
        yuv = mul(yuv, YUVtoRGBCoeffMatrix);
    
        return saturate(yuv);
    }
    
    float4 main(PixelShaderInput input) : SV_TARGET
    {
        float y = luminanceChannel.Sample(linearfilter, input.texCoord);
        float2 uv = chrominanceChannel.Sample(linearfilter, input.texCoord);
    
        float3 YUV = float3(y, uv.x, uv.y);
        float4 YUV4 = float4(YUV.x, YUV.y, YUV.z, 1);
    
        float3 RGB = ConvertYUVtoRGB(YUV);
        float4 RGB4 = float4(RGB.x, RGB.y, RGB.z, 1);
    
        return RGB4;
    }
    

    Can someone help me? What I’m doing wrong?

    EDIT #1

    int skipLineArea = 0;
    int uvCount = (height >> 1) * (width >> 1);
    
    for (j = 0, k = 0; j < uvCount; j++, k++)
    {
        if (skipLineArea == (width >> 1))
        {
            k += pitchU - (width >> 1);
            skipLineArea = 0;
        }
    
        pixelsPtr_[position++] = avU[k];
        pixelsPtr_[position++] = avV[k];
        skipLineArea++;
    }
    

    EDIT #2

    Updating the texture instead of creating new ones

    D3D11_MAPPED_SUBRESOURCE mappedResource;
    d3dContext->Map(tex, 0, D3D11_MAP_WRITE_DISCARD, 0, &mappedResource);
    
    uint8_t* mappedData = reinterpret_cast(mappedResource.pData);
    
    for (UINT i = 0; i < height * 1.5; ++i)
    {
        memcpy(mappedData, frameData, rowPitch);
        mappedData += mappedResource.RowPitch;
        frameData += rowPitch;
    }
    
    d3dContext->Unmap(tex, 0);
    
  • FFMPEG Output to DeckLink Duo 2

    28 mars 2017, par Harry

    Im trying to use FFMPEG to stream out to my Decklink duo 2 but im unable to do so.

    Heres the command that im trying:

    ffmpeg -f decklink -i "DeckLink Duo (1)@5"
    -pix_fmt uyvy422 -r 25000/1000 "DeckLink Duo (3)"
    

    I tried just sending color bars as an output but it still doesnt work

    Command:

    ffmpeg -f lavfi -i color=color=red -t 10000 "DeckLink Duo (3)"
    

    Both Result:

    ffmpeg version N-82794-g3ab1311 Copyright (c) 2000-2016 the FFmpeg developers
      built with gcc 5.4.0 (GCC)
      configuration: --enable-gpl --enable-version3 --disable-w32threads
     --enable-dxva2 --enable-libmfx --enable-nvenc 
    --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r 
    --enable-gnutls --enable-iconv --enable-libass 
    --enable-libbluray --enable-libbs2b --enable-libcaca 
    --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc 
    --enable-libmodplug --enable-libmp3lame --enable-libopencore-amrnb 
    --enable-libopencore-amrwb --enable-libopenh264 --enable-libopenjpeg 
    --enable-libopus --enable-librtmp --enable-libschroedinger 
    --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora 
    --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc 
    --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp 
    --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg
     --enable-lzma --enable-decklink --enable-zlib
      libavutil      55. 42.100 / 55. 42.100
      libavcodec     57. 67.100 / 57. 67.100
      libavformat    57. 58.102 / 57. 58.102
      libavdevice    57.  2.100 / 57.  2.100
      libavfilter     6. 68.100 /  6. 68.100
      libswscale      4.  3.101 /  4.  3.101
      libswresample   2.  4.100 /  2.  4.100
      libpostproc    54.  2.100 / 54.  2.100
    Input #0, lavfi, from 'color=color=red':
      Duration: N/A, start: 0.000000, bitrate: N/A
        Stream #0:0: Video: rawvideo (I420 / 0x30323449), yuv420p, 320x240 [SAR 1:1 DAR 4:3], 25 tbr, 25 tbn, 25 tbc
    
    [NULL @ 00000000025745a0] Unable to find a suitable output format for 'DeckLink Duo (3)'
    DeckLink Duo (3): Invalid argument
    

    Any Suggestions or help to solve this is much appreciated.

  • How to Play a Video file in javacv / javacpp

    28 mars 2017, par Floesmaan

    Does someone have some example code to play a simple video file with the current javaCPP/javaCV version and the FFmpegFrameGrabber?

    I tried this solution, but its apparently too old and does not work with the current javacv version because of an incompatible FrameGrabber interface (returns a "Frame"-Object instead of an "IplImage"-Object). If I change the code manually (using Frame instead of IplImage), it returns the error message:

    java.lang.VerifyError: Bad type on operand stack
    Exception Details:
      Location:
        org/bytedeco/javacv/FFmpegFrameGrabber.startUnsafe()V @1291: invokespecial
      Reason:
        Type 'org/bytedeco/javacpp/avutil$AVFrame' (current frame, stack[2]) is not assignable to 'org/bytedeco/javacpp/Pointer'
      Current Frame:
        bci: @1291
        flags: { }
        locals: { 'org/bytedeco/javacv/FFmpegFrameGrabber', integer, 'org/bytedeco/javacpp/avformat$AVInputFormat', 'org/bytedeco/javacpp/avutil$AVDictionary', integer, 'org/bytedeco/javacpp/avcodec$AVCodec', integer, integer, integer, integer }
        stack: { uninitialized 1283, uninitialized 1283, 'org/bytedeco/javacpp/avutil$AVFrame' }
      Bytecode:
        0x0000000: 2a01 b500 332a bb00 8659 01b7 0087 b500............
    

    FYI: I'm comparing different java libraries for playing video files and extract their pixel data (xuggler, vlcj, ...) and search for the best one. I really like to include javacv in my tests but it's not working :(

  • what is the meaning of "sw" in libswscale of ffmpeg ?

    28 mars 2017, par synyo xu

    In FFMpeg, we always say hello to libswscale and libswresample, but, what is the meaning of sw in libswscale and libswresample?