Recherche avancée

Médias (2)

Mot : - Tags -/plugins

Autres articles (49)

Sur d’autres sites (8917)

  • Render YUV video in OpenGL of ffmpeg using CVPixelBufferRef and Shaders

    4 septembre 2012, par resident_

    I'm using to render YUV frames of ffmpeg with the iOS 5.0 method "CVOpenGLESTextureCacheCreateTextureFromImage".

    I'm using like the apple example GLCameraRipple

    My result in iPhone screen is this : iPhone Screen

    I need to know I'm doing wrong.

    I put part of my code to find errors.

    ffmpeg configure frames :

    ctx->p_sws_ctx = sws_getContext(ctx->p_video_ctx->width,
                                   ctx->p_video_ctx->height,
                                   ctx->p_video_ctx->pix_fmt,
                                   ctx->p_video_ctx->width,
                                   ctx->p_video_ctx->height,
                                   PIX_FMT_YUV420P, SWS_FAST_BILINEAR, NULL, NULL, NULL);


    // Framebuffer for RGB data
    ctx->p_frame_buffer = malloc(avpicture_get_size(PIX_FMT_YUV420P,
                                                   ctx->p_video_ctx->width,
                                                   ctx->p_video_ctx->height));

    avpicture_fill((AVPicture*)ctx->p_picture_rgb, ctx->p_frame_buffer,PIX_FMT_YUV420P,
                  ctx->p_video_ctx->width,
                  ctx->p_video_ctx->height);

    My render method :

    if (NULL == videoTextureCache) {
       NSLog(@"displayPixelBuffer error");
       return;
    }    


    CVPixelBufferRef pixelBuffer;    
      CVPixelBufferCreateWithBytes(kCFAllocatorDefault, mTexW, mTexH, kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, buffer, mFrameW * 3, NULL, 0, NULL, &pixelBuffer);



    CVReturn err;    
    // Y-plane
    glActiveTexture(GL_TEXTURE0);
    err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                                      videoTextureCache,
                                                      pixelBuffer,
                                                      NULL,
                                                      GL_TEXTURE_2D,
                                                      GL_RED_EXT,
                                                      mTexW,
                                                      mTexH,
                                                      GL_RED_EXT,
                                                      GL_UNSIGNED_BYTE,
                                                      0,
                                                      &_lumaTexture);
    if (err)
    {
       NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
    }  

    glBindTexture(CVOpenGLESTextureGetTarget(_lumaTexture), CVOpenGLESTextureGetName(_lumaTexture));
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);    

    // UV-plane
    glActiveTexture(GL_TEXTURE1);
    err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                                      videoTextureCache,
                                                      pixelBuffer,
                                                      NULL,
                                                      GL_TEXTURE_2D,
                                                      GL_RG_EXT,
                                                      mTexW/2,
                                                      mTexH/2,
                                                      GL_RG_EXT,
                                                      GL_UNSIGNED_BYTE,
                                                      1,
                                                      &_chromaTexture);
    if (err)
    {
       NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
    }

    glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture), CVOpenGLESTextureGetName(_chromaTexture));
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);    

    glBindFramebuffer(GL_FRAMEBUFFER, defaultFramebuffer);

    // Set the view port to the entire view
    glViewport(0, 0, backingWidth, backingHeight);

    static const GLfloat squareVertices[] = {
       1.0f, 1.0f,
       -1.0f, 1.0f,
       1.0f,  -1.0f,
       -1.0f,  -1.0f,
    };

    GLfloat textureVertices[] = {
       1, 1,
       1, 0,
       0, 1,
       0, 0,
    };

    // Draw the texture on the screen with OpenGL ES 2
    [self renderWithSquareVertices:squareVertices textureVertices:textureVertices];


    // Flush the CVOpenGLESTexture cache and release the texture
    CVOpenGLESTextureCacheFlush(videoTextureCache, 0);    
    CVPixelBufferRelease(pixelBuffer);    

    [moviePlayerDelegate bufferDone];

    RenderWithSquareVertices method

       - (void)renderWithSquareVertices:(const GLfloat*)squareVertices textureVertices:(const GLfloat*)textureVertices
    {


     // Use shader program.
       glUseProgram(shader.program);

    // Update attribute values.
    glVertexAttribPointer(ATTRIB_VERTEX, 2, GL_FLOAT, 0, 0, squareVertices);
    glEnableVertexAttribArray(ATTRIB_VERTEX);
    glVertexAttribPointer(ATTRIB_TEXTUREPOSITON, 2, GL_FLOAT, 0, 0, textureVertices);
    glEnableVertexAttribArray(ATTRIB_TEXTUREPOSITON);

    glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);

    // Present
    glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);
    [context presentRenderbuffer:GL_RENDERBUFFER];

    }

    My fragment shader :

    uniform sampler2D SamplerY;
    uniform sampler2D SamplerUV;


    varying highp vec2 _texcoord;

    void main()
    {

    mediump vec3 yuv;
    lowp vec3 rgb;

    yuv.x = texture2D(SamplerY, _texcoord).r;
    yuv.yz = texture2D(SamplerUV, _texcoord).rg - vec2(0.5, 0.5);

    // BT.601, which is the standard for SDTV is provided as a reference

    /* rgb = mat3(    1,       1,     1,
    0, -.34413, 1.772,
    1.402, -.71414,     0) * yuv;*/


    // Using BT.709 which is the standard for HDTV
    rgb = mat3(      1,       1,      1,
              0, -.18732, 1.8556,
              1.57481, -.46813,      0) * yuv;

      gl_FragColor = vec4(rgb, 1);

    }

    Very thanks,

  • is it possible to play h264-mpegts format in ios ?

    11 octobre 2012, par jAckOdE

    Server transcodes the video to the h264/aac, and send video data to client simultaneously. To play and save the video data at the same time, I use mpegts as the container format, but the problem is that ios MediaPlayer can not play mpegts file.

    Google said that i can used ios-build of ffmpeg to do the task, but it seem to be an overkill. is there any other ways to play mpegts in iphone using just iOS SDK ?

  • Displaying YUV420 data using Opengles shader is too slow

    28 novembre 2012, par user1278982

    I have a child thread called A to decode video using ffmpeg on iPhone 3GS, another thread called B to display yuv data, in thread B, I used glSubTexImage2D to upload Y U V textures, and then convert yuv data to RGB in shader, but the frame rate in the decode thread is only 15fps.Why ?

    Update :
    The frame size is 720 * 576.
    I also found something interesting that if I didn't start the thread displaying the YUV data, the frame rate calculated in the decode thread is 22 fps,otherwise 15 fps.So I think that my displaying method must be inefficient.the code as below.

    I have a callback in the decode thread :

    typedef struct _DVDVideoPicture
    {
      char *plane[4];
      int iLineSize[4];
    }DVDVideoPicture;

    void YUVCallBack(void *pYUVData, void *pContext)
    {
      VideoView *view = (VideoView *)pContext;
      [view.glView copyYUVData:(DVDVideoPicture *)pData];
      [view calculateFrameRate];
    }

    The copyYUVData method extract the y u v planes separately. The following is displaying thread method.