Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Automatically correct a/v sync with ffmpeg and mediainfo

    28 janvier 2016, par alexy13

    I have a bunch of .ts files that don't have the audio synced by ~30ms. Out of curiosity, I ran mediainfo on one of the files, and it said (among many other things):

    Delay relative to video : -23ms

    I then set the audio delay to -23ms, and it synced up almost perfectly.

    I'm currently doing ffmpeg -i in.mp4 -itsoffset delay_in_milliseconds -i in.mp4 -map 0:0 -map 1:1 -acodec copy -bsf:a aac_adtstoasc -vcodec copy out.mp4 to convert to mp4 and add an audio delay (strangely, however not adding an audio delay causes mediainfo to report nothing for Delay relative to video in the final mp4 file.) This works, however I have a lot of files to do and would take me hours.

    Is it possible for ffmpeg to automatically correct the a/v sync?

  • Stream mjpeg with ffserver

    28 janvier 2016, par user3115426

    I am trying to feed my ffserver with single .jpeg file by mjpg and play it on VLC player. I am sending my image file by command:

    ffmpeg -r 0.33 -i test.jpg http://localhost:8090/feed1.ffm
    

    My ffserver configuration looks like this:

    Port 8090
    BindAddress 0.0.0.0
    MaxHTTPConnections 2000
    MaxClients 10
    MaxBandwidth 100000
    CustomLog -
    
    
    File /tmp/feed1.ffm 
    FileMaxSize 10M
    ACL allow 127.0.0.1
    
    
    
    Feed feed1.ffm
    Format mjpeg
    VideoSize 640x480
    VideoFrameRate 10
    VideoBitRate 2000
    VideoQMin 1
    VideoQMax 10
    Noaudio
    
    

    When I open stream in VLC it disappears quickly, but when I turn on 'loop' option in VLC I can see that there is a frame. My goal is to have continuously stream without any loop options. What I am doing wrong? Do I have to change framerate or something else?

  • Including ffmpeg with Chrome ?

    28 janvier 2016, par alexino2

    I'm trying to do something similare to VideoStream

    They include ffmpeg with nacl and include it in a chrome app

    I've beeen working on it, but I can't have the naclportlist ffmpeg working on my project !

    The reasons are those

    • nacl doesn't work with main() and command line, and ffmpeg requires it
    • glibc, the only way to compile it, expect a main() ! Impossible to remove it (maybe with ppapi ?)

    So my question is simple, how can I communicate with ffmpeg and chrome ? if VideoStream can do it, why can't I ?

    Thanks a lot for yours answers !

  • Video rendering in OpenGL on Qt5

    28 janvier 2016, par Bobnono

    I'm trying to render my decoded frame from FFMPEG to an QOpenGLWindow. FFMPEG give me NV12 AVFrame or YUV420p or RGB. I choose the simplest RGB.

    I created an c_gl_video_yuv class inherit from QOpenGLWindow and QOpenGLFunctions_3_3.

    I want to use shader to draw my rectangle and texture it with the video frame. (for YUV I want the sader to convert it in RGB and apply texture)

    my c_gl_video_yuv class is define as below :

    class c_gl_video_yuv : public QOpenGLWindow,protected QOpenGLFunctions_3_3_Core
    {
    public:
       c_gl_video_yuv();
       ~c_gl_video_yuv();
    ---
       void update_texture(AVFrame *frame, int w, int h);
    
    protected:
       void initializeGL();
       void paintGL();
       void resizeGL(int width, int height);
    
       void paintEvent(QPaintEvent *);
    
    private:
    
       ---
       GLuint textures[2];
       --- 
       // Shader program
       QOpenGLShaderProgram *m_program;
    
       GLint locVertices;
       GLint locTexcoord;
    
    };
    

    I initialise the OpenGL :

       void c_gl_video_yuv::initializeGL()
        {
          // Init shader program
            initializeOpenGLFunctions();
            glGenTextures(2, textures);
            /* Apply some filter on the texture */
            glBindTexture(GL_TEXTURE_2D, textures[0]);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    
            glBindTexture(GL_TEXTURE_2D, textures[1]);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
            glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    
            qDebug() << "c_gl_video_yuv::initializeGL() initialise shader"<< endl;
            m_program = new QOpenGLShaderProgram(this);
            m_program->addShaderFromSourceFile(QOpenGLShader::Vertex, ":/shader/Ressources/vertex_yuv.glsl");
            m_program->addShaderFromSourceFile(QOpenGLShader::Fragment, ":/shader/Ressources/rgb_to_rgb_shader .glsl");
            m_program->link();
    
        //    /* Grab location of shader attributes. */
            locVertices = m_program->attributeLocation("position");
            locTexcoord = m_program->attributeLocation("texpos");
    
            /* Enable vertex arrays to push the data. */
            glEnableVertexAttribArray(locVertices);
            glEnableVertexAttribArray(locTexcoord);
    
            /* set data in the arrays. */
                glVertexAttribPointer(locVertices, 2, GL_FLOAT, GL_FALSE, 0,
                              &vertices[0][0]);
                glVertexAttribPointer(locTexcoord, 2, GL_FLOAT, GL_FALSE, 0,
                              &texcoords[0][0]);
    
            // GL options
            glEnable(GL_DEPTH_TEST);    
        }
    

    And I render

    void c_gl_video_yuv::paintGL()
        {
    
            qDebug() <<  "paintGL()   set viewport "<* Clear background. */
            glClearColor(0.5f,0.5f,0.5f,1.0f);
            glClear(GL_COLOR_BUFFER_BIT);
    
            if(first_frame)
            {
                qDebug() <<  "paintGL() Bind shader" <<  endl;
                m_program->bind();
    
                /* Get Ytex attribute to associate to TEXTURE0 */
                    m_program->bindAttributeLocation("Ytex",0);
                    m_program->bindAttributeLocation("UVtex",1);
    
                    qDebug() <<  "paintGL() Bind texture" <<  endl;
                if(!is_init)
                {
                    glActiveTexture(GL_TEXTURE0);
                    glBindTexture(GL_TEXTURE_2D, textures[0]);
                    glTexImage2D( GL_TEXTURE_2D, 0, GL_RGB, frame_width, frame_height, 0, GL_RGB, GL_UNSIGNED_BYTE, frame_yuv->data[0] );
    
                    glActiveTexture(GL_TEXTURE1);
                    glBindTexture(GL_TEXTURE_2D, textures[1]);
                    glTexImage2D( GL_TEXTURE_2D, 0, GL_RGB, frame_width, frame_height, 0, GL_RGB, GL_UNSIGNED_BYTE, frame_yuv->data[0] );
                    is_init = true;
                }
                else
                {
                    glActiveTexture(GL_TEXTURE0);
                    glBindTexture(GL_TEXTURE_2D, textures[0]);
                   glTexSubImage2D(GL_TEXTURE_2D, 0, 0,0, frame_width, frame_height, GL_RGB, GL_UNSIGNED_BYTE, frame_yuv->data[0]);
    
                   glActiveTexture(GL_TEXTURE1);
                   glBindTexture(GL_TEXTURE_2D, textures[1]);
                   glTexSubImage2D(GL_TEXTURE_2D, 0, 0,0, frame_width, frame_height, GL_RGB, GL_UNSIGNED_BYTE, frame_yuv->data[0]);
                }
    
    
    
                glVertexAttribPointer(locVertices, 2, GL_FLOAT, GL_FALSE, 0, vertices);
                glVertexAttribPointer(locTexcoord, 2, GL_FLOAT, GL_FALSE, 0, texcoords);
    
                glEnableVertexAttribArray(0);
                glEnableVertexAttribArray(1);
    
                glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
    
                glDisableVertexAttribArray(1);
                glDisableVertexAttribArray(0);
                m_program->release();
            }
        }
    

    the vertex shader is :

     #version 330 core
        attribute vec3 position;
        attribute vec2 texpos;
        varying vec2 opos;
        void main(void)
        {
            opos = texpos;
            gl_Position = vec4(position, 1.0);
        }
    

    and the Fragment shader is :

    version 330 core
    in vec2 TexCoords;
    out vec4 color;
    uniform sampler2D YImage;
    uniform sampler2D UVImage;
    const vec3 r_c = vec3(1.164383,  0.000000,  1.596027);
    const vec3 g_c = vec3(1.164383, -0.391762, -0.812968);
    const vec3 b_c = vec3(1.164383,  2.017232,  0.000000);
    const vec3 offset = vec3(-0.0625, -0.5, -0.5);
    void main()
    {
       float y_val = texture(YImage, TexCoords).r;
       float u_val = texture(UVImage, TexCoords).r;
       float v_val = texture(UVImage, TexCoords).g;
       vec3 yuv = vec3(y_val, u_val, v_val);
       yuv += offset;
       color.r = dot(yuv, r_c);
       color.g = dot(yuv, g_c);
       color.b = dot(yuv, b_c);
       color.a = 1.0;
    };
    

    (for RGB frame a replace thevec3(1.164383, 0.000000, 1.596027); by vec3(1.0, 1.000000, 1.0);

    So before i receive a frame it render nothing, just a grey window - noarmal After I receive a frame, the textures are uploaded and the shaders normaly create the player. But nothing appear not even a black rectangle nothing, just plain grey.

    What is wrong ? It is not the right way to upload texture, or my vertex are not created ? of course i declarde my vetrtices eand texture coordonate

    const GLfloat vertices[][2] = {
        {-1.f, -1.f},
        {1.f, -1.f},
        {-1.f, 1.f},
        {1.f, 1.f}
    };
    const GLfloat texcoords[][2] = {
        {0.0f, 1.0f},
        {1.0f, 1.0f},
        {0.0f, 0.0f},
        {1.0f, 0.0f}
    };
    

    I am very new to OpenGL so it is quite fuzzy in my head, but I thought it is not really hard to draw a rectangle with a streaming texture. Maybe i should use vbo or fbo but I still don't really understand this.

    If someone can help me, I will appreciate ! Thanks

  • How to add user defined library to ffmpeg compilation

    28 janvier 2016, par krishna

    I have to plugin my static library(abc.a) to ffmpeg project, So how to place my library in configuration/makefile of ffmpeg to build with my library. Please help me out.