Recherche avancée

Médias (2)

Mot : - Tags -/documentation

Autres articles (10)

  • Déploiements possibles

    31 janvier 2010, par

    Deux types de déploiements sont envisageable dépendant de deux aspects : La méthode d’installation envisagée (en standalone ou en ferme) ; Le nombre d’encodages journaliers et la fréquentation envisagés ;
    L’encodage de vidéos est un processus lourd consommant énormément de ressources système (CPU et RAM), il est nécessaire de prendre tout cela en considération. Ce système n’est donc possible que sur un ou plusieurs serveurs dédiés.
    Version mono serveur
    La version mono serveur consiste à n’utiliser qu’une (...)

  • Gestion des droits de création et d’édition des objets

    8 février 2011, par

    Par défaut, beaucoup de fonctionnalités sont limitées aux administrateurs mais restent configurables indépendamment pour modifier leur statut minimal d’utilisation notamment : la rédaction de contenus sur le site modifiables dans la gestion des templates de formulaires ; l’ajout de notes aux articles ; l’ajout de légendes et d’annotations sur les images ;

  • Qu’est ce qu’un éditorial

    21 juin 2013, par

    Ecrivez votre de point de vue dans un article. Celui-ci sera rangé dans une rubrique prévue à cet effet.
    Un éditorial est un article de type texte uniquement. Il a pour objectif de ranger les points de vue dans une rubrique dédiée. Un seul éditorial est placé à la une en page d’accueil. Pour consulter les précédents, consultez la rubrique dédiée.
    Vous pouvez personnaliser le formulaire de création d’un éditorial.
    Formulaire de création d’un éditorial Dans le cas d’un document de type éditorial, les (...)

Sur d’autres sites (3017)

  • openGL ES 2.0 on android , YUV to RGB and Rendering with ffMpeg

    14 octobre 2013, par 101110101100111111101101

    My renderer dies 1 2 frames later when video shows after.

    FATAL ERROR 11 : blabla...(Exactly occurs in glDrawElements (Y part))

    I think problem is 'glPixelStorei' or 'GL_RGB', 'GL_LUMINANCE' but.. I don't get it.

    My rendering way :

    1. Decode data that got from network, (SDK Getting-> NDK Decoding), Enqueueing.

    2. Dequeueing another threads (of course synchronized) get ready to setup OpenGL ES 2.0.(SDK)

    3. When onDrawFrame, onSurfaceCreated, onSurfaceChanged methods are called, it shrink down to NDK. (My Renderer source in NDK will attach below.)

    4. Rendering.

    As you know, Fragment shader is using for conversion.
    My Data is YUV 420p (pix_fmt_YUV420p) (12bit per pixel)

    Here is my entire source.

    I haven't any knowledge about OpenGL ES before, this is first time.

    Please let me know what am I do improving performance.

    and What am I use parameters in 'glTexImage2D', 'glTexSubImage2D', 'glRenderbufferStorage' ????
    GL_LUMINANCE ? GL_RGBA ? GL_RGB ? (GL_LUMINANCE is using now)

    void Renderer::set_draw_frame(JNIEnv* jenv, jbyteArray yData, jbyteArray uData, jbyteArray vData)
    {
       for (int i = 0; i < 3; i++) {
           if (yuv_data_[i] != NULL) {
               free(yuv_data_[i]);
           }
       }

     int YSIZE = -1;
     int USIZE = -1;
     int VSIZE = -1;

     if (yData != NULL) {
           YSIZE = (int)jenv->GetArrayLength(yData);
       LOG_DEBUG("YSIZE : %d", YSIZE);
           yuv_data_[0] = (unsigned char*)malloc(sizeof(unsigned char) * YSIZE);
       memset(yuv_data_[0], 0, YSIZE);
           jenv->GetByteArrayRegion(yData, 0, YSIZE, (jbyte*)yuv_data_[0]);
       yuv_data_[0] = reinterpret_cast<unsigned>(yuv_data_[0]);
       } else {
           YSIZE = (int)jenv->GetArrayLength(yData);
           yuv_data_[0] = (unsigned char*)malloc(sizeof(unsigned char) * YSIZE);
       memset(yuv_data_[0], 1, YSIZE);
     }

       if (uData != NULL) {
           USIZE = (int)jenv->GetArrayLength(uData);
       LOG_DEBUG("USIZE : %d", USIZE);
           yuv_data_[1] = (unsigned char*)malloc(sizeof(unsigned char) * USIZE);
       memset(yuv_data_[1], 0, USIZE);
           jenv->GetByteArrayRegion(uData, 0, USIZE, (jbyte*)yuv_data_[1]);
       yuv_data_[1] = reinterpret_cast<unsigned>(yuv_data_[1]);
       } else {
           USIZE = YSIZE/4;
           yuv_data_[1] = (unsigned char*)malloc(sizeof(unsigned char) * USIZE);
       memset(yuv_data_[1], 1, USIZE);
     }

       if (vData != NULL) {
           VSIZE = (int)jenv->GetArrayLength(vData);
       LOG_DEBUG("VSIZE : %d", VSIZE);
           yuv_data_[2] = (unsigned char*)malloc(sizeof(unsigned char) * VSIZE);
       memset(yuv_data_[2], 0, VSIZE);
           jenv->GetByteArrayRegion(vData, 0, VSIZE, (jbyte*)yuv_data_[2]);
       yuv_data_[2] = reinterpret_cast<unsigned>(yuv_data_[2]);
       } else {
           VSIZE = YSIZE/4;
           yuv_data_[2] = (unsigned char*)malloc(sizeof(unsigned char) * VSIZE);
       memset(yuv_data_[2], 1, VSIZE);
     }

       glClearColor(1.0F, 1.0F, 1.0F, 1.0F);
       check_gl_error("glClearColor");
       glClear(GL_COLOR_BUFFER_BIT);
       check_gl_error("glClear");
    }

    void Renderer::draw_frame()
    {
     // Binding created FBO
     glBindFramebuffer(GL_FRAMEBUFFER, frame_buffer_object_);
     check_gl_error("glBindFramebuffer");
       // Add program to OpenGL environment
       glUseProgram(program_object_);
       check_gl_error("glUseProgram");

     for (int i = 0; i &lt; 3; i++) {
       LOG_DEBUG("Success");
         //Bind texture
         glActiveTexture(GL_TEXTURE0 + i);
         check_gl_error("glActiveTexture");
         glBindTexture(GL_TEXTURE_2D, yuv_texture_id_[i]);
         check_gl_error("glBindTexture");
         glUniform1i(yuv_texture_object_[i], i);
         check_gl_error("glBindTexture");
       glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, stream_yuv_width_[i], stream_yuv_height_[i], GL_RGBA, GL_UNSIGNED_BYTE, yuv_data_[i]);
         check_gl_error("glTexSubImage2D");
     }

     LOG_DEBUG("Success");
       // Load vertex information
       glVertexAttribPointer(position_object_, 2, GL_FLOAT, GL_FALSE, kStride, kVertexInformation);
       check_gl_error("glVertexAttribPointer");
       // Load texture information
       glVertexAttribPointer(texture_position_object_, 2, GL_SHORT, GL_FALSE, kStride, kTextureCoordinateInformation);
       check_gl_error("glVertexAttribPointer");

    LOG_DEBUG("9");
       glEnableVertexAttribArray(position_object_);
       check_gl_error("glEnableVertexAttribArray");
       glEnableVertexAttribArray(texture_position_object_);
       check_gl_error("glEnableVertexAttribArray");

     // Back to window buffer
     glBindFramebuffer(GL_FRAMEBUFFER, 0);
     check_gl_error("glBindFramebuffer");
     LOG_DEBUG("Success");
       // Draw the Square
       glDrawElements(GL_TRIANGLE_STRIP, 6, GL_UNSIGNED_SHORT, kIndicesInformation);
       check_gl_error("glDrawElements");
    }

    void Renderer::setup_render_to_texture()
    {
       glGenFramebuffers(1, &amp;frame_buffer_object_);
       check_gl_error("glGenFramebuffers");
       glBindFramebuffer(GL_FRAMEBUFFER, frame_buffer_object_);
       check_gl_error("glBindFramebuffer");
       glGenRenderbuffers(1, &amp;render_buffer_object_);
       check_gl_error("glGenRenderbuffers");
       glBindRenderbuffer(GL_RENDERBUFFER, render_buffer_object_);
       check_gl_error("glBindRenderbuffer");
       glRenderbufferStorage(GL_RENDERBUFFER, GL_RGBA4, stream_yuv_width_[0], stream_yuv_height_[0]);
       check_gl_error("glRenderbufferStorage");
       glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, render_buffer_object_);
       check_gl_error("glFramebufferRenderbuffer");
     glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, yuv_texture_id_[0], 0);
       check_gl_error("glFramebufferTexture2D");  
     glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, yuv_texture_id_[1], 0);
       check_gl_error("glFramebufferTexture2D");  
     glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, yuv_texture_id_[2], 0);
       check_gl_error("glFramebufferTexture2D");  

     glBindFramebuffer(GL_FRAMEBUFFER, 0);
       check_gl_error("glBindFramebuffer");

       GLint status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
       if (status != GL_FRAMEBUFFER_COMPLETE) {
           print_log("renderer.cpp", "setup_graphics", "FBO setting fault.", LOGERROR);
           LOG_ERROR("%d\n", status);
           return;
       }
    }

    void Renderer::setup_yuv_texture()
    {
       // Use tightly packed data
       glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
       check_gl_error("glPixelStorei");

     for (int i = 0; i &lt; 3; i++) {
       if (yuv_texture_id_[i]) {
         glDeleteTextures(1, &amp;yuv_texture_id_[i]);
         check_gl_error("glDeleteTextures");
       }
         glActiveTexture(GL_TEXTURE0+i);
         check_gl_error("glActiveTexture");
         // Generate texture object
         glGenTextures(1, &amp;yuv_texture_id_[i]);
         check_gl_error("glGenTextures");
         glBindTexture(GL_TEXTURE_2D, yuv_texture_id_[i]);
         check_gl_error("glBindTexture");
       glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
         check_gl_error("glTexParameteri");
         glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
         check_gl_error("glTexParameteri");
         glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
         check_gl_error("glTexParameterf");
         glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
         check_gl_error("glTexParameterf");
       glEnable(GL_TEXTURE_2D);
       check_gl_error("glEnable");
       glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, maximum_yuv_width_[i], maximum_yuv_height_[i], 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
         check_gl_error("glTexImage2D");
     }
    }

    void Renderer::setup_graphics()
    {
       print_gl_string("Version", GL_VERSION);
       print_gl_string("Vendor", GL_VENDOR);
       print_gl_string("Renderer", GL_RENDERER);
       print_gl_string("Extensions", GL_EXTENSIONS);

       program_object_ = create_program(kVertexShader, kFragmentShader);
       if (!program_object_) {
           print_log("renderer.cpp", "setup_graphics", "Could not create program.", LOGERROR);
           return;
       }

       position_object_ = glGetAttribLocation(program_object_, "vPosition");
       check_gl_error("glGetAttribLocation");
       texture_position_object_ = glGetAttribLocation(program_object_, "vTexCoord");
       check_gl_error("glGetAttribLocation");

       yuv_texture_object_[0] = glGetUniformLocation(program_object_, "yTexture");
       check_gl_error("glGetUniformLocation");
     yuv_texture_object_[1] = glGetUniformLocation(program_object_, "uTexture");
       check_gl_error("glGetUniformLocation");
       yuv_texture_object_[2] = glGetUniformLocation(program_object_, "vTexture");
       check_gl_error("glGetUniformLocation");

     setup_yuv_texture();
       setup_render_to_texture();

     glViewport(0, 0, stream_yuv_width_[0], stream_yuv_height_[0]);//736, 480);//1920, 1080);//maximum_yuv_width_[0], maximum_yuv_height_[0]);
     check_gl_error("glViewport");
    }

    GLuint Renderer::create_program(const char* vertex_source, const char* fragment_source)
    {
       GLuint vertexShader = load_shader(GL_VERTEX_SHADER, vertex_source);
       if (!vertexShader) {
           return 0;
       }

       GLuint pixelShader = load_shader(GL_FRAGMENT_SHADER, fragment_source);
       if (!pixelShader) {
           return 0;
       }

       GLuint program = glCreateProgram();
       if (program) {
           glAttachShader(program, vertexShader);
           check_gl_error("glAttachShader");
           glAttachShader(program, pixelShader);
           check_gl_error("glAttachShader");
           glLinkProgram(program);
           /* Get a Status */
           GLint linkStatus = GL_FALSE;
           glGetProgramiv(program, GL_LINK_STATUS, &amp;linkStatus);
           if (linkStatus != GL_TRUE) {
               GLint bufLength = 0;
               glGetProgramiv(program, GL_INFO_LOG_LENGTH, &amp;bufLength);
               if (bufLength) {
                   char* buf = (char*) malloc(bufLength);
                   if (buf) {
                       glGetProgramInfoLog(program, bufLength, NULL, buf);
                       print_log("renderer.cpp", "create_program", "Could not link program.", LOGERROR);
                       LOG_ERROR("%s\n", buf);
                       free(buf);
                   }
               }
               glDeleteProgram(program);
               program = 0;
           }
       }
       return program;
    }

    GLuint Renderer::load_shader(GLenum shaderType, const char* pSource)
    {
       GLuint shader = glCreateShader(shaderType);
           if (shader) {
               glShaderSource(shader, 1, &amp;pSource, NULL);
               glCompileShader(shader);
               /* Get a Status */
               GLint compiled = 0;
               glGetShaderiv(shader, GL_COMPILE_STATUS, &amp;compiled);
               if (!compiled) {
                   GLint infoLen = 0;
                   glGetShaderiv(shader, GL_INFO_LOG_LENGTH, &amp;infoLen);
                   if (infoLen) {
                       char* buf = (char*) malloc(infoLen);
                       if (buf) {
                           glGetShaderInfoLog(shader, infoLen, NULL, buf);
                           print_log("renderer.cpp", "load_shader", "Could not link program.", LOGERROR);
                                     LOG_ERROR("%d :: %s\n", shaderType, buf);
                           free(buf);
                       }
                       glDeleteShader(shader);
                       shader = 0;
                   }
               }
           }
       return shader;
    }


    void Renderer::onDrawFrame(JNIEnv* jenv, jbyteArray yData, jbyteArray uData, jbyteArray vData)
    {
       set_draw_frame(jenv, yData, uData, vData);
       draw_frame();
       return;
    }

    void Renderer::setSize(int stream_width, int stream_height) {
     stream_yuv_width_[0] = stream_width;
     stream_yuv_width_[1] = stream_width/2;
     stream_yuv_width_[2] = stream_width/2;
     stream_yuv_height_[0] = stream_height;
     stream_yuv_height_[1] = stream_height/2;
     stream_yuv_height_[2] = stream_height/2;
    }

    void Renderer::onSurfaceChanged(int width, int height)
    {
     mobile_yuv_width_[0] = width;
     mobile_yuv_width_[1] = width/2;
     mobile_yuv_width_[2] = width/2;
     mobile_yuv_height_[0] = height;
     mobile_yuv_height_[1] = height/2;
     mobile_yuv_height_[2] = height/2;

     maximum_yuv_width_[0] = 1920;
     maximum_yuv_width_[1] = 1920/2;
     maximum_yuv_width_[2] = 1920/2;
     maximum_yuv_height_[0] = 1080;
     maximum_yuv_height_[1] = 1080/2;
     maximum_yuv_height_[2] = 1080/2;

     // If stream size not setting, default size D1
     //if (stream_yuv_width_[0] == 0) {
       stream_yuv_width_[0] = 736;
       stream_yuv_width_[1] = 736/2;
       stream_yuv_width_[2] = 736/2;
       stream_yuv_height_[0] = 480;
       stream_yuv_height_[1] = 480/2;
       stream_yuv_height_[2] = 480/2;
     //}

       setup_graphics();
       return;
    }
    </unsigned></unsigned></unsigned>

    Here is my Fragment, Vertex source and coordinates :

    static const char kVertexShader[] =
       "attribute vec4 vPosition;      \n"
         "attribute vec2 vTexCoord;        \n"
         "varying vec2 v_vTexCoord;        \n"
       "void main() {                        \n"
           "gl_Position = vPosition;       \n"
           "v_vTexCoord = vTexCoord;       \n"
       "}                                          \n";

    static const char kFragmentShader[] =
           "precision mediump float;               \n"
           "varying vec2 v_vTexCoord;          \n"
           "uniform sampler2D yTexture;        \n"
           "uniform sampler2D uTexture;        \n"
           "uniform sampler2D vTexture;        \n"
           "void main() {                      \n"
               "float y=texture2D(yTexture, v_vTexCoord).r;\n"
               "float u=texture2D(uTexture, v_vTexCoord).r - 0.5;\n"
               "float v=texture2D(vTexture, v_vTexCoord).r - 0.5;\n"
               "float r=y + 1.13983 * v;\n"
               "float g=y - 0.39465 * u - 0.58060 * v;\n"
               "float b=y + 2.03211 * u;\n"
               "gl_FragColor = vec4(r, g, b, 1.0);\n"
           "}\n";

    static const GLfloat kVertexInformation[] =
    {
            -1.0f, 1.0f,           // TexCoord 0 top left
            -1.0f,-1.0f,           // TexCoord 1 bottom left
             1.0f,-1.0f,           // TexCoord 2 bottom right
             1.0f, 1.0f            // TexCoord 3 top right
    };
    static const GLshort kTextureCoordinateInformation[] =
    {
             0, 0,         // TexCoord 0 top left
             0, 1,         // TexCoord 1 bottom left
             1, 1,         // TexCoord 2 bottom right
             1, 0          // TexCoord 3 top right
    };
    static const GLuint kStride = 0;//COORDS_PER_VERTEX * 4;
    static const GLshort kIndicesInformation[] =
    {
       0, 1, 2,
       0, 2, 3
    };
  • I am trying to use the RTSPPlay, project of mooncatventures. But it shows me error like this

    7 octobre 2013, par kamlesh shingarakhiya

    I am trying to use the RTSPPlay, project of mooncatventures. But it shows me error like this.

    Ld /Users/admin/Library/Developer/Xcode/DerivedData/Intercom-bpixnryueeelexfnzxtmgchcfgzq/Build/Products/Debug-iphonesimulator/Intercom.app/Intercom normal i386
       cd /Volumes/IPHONE/Intercom
       setenv IPHONEOS_DEPLOYMENT_TARGET 6.1
       setenv PATH "/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/usr/bin:/Applications/Xcode.app/Contents/Developer/usr/bin:/usr/bin:/bin:/usr/sbin:/sbin"
       /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang -arch i386 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator6.1.sdk -L/Users/admin/Library/Developer/Xcode/DerivedData/Intercom-bpixnryueeelexfnzxtmgchcfgzq/Build/Products/Debug-iphonesimulator -F/Users/admin/Library/Developer/Xcode/DerivedData/Intercom-bpixnryueeelexfnzxtmgchcfgzq/Build/Products/Debug-iphonesimulator -F/Users/admin/Downloads/RTSPPlay-master -filelist /Users/admin/Library/Developer/Xcode/DerivedData/Intercom-bpixnryueeelexfnzxtmgchcfgzq/Build/Intermediates/Intercom.build/Debug-iphonesimulator/Intercom.build/Objects-normal/i386/Intercom.LinkFileList -Xlinker -objc_abi_version -Xlinker 2 -Xlinker -no_implicit_dylibs -fobjc-link-runtime -mios-simulator-version-min=6.1 -framework FFmpegDecoder -framework FFPlayer -framework SDL -framework OpenGLES -framework MediaPlayer -framework CoreMedia -framework CoreVideo -framework MobileCoreServices -framework AVFoundation -framework CFNetwork -lbz2 -lxml2 -lz.1.1.3 -lz -framework QuartzCore -framework Security -framework SystemConfiguration -framework UIKit -framework Foundation -framework CoreGraphics -o /Users/admin/Library/Developer/Xcode/DerivedData/Intercom-bpixnryueeelexfnzxtmgchcfgzq/Build/Products/Debug-iphonesimulator/Intercom.app/Intercom



    Undefined symbols for architecture i386:
     "_SDL_AddTimer", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_SDL_CloseAudio", referenced from:
         _stream_component_close in FFPlayer(main.o)
     "_SDL_CondSignal", referenced from:
         _SDL_main in FFPlayer(main.o)
         _stream_component_close in FFPlayer(main.o)
         _packet_queue_abort in FFPlayer(main.o)
         _packet_queue_put in FFPlayer(main.o)
     "_SDL_CondWait", referenced from:
         _video_thread in FFPlayer(main.o)
         _subtitle_thread in FFPlayer(main.o)
         _packet_queue_get in FFPlayer(main.o)
     "_SDL_CreateCond", referenced from:
         _SDL_main in FFPlayer(main.o)
         _packet_queue_init in FFPlayer(main.o)
     "_SDL_CreateMutex", referenced from:
         _SDL_main in FFPlayer(main.o)
         _packet_queue_init in FFPlayer(main.o)
     "_SDL_CreateRGBSurface", referenced from:
         _SDL_main in FFPlayer(main.o)
         _video_open in FFPlayer(main.o)
     "_SDL_CreateRenderer", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_SDL_CreateThread", referenced from:
         _SDL_main in FFPlayer(main.o)
         _stream_component_open in FFPlayer(main.o)
     "_SDL_CreateWindow", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_SDL_Delay", referenced from:
         _video_thread in FFPlayer(main.o)
         _subtitle_thread in FFPlayer(main.o)
         _decode_thread in FFPlayer(main.o)
     "_SDL_DestroyCond", referenced from:
         _do_exit in FFPlayer(main.o)
         _stream_component_close in FFPlayer(main.o)
     "_SDL_DestroyMutex", referenced from:
         _do_exit in FFPlayer(main.o)
         _stream_component_close in FFPlayer(main.o)
     "_SDL_DestroyTexture", referenced from:
         _video_display in FFPlayer(main.o)
     "_SDL_EventState", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_SDL_FillRect", referenced from:
         _video_display in FFPlayer(main.o)
     "_SDL_FreeSurface", referenced from:
         _SDL_main in FFPlayer(main.o)
         _do_exit in FFPlayer(main.o)
     "_SDL_GL_SetAttribute", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_SDL_Init", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_SDL_LockSurface", referenced from:
         _video_thread in FFPlayer(main.o)
     "_SDL_MapRGB", referenced from:
         _video_display in FFPlayer(main.o)
     "_SDL_OpenAudio", referenced from:
         _stream_component_open in FFPlayer(main.o)
     "_SDL_PauseAudio", referenced from:
         _stream_component_open in FFPlayer(main.o)
     "_SDL_PixelFormatEnumToMasks", referenced from:
         _SDL_main in FFPlayer(main.o)
         _video_open in FFPlayer(main.o)
     "_SDL_PollEvent", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_SDL_PushEvent", referenced from:
         _sdl_refresh_timer_cb in FFPlayer(main.o)
         _video_thread in FFPlayer(main.o)
         _decode_thread in FFPlayer(main.o)
     "_SDL_RenderPresent", referenced from:
         _video_display in FFPlayer(main.o)
     "_SDL_SendQuit", referenced from:
         -[SDLUIKitDelegate quitSdl] in SDL_uikitappdelegate.o
         -[SDLUIKitDelegate applicationWillTerminate:] in SDL_uikitappdelegate.o
         -[SDLUIKitDelegate applicationDidEnterBackground:] in SDL_uikitappdelegate.o
     "_SDL_UnlockSurface", referenced from:
         _video_thread in FFPlayer(main.o)
     "_SDL_WaitEvent", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_SDL_WaitThread", referenced from:
         _do_exit in FFPlayer(main.o)
         _stream_component_close in FFPlayer(main.o)
     "_SDL_mutexP", referenced from:
         _SDL_main in FFPlayer(main.o)
         _stream_component_close in FFPlayer(main.o)
         _video_thread in FFPlayer(main.o)
         _subtitle_thread in FFPlayer(main.o)
         _packet_queue_get in FFPlayer(main.o)
         _packet_queue_abort in FFPlayer(main.o)
         _packet_queue_flush in FFPlayer(main.o)
         ...
     "_SDL_mutexV", referenced from:
         _SDL_main in FFPlayer(main.o)
         _stream_component_close in FFPlayer(main.o)
         _video_thread in FFPlayer(main.o)
         _subtitle_thread in FFPlayer(main.o)
         _packet_queue_get in FFPlayer(main.o)
         _packet_queue_abort in FFPlayer(main.o)
         _packet_queue_flush in FFPlayer(main.o)
         ...
     "_av_audio_convert", referenced from:
         _sdl_audio_callback in FFPlayer(main.o)
     "_av_audio_convert_alloc", referenced from:
         _sdl_audio_callback in FFPlayer(main.o)
     "_av_audio_convert_free", referenced from:
         _stream_component_close in FFPlayer(main.o)
         _sdl_audio_callback in FFPlayer(main.o)
     "_av_bitstream_filter_next", referenced from:
         _show_formats in cmdutils.o
     "_av_close_input_file", referenced from:
         _decode_thread in FFPlayer(main.o)
     "_av_codec_next", referenced from:
         _show_formats in cmdutils.o
     "_av_dup_packet", referenced from:
         _packet_queue_put in FFPlayer(main.o)
     "_av_find_input_format", referenced from:
         _opt_format in FFPlayer(main.o)
     "_av_find_opt", referenced from:
         _opt_default in cmdutils.o
     "_av_find_stream_info", referenced from:
         _decode_thread in FFPlayer(main.o)
     "_av_free", referenced from:
         _SDL_main in FFPlayer(main.o)
         _free_subpicture in FFPlayer(main.o)
         _video_thread in FFPlayer(main.o)
         _packet_queue_get in FFPlayer(main.o)
     "_av_free_packet", referenced from:
         _sdl_audio_callback in FFPlayer(main.o)
         _video_thread in FFPlayer(main.o)
         _subtitle_thread in FFPlayer(main.o)
         _packet_queue_flush in FFPlayer(main.o)
         _decode_thread in FFPlayer(main.o)
     "_av_freep", referenced from:
         _free_subpicture in FFPlayer(main.o)
         _packet_queue_flush in FFPlayer(main.o)
     "_av_get_bits_per_sample_format", referenced from:
         _sdl_audio_callback in FFPlayer(main.o)
     "_av_get_int", referenced from:
         _video_thread in FFPlayer(main.o)
     "_av_get_string", referenced from:
         _set_context_opts in cmdutils.o
     "_av_gettime", referenced from:
         _SDL_main in FFPlayer(main.o)
         _get_master_clock in FFPlayer(main.o)
         _video_display in FFPlayer(main.o)
         _get_video_clock in FFPlayer(main.o)
         _stream_component_open in FFPlayer(main.o)
         _sdl_audio_callback in FFPlayer(main.o)
         _stream_pause in FFPlayer(main.o)
         ...
     "_av_iformat_next", referenced from:
         _show_formats in cmdutils.o
     "_av_init_packet", referenced from:
         _SDL_main in FFPlayer(main.o)
         _decode_thread in FFPlayer(main.o)
     "_av_log", referenced from:
         _decode_thread in FFPlayer(main.o)
     "_av_malloc", referenced from:
         _packet_queue_put in FFPlayer(main.o)
     "_av_mallocz", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_av_metadata_get", referenced from:
         _decode_thread in FFPlayer(main.o)
     "_av_oformat_next", referenced from:
         _show_formats in cmdutils.o
     "_av_open_input_file", referenced from:
         _decode_thread in FFPlayer(main.o)
     "_av_protocol_next", referenced from:
         _show_formats in cmdutils.o
     "_av_read_frame", referenced from:
         _decode_thread in FFPlayer(main.o)
     "_av_read_pause", referenced from:
         _decode_thread in FFPlayer(main.o)
     "_av_read_play", referenced from:
         _decode_thread in FFPlayer(main.o)
     "_av_realloc", referenced from:
         _opt_default in cmdutils.o
     "_av_register_all", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_av_set_string3", referenced from:
         _opt_default in cmdutils.o
         _set_context_opts in cmdutils.o
     "_av_strdup", referenced from:
         _parse_options in cmdutils.o
     "_av_strlcat", referenced from:
         _show_help_options in cmdutils.o
     "_av_strlcpy", referenced from:
         _show_help_options in cmdutils.o
         _SDL_main in FFPlayer(main.o)
     "_avcodec_alloc_context2", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_avcodec_alloc_frame", referenced from:
         _video_thread in FFPlayer(main.o)
     "_avcodec_close", referenced from:
         _stream_component_close in FFPlayer(main.o)
     "_avcodec_decode_audio3", referenced from:
         _sdl_audio_callback in FFPlayer(main.o)
     "_avcodec_decode_subtitle2", referenced from:
         _subtitle_thread in FFPlayer(main.o)
     "_avcodec_decode_video2", referenced from:
         _video_thread in FFPlayer(main.o)
     "_avcodec_find_decoder", referenced from:
         _stream_component_open in FFPlayer(main.o)
     "_avcodec_flush_buffers", referenced from:
         _sdl_audio_callback in FFPlayer(main.o)
         _video_thread in FFPlayer(main.o)
         _subtitle_thread in FFPlayer(main.o)
     "_avcodec_open", referenced from:
         _stream_component_open in FFPlayer(main.o)
     "_avcodec_register_all", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_avcodec_thread_init", referenced from:
         _stream_component_open in FFPlayer(main.o)
     "_avcodec_version", referenced from:
         _print_all_lib_versions in cmdutils.o
     "_avdevice_register_all", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_avdevice_version", referenced from:
         _print_all_lib_versions in cmdutils.o
     "_avfilter_version", referenced from:
         _print_all_lib_versions in cmdutils.o
     "_avformat_alloc_context", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_avformat_seek_file", referenced from:
         _decode_thread in FFPlayer(main.o)
     "_avformat_version", referenced from:
         _print_all_lib_versions in cmdutils.o
     "_avpicture_alloc", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_avutil_version", referenced from:
         _print_all_lib_versions in cmdutils.o
     "_dump_format", referenced from:
         _decode_thread in FFPlayer(main.o)
     "_parse_date", referenced from:
         _parse_time_or_die in cmdutils.o
     "_sws_getCachedContext", referenced from:
         _video_thread in FFPlayer(main.o)
     "_sws_getContext", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_sws_scale", referenced from:
         _video_thread in FFPlayer(main.o)
     "_swscale_version", referenced from:
         _print_all_lib_versions in cmdutils.o
     "_url_feof", referenced from:
         _decode_thread in FFPlayer(main.o)
     "_url_ferror", referenced from:
         _decode_thread in FFPlayer(main.o)
     "_url_ftell", referenced from:
         _SDL_main in FFPlayer(main.o)
     "_url_set_interrupt_cb", referenced from:
         _decode_thread in FFPlayer(main.o)
    ld: symbol(s) not found for architecture i386
    clang: error: linker command failed with exit code 1 (use -v to see invocation)

    Can someone plz help me, thanx guys. I have checked that SDL.Framework and I dont see any problem in it. Does I have to add some library in my project to run this project or in mac.

  • Converting DVD image with subtitles to MKV using avconv

    16 janvier 2016, par Carlos Eugenio Thompson Pinzón

    This is the procedure I know to convert a DVD image to another videoformat (v.g. MP4) :

    1. concatenate the VTS_01_n.VOB files inside VIDEO_TS folder (for n >= 0) into a single VOB file.
    2. use avconv or ffmpeg in order to convert that VOB into another format.

    So far so good, however now I want to convert the DVD image with the subtitles. As far as I know the MKV format supports subtitles, so it seems an obvious choice. Alternatively I might use any other format with hard subtitles (subtitles as part of the video image).

    However, the subtitle encoding in the DVD image is dvdsub and I get the following error

    Only SUBTITLE_ASS type supported.
    Subtitle encoding failed

    However, when running avconv -codecs I get :

    ...
    DES    dvdsub          DVD subtitles
    ...

    And the -c:s copy switch, while it prevents the command to fail, it does not seem to produce a subtitle that the player can understand.

    So, how can I create ass subtitles from dvdsub using avconv ?

    My VOB file has eight subtitle channels and two audio channels. The Ubuntu video app does not show any subtitles, and only the first audio channel seems to be working, do the DVD image might be broken.

    Another file, for a double-layer DVD, displays the Subtitle encoding error, however when using the -c:s copy switch it further displays :

    Application provided invalid, non monotonically increasing dts to muxer in stream 2: 49 >= 49
    av_interleaved_write_frame(): Invalid argument

    Thank you in advance for any ideas on how to solve these problems.

    (I am using Ubuntu where ffmpeg is an alias for avconv. I know it is possible to install the real ffmpeg but so far I have not done so.)


    update : commands and console outputs :

    commands

    cat VTS_01_1.VOB VTS_01_2.VOB VTS_01_3.VOB VTS_01_4.VOB VTS_01_5.VOB > ~/temp/mymovie.VOB
    cd ~/temp
    avconv -i mymovie.VOB

    output

    avconv version 0.8.6-6:0.8.6-1ubuntu2, Copyright (c) 2000-2013 the Libav developers
     built on Mar 30 2013 22:20:06 with gcc 4.7.2
    [mpeg @ 0x1a64d40] max_analyze_duration reached
    Input #0, mpeg, from 'mymovie.VOB':
     Duration: 00:00:01.95, start: 0.280633, bitrate: -2147483 kb/s
       Stream #0.0[0x1e0]: Video: mpeg2video (Main), yuv420p, 720x480 [PAR 8:9 DAR 4:3], 7500 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc
       Stream #0.1[0x20]: Subtitle: dvdsub
       Stream #0.2[0x21]: Subtitle: dvdsub
       Stream #0.3[0x22]: Subtitle: dvdsub
       Stream #0.4[0x23]: Subtitle: dvdsub
       Stream #0.5[0x24]: Subtitle: dvdsub
       Stream #0.6[0x25]: Subtitle: dvdsub
       Stream #0.7[0x26]: Subtitle: dvdsub
       Stream #0.8[0x27]: Subtitle: dvdsub
       Stream #0.9[0x80]: Audio: ac3, 48000 Hz, 5.1, s16, 448 kb/s
       Stream #0.10[0x81]: Audio: ac3, 48000 Hz, 5.1, s16, 448 kb/s
    At least one output file must be specified

    command

    avconv -i mymovie.VOB mymovie.mkv

    output

    avconv version 0.8.6-6:0.8.6-1ubuntu2, Copyright (c) 2000-2013 the Libav developers
     built on Mar 30 2013 22:20:06 with gcc 4.7.2
    [mpeg @ 0x1cdad40] max_analyze_duration reached
    Input #0, mpeg, from 'mymovie.VOB':
     Duration: 00:00:01.95, start: 0.280633, bitrate: -2147483 kb/s
       Stream #0.0[0x1e0]: Video: mpeg2video (Main), yuv420p, 720x480 [PAR 8:9 DAR 4:3], 7500 kb/s, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc
       Stream #0.1[0x20]: Subtitle: dvdsub
       Stream #0.2[0x21]: Subtitle: dvdsub
       Stream #0.3[0x22]: Subtitle: dvdsub
       Stream #0.4[0x23]: Subtitle: dvdsub
       Stream #0.5[0x24]: Subtitle: dvdsub
       Stream #0.6[0x25]: Subtitle: dvdsub
       Stream #0.7[0x26]: Subtitle: dvdsub
       Stream #0.8[0x27]: Subtitle: dvdsub
       Stream #0.9[0x80]: Audio: ac3, 48000 Hz, 5.1, s16, 448 kb/s
       Stream #0.10[0x81]: Audio: ac3, 48000 Hz, 5.1, s16, 448 kb/s
    File 'mymovie.mkv' already exists. Overwrite ? [y/N] y
    [buffer @ 0x1ce23c0] w:720 h:480 pixfmt:yuv420p
    Output #0, matroska, to 'mymovie.mkv':
     Metadata:
       encoder         : Lavf53.21.1
       Stream #0.0: Video: mpeg4, yuv420p, 720x480 [PAR 8:9 DAR 4:3], q=2-31, 200 kb/s, 1k tbn, 29.97 tbc
       Stream #0.1: Audio: libvorbis, 48000 Hz, 5.1, s16
       Stream #0.2: Subtitle: ass, 200 kb/s
    Stream mapping:
     Stream #0:0 -> #0:0 (mpeg2video -> mpeg4)
     Stream #0:9 -> #0:1 (ac3 -> libvorbis)
     Stream #0:1 -> #0:2 (dvdsub -> ass)
    Press ctrl-c to stop encoding
    [ass @ 0x1ce0140] Only SUBTITLE_ASS type supported.
    Subtitle encoding failed

    command

    avconv -i mymovie.VOB -c:s copy mymovie.mkv

    output omitted.

    command

    avconv -i mymovie.mkv

    output

    avconv version 0.8.6-6:0.8.6-1ubuntu2, Copyright (c) 2000-2013 the Libav developers
     built on Mar 30 2013 22:20:06 with gcc 4.7.2
    [matroska,webm @ 0xbc1d40] Estimating duration from bitrate, this may be inaccurate
    Input #0, matroska,webm, from 'mymovie.mkv':
     Metadata:
       ENCODER         : Lavf53.21.1
     Duration: 01:05:09.47, start: 0.000000, bitrate: N/A
       Stream #0.0: Video: mpeg4 (Simple Profile), yuv420p, 720x480 [PAR 8:9 DAR 4:3], 29.97 fps, 29.97 tbr, 1k tbn, 30k tbc (default)
       Stream #0.1: Audio: vorbis, 48000 Hz, 5.1, s16 (default)
       Stream #0.2: Subtitle: dvdsub (default)
    At least one output file must be specified

    Now, for the double-layer :
    commands

    cat VTS_01_1.VOB VTS_01_2.VOB VTS_01_3.VOB VTS_01_4.VOB VTS_01_5.VOB VTS_01_6.VOB VTS_01_7.VOB VTS_01_8.VOB > ~/temp/mylongmovie.VOB
    cd ~/temp
    avconv -i mylongmovie.VOB mylongmovie.mkv

    output

    avconv version 0.8.6-6:0.8.6-1ubuntu2, Copyright (c) 2000-2013 the Libav developers
     built on Mar 30 2013 22:20:06 with gcc 4.7.2
    [mpeg @ 0x13c2d40] max_analyze_duration reached
    Input #0, mpeg, from 'Cosmos-0203.VOB':
     Duration: 00:00:30.24, start: 0.280633, bitrate: 2103365 kb/s
       Stream #0.0[0x1e0]: Video: mpeg2video (Main), yuv420p, 720x480 [PAR 8:9 DAR 4:3], 8000 kb/s, 27.46 fps, 59.94 tbr, 90k tbn, 59.94 tbc
       Stream #0.1[0x20]: Subtitle: dvdsub
       Stream #0.2[0x21]: Subtitle: dvdsub
       Stream #0.3[0x22]: Subtitle: dvdsub
       Stream #0.4[0x23]: Subtitle: dvdsub
       Stream #0.5[0x24]: Subtitle: dvdsub
       Stream #0.6[0x25]: Subtitle: dvdsub
       Stream #0.7[0x26]: Subtitle: dvdsub
       Stream #0.8[0x27]: Subtitle: dvdsub
       Stream #0.9[0x81]: Audio: ac3, 48000 Hz, 5.1, s16, 448 kb/s
       Stream #0.10[0x80]: Audio: ac3, 48000 Hz, 5.1, s16, 448 kb/s
    File 'mylongmovie.mkv' already exists. Overwrite ? [y/N] y
    [buffer @ 0x13ca3c0] w:720 h:480 pixfmt:yuv420p
    Output #0, matroska, to 'mylongmovie.mkv':
     Metadata:
       encoder         : Lavf53.21.1
       Stream #0.0: Video: mpeg4, yuv420p, 720x480 [PAR 8:9 DAR 4:3], q=2-31, 200 kb/s, 1k tbn, 59.94 tbc
       Stream #0.1: Audio: libvorbis, 48000 Hz, 5.1, s16
       Stream #0.2: Subtitle: ass, 200 kb/s
    Stream mapping:
     Stream #0:0 -> #0:0 (mpeg2video -> mpeg4)
     Stream #0:9 -> #0:1 (ac3 -> libvorbis)
     Stream #0:1 -> #0:2 (dvdsub -> ass)
    Press ctrl-c to stop encoding
    [ass @ 0x13d19c0] Only SUBTITLE_ASS type supported.
    Subtitle encoding failed

    command

    avconv -i mylongmovie.VOB -c:s copy mylongmovie.mkv

    output

    avconv version 0.8.6-6:0.8.6-1ubuntu2, Copyright (c) 2000-2013 the Libav developers
     built on Mar 30 2013 22:20:06 with gcc 4.7.2
    [mpeg @ 0xce1d40] max_analyze_duration reached
    Input #0, mpeg, from 'mylongmovie.VOB':
     Duration: 00:00:30.24, start: 0.280633, bitrate: 2103365 kb/s
       Stream #0.0[0x1e0]: Video: mpeg2video (Main), yuv420p, 720x480 [PAR 8:9 DAR 4:3], 8000 kb/s, 27.46 fps, 59.94 tbr, 90k tbn, 59.94 tbc
       Stream #0.1[0x20]: Subtitle: dvdsub
       Stream #0.2[0x21]: Subtitle: dvdsub
       Stream #0.3[0x22]: Subtitle: dvdsub
       Stream #0.4[0x23]: Subtitle: dvdsub
       Stream #0.5[0x24]: Subtitle: dvdsub
       Stream #0.6[0x25]: Subtitle: dvdsub
       Stream #0.7[0x26]: Subtitle: dvdsub
       Stream #0.8[0x27]: Subtitle: dvdsub
       Stream #0.9[0x81]: Audio: ac3, 48000 Hz, 5.1, s16, 448 kb/s
       Stream #0.10[0x80]: Audio: ac3, 48000 Hz, 5.1, s16, 448 kb/s
    File 'mylongmovie.mkv' already exists. Overwrite ? [y/N] y
    [buffer @ 0xce93c0] w:720 h:480 pixfmt:yuv420p
    Output #0, matroska, to 'mylongmovie.mkv':
     Metadata:
       encoder         : Lavf53.21.1
       Stream #0.0: Video: mpeg4, yuv420p, 720x480 [PAR 8:9 DAR 4:3], q=2-31, 200 kb/s, 1k tbn, 59.94 tbc
       Stream #0.1: Audio: libvorbis, 48000 Hz, 5.1, s16
       Stream #0.2: Subtitle: dvdsub
    Stream mapping:
     Stream #0:0 -> #0:0 (mpeg2video -> mpeg4)
     Stream #0:9 -> #0:1 (ac3 -> libvorbis)
     Stream #0:1 -> #0:2 (copy)
    Press ctrl-c to stop encoding
    [matroska @ 0xce4b40] Application provided invalid, non monotonically increasing dts to muxer in stream 2: 49 >= 49
    av_interleaved_write_frame(): Invalid argument