Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • Reading RTMP streams using FFMPEG returns AVERROR_EOF randomly

    9 octobre 2013, par user2628781

    I am using FFMpeg to receive RTMP streams. This logic is placed in a custom video player I am building.

    I managed to successfully connect to the RTMP stream and display the video correctly. However, after a period in time, the stream terminates prematurely with an AVERROR_EOF when I perform av_read_frame().

    This indicates to me that the file has ended so my demux, video and audio threads terminates thinking that the video has ended. However, the video playback hasn't yet reached its end (in the case of a file streamed through rtmp) or from a live stream (which runs forever). The EOFs are received very randomly so it may run for say 7 min before this occurs or after 3 mins.

    Is this a characteristic behaviour of RTMP or am I doing something incorrectly? I am also having a similar problem with Http Live Streams using FFMpeg.

    A small snippet of the code is provided below:

     AVPacket packet;
    //start timeout timer and timeout
    __interrupt_timer.start();
    
    int ret = av_read_frame(format_context, &packet); //0: ok, <0: error/end
    
    
    //invalidate the timer
    __interrupt_timer.invalidate();
    
    if (ret != 0) {
        if (ret == AVERROR_EOF) { //end of file. FIXME: why no eof if replaying by seek(0)?            
            if (!eof) {
                eof = true;
                started_ = false;
                pkt->data = QByteArray(); //flush
                pkt->markEnd();
                qDebug("End of file. %s %d", __FUNCTION__, __LINE__);
                emit finished();
                return true;
            }
            pkt->data = QByteArray(); //flush
            //return true;
            return false; //frames after eof are eof frames
        } else if (ret == AVERROR_INVALIDDATA) {
            qWarning("AVERROR_INVALIDDATA");
        } else if (ret == AVERROR(EAGAIN)) {
            return true;
        }
        qWarning("[AVDemuxer] error: %s", av_err2str(ret));
        return false;
    }
    
  • Trying to compile the FFMPEG libraries for iPhoneOS platform with armv6 and arv7 architecture

    9 octobre 2013, par Er Amit Sachdeva

    I'm trying to compile the FFMPEG Libraries for IOS platform with arm7 architecture. I downloaded the FFMPEG libraries from http://www.ffmpeg.org.

    I'm successfully able to build the static libraries for the i386 architecture for the iPhoneSimulator. But I need the libraries for iPhoneOS. Using the configure command I'm getting an error. Below are my command details.

    For i386(which is error free):

    ./configure  --enable-cross-compile --disable-debug --disable-ffmpeg --disable-ffplay --disable-ffprobe --disable-ffserver --disable-doc --disable-network  --disable-everything  --enable-protocol=file --enable-demuxer=mov  --enable-muxer=mpegts --enable-bsf=h264_mp4toannexb   --arch=i386 --target-os=darwin --cc=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/usr/bin/gcc --as='./gas-preprocessor/gas-preprocessor.pl /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc' --sysroot=/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneOS6.1.sdk --cpu=i386 --extra-cflags='-arch i386' --extra-ldflags='-arch i386 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator6.1.sdk' --prefix="/Users/amit/Desktop/LivuLib-master/LivuLIb/ffmpeg/bin/newLib"
    

    For armv7:

    ./configure  --enable-cross-compile --disable-debug --disable-ffmpeg --disable-ffplay --disable-ffprobe --disable-ffserver --disable-doc --disable-network  --disable-everything  --enable-protocol=file --enable-demuxer=mov  --enable-muxer=mpegts --enable-bsf=h264_mp4toannexb   --arch=armv7 --target-os=darwin --cc=/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc --as='./gas-preprocessor/gas-preprocessor.pl /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc' --sysroot=/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS6.1.sdk --cpu=cortex-a9 --extra-cflags='-arch armv7' --extra-ldflags='-arch armv7 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS6.1.sdk' --prefix="/Users/amit/Desktop/LivuLib-master/LivuLIb/ffmpeg/bin/newLib"
    

    For above command for Arm7 I'm getting this error:

    /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc is unable to create an executable file.
    C compiler test failed.
    
    If you think configure made a mistake, make sure you are using the latest
    version from SVN.
    

    So please suggest me a possible solution. Why is it not working? Or do I need to update anything. Thanks.

  • FFmpeg libx264 iOS

    8 octobre 2013, par user441918

    I have to stream a RTSP video from IP camera in iOS. I found that I can use FFmpeg to stream the RTSP and it seems working. When I reviewed the legal terms of FFmpeg, it says "Make sure your program is not using any GPL libraries (notably libx264)". Does it mean, I cannot use libx264 library in iOS application or do I have to compile the FFmpeg library with "disable-gpl" option?

    If FFmpeg with libx264 cannot be used in iOS, is there any other alternative library available?

  • Capture raw video using v4l2 encode with mpeg-2 codec and stream mpeg2-ts to socket

    8 octobre 2013, par Dennis Jansky

    Sorry for the inconvenience, but I am a newbie to ffmpeg. I am trying to take raw video from a camera, encode using mpeg-2 and transport using MPEG2-TS. This seems relatively easily until I add the requirement that this all be done using sockets. I have looked at the documentation, for ffmpeg, and I am able to retrieve the video, encode and stream to a file. However, I would like to remove the file and send the data straight down a socket. Does anyone know if this can be done ? Again, I am sorry if this is a newb question :-\

    Thanks, Dennis

  • yuv to rgb conversion using OpenGL ES2.0 in android,

    8 octobre 2013, par 101110101100111111101101

    I have 2 questions about yuv to rgb conversion using OpenGL ES2.0 in android.

    First thing needs little background.

    -------BACK GROUND------

    When I put random data(YUVs one), It renders like good. (I'm not certain because data is random data.)

    however, When I put 'real' one, that is awkaward, important is 'render something'.

    I checked my codes(renderer, data parser, etc... every where), and no doubtful part , except for using GL_RGB parameters in method 'glRenderbufferStorage', 'glTexImage2D'...

    I changed that parameters many times which is in 'glRenderbufferStorage'(GL_RGBA4, GL_RGB565 etc... ). (current is GL_RGBA4)

    but no changed. (some parameters occur error, no rendering.)

    and 'glTexImage2D's parameters too. (current is GL_LUMINANCE.)

    Before Conversion : YUV420P (from ffmpeg pix_fmt_YUV420p) (16bpp, 2X2 Y4 Cb1 Cr1) After Conversion : RGB. (I don't know what's difference with RGB series... RGBA, RGB565 etc) Before decodedData : decoded linesize is 736 height is 480. (fixed)

    so 12*736*480/8 -> array size; Y is 736 * 480 U is arraysize * 1/4; V is arraysize * 1/4;

    ---------BACK GROUND END-----------

    I wondering GL_RGB or RGBA4 or RGB565 that parameters effect output data's result? Not part of quality, but part of rendering or not.

    Second part is about fragment shader.

    My rendering engine structure has 3texture in fragment shader. (attach source below)

            "precision mediump float;               \n"
        "varying vec2 v_vTexCoord;          \n"
        "uniform sampler2D yTexture;        \n"
        "uniform sampler2D uTexture;        \n"
        "uniform sampler2D vTexture;        \n"
        "void main() {                      \n"
            "float y=texture2D(yTexture, v_vTexCoord).r;\n"
            "float u=texture2D(uTexture, v_vTexCoord).r;\n"
            "float v=texture2D(vTexture, v_vTexCoord).r;\n"
            "y=1.1643 * (y - 0.0625);\n"
            "u=u - 0.5;\n"
            "v=v - 0.5;\n"
            "float r=y + 1.5958 * v;\n"
            "float g=y - 0.39173 * u - 0.81290 * v;\n"
            "float b=y + 2.017 * u;\n"
            "gl_FragColor = vec4(r, g, b, 1.0);\n"
        "}\n";
    

    as you know, there are three texture, Ytex,Utex,Vtex. and then, it converts, and gl_FragColor = vec4(r,g,b,1.0); I don't know gl_FragColor = vec4(r,g,b,1.0) means. of cource, I know gl_FragColor is set, but How can I get actual r,g,b value? it renders in texture automatically?