Newest 'libx264' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/libx264

Les articles publiés sur le site

  • Where to find ffserver static for linux

    29 mai 2013, par sajad

    I need just to download ffserver static file for using in linux(ubuntu 12.04 x64), with libx264 library. I don't want to challenge compilation task on Linux and have just a portable static file. I have spent long time searching for that. Can anybody guide me?

  • libx264.dll exposes function x264_encoder_open_130, shouldnt this be x264_encoder_open ?

    18 mai 2013, par Kim Kardashian

    I checked in my version of x264 dll file that the open function is named 'x264_encoder_open_130' and 130 happens to be the specific build number of x264 that i use. I build using minGW on windows and link to it, so my program would crash if i try to use a newer dll.. is there a way to get around this ?

  • Decoding by libjpeg -> Encoding by x264, strange artefacts on frames

    15 mai 2013, par mmmaaak

    I have a collection of jpeg, which must be decoded by lib jpeg, and after it, encoded by x264 (after it encoded packets are streamed via rtmp). Code I used for decoding:

    struct my_error_mgr 
    {   
        struct jpeg_error_mgr pub;
        jmp_buf setjmp_buffer;
    };
    typedef my_error_mgr *my_error_ptr;
    
    METHODDEF(void) my_error_exit (j_common_ptr cinfo)
    {
        my_error_ptr myerr = (my_error_ptr) cinfo->err;
        (*cinfo->err->output_message) (cinfo);
        longjmp(myerr->setjmp_buffer, 1);   
    }
    
    void init_source(j_decompress_ptr ptr)
    {
        Q_UNUSED(ptr)
    }
    
    boolean fill_input_buffer(j_decompress_ptr ptr) 
    {
         Q_UNUSED(ptr)
        return TRUE;
    }
    
    void term_source(j_decompress_ptr ptr)
    {
        Q_UNUSED(ptr)
    }
    
    void skip_input_data(j_decompress_ptr ptr, long num_bytes)
    {
        if(num_bytes>0) 
        {
            ptr->src->next_input_byte+=(size_t)num_bytes;
            ptr->src->bytes_in_buffer-=(size_t)num_bytes;   
        }
    }
    
    EtherDecoder::EtherDecoder(QObject *parent):
    QObject(parent)
    {
    }
    
    void EtherDecoder::dataBlockReady(QByteArray data)
    {
        jpeg_decompress_struct decompressInfo;
        jpeg_create_decompress(&decompressInfo);
        my_error_mgr err;
        decompressInfo.do_fancy_upsampling = FALSE;
        decompressInfo.src = (jpeg_source_mgr *) (*decompressInfo.mem->alloc_small)       ((j_common_ptr) &decompressInfo, JPOOL_PERMANENT, sizeof(jpeg_source_mgr));
        decompressInfo.err = jpeg_std_error(&err.pub);
        err.pub.error_exit = my_error_exit;
        if (setjmp(err.setjmp_buffer))  
        {
            jpeg_destroy_decompress(&decompressInfo);
            return; 
        }
        decompressInfo.src->init_source = init_source;
        decompressInfo.src->resync_to_restart = jpeg_resync_to_restart;
        decompressInfo.src->fill_input_buffer = fill_input_buffer;
        decompressInfo.src->skip_input_data = skip_input_data;
        decompressInfo.src->term_source = term_source;
        decompressInfo.src->next_input_byte = reinterpret_cast(data.data());
        decompressInfo.src->bytes_in_buffer = data.size();
        jpeg_read_header(&decompressInfo, TRUE);
        jpeg_start_decompress(&decompressInfo);
        int size = 0;
        int n_samples = 0;
        char *samples = new char[5242880];
        char *reserv = samples;
        while (decompressInfo.output_scanline < decompressInfo.output_height)   
        {
            n_samples = jpeg_read_scanlines(&decompressInfo, (JSAMPARRAY) &samples, 1);
            samples += n_samples * decompressInfo.image_width * decompressInfo.num_components;
            size += n_samples * decompressInfo.image_width * decompressInfo.num_components; 
        }
        jpeg_finish_decompress(&decompressInfo);
        QByteArray output(reserv, size);
        emit frameReady(output, decompressInfo.output_width, decompressInfo.output_height);
        jpeg_destroy_decompress(&decompressInfo);
        delete[] reserv;
    }
    

    When I emit frameReady signal, I send data to Encoder, method, where I init Encedor looks like:

    bool EtherEncoder::initEncoder(unsigned int width, unsigned int height)
    {
        x264_param_t param;
        x264_param_default_preset(&param, "veryfast", "zerolatency");
        param.i_width=width;
        param.i_height=height;
        param.i_frame_total=0;
        param.i_csp=X264_CSP_I420;
        param.i_timebase_num=1;
        param.i_timebase_den=96000;
        param.b_annexb=true;
        param.b_repeat_headers=false;
        x264_param_apply_fastfirstpass(&param);
        x264_param_apply_profile(&param, "baseline");
        _context=x264_encoder_open(&param);
        if(!_context)
            return false;
        int nal_count;
        x264_nal_t *nals;
        if(x264_encoder_headers(_context, &nals, &nal_count)<0) 
        {
            x264_encoder_close(_context);
            _context=0;
            return false;   
        }
        _extradata=QByteArray();
        _width=width;
        _height=height;
        if(nal_count>0) 
        {
            _extradata=QByteArray(
                (const char *)nals[0].p_payload,
                nals[nal_count-1].p_payload+nals[nal_count-1].i_payload-nals[0].p_payload); 
        }
        return true;
    }
    

    And encoding method:

    void EtherEncoder::onFrameReady(QByteArray data, int width, int height)
    {
        while(data.size()>0)    
        {
            if(!_context && initEncoder(width, height))
            {
                _timestampDelta=realTimestamp();
            }
            if(_context)
            {
                x264_picture_t pic;
                x264_picture_init(&pic);
                pic.i_type=X264_TYPE_AUTO;
                pic.i_pts=_timestampDelta*96000;
                pic.img.i_csp=X264_CSP_I420;
                pic.img.i_plane=3;
                int planeSize = width*height;
                uint8_t *p = (uint8_t*)data.data();
                pic.img.plane[0]=p;
                p+=planeSize;
                pic.img.plane[1]=p;
                p+=planeSize/4;
                pic.img.plane[2]=p;
                pic.img.i_stride[0]=width;
                pic.img.i_stride[1]=width/2;
                pic.img.i_stride[2]=width/2;
                if(_forceKeyFrame)
                {
                    pic.i_type=X264_TYPE_I;
                    _forceKeyFrame=false;
                }
                int nal_count;
                x264_nal_t *nals;
                int rc=x264_encoder_encode(_context, &nals, &nal_count, &pic, &pic);
                if(rc>0)
                {
                    _mutex.lock();
                    _packets.push_back(
                        Packet(
                            QByteArray(
                                (const char *)nals[0].p_payload,         nals[nal_count-    1].p_payload+nals[nal_count-1].i_payload-nals[0].p_payload),
                            _timestampDelta/96.0,
                            _timestampDelta/96.0,
                            pic.b_keyframe));
                    _timestampDelta+=40;
                    data.clear();
                    _mutex.unlock();
                    emit onPacketReady();
                }
            }   
        }
    }
    

    Decoding and encoding proceeds without errors, at the end I get valid video stream, but, it seems that in one of this steps I set Invalid data for decoder/encoder. I get only 1/4 part of image (top-left, as I understood) and it has invalid color and come color stripes. Maybe I set invalid strides and planes when encode frame, or maybe my setting data for libjpeg decoder is incorrect.. Please ask questions about my code, I'll try to make some explanations for you. I explodes my brain.. Thank you.

  • Stream video and commands on same connection or split connections ?

    10 mai 2013, par bizzehdee

    Background

    I am in the middle of writing a client/server app that i will install on every machine within my office (roughly 30 - 35 machines). I currently have the client connecting to the server and it has an ability to send mouse movement, mouse clicks, key strokes and execute certain commands. The next step is to stream back a video output of the screen, i am using the GDI method from Fastest method of screen capturing to capture the entire screen and will be using the x264 encoder to compress the frames and transmit them back to the client which will then decode and display the stream.

    Question

    is it best (by means of reducing lag, ensuring all commands are delivered as fast as possible and that streaming is as live as possible) that i transmit back along the same connection that i established for the commands, or, should i establish a separate connection on the same port, or on a different port to stream back the video?

    P.S.

    i am aware that VNC, RD and other things such as TeamViewer already exist and already do this sort of thing, but none of these support all the requirements needed for what we need within this system.

  • is this right when i use the x264 ?

    22 avril 2013, par chinayin

    is this right to use the new x264 api ,i want to get a frame ,and encode it with x264,then save it to a .264 file? but i do not know is this is right ?

    picIn.img.plane[0] = pInBuffer ;
    picIn.img.plane[1] = pInBuffer +m_VideoEncParam.nWidth*m_VideoEncParam.nHeight ;
    picIn.img.plane[2] = pInBuffer +m_VideoEncParam.nWidth*m_VideoEncParam.nHeight*5/4 ;
    
    picIn.img.i_stride[0] = m_VideoEncParam.nWidth ;
    picIn.img.i_stride[1] = m_VideoEncParam.nWidth /2 ;
    picIn.img.i_stride[2] = m_VideoEncParam.nWidth /2 ;
    
    nRet = x264_encoder_encode(
                                m_pX264Handle,
                                &pNal,
                                &nNalCount,
                                &picIn,
                                &picOut);
    if (nRet<0)
    {
        return -1 ;
    }
    
    for (int i = 0 ;i/memcpy(pOut+nLenOut,&pNal[i],pNal[i].i_payload) ;
        x264_nal_encode(m_pX264Handle,pOutBuffer+nLenOut,&pNal[i]);
        nLenOut += pNal[i].i_payload ;
    }
    int nSize = 0 ;
    FILE *pFile = fopen(pchFileName,"w") ;
    if (!pFile)
    {
        return -1 ;
    }
    nSize = fwrite(pOutBuffer,1,nLenOut,pFile) ;
    fclose(pFile) ;