Newest 'ffmpeg' Questions - Stack Overflow

http://stackoverflow.com/questions/tagged/ffmpeg

Les articles publiés sur le site

  • How to fill an AVFrame structure in order to encode an YUY2 video (or UYVY) into H265

    22 avril, par Rich Deng

    I want to compress a video stream in YUY2 or UYVY format to, say H265. If I understand the answers given this thread correctly, I should be able use the function av_image_fill_arrays() to fill the data and linesize arrays of an AVFrame object, call avcodec_send_frame(), and then avcodec_receive_packet() to get encoded data:

    bool VideoEncoder::Init(const AM_MEDIA_TYPE* pMediaType)
    {
        // we should have a valid pointer
        if (pMediaType)
        {
            m_mtInput.Empty();
            m_mtInput.Set(*pMediaType);
        }
        else
            return false;
    
            // find encoder
        m_pCodec = m_spAVCodecDlls->avcodec_find_encoder(AV_CODEC_ID_HEVC);
        m_pCodecCtx = m_spAVCodecDlls->avcodec_alloc_context3(m_pCodec);
        if (!m_pCodec || !m_pCodecCtx)
        {
            Log.Log(_T("Failed to find or allocate codec context!"));
            return false;
        }
    
        AVPixelFormat ePixFmtInput = GetInputPixelFormat();
        if (CanConvertInputFormat(ePixFmtInput) == false)
        {
            return false;
        }
    
        // we are able to convert
        // so continue with setting it up
        int nWidth = m_mtInput.GetWidth();
        int nHeight = m_mtInput.GetHeight();
    
        // Set encoding parameters
    
        // Set bitrate (4 Mbps for 1920x1080)
        m_pCodecCtx->bit_rate = (((int64)4000000 * nWidth / 1920) * nHeight / 1080);  
    
        m_pCodecCtx->width = nWidth;  
        m_pCodecCtx->height = nHeight;
    
    
        // use reference time as time_base
        m_pCodecCtx->time_base.den = 10000000;  
        m_pCodecCtx->time_base.num = 1;
    
        SetAVRational(m_pCodecCtx->framerate, m_mtInput.GetFrameRate());
        //m_pCodecCtx->framerate = (AVRational){ 30, 1 };
        m_pCodecCtx->gop_size = 10;  // GOP size
        m_pCodecCtx->max_b_frames = 1;
    
        // set pixel format
        m_pCodecCtx->pix_fmt = ePixFmtInput;  // YUV 4:2:0 format or YUV 4:2:2
    
        // Open the codec
        if (m_spAVCodecDlls->avcodec_open2(m_pCodecCtx, m_pCodec, NULL) < 0)
        {
            return false;
        }
    
            return true;
    }
    
    bool VideoEncoder::AllocateFrame()
    {
    
        m_pFrame = m_spAVCodecDlls->av_frame_alloc();
        if (m_pFrame == NULL)
        {
            Log.Log(_T("Failed to allocate frame object!"));
            return false;
        }
    
        m_pFrame->format = m_pCodecCtx->pix_fmt;
        m_pFrame->width = m_pCodecCtx->width;
        m_pFrame->height = m_pCodecCtx->height;
    
        m_pFrame->time_base.den = m_pCodecCtx->time_base.den;
        m_pFrame->time_base.num = m_pCodecCtx->time_base.num;
    
    
        return true;
    }
    
    bool VideoEncoder::Encode(IMediaSample* pSample)
    {
        if (m_pFrame == NULL)
        {
            return false;
        }
    
        // get the time stamps
        REFERENCE_TIME rtStart, rtEnd;
        HRESULT hr = pSample->GetTime(&rtStart, &rtEnd);
        m_rtInputFrameStart = rtStart;
        m_rtInputFrameEnd = rtEnd;
    
    
        // get length
        int nLength = pSample->GetActualDataLength();
    
        // get pointer to actual sample data
        uint8_t* pData = NULL;
        hr = pSample->GetPointer(&pData);
    
        if (FAILED(hr) || NULL == pData)
            return false;
    
        m_pFrame->flags = (S_OK == pSample->IsSyncPoint()) ? (m_pFrame->flags | AV_FRAME_FLAG_KEY) : (m_pFrame->flags & ~AV_FRAME_FLAG_KEY);
    
        // clear old data
        for (int n = 0; n < AV_NUM_DATA_POINTERS; n++)
        {
            m_pFrame->data[n] = NULL;// (uint8_t*)aryData[n];
            m_pFrame->linesize[n] = 0;// = aryStride[n];
        }
    
    
        int nRet = 0;
        int nStride = m_mtInput.GetStride();
        nRet = m_spAVCodecDlls->av_image_fill_arrays(m_pFrame->data, m_pFrame->linesize, pData, ePixFmt, m_pFrame->width, m_pFrame->height, 32);
        if (nRet < 0)
        {
            return false;
        }
    
        m_pFrame->pts = (int64_t) rtStart;
        m_pFrame->duration = rtEnd - rtStart;
        nRet = m_spAVCodecDlls->avcodec_send_frame(m_pCodecCtx, m_pFrame);
        if (nRet == AVERROR(EAGAIN))
        {
            ReceivePacket();
            nRet = m_spAVCodecDlls->avcodec_send_frame(m_pCodecCtx, m_pFrame);
        }
    
        if (nRet < 0)
        {
            return false;
        }
    
        // Receive the encoded packets
        ReceivePacket();
    
        return true;
    }
    
    bool VideoEncoder::ReceivePacket()
    {
        bool bRet = true;
        AVPacket* pkt = m_spAVCodecDlls->av_packet_alloc();
        while (m_spAVCodecDlls->avcodec_receive_packet(m_pCodecCtx, pkt) == 0)
        {
            // Write pkt->data to output file or stream
            m_pCallback->VideoEncoderWriteEncodedSample(pkt);
            if (m_OutFile.IsOpen())
                m_OutFile.Write(pkt->data, pkt->size);
            m_spAVCodecDlls->av_packet_unref(pkt);
        }
        m_spAVCodecDlls->av_packet_free(&pkt);
    
        return bRet;
    }
    

    I must have done something wrong. The result is not correct. For example, rather than a video with a person's face showing in the middle of the screen, I get a mostly green screen with parts of the face showing up at the lower left and lower right corners.

    Can someone help me?

  • mobile-ffmpeg-https (4.3.1) POD install failed

    22 avril, par shruti tupkari

    I am getting error on my pod install command , Error creating package is mobile-ffmpeg-https (4.3.1). Image attached for more details.

    Actually i have not used this package anywhere in my project. I tried to use it and then removed it from project. But i am not getting why this is still showing up in pod install.

    i tried deleting my podfile.lock and again running pod install but issue remains.

    enter image description here

    thanks in advance .

  • libavcodec.so.58 not found when running software compiled with opencv

    22 avril, par Abinaya

    I am using ubuntu 22.04. Now every time I try to run software compiled with Opencv, I get the following error:

    `libavcodec.so.58 => not found
     libavformat.so.58 => not found
     libavutil.so.56 => not found
     libswscale.so.5 => not found
    `
    

    Looking around /lib/x86_64-linux-gnu/, I can find libavcodec.so.59, but not libavcodec.so.58.

    When trying to run sudo apt-get install libavcodec58, I get:

    Package 'libavcodec58' has no installation candidate

    I've scoured the internet in search of an answer, but could not find anything at this point. Any help with solving this problem will be very much appreciated.

    I have tried to recreate symbolic link with 'ls -l libavcodec.so.59' 1 root root 23 Aug 10 2024 libavcodec.so.59 ->libavcodec.so.59.37.100`

    'dconfig -vl libavcodec.so.59.37.100' libavcodec.so.59 -> libavcodec.so.59.37.100`

    But I am still struck

  • Ffmpeg command not streaming to youtube [closed]

    21 avril, par Ahmed Seddik Bouchiba

    I'm trying to stream my desktop (from an X11 session) to YouTube Live using ffmpeg. I'm running this on a Linux machine with an active X server, and I set the DISPLAY variable accordingly (:0 in most cases).

    Here's the ffmpeg command I've tried:

    ffmpeg -loglevel info \
        -probesize ${PROBESIZE} -analyzeduration ${ANALYZE_DURATION} \
        -f x11grab -video_size ${VIDEO_SIZE} -r ${FRAME_RATE} -draw_mouse 0 -i ${DISPLAY} \
        -f alsa -i default \
        -deinterlace -vcodec libx264 -pix_fmt yuv420p -preset fast \
        -r 30 -g 60 -b:v 2000k -bufsize 4000k \
        -acodec libmp3lame -ar 44100 -b:a 128k \
        -map 0:v:0 -map 1:a:0 -vsync 0 \
        -f flv "${RTMP_URL}" &
    

    Environment variables are set correctly (DISPLAY, VIDEO_SIZE, FRAME_RATE, etc.), and I replaced ${RTMP_URL} with the correct YouTube RTMP endpoint (e.g., rtmp://a.rtmp.youtube.com/live2/). But nothing seems to work — the stream never starts or appears on YouTube, and sometimes I get timeout or "connection refused" errors.

    I've checked:

    That I'm logged into an active X session

    That I have access to the display (even tried xhost +)

    That ffmpeg has access to ALSA (sound seems okay)

    Questions:

    Am I missing something in my command?

    Is there a better way to stream both screen and audio from an X server to YouTube Live?

    Could this be a codec or YouTube-specific format issue?

    Any help or working examples would be really appreciated. Thanks!

  • How to stream to YouTube from an X server using DISPLAY with ffmpeg ? [closed]

    21 avril, par Ahmed Seddik Bouchiba

    I'm trying to stream my desktop (from an X11 session) to YouTube Live using ffmpeg. I'm running this on a Linux machine with an active X server, and I set the DISPLAY variable accordingly (:0 in most cases).

    Here's the ffmpeg command I've tried:

    ffmpeg -loglevel info \
        -probesize ${PROBESIZE} -analyzeduration ${ANALYZE_DURATION} \
        -f x11grab -video_size ${VIDEO_SIZE} -r ${FRAME_RATE} -draw_mouse 0 -i ${DISPLAY} \
        -f alsa -i default \
        -deinterlace -vcodec libx264 -pix_fmt yuv420p -preset fast \
        -r 30 -g 60 -b:v 2000k -bufsize 4000k \
        -acodec libmp3lame -ar 44100 -b:a 128k \
        -map 0:v:0 -map 1:a:0 -vsync 0 \
        -f flv "${RTMP_URL}" &
    

    Environment variables are set correctly (DISPLAY, VIDEO_SIZE, FRAME_RATE, etc.), and I replaced ${RTMP_URL} with the correct YouTube RTMP endpoint (e.g., rtmp://a.rtmp.youtube.com/live2/). But nothing seems to work — the stream never starts or appears on YouTube, and sometimes I get timeout or "connection refused" errors.

    I've checked:

    That I'm logged into an active X session

    That I have access to the display (even tried xhost +)

    That ffmpeg has access to ALSA (sound seems okay)

    Questions:

    Am I missing something in my command?

    Is there a better way to stream both screen and audio from an X server to YouTube Live?

    Could this be a codec or YouTube-specific format issue?

    Any help or working examples would be really appreciated. Thanks!