Recherche avancée

Médias (0)

Mot : - Tags -/auteurs

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (15)

  • Submit bugs and patches

    13 avril 2011

    Unfortunately a software is never perfect.
    If you think you have found a bug, report it using our ticket system. Please to help us to fix it by providing the following information : the browser you are using, including the exact version as precise an explanation as possible of the problem if possible, the steps taken resulting in the problem a link to the site / page in question
    If you think you have solved the bug, fill in a ticket and attach to it a corrective patch.
    You may also (...)

  • La sauvegarde automatique de canaux SPIP

    1er avril 2010, par

    Dans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
    Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

Sur d’autres sites (6121)

  • lavf/aiffdec : Default to full rate qcelp as QT does.

    11 octobre 2016, par Carl Eugen Hoyos
    lavf/aiffdec : Default to full rate qcelp as QT does.
    

    Fixes decoding of the output file from ticket #4009.

    • [DH] libavformat/aiffdec.c
    • [DH] libavformat/version.h
  • FFMPEG - subtitles not showing for the full duration of video

    26 juin 2023, par Caio Maia

    I could not archive the following :

    


      

    1. Have images loaded from a textfile to create a slideshow.
    2. 


    3. Have a background music with volume control.
    4. 


    5. Have my voice.mp3 over the bg music.
    6. 


    7. Have subtitles in the ass format.
    8. 


    9. Have a text shown with drawtext in the full duration of video.
    10. 


    11. in only one command, if possible.
    12. 


    


    The command I tryed is :

    


    ffmpeg.exe -f concat -i images.txt -i bg_music.m4a -i voice.mp3 -filter_complex "[0:v]drawtext=fontfile='fonte.TTF':fontsize=20:fontcolor=white:text='Imagens da internet':x=w-tw-10:y=h-th-10,[0]overlay=10:10,ass=subtitles.txt[out],[1]volume=0.3[a1];[2]volume=2[a2];[a1][a2]amix=inputs=2:duration=shortest[aud]" -map "[out]" -map "[aud]":a -pix_fmt yuv420p -c:v libx264 -c:s mov_text -r 30 -y out.mp4


    


    It works but not for subtitles that are showing only after the first image of th slideshow appears.

    


    the content of images.txt is :

    


    file 'image1.png'
duration 20
file 'image2.png'
duration 5
file 'image3.png'
duration 5


    


    the content of subtitles.txt is

    


    Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text
Dialogue: 0,0:00:01.00,0:00:06.00,Default,,0,0,0,,Subscribe!
Dialogue: 0,0:00:07.00,0:00:16.00,Default2,,0,0,0,,Like!
Dialogue: 0,0:00:17.00,0:00:26.00,Default,,0,0,0,,Share!


    


    The problem is that only the "Share !" text is shown.

    


  • Getting green screen in ffplay : Streaming desktop (DirectX surface) as H264 video over RTP stream using Live555

    7 novembre 2019, par Ram

    I’m trying to stream the desktop(DirectX surface in NV12 format) as H264 video over RTP stream using Live555 & Windows media foundation’s hardware encoder on Windows10, and expecting it to be rendered by ffplay (ffmpeg 4.2). But only getting a green screen like below,

    enter image description here

    enter image description here

    enter image description here

    enter image description here

    I referred MFWebCamToRTP mediafoundation-sample & Encoding DirectX surface using hardware MFT for implementing live555’s FramedSource and changing the input source to DirectX surface instead of webCam.

    Here is an excerpt of my implementation for Live555’s doGetNextFrame callback to feed input samples from directX surface :

    virtual void doGetNextFrame()
    {
       if (!_isInitialised)
       {
           if (!initialise()) {
               printf("Video device initialisation failed, stopping.");
               return;
           }
           else {
               _isInitialised = true;
           }
       }

       //if (!isCurrentlyAwaitingData()) return;

       DWORD processOutputStatus = 0;
       HRESULT mftProcessOutput = S_OK;
       MFT_OUTPUT_STREAM_INFO StreamInfo;
       IMFMediaBuffer *pBuffer = NULL;
       IMFSample *mftOutSample = NULL;
       DWORD mftOutFlags;
       bool frameSent = false;
       bool bTimeout = false;

       // Create sample
       CComPtr<imfsample> videoSample = NULL;

       // Create buffer
       CComPtr<imfmediabuffer> inputBuffer;
       // Get next event
       CComPtr<imfmediaevent> event;
       HRESULT hr = eventGen->GetEvent(0, &amp;event);
       CHECK_HR(hr, "Failed to get next event");

       MediaEventType eventType;
       hr = event->GetType(&amp;eventType);
       CHECK_HR(hr, "Failed to get event type");


       switch (eventType)
       {
       case METransformNeedInput:
           {
               hr = MFCreateDXGISurfaceBuffer(__uuidof(ID3D11Texture2D), surface, 0, FALSE, &amp;inputBuffer);
               CHECK_HR(hr, "Failed to create IMFMediaBuffer");

               hr = MFCreateSample(&amp;videoSample);
               CHECK_HR(hr, "Failed to create IMFSample");
               hr = videoSample->AddBuffer(inputBuffer);
               CHECK_HR(hr, "Failed to add buffer to IMFSample");

               if (videoSample)
               {
                   _frameCount++;

                   CHECK_HR(videoSample->SetSampleTime(mTimeStamp), "Error setting the video sample time.\n");
                   CHECK_HR(videoSample->SetSampleDuration(VIDEO_FRAME_DURATION), "Error getting video sample duration.\n");

                   // Pass the video sample to the H.264 transform.

                   hr = _pTransform->ProcessInput(inputStreamID, videoSample, 0);
                   CHECK_HR(hr, "The resampler H264 ProcessInput call failed.\n");

                   mTimeStamp += VIDEO_FRAME_DURATION;
               }
           }

           break;

       case METransformHaveOutput:

           {
               CHECK_HR(_pTransform->GetOutputStatus(&amp;mftOutFlags), "H264 MFT GetOutputStatus failed.\n");

               if (mftOutFlags == MFT_OUTPUT_STATUS_SAMPLE_READY)
               {
                   MFT_OUTPUT_DATA_BUFFER _outputDataBuffer;
                   memset(&amp;_outputDataBuffer, 0, sizeof _outputDataBuffer);
                   _outputDataBuffer.dwStreamID = outputStreamID;
                   _outputDataBuffer.dwStatus = 0;
                   _outputDataBuffer.pEvents = NULL;
                   _outputDataBuffer.pSample = nullptr;

                   mftProcessOutput = _pTransform->ProcessOutput(0, 1, &amp;_outputDataBuffer, &amp;processOutputStatus);

                   if (mftProcessOutput != MF_E_TRANSFORM_NEED_MORE_INPUT)
                   {
                       if (_outputDataBuffer.pSample) {

                           //CHECK_HR(_outputDataBuffer.pSample->SetSampleTime(mTimeStamp), "Error setting MFT sample time.\n");
                           //CHECK_HR(_outputDataBuffer.pSample->SetSampleDuration(VIDEO_FRAME_DURATION), "Error setting MFT sample duration.\n");

                           IMFMediaBuffer *buf = NULL;
                           DWORD bufLength;
                           CHECK_HR(_outputDataBuffer.pSample->ConvertToContiguousBuffer(&amp;buf), "ConvertToContiguousBuffer failed.\n");
                           CHECK_HR(buf->GetCurrentLength(&amp;bufLength), "Get buffer length failed.\n");
                           BYTE * rawBuffer = NULL;

                           fFrameSize = bufLength;
                           fDurationInMicroseconds = 0;
                           gettimeofday(&amp;fPresentationTime, NULL);

                           buf->Lock(&amp;rawBuffer, NULL, NULL);
                           memmove(fTo, rawBuffer, fFrameSize);

                           FramedSource::afterGetting(this);

                           buf->Unlock();
                           SafeRelease(&amp;buf);

                           frameSent = true;
                           _lastSendAt = GetTickCount();

                           _outputDataBuffer.pSample->Release();
                       }

                       if (_outputDataBuffer.pEvents)
                           _outputDataBuffer.pEvents->Release();
                   }

                   //SafeRelease(&amp;pBuffer);
                   //SafeRelease(&amp;mftOutSample);

                   break;
               }
           }

           break;
       }

       if (!frameSent)
       {
           envir().taskScheduler().triggerEvent(eventTriggerId, this);
       }

       return;

    done:

       printf("MediaFoundationH264LiveSource doGetNextFrame failed.\n");
       envir().taskScheduler().triggerEvent(eventTriggerId, this);
    }
    </imfmediaevent></imfmediabuffer></imfsample>

    Initialise method :

    bool initialise()
    {
       HRESULT hr;
       D3D11_TEXTURE2D_DESC desc = { 0 };

       HDESK CurrentDesktop = nullptr;
       CurrentDesktop = OpenInputDesktop(0, FALSE, GENERIC_ALL);
       if (!CurrentDesktop)
       {
           // We do not have access to the desktop so request a retry
           return false;
       }

       // Attach desktop to this thread
       bool DesktopAttached = SetThreadDesktop(CurrentDesktop) != 0;
       CloseDesktop(CurrentDesktop);
       CurrentDesktop = nullptr;
       if (!DesktopAttached)
       {
           printf("SetThreadDesktop failed\n");
       }

       UINT32 activateCount = 0;

       // h264 output
       MFT_REGISTER_TYPE_INFO info = { MFMediaType_Video, MFVideoFormat_H264 };

       UINT32 flags =
           MFT_ENUM_FLAG_HARDWARE |
           MFT_ENUM_FLAG_SORTANDFILTER;

       // ------------------------------------------------------------------------
       // Initialize D3D11
       // ------------------------------------------------------------------------

       // Driver types supported
       D3D_DRIVER_TYPE DriverTypes[] =
       {
           D3D_DRIVER_TYPE_HARDWARE,
           D3D_DRIVER_TYPE_WARP,
           D3D_DRIVER_TYPE_REFERENCE,
       };
       UINT NumDriverTypes = ARRAYSIZE(DriverTypes);

       // Feature levels supported
       D3D_FEATURE_LEVEL FeatureLevels[] =
       {
           D3D_FEATURE_LEVEL_11_0,
           D3D_FEATURE_LEVEL_10_1,
           D3D_FEATURE_LEVEL_10_0,
           D3D_FEATURE_LEVEL_9_1
       };
       UINT NumFeatureLevels = ARRAYSIZE(FeatureLevels);

       D3D_FEATURE_LEVEL FeatureLevel;

       // Create device
       for (UINT DriverTypeIndex = 0; DriverTypeIndex &lt; NumDriverTypes; ++DriverTypeIndex)
       {
           hr = D3D11CreateDevice(nullptr, DriverTypes[DriverTypeIndex], nullptr,
               D3D11_CREATE_DEVICE_VIDEO_SUPPORT,
               FeatureLevels, NumFeatureLevels, D3D11_SDK_VERSION, &amp;device, &amp;FeatureLevel, &amp;context);
           if (SUCCEEDED(hr))
           {
               // Device creation success, no need to loop anymore
               break;
           }
       }

       CHECK_HR(hr, "Failed to create device");

       // Create device manager
       UINT resetToken;
       hr = MFCreateDXGIDeviceManager(&amp;resetToken, &amp;deviceManager);
       CHECK_HR(hr, "Failed to create DXGIDeviceManager");

       hr = deviceManager->ResetDevice(device, resetToken);
       CHECK_HR(hr, "Failed to assign D3D device to device manager");


       // ------------------------------------------------------------------------
       // Create surface
       // ------------------------------------------------------------------------
       desc.Format = DXGI_FORMAT_NV12;
       desc.Width = surfaceWidth;
       desc.Height = surfaceHeight;
       desc.MipLevels = 1;
       desc.ArraySize = 1;
       desc.SampleDesc.Count = 1;

       hr = device->CreateTexture2D(&amp;desc, NULL, &amp;surface);
       CHECK_HR(hr, "Could not create surface");

       hr = MFTEnumEx(
           MFT_CATEGORY_VIDEO_ENCODER,
           flags,
           NULL,
           &amp;info,
           &amp;activateRaw,
           &amp;activateCount
       );
       CHECK_HR(hr, "Failed to enumerate MFTs");

       CHECK(activateCount, "No MFTs found");

       // Choose the first available encoder
       activate = activateRaw[0];

       for (UINT32 i = 0; i &lt; activateCount; i++)
           activateRaw[i]->Release();

       // Activate
       hr = activate->ActivateObject(IID_PPV_ARGS(&amp;_pTransform));
       CHECK_HR(hr, "Failed to activate MFT");

       // Get attributes
       hr = _pTransform->GetAttributes(&amp;attributes);
       CHECK_HR(hr, "Failed to get MFT attributes");

       // Unlock the transform for async use and get event generator
       hr = attributes->SetUINT32(MF_TRANSFORM_ASYNC_UNLOCK, TRUE);
       CHECK_HR(hr, "Failed to unlock MFT");

       eventGen = _pTransform;
       CHECK(eventGen, "Failed to QI for event generator");

       // Get stream IDs (expect 1 input and 1 output stream)
       hr = _pTransform->GetStreamIDs(1, &amp;inputStreamID, 1, &amp;outputStreamID);
       if (hr == E_NOTIMPL)
       {
           inputStreamID = 0;
           outputStreamID = 0;
           hr = S_OK;
       }
       CHECK_HR(hr, "Failed to get stream IDs");

        // ------------------------------------------------------------------------
       // Configure hardware encoder MFT
      // ------------------------------------------------------------------------
       CHECK_HR(_pTransform->ProcessMessage(MFT_MESSAGE_SET_D3D_MANAGER, reinterpret_cast(deviceManager.p)), "Failed to set device manager.\n");

       // Set low latency hint
       hr = attributes->SetUINT32(MF_LOW_LATENCY, TRUE);
       CHECK_HR(hr, "Failed to set MF_LOW_LATENCY");

       hr = MFCreateMediaType(&amp;outputType);
       CHECK_HR(hr, "Failed to create media type");

       hr = outputType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video);
       CHECK_HR(hr, "Failed to set MF_MT_MAJOR_TYPE on H264 output media type");

       hr = outputType->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_H264);
       CHECK_HR(hr, "Failed to set MF_MT_SUBTYPE on H264 output media type");

       hr = outputType->SetUINT32(MF_MT_AVG_BITRATE, TARGET_AVERAGE_BIT_RATE);
       CHECK_HR(hr, "Failed to set average bit rate on H264 output media type");

       hr = MFSetAttributeSize(outputType, MF_MT_FRAME_SIZE, desc.Width, desc.Height);
       CHECK_HR(hr, "Failed to set frame size on H264 MFT out type");

       hr = MFSetAttributeRatio(outputType, MF_MT_FRAME_RATE, TARGET_FRAME_RATE, 1);
       CHECK_HR(hr, "Failed to set frame rate on H264 MFT out type");

       hr = outputType->SetUINT32(MF_MT_INTERLACE_MODE, 2);
       CHECK_HR(hr, "Failed to set MF_MT_INTERLACE_MODE on H.264 encoder MFT");

       hr = outputType->SetUINT32(MF_MT_ALL_SAMPLES_INDEPENDENT, TRUE);
       CHECK_HR(hr, "Failed to set MF_MT_ALL_SAMPLES_INDEPENDENT on H.264 encoder MFT");

       hr = _pTransform->SetOutputType(outputStreamID, outputType, 0);
       CHECK_HR(hr, "Failed to set output media type on H.264 encoder MFT");

       hr = MFCreateMediaType(&amp;inputType);
       CHECK_HR(hr, "Failed to create media type");

       for (DWORD i = 0;; i++)
       {
           inputType = nullptr;
           hr = _pTransform->GetInputAvailableType(inputStreamID, i, &amp;inputType);
           CHECK_HR(hr, "Failed to get input type");

           hr = inputType->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video);
           CHECK_HR(hr, "Failed to set MF_MT_MAJOR_TYPE on H264 MFT input type");

           hr = inputType->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_NV12);
           CHECK_HR(hr, "Failed to set MF_MT_SUBTYPE on H264 MFT input type");

           hr = MFSetAttributeSize(inputType, MF_MT_FRAME_SIZE, desc.Width, desc.Height);
           CHECK_HR(hr, "Failed to set MF_MT_FRAME_SIZE on H264 MFT input type");

           hr = MFSetAttributeRatio(inputType, MF_MT_FRAME_RATE, TARGET_FRAME_RATE, 1);
           CHECK_HR(hr, "Failed to set MF_MT_FRAME_RATE on H264 MFT input type");

           hr = _pTransform->SetInputType(inputStreamID, inputType, 0);
           CHECK_HR(hr, "Failed to set input type");

           break;
       }

       CheckHardwareSupport();

       CHECK_HR(_pTransform->ProcessMessage(MFT_MESSAGE_COMMAND_FLUSH, NULL), "Failed to process FLUSH command on H.264 MFT.\n");
       CHECK_HR(_pTransform->ProcessMessage(MFT_MESSAGE_NOTIFY_BEGIN_STREAMING, NULL), "Failed to process BEGIN_STREAMING command on H.264 MFT.\n");
       CHECK_HR(_pTransform->ProcessMessage(MFT_MESSAGE_NOTIFY_START_OF_STREAM, NULL), "Failed to process START_OF_STREAM command on H.264 MFT.\n");

       return true;

    done:

       printf("MediaFoundationH264LiveSource initialisation failed.\n");
       return false;
    }


       HRESULT CheckHardwareSupport()
       {
           IMFAttributes *attributes;
           HRESULT hr = _pTransform->GetAttributes(&amp;attributes);
           UINT32 dxva = 0;

           if (SUCCEEDED(hr))
           {
               hr = attributes->GetUINT32(MF_SA_D3D11_AWARE, &amp;dxva);
           }

           if (SUCCEEDED(hr))
           {
               hr = attributes->SetUINT32(CODECAPI_AVDecVideoAcceleration_H264, TRUE);
           }

    #if defined(CODECAPI_AVLowLatencyMode) // Win8 only

           hr = _pTransform->QueryInterface(IID_PPV_ARGS(&amp;mpCodecAPI));

           if (SUCCEEDED(hr))
           {
               VARIANT var = { 0 };

               // FIXME: encoder only
               var.vt = VT_UI4;
               var.ulVal = 0;

               hr = mpCodecAPI->SetValue(&amp;CODECAPI_AVEncMPVDefaultBPictureCount, &amp;var);

               var.vt = VT_BOOL;
               var.boolVal = VARIANT_TRUE;
               hr = mpCodecAPI->SetValue(&amp;CODECAPI_AVEncCommonLowLatency, &amp;var);
               hr = mpCodecAPI->SetValue(&amp;CODECAPI_AVEncCommonRealTime, &amp;var);

               hr = attributes->SetUINT32(CODECAPI_AVLowLatencyMode, TRUE);

               if (SUCCEEDED(hr))
               {
                   var.vt = VT_UI4;
                   var.ulVal = eAVEncCommonRateControlMode_Quality;
                   hr = mpCodecAPI->SetValue(&amp;CODECAPI_AVEncCommonRateControlMode, &amp;var);

                   // This property controls the quality level when the encoder is not using a constrained bit rate. The AVEncCommonRateControlMode property determines whether the bit rate is constrained.
                   VARIANT quality;
                   InitVariantFromUInt32(50, &amp;quality);
                   hr = mpCodecAPI->SetValue(&amp;CODECAPI_AVEncCommonQuality, &amp;quality);
               }
           }
    #endif

           return hr;
       }

    ffplay command :

    ffplay -protocol_whitelist file,udp,rtp -i test.sdp -x 800 -y 600 -profile:v baseline

    SDP :

    v=0
    o=- 0 0 IN IP4 127.0.0.1
    s=No Name
    t=0 0
    c=IN IP4 127.0.0.1
    m=video 1234 RTP/AVP 96
    a=rtpmap:96 H264/90000
    a=fmtp:96 packetization-mode=1

    I don’t know what am I missing, I have been trying to fix this for almost a week without any progress, and tried almost everything I could. Also, the online resources for encoding a DirectX surface as video are very limited.

    Any help would be appreciated.