Recherche avancée

Médias (1)

Mot : - Tags -/illustrator

Autres articles (70)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (8819)

  • ffmpeg frame mapping to rgb32 dynamic resource using directx memcpy

    9 janvier 2019, par Sang Hun Kim

    I have been trying to solve the problem for a month with googling.
    But Now I have to ask for help here.

    I want to render using ffmpeg decoded frame.
    I’m trying to decode video and I got the frame(I guess it format YUV444P).
    So conversion 420P and RGB32 conversion again.(RGB is same way, just change format)

    AVFrame* YUVFrame = av_frame_alloc();

    SwsContext * swsContext = sws_getContext(pVFrame->width, pVFrame->height, pVideoCodecCtx->pix_fmt,
       pVFrame->width, pVFrame->height, AV_PIX_FMT_YUV420P, SWS_FAST_BILINEAR, 0, 0, 0);
    if (swsContext == NULL) {
       return false;
    }
    *YUVFrame = *pVFrame;
    YUVFrame->data[0] = pVFrame->data[0];   YUVFrame->linesize[0] = pVFrame->linesize[0];
    YUVFrame->data[1] = pVFrame->data[1];   YUVFrame->linesize[1] = pVFrame->linesize[1];
    YUVFrame->data[2] = pVFrame->data[2];   YUVFrame->linesize[2] = pVFrame->linesize[2];
    YUVFrame->width = pVFrame->width;       YUVFrame->height = pVFrame->height;


    sws_scale(swsContext, pVFrame->data, pVFrame->linesize, 0, (int)pVFrame->height, YUVFrame->data, YUVFrame->linesize);

    if (YUVFrame == NULL) {
       av_frame_unref(YUVFrame);
       return false;
    }

    and using frame, I try to render frame with DX2D texture.

    ZeroMemory(&TextureDesc, sizeof(TextureDesc));

    TextureDesc.Height = pFrame->height;
    TextureDesc.Width = pFrame->width;
    TextureDesc.MipLevels = 1;
    TextureDesc.ArraySize = 1;
    TextureDesc.Format = DXGI_FORMAT_R32G32B32A32_FLOAT;            //size 16
    TextureDesc.SampleDesc.Count = 1;
    TextureDesc.SampleDesc.Quality = 0;
    TextureDesc.Usage = D3D11_USAGE_DYNAMIC;
    TextureDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE;
    TextureDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
    TextureDesc.MiscFlags = 0;

    result = m_device->CreateTexture2D(&TextureDesc, NULL, &m_2DTex);
    if (FAILED(result))     return false;

    ShaderResourceViewDesc.Format = TextureDesc.Format;
    ShaderResourceViewDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D;
    ShaderResourceViewDesc.Texture2D.MostDetailedMip = 0;
    ShaderResourceViewDesc.Texture2D.MipLevels = 1;

    also mapping (cause usage DYNAMIC, CPU_ACCESS_WRITE)

       D3D11_MAPPED_SUBRESOURCE S_mappedResource_tt = { 0, };

    ZeroMemory(&S_mappedResource_tt, sizeof(D3D11_MAPPED_SUBRESOURCE));
    DWORD   Stride = pFrame->linesize[0];

    result = m_deviceContext->Map(m_2DTex, 0, D3D11_MAP_WRITE_DISCARD, 0, &S_mappedResource_tt);
    if (FAILED(result)) return false;
    BYTE* mappedData = reinterpret_cast<byte>(S_mappedResource_tt.pData);
    for (UINT i = 0; i &lt; 3; ++i) {
       memcpy(mappedData, pFrame->data, Stride * 3);
       mappedData += S_mappedResource_tt.RowPitch;
       *pFrame->data += Stride * 3;
    }

    m_deviceContext->Unmap(m_2DTex, 0);

    result = m_device->CreateShaderResourceView(m_2DTex, &amp;ShaderResourceViewDesc, &amp;m_ShaderResourceView);
    if (FAILED(result))     return false;

       m_deviceContext->PSSetShaderResources(0, 1, &amp;m_ShaderResourceView);
    </byte>

    but it shows me just black screen(nothing render).
    I guess it’s wrong memcpy size.
    The biggest problem is that I don’t know what is the problem.

    Question 1 :
    Can it be decoded in the right way and converted using ffmpeg ? (I think it’s right, but it just guesses)

    Question 2 :
    It has any problem creating 2D texture for mapping ?

    Question 3 :
    What size of the memcpy parameters should I enter (related to formatting) ?

    Thank U for watching, Please reply.

  • Ffmpeg dynamic cropping for every frame

    25 janvier 2024, par Aman Mehta

    I have different crop parameters for every frame. I was using sendcmd but apparently, that is not working.

    &#xA;

    This is the format of my sendcmd file

    &#xA;

    0.00 crop &#x27;w=466:h=432:x=1373:y=336&#x27;;&#xA;0.02 crop &#x27;w=324:h=382:x=277:y=693&#x27;;&#xA;0.03 crop &#x27;w=304:h=332:x=1364:y=794&#x27;;&#xA;0.05 crop &#x27;w=348:h=448:x=1500:y=966&#x27;;&#xA;0.07 crop &#x27;w=466:h=412:x=794:y=193&#x27;;&#xA;0.08 crop &#x27;w=390:h=502:x=1425:y=813&#x27;;&#xA;

    &#xA;

    And I am running this command

    &#xA;

    ffmpeg -ss 0 -t 1 -i Demo.mp4 -filter_complex "[0:v]sendcmd=f=crop_parameters.txt[video]" -map "[video]" output_%03d.png&#xA;

    &#xA;

    These are the logs coming from FFmpeg

    &#xA;

    [Parsed_sendcmd_0 @ 0x6000016709a0] [expr] interval #0 start_ts:0.000000 end_ts:9223372036854.775391 ts:0.083367&#xA;[Parsed_sendcmd_0 @ 0x6000016709a0] [expr] interval #1 start_ts:0.040000 end_ts:9223372036854.775391 ts:0.083367&#xA;[Parsed_sendcmd_0 @ 0x6000016709a0] [enter&#x2B;expr] interval #2 start_ts:0.080000 end_ts:9223372036854.775391 ts:0.083367&#xA;[Parsed_sendcmd_0 @ 0x6000016709a0] Processing command #0 target:crop command:w arg:147&#xA;[Parsed_sendcmd_0 @ 0x6000016709a0] Command reply for command #0: ret:Function not implemented res:&#xA;[Parsed_sendcmd_0 @ 0x6000016709a0] Processing command #1 target:crop command:h arg:258&#xA;[Parsed_sendcmd_0 @ 0x6000016709a0] Command reply for command #1: ret:Function not implemented res:&#xA;[Parsed_sendcmd_0 @ 0x6000016709a0] Processing command #2 target:crop command:x arg:928&#xA;[Parsed_sendcmd_0 @ 0x6000016709a0] Command reply for command #2: ret:Function not implemented res:&#xA;[Parsed_sendcmd_0 @ 0x6000016709a0] Processing command #3 target:crop command:y arg:102&#xA;[Parsed_sendcmd_0 @ 0x6000016709a0] Command reply for command #3: ret:Function not implemented res:&#xA;

    &#xA;

    It resulted in no cropping of the video, it just gives the exact same frame

    &#xA;

  • Dynamic subtitles by ffmpeg

    8 septembre 2019, par Saeron Meng

    I would like to add some commentary texts into my video but I do not know how to use ffmpeg to realize this. The comments are like screen bullets through the screen, appearing in the right margin, moving and scrolling, and disappearing from the left.

    My thought is to count the length of the comments and define speeds for them to move and I have already gotten the comments saved as an xml file. But even though I can transfer it into srt file, the tricky problem is, it is hard to write the speeds of the subtitles, or something like that, in an srt file, and apply them to ffmpeg commands or APIs. Here is an example of comments (xml file) :

    <chat timestamp="671.195">
       <ems utctime="1562584080" sender="Bill">
           <richtext></richtext>
       </ems>
    </chat>
    <chat timestamp="677.798">
       <ems utctime="1562584086" sender="Jack">
           <richtext></richtext>
       </ems>
    </chat>

    The final result is like this (I did not find an example in websites in English. In China, such moving subtitles are called "danmu" or "screen bullets"), these colorful characters can move horizontally from right to left :

    example

    1. I have searched some solutions on the Internet, most of which talk about how to write ass/srt files and add motionless subtitles. Like this :

    ffmpeg -i infile.mp4 -i infile.srt -c copy -c:s mov_text outfile.mp4

    3
    00:00:39,770 --> 00:00:41,880
    When I was lying there in the VA hospital ...

    4
    00:00:42,550 --> 00:00:44,690
    ... with a big hole blown through the middle of my life,

    5
    00:00:45,590 --> 00:00:48,120
    ... I started having these dreams of flying.

    But I need another kind of "subtitles" which can move.

    1. When it comes to scrolling subtitles, there are still some solutions : Scrolling from RIGHT to LEFT in ffmpeg / drawtext

    So my question is, how to combine the solutions above to arrange subtitles from top to bottom and let them move concerning the timestamps of the comments ?