Recherche avancée

Médias (0)

Mot : - Tags -/publication

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (37)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • De l’upload à la vidéo finale [version standalone]

    31 janvier 2010, par

    Le chemin d’un document audio ou vidéo dans SPIPMotion est divisé en trois étapes distinctes.
    Upload et récupération d’informations de la vidéo source
    Dans un premier temps, il est nécessaire de créer un article SPIP et de lui joindre le document vidéo "source".
    Au moment où ce document est joint à l’article, deux actions supplémentaires au comportement normal sont exécutées : La récupération des informations techniques des flux audio et video du fichier ; La génération d’une vignette : extraction d’une (...)

Sur d’autres sites (6726)

  • ffmpeg hangs when run in a multi-threaded environment

    3 février 2018, par Zaid Amir

    I have a service that needs to transcode large amount of videos with different formats. The service spawns five threads, one for a single video and each thread runs ffmpeg with the following command :

    ffmpeg -i %%FILEPATH%% -vf scale=X:Y -ab 128k -c:a aac -movflags faststart -strict -2 -ar 22050 -r 24 -c:v libx264 -crf 25 -y %%OUTPUT.MP4%%

    where X and Y are the desired dimensions based on the orientation of the original file basically its either 640:trunc(ow*a/2)*2 for landscape or trunc(oh*a/2)*2:640 for portrait.

    This is my ffmpeg info :

    ffmpeg version 2.4.3-1ubuntu1~trusty6 Copyright (c) 2000-2014 the FFmpeg developers
     built on Nov 22 2014 17:07:19 with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1)
     configuration: --prefix=/usr --extra-version='1ubuntu1~trusty6' --build-suffix=-ffmpeg --toolchain=hardened --extra-cflags= --extra-cxxflags= --libdir=/usr/lib/x86_64-linux-gnu --shlibdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --enable-shared --disable-stripping --enable-avresample --enable-avisynth --enable-fontconfig --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-opengl --enable-x11grab --enable-libxvid --enable-libx265 --enable-libdc1394 --enable-libiec61883 --enable-libzvbi --enable-libzmq --enable-frei0r --enable-libx264 --enable-libsoxr --enable-openal --enable-libopencv
     libavutil      54.  7.100 / 54.  7.100
     libavcodec     56.  1.100 / 56.  1.100
     libavformat    56.  4.101 / 56.  4.101
     libavdevice    56.  0.100 / 56.  0.100
     libavfilter     5.  1.100 /  5.  1.100
     libavresample   2.  1.  0 /  2.  1.  0
     libswscale      3.  0.100 /  3.  0.100
     libswresample   1.  1.100 /  1.  1.100
     libpostproc    53.  0.100 / 53.  0.100
    Hyper fast Audio and Video encoder
    usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...

    The service is written in Java and running on Ubuntu Server 14.04 and the machine is an 64-bit octa-core server.

    This is the code block that executes ffmpeg :

    try
    {
       ProcessBuilder procBuilder = null;


       String sArgs = String.format("ffmpeg -i %s -vf scale=%s:%s -ab 128k -c:a aac -movflags faststart -strict -2 -ar 22050 -r 24 -c:v libx264 -crf 25 -y %s",
                   originalPath,
                   outWidth,
                   outHeight,
                   targetPath
           );

       }

       String[] arrArgs = sArgs.split("\\s+");
       procBuilder = new ProcessBuilder(Arrays.asList(arrArgs));
       procBuilder.redirectErrorStream(true);
      procBuilder.redirectOutput();


       Process process = procBuilder.start();

       try (BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream())))
       {
           String line = null;
           while ((line = reader.readLine()) != null)
           {
               System.out.println(line);
           }
           errorCode = process.waitFor();
       }
    }
    catch(Throwable ex)
    {
    }

    I am currently spawning five threads, and each thread runs a single instance of ffmpeg targeting a single video file. This works fine most of the time, but every once in a while threads start to hang. I noticed from top that ffmpeg hangs indefinitely on some files with one of the threads using 100% of the core CPU and no progress is made. It happened with different file types as I noticed this on mkv, avi, wmv and mp4 files.

    I am not sure what is causing ffmpeg to hang, it does not happen right at the beginning of the transcoding process, ffmpeg starts converting the file fine but somewhere in the middle it gets stuck.

    Now this is not a problem with the files as when I try the same command on the same file manually it runs fine. And it only seems to happen when there are multiple instances of ffmpeg running at the same time as I now changed my service to only run a single thread and it has been running for almost a month with no issues.

    Is there an option that I need to use to allow multiple instances of ffmpeg to run at the same time ? Is it something in the command line that I currently use that causes this ?

  • rgb32 data resource mapping. using directx memcpy

    17 janvier 2019, par Sang Hun Kim

    I have been trying to solve the problem for a month with googling.
    But Now I have to ask for help here.

    I want to render using ffmpeg decoded frame.
    and using frame(it converted to RGB32 format), I try to render frame with DX2D texture.

    ZeroMemory(&TextureDesc, sizeof(TextureDesc));

    TextureDesc.Height = pFrame->height;
    TextureDesc.Width = pFrame->width;
    TextureDesc.MipLevels = 1;
    TextureDesc.ArraySize = 1;
    TextureDesc.Format = DXGI_FORMAT_R32G32B32A32_FLOAT;            //size 16
    TextureDesc.SampleDesc.Count = 1;
    TextureDesc.SampleDesc.Quality = 0;
    TextureDesc.Usage = D3D11_USAGE_DYNAMIC;
    TextureDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE;
    TextureDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
    TextureDesc.MiscFlags = 0;

    result = m_device->CreateTexture2D(&TextureDesc, NULL, &m_2DTex);
    if (FAILED(result))     return false;

    ShaderResourceViewDesc.Format = TextureDesc.Format;
    ShaderResourceViewDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D;
    ShaderResourceViewDesc.Texture2D.MostDetailedMip = 0;
    ShaderResourceViewDesc.Texture2D.MipLevels = 1;

       D3D11_MAPPED_SUBRESOURCE S_mappedResource_tt = { 0, };

    ZeroMemory(&S_mappedResource_tt, sizeof(D3D11_MAPPED_SUBRESOURCE));

    result = m_deviceContext->Map(m_2DTex, 0, D3D11_MAP_WRITE_DISCARD, 0, &S_mappedResource_tt);
    if (FAILED(result)) return false;
    BYTE* mappedData = reinterpret_cast<byte>(S_mappedResource_tt.pData);
    for (auto i = 0; i &lt; pFrame->height; ++i) {
       memcpy(mappedData, pFrame->data, pFrame->linesize[0]);
       mappedData += S_mappedResource_tt.RowPitch;
       pFrame->data[0] += pFrame->linesize[0];
    }

    m_deviceContext->Unmap(m_2DTex, 0);

    result = m_device->CreateShaderResourceView(m_2DTex, &amp;ShaderResourceViewDesc, &amp;m_ShaderResourceView);
    if (FAILED(result))     return false;

       m_deviceContext->PSSetShaderResources(0, 1, &amp;m_ShaderResourceView);
    </byte>

    but it shows me just black screen(nothing render).
    I guess it’s wrong memcpy size.
    The biggest problem is that I don’t know what is the problem.

    Question 1 :
    It has any problem creating 2D texture for mapping ?

    Question 2 :
    What size of the memcpy parameters should I enter (related to formatting) ?

    I based on the link below.

    [1]https://www.gamedev.net/forums/topic/667097-copy-2d-array-into-texture2d/
    [2]https://www.gamedev.net/forums/topic/645514-directx-11-maping-id3d11texture2d/
    [3]https://www.gamedev.net/forums/topic/606100-solved-dx11-updating-texture-data/

    Thank U for watching, Please reply.

  • lavc/libaomenc : Add a maximum constraint of 64 encoder threads.

    27 novembre 2018, par Jun Zhao
    lavc/libaomenc : Add a maximum constraint of 64 encoder threads.
    

    fixed the error in Intel(R) Xeon(R) Gold 6152 CPU like :
    [libaom-av1 @ 0x469f340] Failed to initialize encoder : Invalid parameter
    [libaom-av1 @ 0x469f340] Additional information : g_threads out of range [..MAX_NUM_THREADS]

    Signed-off-by : Jun Zhao <mypopydev@gmail.com>
    Signed-off-by : James Almer <jamrial@gmail.com>

    • [DH] libavcodec/libaomenc.c