Recherche avancée

Médias (1)

Mot : - Tags -/école

Autres articles (99)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

Sur d’autres sites (11568)

  • FFMPEG result video freezes in the middle of playback

    10 mai 2013, par Revoluzifer

    I'm trying stitch four source videos together into one grid. For testing purposes I'm working with a single source flv using this command :

    ffmpeg -i woohoo.flv -i woohoo.flv -i woohoo.flv -i woohoo.flv \
    -filter_complex "[0:0]pad=iw:ih[a];[1:0]scale=w=iw/2:h=ih/2[b];
    [2:0]scale=w=iw/2:h=ih/2[c];[3:0]scale=w=iw/2:h=ih/2[d];
    [0:0]scale=w=iw/2:h=ih/2[e];[a][b]overlay=w[x];[x][c]overlay=0:h[y];
    [y][d]overlay=w:h[z];[z][e]overlay=0:0" -qscale 1 stitched.avi

    The stitching itself is working great, but somehow the resulting video does not show as expected.
    After running for a few frames, it gets stuck at a single frame for most of the video, skipping the frames which should be there.

    While encoding ffmpeg throws a freaking lot of messages like :

    [Parsed_overlay_8 @ 000000000459e080] Buffer queue overflow, dropping.

    Any hints on how to solve my problem ?
    My source flv is h.264 encoded, coming from Red5 media server.

    Edit : Output is :

    root@s15757871:/opt/red5/webapps/woohoo/streams# ffmpeg -i out_3.mp4 -i 3.mp4 -filter_complex "[0:0]pad=iw:ih[a];[1:0]scale=w=iw/2:h=ih/2[b];[a][b]overlay=w:h" -vcodec libx264 -r 30 -acodec copy out_final.mp4
    ffmpeg version N-52943-g500220a Copyright (c) 2000-2013 the FFmpeg developers
    built on May 10 2013 09:46:03 with gcc 4.4.3 (Ubuntu 4.4.3-4ubuntu5.1)
    configuration: --enable-gpl --enable-libx264 --enable-libmp3lame --enable-nonfree --enable-libaacplus
     libavutil      52. 30.100 / 52. 30.100
     libavcodec     55.  7.100 / 55.  7.100
     libavformat    55.  4.101 / 55.  4.101
     libavdevice    55.  0.100 / 55.  0.100
     libavfilter     3. 63.101 /  3. 63.101
     libswscale      2.  3.100 /  2.  3.100
     libswresample   0. 17.102 /  0. 17.102
     libpostproc    52.  3.100 / 52.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'out_3.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf55.4.101
    Duration: 00:00:26.53, start: 0.000000, bitrate: 513 kb/s
    Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 640x360 [SAR 1:1 DAR 16:9], 447 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc
    Metadata:
     handler_name    : VideoHandler
    Stream #0:1(und): Audio: mp3 (mp4a / 0x6134706D), 44100 Hz, mono, s16p, 62 kb/s
    Metadata:
     handler_name    : SoundHandler
    Input #1, mov,mp4,m4a,3gp,3g2,mj2, from '3.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
       encoder         : Lavf55.4.101
    Duration: 00:00:26.60, start: 0.000000, bitrate: 639 kb/s
    Stream #1:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 640x360 [SAR 1:1 DAR 16:9], 574 kb/s, 30 fps, 30 tbr, 15360 tbn, 60 tbc
    Metadata:
     handler_name    : VideoHandler
    Stream #1:1(und): Audio: mp3 (mp4a / 0x6134706D), 44100 Hz, mono, s16p, 62 kb/s
    Metadata:
     handler_name    : SoundHandler
    File 'out_final.mp4' already exists. Overwrite ? [y/N] y
    [libx264 @ 0x19baee0] using SAR=1/1
    [libx264 @ 0x19baee0] using cpu capabilities: MMX2 SSE2Fast SSEMisalign LZCNT
    [libx264 @ 0x19baee0] profile High, level 3.0
    [libx264 @ 0x19baee0] 264 - core 132 r2310 76a5c3a - H.264/MPEG-4 AVC codec - Copyleft 2003-2013 - http://www.videolan.org/x264.html - options:  cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=12 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
    Output #0, mp4, to 'out_final.mp4':
     Metadata:
       major_brand     : isom
       minor_version   : 512
       compatible_brands: isomiso2avc1mp41
    encoder         : Lavf55.4.101
    Stream #0:0: Video: h264 (libx264) ([33][0][0][0] / 0x0021), yuv420p, 640x360 [SAR 1:1 DAR 16:9], q=-1--1, 15360 tbn, 30 tbc
    Stream #0:1(und): Audio: mp3 (i[0][0][0] / 0x0069), 44100 Hz, mono, 62 kb/s
    Metadata:
     handler_name    : SoundHandler
    Stream mapping:
     Stream #0:0 (h264) -> pad
     Stream #1:0 (h264) -> scale
     overlay -> Stream #0:0 (libx264)
     Stream #0:1 -> #0:1 (copy)
    Press [q] to stop, [?] for help
    Buffer queue overflow, dropping.     583kB time=00:00:08.00 bitrate= 597.3kbits/s
    [Parsed_overlay_2 @ 0x19be420] Buffer queue overflow, dropping.
    Last message repeated 202 times
    frame=  795 fps= 96 q=-2.0 Lsize=    1700kB time=00:00:27.04 bitrate= 514.9kbits/s dup=203 drop=0
    video:1487kB audio:196kB subtitle:0 global headers:0kB muxing overhead 1.025454%
    [libx264 @ 0x19baee0] frame I:4     Avg QP:19.83  size: 34019
    [libx264 @ 0x19baee0] frame P:322   Avg QP:22.85  size:  3858
    [libx264 @ 0x19baee0] frame B:469   Avg QP:24.57  size:   305
    [libx264 @ 0x19baee0] consecutive B-frames: 13.8% 15.8% 20.0% 50.3%
    [libx264 @ 0x19baee0] mb I  I16..4: 10.5% 47.8% 41.7%
    [libx264 @ 0x19baee0] mb P  I16..4:  1.4%  3.0%  0.8%  P16..4: 27.1% 14.4%  9.2%  0.0% 0.0%    skip:44.1%
    [libx264 @ 0x19baee0] mb B  I16..4:  0.0%  0.1%  0.0%  B16..8: 15.3%  1.5%  0.2%  direct: 0.3%  skip:82.6%  L0:39.7% L1:53.8% BI: 6.5%
    [libx264 @ 0x19baee0] 8x8 transform intra:56.6% inter:64.9%
    [libx264 @ 0x19baee0] coded y,uvDC,uvAC intra: 58.4% 72.9% 39.0% inter: 8.3% 11.9% 1.0%
    [libx264 @ 0x19baee0] i16 v,h,dc,p: 63% 18% 10% 10%
    [libx264 @ 0x19baee0] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 23% 17% 20%  4%  7%  9%  6%  6%  8%
    [libx264 @ 0x19baee0] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 27% 21% 13%  5%  7%  8%  6%  6%  7%
    [libx264 @ 0x19baee0] i8c dc,h,v,p: 52% 18% 22%  9%
    [libx264 @ 0x19baee0] Weighted P-Frames: Y:7.5% UV:4.0%
    [libx264 @ 0x19baee0] ref P L0: 70.8% 16.9%  9.3%  2.8%  0.2%
    [libx264 @ 0x19baee0] ref B L0: 90.6%  8.1%  1.3%
    [libx264 @ 0x19baee0] ref B L1: 96.4%  3.6%
    [libx264 @ 0x19baee0] kb/s:459.39

    The source mp4 files used here have successfully been converted from flv by ffmpeg before.

  • how to use crop filter twice in ffmpeg

    30 janvier 2018, par user1502679

    I want to use crop filter in two different locations, for example top left + bottom right and to merge them, how can I do this ?

    I want to use the code once, not to make with two independent crops and merge after that.

    EDIT : an image : http://alexvorn.com/wp-content/uploads/2013/07/ffmpeg-multiple-crop.png

  • FFmpeg - feed raw frames via pipe - FFmpeg does not detect pipe closure

    8 septembre 2018, par Rumble

    Im trying to follow these examples from C++ in Windows. Phyton Example C# Example

    I have an application that produces raw frames that shall be encoded with FFmpeg.
    The raw frames are transfered via IPC pipe to FFmpegs STDIN. That is working as expected, FFmpeg even displays the number of frames currently available.

    The problem occours when we are done sending frames. When I close the write end of the pipe I would expect FFmpeg to detect that, finish up and output the video. But that does not happen. FFmpeg stays open and seems to wait for more data.

    I made a small test project in VisualStudio.

    #include "stdafx.h"
    //// stdafx.h
    //#include "targetver.h"
    //#include
    //#include
    //#include <iostream>

    #include "Windows.h"
    #include <cstdlib>

    using namespace std;

    bool WritePipe(void* WritePipe, const UINT8 *const Buffer, const UINT32 Length)
    {
       if (WritePipe == nullptr || Buffer == nullptr || Length == 0)
       {
           cout &lt;&lt; __FUNCTION__ &lt;&lt; ": Some input is useless";
           return false;
       }

       // Write to pipe
       UINT32 BytesWritten = 0;
       UINT8 newline = '\n';
       bool bIsWritten = WriteFile(WritePipe, Buffer, Length, (::DWORD*)&amp;BytesWritten, nullptr);
       cout &lt;&lt; __FUNCTION__ &lt;&lt; " Bytes written to pipe " &lt;&lt; BytesWritten &lt;&lt; endl;
       //bIsWritten = WriteFile(WritePipe, &amp;newline, 1, (::DWORD*)&amp;BytesWritten, nullptr); // Do we need this? Actually this should destroy the image.

       FlushFileBuffers(WritePipe); // Do we need this?

       return bIsWritten;
    }

    #define PIXEL 80 // must be multiple of 8. Otherwise we get warning: Bytes are not aligned

    int main()
    {
       HANDLE PipeWriteEnd = nullptr;
       HANDLE PipeReadEnd = nullptr;
       {
           // create us a pipe for inter process communication
           SECURITY_ATTRIBUTES Attr = { sizeof(SECURITY_ATTRIBUTES), NULL, true };
           if (!CreatePipe(&amp;PipeReadEnd, &amp;PipeWriteEnd, &amp;Attr, 0))
           {
               cout &lt;&lt; "Could not create pipes" &lt;&lt; ::GetLastError() &lt;&lt; endl;
               system("Pause");
               return 0;
           }
       }

       // Setup the variables needed for CreateProcess
       // initialize process attributes
       SECURITY_ATTRIBUTES Attr;
       Attr.nLength = sizeof(SECURITY_ATTRIBUTES);
       Attr.lpSecurityDescriptor = NULL;
       Attr.bInheritHandle = true;

       // initialize process creation flags
       UINT32 CreateFlags = NORMAL_PRIORITY_CLASS;
       CreateFlags |= CREATE_NEW_CONSOLE;

       // initialize window flags
       UINT32 dwFlags = 0;
       UINT16 ShowWindowFlags = SW_HIDE;

       if (PipeWriteEnd != nullptr || PipeReadEnd != nullptr)
       {
           dwFlags |= STARTF_USESTDHANDLES;
       }

       // initialize startup info
       STARTUPINFOA StartupInfo = {
           sizeof(STARTUPINFO),
           NULL, NULL, NULL,
           (::DWORD)CW_USEDEFAULT,
           (::DWORD)CW_USEDEFAULT,
           (::DWORD)CW_USEDEFAULT,
           (::DWORD)CW_USEDEFAULT,
           (::DWORD)0, (::DWORD)0, (::DWORD)0,
           (::DWORD)dwFlags,
           ShowWindowFlags,
           0, NULL,
           HANDLE(PipeReadEnd),
           HANDLE(nullptr),
           HANDLE(nullptr)
       };

       LPSTR ffmpegURL = "\"PATHTOFFMPEGEXE\" -y -loglevel verbose -f rawvideo -vcodec rawvideo -framerate 1 -video_size 80x80 -pixel_format rgb24 -i - -vcodec mjpeg -framerate 1/4 -an \"OUTPUTDIRECTORY\"";

       // Finally create the process
       PROCESS_INFORMATION ProcInfo;
       if (!CreateProcessA(NULL, ffmpegURL, &amp;Attr, &amp;Attr, true, (::DWORD)CreateFlags, NULL, NULL, &amp;StartupInfo, &amp;ProcInfo))
       {
           cout &lt;&lt; "CreateProcess failed " &lt;&lt; ::GetLastError() &lt;&lt; endl;
       }
       //CloseHandle(ProcInfo.hThread);


           // Create images and write to pipe
    #define MYARRAYSIZE (PIXEL*PIXEL*3) // each pixel has 3 bytes

       UINT8* Bitmap = new UINT8[MYARRAYSIZE];

       for (INT32 outerLoopIndex = 9; outerLoopIndex >= 0; --outerLoopIndex)   // frame loop
       {
           for (INT32 innerLoopIndex = MYARRAYSIZE - 1; innerLoopIndex >= 0; --innerLoopIndex) // create the pixels for each frame
           {
               Bitmap[innerLoopIndex] = (UINT8)(outerLoopIndex * 20); // some gray color
           }
           system("pause");
           if (!WritePipe(PipeWriteEnd, Bitmap, MYARRAYSIZE))
           {
               cout &lt;&lt; "Failed writing to pipe" &lt;&lt; endl;
           }
       }

       // Done sending images. Tell the other process. IS THIS NEEDED? HOW TO TELL FFmpeg WE ARE DONE?
       //UINT8 endOfFile = 0xFF; // EOF = -1 == 1111 1111 for uint8
       //if (!WritePipe(PipeWriteEnd, &amp;endOfFile, 1))
       //{
       //  cout &lt;&lt; "Failed writing to pipe" &lt;&lt; endl;
       //}
       //FlushFileBuffers(PipeReadEnd); // Do we need this?

       delete Bitmap;


       system("pause");

       // clean stuff up
       FlushFileBuffers(PipeWriteEnd); // Do we need this?

       if (PipeWriteEnd != NULL &amp;&amp; PipeWriteEnd != INVALID_HANDLE_VALUE)
       {
           CloseHandle(PipeWriteEnd);
       }

       // We do not want to destroy the read end of the pipe? Should not as that belongs to FFmpeg
       //if (PipeReadEnd != NULL &amp;&amp; PipeReadEnd != INVALID_HANDLE_VALUE)
       //{
       //  ::CloseHandle(PipeReadEnd);
       //}


       return 0;

    }
    </cstdlib></iostream>

    And here the output of FFmpeg

    ffmpeg version 3.4.1 Copyright (c) 2000-2017 the FFmpeg developers
     built with gcc 7.2.0 (GCC)
     configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-bzlib --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-cuda --enable-cuvid --enable-d3d11va --enable-nvenc --enable-dxva2 --enable-avisynth --enable-libmfx
     libavutil      55. 78.100 / 55. 78.100
     libavcodec     57.107.100 / 57.107.100
     libavformat    57. 83.100 / 57. 83.100
     libavdevice    57. 10.100 / 57. 10.100
     libavfilter     6.107.100 /  6.107.100
     libswscale      4.  8.100 /  4.  8.100
     libswresample   2.  9.100 /  2.  9.100
     libpostproc    54.  7.100 / 54.  7.100
    [rawvideo @ 00000221ff992120] max_analyze_duration 5000000 reached at 5000000 microseconds st:0
    Input #0, rawvideo, from 'pipe:':
     Duration: N/A, start: 0.000000, bitrate: 153 kb/s
       Stream #0:0: Video: rawvideo, 1 reference frame (RGB[24] / 0x18424752), rgb24, 80x80, 153 kb/s, 1 fps, 1 tbr, 1 tbn, 1 tbc
    Stream mapping:
     Stream #0:0 -> #0:0 (rawvideo (native) -> mjpeg (native))
    [graph 0 input from stream 0:0 @ 00000221ff999c20] w:80 h:80 pixfmt:rgb24 tb:1/1 fr:1/1 sar:0/1 sws_param:flags=2
    [auto_scaler_0 @ 00000221ffa071a0] w:iw h:ih flags:'bicubic' interl:0
    [format @ 00000221ffa04e20] auto-inserting filter 'auto_scaler_0' between the filter 'Parsed_null_0' and the filter 'format'
    [swscaler @ 00000221ffa0a780] deprecated pixel format used, make sure you did set range correctly
    [auto_scaler_0 @ 00000221ffa071a0] w:80 h:80 fmt:rgb24 sar:0/1 -> w:80 h:80 fmt:yuvj444p sar:0/1 flags:0x4
    Output #0, mp4, to 'c:/users/vr3/Documents/Guenni/sometest.mp4':
     Metadata:
       encoder         : Lavf57.83.100
       Stream #0:0: Video: mjpeg, 1 reference frame (mp4v / 0x7634706D), yuvj444p(pc), 80x80, q=2-31, 200 kb/s, 1 fps, 16384 tbn, 1 tbc
       Metadata:
         encoder         : Lavc57.107.100 mjpeg
       Side data:
         cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
    frame=   10 fps=6.3 q=1.6 size=       0kB time=00:00:09.00 bitrate=   0.0kbits/s speed=5.63x

    As you can see in the last line of te FFmpeg output, the images got trough. 10 frames are available. But after closing the pipe, FFmpeg does not close, still expecting input.

    As the linked examples show, this should be a valid method.

    Trying for a week now...