Recherche avancée

Médias (1)

Mot : - Tags -/livre électronique

Autres articles (99)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

Sur d’autres sites (9539)

  • "Smooth" the ffmpeg showwaves video filter output

    24 décembre 2019, par goodytx

    I think I’m close to what I’m after, thanks to excellent answers like : FFMPEG : Fill/Change (part of) audio waveform color as per actual progress with respect to time progress ...but not quite there !

    I’m scripting the generation of a waveform video with showwaves, and overlaying that onto a static image background, turning the whole thing into video. All is well, except I’d like to "smooth" the waveform. In most tutorials/answers you will see jagged "peaks" in the waveform animation, as of course the audio data varies widely into peaks and troughs. (The differences expressed between the lowest and highest amplitude (I think) in each audio frame sample.) I’d like to get more of a consistent "flowing wave", without the sharp spikes up and down.

    Currently using the filter much as described in the previous link :

    [1:a]aformat=channel_layouts=mono,asplit[red][white]; \
    [red]showwaves=s=1280x100:rate=15:mode=cline:scale=sqrt:colors=0xff0000[red]; \

    Which results in the following (ignoring the red on white overlay, which I may drop) :

    Current v desired

    This generates the type of wave plot I want (solid vs a sine, for example), but I think I need to either tweak the audio stream or...possibly post filter the showwaves video stream (but I think the former) to get the effect I want. I tried to reduce the rate of showwaves but that just results in juddery video. So to me that feels like the audio data needs to be the changed factor.

    My limited audio knowledge leads me to think I should gain, compress, normalize, or all of the above, the audio input, which I have tried, but can’t seem to nail it. Keep in mind the audio stream in this filter chain is purely feeding the showwaves filter, so I can "trash" it as much as needed in order to get the visualization I want.

    I hope this all makes sense, and if I need to provide more info, please let me know. Thanks !

  • x264 configure linking and building problems

    16 janvier 2014, par Pie

    I am trying to build x264 from source on Ubuntu 32bit in order to convert a sequence of jpg or png images into mp4 video : x264 site, sample images

    The downloaded binaries is able to convert the sequence into an mkv video (or few other formats) when I run this command :

    ./x264dist ~/Dev/x264emp/img/FLYOVER%4d.JPG -o abc.mkv

    x264dist is the renamed name of the binary I download from the site.

    However, when I grab the source and compile with simple configure :

    $ ./configure --enable-shared --enable-static --enable-pic

    platform:      X86
    system:        LINUX
    cli:           yes
    libx264:       internal
    shared:        yes
    static:        yes
    asm:           yes
    interlaced:    yes
    avs:           avxsynth
    lavf:          no
    ffms:          no
    mp4:           no
    gpl:           yes
    thread:        posix
    opencl:        yes
    filters:       crop select_every
    debug:         no
    gprof:         no
    strip:         no
    PIC:           yes
    bit depth:     8
    chroma format: all

    then $ make. Then I use the binaries to run the exactly same command as above but there is this error :

    ./x264 ~/Dev/x264emp/img/FLYOVER%4d.JPG -o abc.mkv
    raw [error]: raw input requires a resolution.
    x264 [error]: could not open input file `/home/tmd/Dev/x264emp/img/FLYOVER%4d.JPG' via any method!

    It seems like it can’t read any input at all. But at least I am still able to run --help on that binaries.

    Then I realized that the downloaded binaries is 3.5Mb while my custom compilation results in 1.5Mb binaries.

    So I just want to know what are the build configurations used by the official build, and/or is there any dependency I am missing that leads to this problem.

    The reason I am trying to build myself because I want to port the x264 lib into Javascript using Emscripten. There has been a solution using FFmpeg but it seems like I don’t need the whole video processing library but only a simple H264 codec. So I need to solve the configure/compile/linking problem to port it rightly.

    Possibly similar How to configure X264 build before running make on OS X

  • Intel IPP RGBToYUV420 function is getting IppStsSizeErr result code

    6 février 2018, par yesilcimen.ahmet

    I am using IPP 2017.0.3(r55431) and Delphi 10.2, I am trying convert RGB to YUV420P, but I am getting IppStsSizeErr result code.

    I have m_dst_picture, m_src_picture: AVPicture structure created by FFMPEG.

    { allocate the encoded raw picture }

    ret := avpicture_alloc(@m_dst_picture, AV_PIX_FMT_YUV420P, c^.width, c^.height);

    if (ret < 0) then
       Exit(False);

    { allocate BGR frame that we can pass to the YUV frame }
    ret := avpicture_alloc(@m_src_picture, AV_PIX_FMT_BGR24, c^.width, c^.height);
    if (ret < 0) then
      Exit(False);
    //It works fine.
    { convert BGR frame (m_src_picture) to and YUV frame (m_dst_picture) }
    sws_scale(sws_ctx, @m_src_picture.data[0], @m_src_picture.linesize, 0, c^.height, @m_dst_picture.data[0], @m_dst_picture.linesize);

    I want to convert the RGB buffer directly to YUV420P. The original code first loads RGB into the AVPicture then convert RGB to YUV420P with sws_scale and it causes slowness.

    Here I copy the BGR buffer to m_src_picture of FFMPEG. But this leads to performance loss, so I want to convert it directly to YUV420P using Intel IPP.

    procedure WriteFrameBGR24(frame: PByte);
    var
     y: Integer;
    begin
     for y := 0 to m_c^.height - 1 do
       Move(PByte(frame - (y * dstStep))^, PByte(m_src_picture.data[0] + (y * m_src_picture.linesize[0]))^, dstStep);
    end;

    In the code below I am trying to convert using Intel IPP.

    { Converting RGB to YUV420P. }

    **roiSize is 1920 and 1080

    **The values created by FFMPEG for YUV420P in m_dst_picture.linesize are [0]=1920,[1]=960,[2]=960 respectively.

    Do I need to convert the values of the linesize to another value ?

    **The reason why the srcStep parameter is a minus sign is the Bottom-Up Bitmap and the frame pointer indicates the Bmp.ScanLine[0] address, which indicates the highest pointer address.

    srcStep := (((width * (3 * 8)) + 31) and not 31) div 8; //for 24 bitmap

    { Swap of BGR channels to RGB. }
    //It works fine    
    st := ippiSwapChannels_8u_C3IR(frame, -srcStep, roiSize, @BGRToRGBArray[0]);

    { Convert RGB to YUV420P. }
    //IppStsSizeErr  
    st := ippiRGBToYUV420_8u_C3P3R(frame, -srcStep, @m_dst_picture.data[0], @m_dst_picture.linesize[0], roiSize);

    How do I solve this problem ?

    Thank you.