Recherche avancée

Médias (3)

Mot : - Tags -/spip

Autres articles (39)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

Sur d’autres sites (6748)

  • FFmpeg uses too much memory when repeating split, select, overlay

    13 novembre 2020, par finefoot

    I'm running

    


    ffmpeg -i input.mp4 -filter_complex_script script.txt output.mp4


    


    with the following minimal example script :

    


    split[tmp],
select='between(t,1,2)',
select='between(n,0,1)',
[tmp]overlay=enable='between(t,1,2)':eof_action=repeat,
split[tmp],
select='between(t,3,4)',
select='between(n,0,1)',
[tmp]overlay=enable='between(t,3,4)':eof_action=repeat


    


    What I want to do is to take 1 frame at a certain position and repeat it for a certain duration, basically "pausing" the video, while overwriting to keep the same output length. In the example, I'm doing this twice : I'm using split[tmp] to get a second input stream to work on, select the time at position 00:01 with select='between(t,1,2)', select the first frame from that position with select='between(n,0,1)' and finally overlay that frame over the input. This repeats for a second time at position 00:03. I have tested this and it does exactly what I'm looking for.

    


    However, in my real script, I'm repeating this about 1000 times for different positions in the stream (and for shorter durations than 1 second) which results in running out of memory. What am I doing wrong ? What can I do to optimize ?

    


  • checkasm/hevc_pel : Fix stack buffer overreads

    28 septembre 2021, par Andreas Rheinhardt
    checkasm/hevc_pel : Fix stack buffer overreads
    

    This patch increases several stack buffers in order to fix
    stack-buffer-overflows (e.g. in put_hevc_qpel_uni_hv_9 in
    line 814 of hevcdsp_template.c) detected with ASAN in the hevc_pel
    checkasm test.
    The buffers are increased by the minimal amount necessary
    in order not to mask potential future bugs.

    Reviewed-by : Martin Storsjö <martin@martin.st>
    Reviewed-by : "zhilizhao(赵志立)" <quinkblack@foxmail.com>
    Signed-off-by : Andreas Rheinhardt <andreas.rheinhardt@outlook.com>

    • [DH] tests/checkasm/hevc_pel.c
  • Create a mkv file with colored background and containing a given audio and subtitle stream

    25 mai 2023, par rdrg109

    Table of contents

    &#xA;

      &#xA;
    • The context
    • &#xA;

    • Minimal working example
    • &#xA;

    • What I've tried&#xA;
        &#xA;
      • Create a mkv file with colored background and an audio stream
      • &#xA;

      • Create a mkv file with colored background, an audio stream and a subtitles stream
      • &#xA;

      &#xA;

    • &#xA;

    • The question
    • &#xA;

    &#xA;

    &#xA;

    The context

    &#xA;

    I have a *.flac file and a *.srt file. I want to merge those files in a MKV file, but at the same time, I want to add a video stream. I want the video stream to show a green background the entire time.

    &#xA;

    &#xA;

    Minimal working example

    &#xA;

    For our experimentation, let's create two sample files : one *.flac file and one *.srt file.

    &#xA;

    The following command creates a *.flac file that lasts 60 seconds and contains a sine wave.

    &#xA;

    $ ffmpeg -y -f lavfi -i "sine=f=1000:d=60" input.flac&#xA;

    &#xA;

    The following command creates a *.srt file. Note that our last subtitle lasts until the sixth second, this is intended.

    &#xA;

    $ cat &lt;&lt; EOF > input.srt&#xA;1&#xA;00:00:00,000 --> 00:00:03,000&#xA;This is the first subtitle in a&#xA;SRT file.&#xA;&#xA;2&#xA;00:00:03,000 --> 00:00:06,000&#xA;This is the second subtitle in a&#xA;SRT file.&#xA;EOF&#xA;

    &#xA;

    &#xA;

    What I've tried

    &#xA;

    &#xA;

    Create a mkv file with colored background and an audio stream

    &#xA;

    I know how to create a MKV file containing a given audio stream and a colored background as the video stream.

    &#xA;

    The following command creates a MKV file containing input.flac as the audio stream and green background as the video stream. The MKV file have the same duration as input.flac.

    &#xA;

    $ ffmpeg \&#xA;  -y \&#xA;  -f lavfi \&#xA;  -i color=c=green:s=2x2 \&#xA;  -i input.flac \&#xA;  -c:v libx264 \&#xA;  -c:a copy \&#xA;  -shortest \&#xA;  output.mkv&#xA;

    &#xA;

    The following command shows the duration of the streams in the resulting file.

    &#xA;

    $ ffprobe -v error -print_format json -show_entries stream=codec_type:stream_tags=duration output.mkv | jq -r &#x27;&#x27;&#xA;

    &#xA;

    {&#xA;  "programs": [],&#xA;  "streams": [&#xA;    {&#xA;      "codec_type": "video",&#xA;      "tags": {&#xA;        "DURATION": "00:00:58.200000000"&#xA;      }&#xA;    },&#xA;    {&#xA;      "codec_type": "audio",&#xA;      "tags": {&#xA;        "DURATION": "00:01:00.000000000"&#xA;      }&#xA;    }&#xA;  ]&#xA;}&#xA;

    &#xA;

    &#xA;

    Create a mkv file with colored background, an audio stream and a subtitles stream

    &#xA;

    To add a subtitles stream, I just need to specify the *.srt file. However, when I do this, the duration of the video is set to the time of the last subtitle in the *.srt file. This is expected because I have used -shortest. I would get the result I'm looking for if it were possible to specify the stream that -shortest gives top priority to. I haven't found this information on the Internet.

    &#xA;

    $ ffmpeg \&#xA;  -y \&#xA;  -f lavfi \&#xA;  -i color=c=green:s=2x2 \&#xA;  -i input.flac \&#xA;  -i input.srt \&#xA;  -c:v libx264 \&#xA;  -c:a copy \&#xA;  -shortest \&#xA;  output.mkv&#xA;

    &#xA;

    The following command shows the duration of the streams in the resulting file. Note that the maximum duration of the resulting file is 6 seconds, while in the resulting file from the previous section it was 1 minute.

    &#xA;

    $ ffprobe -v error -print_format json -show_entries stream=codec_type:stream_tags=duration output.mkv | jq -r &#x27;&#x27;&#xA;

    &#xA;

    {&#xA;  "programs": [],&#xA;  "streams": [&#xA;    {&#xA;      "codec_type": "video",&#xA;      "tags": {&#xA;        "DURATION": "00:00:01.160000000"&#xA;      }&#xA;    },&#xA;    {&#xA;      "codec_type": "audio",&#xA;      "tags": {&#xA;        "DURATION": "00:00:03.134000000"&#xA;      }&#xA;    },&#xA;    {&#xA;      "codec_type": "subtitle",&#xA;      "tags": {&#xA;        "DURATION": "00:00:06.000000000"&#xA;      }&#xA;    }&#xA;  ]&#xA;}&#xA;

    &#xA;

    &#xA;

    The question

    &#xA;

    Given a *.flac file and a *.srt file. How to merge them in a *.mkv file so that it has the *.flac file as the audio stream, the *.srt file as the subtitles stream and a green background as the video stream ?

    &#xA;