Recherche avancée

Médias (1)

Mot : - Tags -/iphone

Autres articles (76)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Contribute to translation

    13 avril 2011

    You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
    To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
    MediaSPIP is currently available in French and English (...)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

Sur d’autres sites (10462)

  • avcodec/v4l2 : set sizeimage param for non-raw buffers [fixes #6716]

    4 octobre 2017, par Jorge Ramirez-Ortiz
    avcodec/v4l2 : set sizeimage param for non-raw buffers [fixes #6716]
    

    Some V4L2 drivers fail to allocate buffers when sizeimage is not set
    to a max value. This is indeed the case for s5p-mfc [1]

    Most drivers should be able to calculate this value from the frame
    dimensions and format - or at least have their own default.

    However since this work around should not impact those drivers doing
    the "right thing" this commit just provides such a default.

    The calculations were extracted from the v4l2 driver used to develop
    the ffmpeg v4l2_m2m support [2]. See venc.c and vdec.c

    [1] linux.git/drivers/media/platform/s5p-mfc
    [2] linux.git/drivers/media/platform/qcom/venus/

    • [DH] libavcodec/v4l2_context.c
  • confused about lhls in ffmpeg

    27 décembre 2020, par JohnL

    I am more than a little confused about trying to use the experimental lhls option in ffmpeg.

    


    I have built ffmpeg from source (which takes a while), and currently have version 4.1.6 running.

    


    Reading other threads here, and the ffmpeg docs, it appears that the lhls option is available only in the Dash muxer.

    


    I gravitated to HLS because of its support in Apple devices, and I thought that lhls was related (and Apple was drafting the spec).

    


    Then there is the comment in the ffmpeg docs that states that the hls.js folks are working on supporting lhls in their library, which I also use to support browsers like Chrome.

    


    Should I be using dash instead of hls for my prototype ? I want to try and get very low latency for a construction inspection app I am considering building. Right now, the latency I am seeing is 10-15 seconds.

    


    Any insights would be appreciated. TIA

    


    See my comment below. I have many of the dash format flags working, but can't get -lhls to work. Here are some of the build flags used :

    


    conversion : avcodec configuration : —prefix=/usr —extra-version='1 deb10u1+rpt1' —toolchain=hardened —incdir=/usr/include/arm-linux-gnueabihf —enable-gpl —disable-stripping —enable-avresample —disable-filter=resample —enable-avisynth —enable-gnutls —enable-ladspa —enable-libaom —enable-libass —enable-libbluray —enable-libbs2b —enable-libcaca —enable-libcdio —enable-libcodec2 —enable-libflite —enable-libfontconfig —enable-libfreetype —enable-libfribidi —enable-libgme —enable-libgsm —enable-libjack —enable-libmp3lame —enable-libmysofa —enable-libopenjpeg —enable-libopenmpt —enable-libopus —enable-libpulse —enable-librsvg —enable-librubberband —enable-libshine —enable-libsnappy —enable-libsoxr —enable-libspeex —enable-libssh —enable-libtheora —enable-libtwolame —enable-libvidstab —enable-libvorbis —enable-libvpx —enable-libwavpack —enable-libwebp —enable-libx265 —enable-libxml2 —enable-libxvid —enable-libzmq —enable-libzvbi —enable-lv2 —enable-omx —enable-openal —enable-opengl —enable-sdl2 —enable-omx-rpi —enable-mmal —enable-neon —enable-rpi —enable-libdc1394 —enable-libdrm —enable-libiec61883 —enable-chromaprint —enable-frei0r —enable-libx264 —libdir=/usr/lib/arm-linux-gnueabihf/neon/vfp —cpu=cortex-a7 —arch=armv6t2 —disable-thumb —enable-shared —disable-doc —disable-programs

    


  • ffmpeg does not recognize long string filter in execv

    4 mai 2023, par incertia

    I am writing some simple python script to call ffmpeg and concat some clips together. However, it doesn't work for reasons I am unable to explain.

    


    below is a working version of the code after some debugging

    


    inputs = sorted(list(askopenfilenames()))
n = len(inputs)

filter = []
for i in range(n):
    filter.append("[{}:v]".format(i))
    filter.append("[{}:a]".format(i))
filter.append("concat={}:v=1:a=1".format(n))
filter.append("[v]")
filter.append("[a]")
filter = " ".join(filter)

fargs = zip(itertools.repeat('-i'), inputs)
fargs = itertools.chain(
    ["ffmpeg"],
    itertools.chain.from_iterable(fargs),
    ["-filter_complex", '"{}"'.format(filter), "-vsync", "vfr", "-map", "[v]", "-map", "[a]"],
    ["-c:v", "libx264", "-crf", "{}".format(quality)],
    ["-c:a", "aac", "-b:a", "192k"],
    [out]
    )

os.execvp("ffmpeg", list(fargs))


    


    but the entire fargs construction causes ffmpeg to complain about the filter chain when quotes are not utilized. e.g. by utilizing the below process

    


    fargs = itertools.chain(
    ["ffmpeg", "-loglevel", "debug"],
    itertools.chain.from_iterable(fargs),
    #["-filter_complex", '"{}"'.format(filter), "-vsync", "vfr", "-map", "[v]", "-map", "[a]"],
    ["-filter_complex", filter, "-vsync", "vfr", "-map", "[v]", "-map", "[a]"],
    ["-c:v", "libx264", "-crf", "{}".format(quality)],
    ["-c:a", "aac", "-b:a", "192k"],
    [out]
    )


    


    we see that ffmpeg somehow sees this as multiple arguments

    


    Reading option '-filter_complex' ... matched as option 'filter_complex' (create a complex filtergraph) with argument '[0:v]'.
Reading option '[0:a]' ... matched as output url.
Reading option '[1:v]' ... matched as output url.
Reading option '[1:a]' ... matched as output url.
Reading option '[2:v]' ... matched as output url.
Reading option '[2:a]' ... matched as output url.
Reading option 'concat=3:v=1:a=1' ... matched as output url.
Reading option '[v]' ... matched as output url.
Reading option '[a]' ... matched as output url.


    


    even though a simple print(list(fargs)) yields

    


    ['ffmpeg', '-loglevel', 'debug', '-i', 'a.mp4', '-i', 'b.mp4', '-i', 'c.mp4', '-filter_complex', '[0:v] [0:a] [1:v] [1:a] [2:v] [2:a] concat=3:v=1:a=1 [v] [a]', '-vsync', 'vfr', '-map', '[v]', '-map', '[a]', '-c:v', 'libx264', '-crf', '20', '-c:a', 'aac', '-b:a', '192k', 'asdf.mp4']


    


    implying that the long filter string is being passed to ffmpeg as a single argument.