Recherche avancée

Médias (1)

Mot : - Tags -/blender

Autres articles (108)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

Sur d’autres sites (10388)

  • Why is ffmpeg's hstack so much slower than overlay and pad ?

    27 janvier 2021, par cgenco

    I'm using ffmpeg to stitch together two videos of people chatting into a video with each of them side-by-side, like this :

    


    left.mp4 + right.mp4 = out.mp4

    


    Here's the command I'm currently using to get this done, which runs at 2.5x on my 13" M1 MacBook Pro :

    


    ffmpeg -y -i left.mp4 -i right.mp4 -filter_complex "
  [0:v] crop=w=in_w/2 [croppedLeft];
  [1:v][1:v] overlay=x=overlay_w/4 [shiftedRight];
  [shiftedRight][croppedLeft] overlay [vout];
  [0:a][1:a] amix [aout]
" -map "[vout]" -map "[aout]" -ac 2 out.mp4


    


    This command crops the left video to half of its original width (cropping so the video is centered), then shifts the right video a quarter of its width to the right, then overlays the left video on the left half of the output merged with the shifted right video.

    


    One day on my weekly fun-time read-through the FFmpeg filters documentation I stumbled on a filter named hstack, which is described as being "faster than using overlay and pad filter to create same output."

    


    My ex wife can affirm that there are few higher priorities in my life than going faster, so I altered my ffmpeg script to use hstack instead of two overlays :

    


    ffmpeg -y -i left.mp4 -i right.mp4 -filter_complex "
  [0:v] crop=w=in_w/2 [croppedLeft];
  [1:v] crop=w=in_w/2 [croppedRight];
  [croppedLeft][croppedRight] vstack [vout];
  [0:a][1:a] amix [aout]
" -map "[vout]" -map "[aout]" -ac 2 out.mp4


    


    ...but that command runs painfully slowly, like 0.1x. It takes multiple minutes to render a single second.

    


    So uhhh what's going on here ? Why is hstack taking so long when it's supposed to be faster ?

    


    I've tried this on both the M1 native build from OSXExperts (version N-99816-g3da35b7) and the standard ffmpeg from brew and hstack is just as slow on each.

    


  • A Docker Ubuntu image with ARM Mali G610 GPU support [closed]

    7 novembre 2023, par Тимур Тимаев

    I have a docker-compose project which automates creation of GFX-video files used in TV production. The program takes an order from a user, takes screenshots of various web-pages on the internet, composes these screenshots into a html-based animation and renders this animation in a video file. User gets broadcast ready .MP4 file.

    


    I currently export animation from html by extracting every single frame as a PNG sequence and then stitch them together using FFMPEG. However there's a much faster way to do this - record the html animation as it is playing in a browser.

    


    The issue is that the animation is choppy due to the lack of GPU support in a Docker container. I'm using Seleniarm/Chrome and it's not hardware accelerated.

    


    I'm trying to run this project on a OrangePI 5 Plus SBC which features Rockchip RK3588 SoC with ARM Mali G610 GPU. I have found a Ubuntu image by Joshua Riek which implements GPU acceleration the best in my opinion. It's very smooth. Joshua adds SoC GPU drivers to the Ubuntu image thus adding the gpu support.

    


    I want to build a Docker image from this repo to be able to spin it up, install Chrome, Selenium to the container, play the html animation and record it via x11grab device of FFMPEG.

    


    My problem is that I have little knowledge about building containers myself. Let alone tinkering with container drivers and etc. I'm not even sure if it's possible at all.

    


    Can anybody help me or point me in the right direction ?

    


    I have found some links, but I have absolutely no clue for where to start from.
I've looked into Seleniarm but some discussions mentioned that it uses XVFB which can not be hardware accelerated by definition.

    


  • Unable to find suitable output format for 'ffmpeg'

    23 mai 2023, par kumarvhat

    I'm trying to use the ffmpeg toolset to stitch a set of images to an .mp4 and am receiving the following error :

    


    [NULL @ 000001d48d1dcc00] Unable to find a suitable output format for 
'ffmpeg' ffmpeg: Invalid argument


    


    The command I'm using is :

    


    ffmpeg -r 30 -f image2 -s 1920x1080 -i C://MYPATH//back_squat_%03d.png -vcodec libx264 -crf 25 -pix_fmt yuv420p C://MYPATH//back_squat.mp4


    


    And the full output looks like this.

    


    ffmpeg version 4.2.3 Copyright (c) 2000-2020 the FFmpeg developers
  built with gcc 9.3.1 (GCC) 20200523
  configuration: --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libdav1d --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --enable-libmfx --enable-amf --enable-ffnvcodec --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth --enable-libopenmpt
  libavutil      56. 31.100 / 56. 31.100
  libavcodec     58. 54.100 / 58. 54.100
  libavformat    58. 29.100 / 58. 29.100
  libavdevice    58.  8.100 / 58.  8.100
  libavfilter     7. 57.100 /  7. 57.100
  libswscale      5.  5.100 /  5.  5.100
  libswresample   3.  5.100 /  3.  5.100
  libpostproc    55.  5.100 / 55.  5.100
Input #0, image2, from 'C://MYPATH//back_squat_%03d.png':
  Duration: 00:00:02.37, start: 0.000000, bitrate: N/A
    Stream #0:0: Video: png, rgba(pc), 900x1000 [SAR 2835:2835 DAR 9:10], 30 fps, 30 tbr, 30 tbn, 30 tbc
Input #1, image2, from 'C://MYPATH//back_squat_%03d.png':
  Duration: 00:00:02.37, start: 0.000000, bitrate: N/A
    Stream #1:0: Video: png, rgba(pc), 900x1000 [SAR 2835:2835 DAR 9:10], 30 fps, 30 tbr, 30 tbn, 30 tbc
[NULL @ 000002001b8d5780] Unable to find a suitable output format for 'ffmpeg'
ffmpeg: Invalid argument


    


    What is wrong with the syntax of the command string ?