Recherche avancée

Médias (0)

Mot : - Tags -/clipboard

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (96)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

  • Installation en mode standalone

    4 février 2011, par

    L’installation de la distribution MediaSPIP se fait en plusieurs étapes : la récupération des fichiers nécessaires. À ce moment là deux méthodes sont possibles : en installant l’archive ZIP contenant l’ensemble de la distribution ; via SVN en récupérant les sources de chaque modules séparément ; la préconfiguration ; l’installation définitive ;
    [mediaspip_zip]Installation de l’archive ZIP de MediaSPIP
    Ce mode d’installation est la méthode la plus simple afin d’installer l’ensemble de la distribution (...)

Sur d’autres sites (5394)

  • How to convert a DNxHD file and interlace at 25fps ?

    27 octobre 2022, par Moritz Bastian

    I need some help transcoding a file with ffmpeg.

    


    I'm trying to transcode a progressive file to an interlaced DNxHD 185 using different filters.

    


    Goal is to archive better interlacing for a couple of problematic shots eg. zooms / drone footage.

    


    Can someone help with the proper command ?

    


    Best,
Moritz

    


  • FFMPEG blending screen two libvpx-vp9 webm yuva420p video files comes out looking incorrect

    30 novembre 2022, par OneWorld

    I'm trying to screen blend two libvpx-vp9 webm files, so that the blend comes out looking correct in FFMPEG. The example below takes two rgba png input files, loops them for a couple of seconds into libvpx-vp9 webm files with the pixel format yuva420p. It then tries to blend them using FFMPEG. I then output frames of these to visualise how it looks here in this Stack Overflow post.

    


    I have these two input rgba pngs (circle and Pikachu)

    


    enter image description here
enter image description here

    


    I create two libvpx-vp9 webm files from them like this :-

    


    ffmpeg -loop 1 -i circle_50_rgba.png -c:v libvpx-vp9 -t 2 -pix_fmt yuva420p circle_libvpx-vp9_yuva420p.webm

ffmpeg -loop 1 -i pikachu_rgba.png -c:v libvpx-vp9 -t 2 -pix_fmt yuva420p pikachu_libvpx-vp9_yuva420p.webm


    


    I then try and do a blend of these two libvpx-vp9 webm files like this :-

    


    ffmpeg -y -c:v libvpx-vp9 -i circle_libvpx-vp9_yuva420p.webm -c:v libvpx-vp9 -i pikachu_libvpx-vp9_yuva420p.webm -filter_complex "[1:v][0:v]blend=all_mode=screen" pikachu_reverse_all_mode_screened_onto_circle_both_yuva420p.webm


    


    and extract a frame from that like this

    


    ffmpeg  -c:v libvpx-vp9 -i pikachu_reverse_all_mode_screened_onto_circle_both_yuva420p.webm -frames:v 1  pikachu_reverse_all_mode_screened_onto_circle_from_yuva420p.png


    


    Which looks like this :-
enter image description here

    


    If I do this without all_mode, like this

    


    ffmpeg -y -c:v libvpx-vp9 -i circle_libvpx-vp9_yuva420p.webm -c:v libvpx-vp9 -i pikachu_libvpx-vp9_yuva420p.webm -filter_complex "[1:v][0:v]blend=screen" pikachu_reverse_screened_onto_circle_both_yuva420p.webm


    


    and then extract the png so we can visualise it, like this :-

    


    ffmpeg  -c:v libvpx-vp9 -i pikachu_reverse_screened_onto_circle_both_yuva420p.webm -frames:v 1  pikachu_reverse_screened_onto_circle_from_yuva420p.png


    


    it gives this output :-
enter image description here

    


    which is also incorrect because the white part of the circle should be completely white in the screen blend. We shouldn't see a faint yellow outline of Pikachu inside the white part.

    


    It should look like this :-
enter image description here

    


    Here is the full log of this is like this :-

    


    ffmpeg -y -c:v libvpx-vp9 -i circle_libvpx-vp9_yuva420p.webm -c:v libvpx-vp9 -i pikachu_libvpx-vp9_yuva420p.webm -filter_complex "[1:v][0:v]blend=screen" pikachu_reverse_screened_onto_circle_both_yuva420p.webm
ffmpeg version 4.2.7-0ubuntu0.1 Copyright (c) 2000-2022 the FFmpeg developers
  built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
  configuration: --prefix=/usr --extra-version=0ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  libavutil      56. 31.100 / 56. 31.100
  libavcodec     58. 54.100 / 58. 54.100
  libavformat    58. 29.100 / 58. 29.100
  libavdevice    58.  8.100 / 58.  8.100
  libavfilter     7. 57.100 /  7. 57.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  5.100 /  5.  5.100
  libswresample   3.  5.100 /  3.  5.100
  libpostproc    55.  5.100 / 55.  5.100
[libvpx-vp9 @ 0x55d5b1f34680] v1.8.2
    Last message repeated 1 times
Input #0, matroska,webm, from 'circle_libvpx-vp9_yuva420p.webm':
  Metadata:
    ENCODER         : Lavf58.29.100
  Duration: 00:00:02.00, start: 0.000000, bitrate: 19 kb/s
    Stream #0:0: Video: vp9 (Profile 0), yuva420p(tv), 50x50, SAR 1:1 DAR 1:1, 25 fps, 25 tbr, 1k tbn, 1k tbc (default)
    Metadata:
      alpha_mode      : 1
      ENCODER         : Lavc58.54.100 libvpx-vp9
      DURATION        : 00:00:02.000000000
[libvpx-vp9 @ 0x55d5b1f854c0] v1.8.2
    Last message repeated 1 times
Input #1, matroska,webm, from 'pikachu_libvpx-vp9_yuva420p.webm':
  Metadata:
    ENCODER         : Lavf58.29.100
  Duration: 00:00:02.00, start: 0.000000, bitrate: 29 kb/s
    Stream #1:0: Video: vp9 (Profile 0), yuva420p(tv), 50x50, SAR 1:1 DAR 1:1, 25 fps, 25 tbr, 1k tbn, 1k tbc (default)
    Metadata:
      alpha_mode      : 1
      ENCODER         : Lavc58.54.100 libvpx-vp9
      DURATION        : 00:00:02.000000000
[libvpx-vp9 @ 0x55d5b1f38940] v1.8.2
[libvpx-vp9 @ 0x55d5b1f49440] v1.8.2
Stream mapping:
  Stream #0:0 (libvpx-vp9) -> blend:bottom
  Stream #1:0 (libvpx-vp9) -> blend:top
  blend -> Stream #0:0 (libvpx-vp9)
Press [q] to stop, [?] for help
[libvpx-vp9 @ 0x55d5b1f49440] v1.8.2
[libvpx-vp9 @ 0x55d5b1f38940] v1.8.2
[libvpx-vp9 @ 0x55d5b1f80c40] v1.8.2
Output #0, webm, to 'pikachu_reverse_screened_onto_circle_both_yuva420p.webm':
  Metadata:
    encoder         : Lavf58.29.100
    Stream #0:0: Video: vp9 (libvpx-vp9), yuva420p, 50x50 [SAR 1:1 DAR 1:1], q=-1--1, 200 kb/s, 25 fps, 1k tbn, 25 tbc (default)
    Metadata:
      encoder         : Lavc58.54.100 libvpx-vp9
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
frame=   50 fps=0.0 q=0.0 Lsize=       7kB time=00:00:01.96 bitrate=  29.3kbits/s speed=33.2x    
video:4kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 96.711426%



    


    I also tried doing a convertion to rgba, like this :-

    


    ffmpeg -y -c:v libvpx-vp9 -i circle_libvpx-vp9_yuva420p.webm -c:v libvpx-vp9 -i pikachu_libvpx-vp9_yuva420p.webm -filter_complex "[0:v]format=pix_fmts=rgba[zero];[1:v]format=pix_fmts=rgba[one];[one][zero]blend=screen" pikachu_reverse_screened_all_mode_onto_circle_after_rgba_conversion_webm.webm


    


    However the result of this also comes out with yellow inside the white circle, which should be white

    


    I was wondering what I need to do so that the blend of these two webm libvpx-vp9 video files looks correct, like it does above.

    


    note : I need to retain the alpha channels, because sometimes assets have transparent alpha channels. In the examples above the assets happen to have opaque alpha channels.

    


  • StackOverflowException with Process in C#

    4 juin 2015, par user3633222

    I have a process, which runs in a console app. It runs forever.

    After a couple of days the app crashes with a StackOverflowException.

    The essence of the app is where I spin up a Process with FFMpeg.exe and creates a sceenshot of a video stream. It works very good but only for a few days at the time.

    I am pretty sure it has to do with the disposal of the FFMpeg or some internal Process stuff.

    Here is the code

    using ( Process ffmpegProcess = new Process() ) {

       //arguments for running ffmpeg
       ffmpegProcess.StartInfo.UseShellExecute = false;
       ffmpegProcess.StartInfo.CreateNoWindow = true;
       ffmpegProcess.StartInfo.RedirectStandardOutput = true;

       //specific for our screenshots
       ffmpegProcess.StartInfo.FileName = string.Concat( Environment.CurrentDirectory, Path.DirectorySeparatorChar, ffmpegProgramName );

       try {
           //todo: log this stopwatch somewhere perhaps
           processWatch.Start();

           //set arguments every time we want to create a new screen shot
           ffmpegProcess.StartInfo.Arguments = string.Format( @"-y -i {0}{1} -threads 0 -ss 00:00:01.000 -f image2 -s 620x349 -vframes 1 ../../web/{2}.jpg", server, streamPath, slug );
           ffmpegProcess.Start();
           ffmpegProcess.WaitForExit( 500 );

           Console.WriteLine( slug );
           Console.WriteLine( processWatch.Elapsed );

           processWatch.Reset();
           runCount++;
           cacheIndexer++;

           //lets see how many spins we've had!
           Console.WriteLine( string.Format( "SERVER CACHE INDEX : {0}", cacheIndexer ) );
           Console.WriteLine( string.Format( "RUN : {0}", runCount ) );
           Console.WriteLine( Environment.NewLine );

       } catch ( Exception ex ) {
           //Console.WriteLine( "Ex " + ex );
       }
    }

    The loop looks like this.

       public void RecurseTask() {
           /*
           You can try one of these, but you will se CPU usage go crazy and perhaps concurrency errors due IO blocking

           Parallel.ForEach( _videoStreamSlugs, ( e ) => _videoStreamScreenShots.GrabScreenShot( e ) );

           foreach ( var slug in _videoStreamSlugs ) {
               Task.Run( () => _videoStreamScreenShots.GrabScreenShot( slug ) );
           }
           */

           //we want to grab screen shots for every slug in out slug list!
           foreach ( var slug in _videoStreamSlugs ) {
               _videoStreamScreenShots.GrabScreenShot( slug );
           }

           //sleep for a little while
           Thread.Sleep( _recurseInterval );

           //A heavy clean up!
           //We do this, trying to avoid a stackoverflow excecption in the recursive method
           //Please inspect this if problems arise
           GC.Collect();

           //lets grab over again
           RecurseTask();
       }

    I added a GC.Collect out of curiosity to see if it made a difference.

    I am not doing a Windows Service.