Recherche avancée

Médias (2)

Mot : - Tags -/documentation

Autres articles (31)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

Sur d’autres sites (5660)

  • How to determine when FFmpeg process completes

    18 avril 2017, par ajbeaven
    var origFilePath = "C:/MyOriginalFile.webm";
    var processedFilePath = "C:/MyProcessedFile.webm";

    RunFfmpeg($"-i \"{origFilePath}\" -af \"silenceremove=1:0.1:0.001, areverse, silenceremove=1:0.1:0.001, areverse\" \"{processedFilePath}\" -y");

    // fails with IOException as the file presumably not been released by FFmpeg
    System.IO.File.Delete(origFilePath);

    When the file is deleted, the following exception is frequently (maybe 80% of the time) thrown :

    IOException : The process cannot access the file ’C :\MyOriginalFile.webm’ because it is being used by another process.

    The call to create and run the FFmpeg process goes like this :

    private List<string> RunFfmpeg(string arguments)
    {
       using (var process = new Process())
       {
           process.StartInfo.RedirectStandardOutput = true;
           process.StartInfo.RedirectStandardError = true;
           process.StartInfo.FileName = _hostingEnvironment.ContentRootPath + _settings.FfmpegPath;
           process.StartInfo.Arguments = arguments;
           process.StartInfo.UseShellExecute = false;
           process.StartInfo.CreateNoWindow = true;

           // ffmpeg only uses strerr for its output
           var output = new List<string>();
           process.ErrorDataReceived += new DataReceivedEventHandler((s, e) => {
               if (e.Data != null)
                   output.Add(e.Data);
           });

           process.Start();
           process.BeginErrorReadLine();
           process.WaitForExit();
           return output;
       }
    }
    </string></string>

    It appears that even though the process has completed when the file is deleted, FFmpeg is still processing or has otherwise not released it.

    Is this behaviour expected ? How do I go about ensuring that FFmpeg has finished processing the files before continuing ?

  • ffmpeg determine frame number for fade filter

    16 juin 2016, par Gert Gottschalk

    I am looking for an easy way to set the time points for the fade filter. Particularly for the fade-out at the end. At best would be a time based format. I understand the fade filter works based on frames. But is there a way to change that to time stamp ? Particularly in the end I have a hard time getting the number of the last frame. Some means to tell the fade filter to start the fade 0.5sec before the end would be awesome. Maybe something like :

    -filter:v 'fade=out:-0.5:0.3'

    Read : ’start fade out 0.5sec before end and have fade take 0.3sec. I.e. have 0.2sec of black at the end.

    I would also be OK if this would be in frame number counts.

    My grief right now is that the frame count i.e. reported by ffprobe seems to be somewhat half of what it really is. A fade filter applied on the count provided by ffprobe turn my video to black about half-way through the clip. I’m not sure what I’m missing there.

    Again my primary question is : How to determine the correct frame number for the fade out filter ?

    Thanks,
    Gert

  • ffmpeg : how to determine frame rate automatically ?

    25 décembre 2016, par mrgloom

    I use this simple script to convert video to images using ffmpeg, but frame rate is fixed, how can I determine it automatically ?

    FRAME_RATE="30"
    SEPARATOR='/'


    VIDEO_PATH=$1

    VIDEO_BASE_DIR=`dirname $1`
    FRAMES_DIR=$VIDEO_BASE_DIR$SEPARATOR"Frames"
    rm -rf $FRAMES_DIR
    mkdir $FRAMES_DIR

    #Convert video to images
    ./ffmpeg -r $FRAME_RATE -i $VIDEO_PATH $FRAMES_DIR$SEPARATOR"image%d.png"

    UPDATE :

    By ffprobe I checked that my 1st video frame rate is 30.
    Also results are the same (339 frames are produced) even I reduce frame rate, so -r option doesn’t work or work in some other way ?

    These command give the same result :

    ./ffmpeg -r 10 -i $VIDEO_PATH $FRAMES_DIR$SEPARATOR"image%d"$EXTENSION
    ./ffmpeg -r 30 -i $VIDEO_PATH $FRAMES_DIR$SEPARATOR"image%d"$EXTENSION
    ./ffmpeg -i $VIDEO_PATH $FRAMES_DIR$SEPARATOR"image%d"$EXTENSION

    Output :

    ffmpeg version N-63893-gc69defd Copyright (c) 2000-2014 the FFmpeg developers
     built on Jul 16 2014 05:38:01 with gcc 4.6 (Debian 4.6.3-1)
     configuration: --prefix=/root/ffmpeg-static/64bit --extra-cflags='-I/root/ffmpeg-static/64bit/include -static' --extra-ldflags='-L/root/ffmpeg-static/64bit/lib -static' --extra-libs='-lxml2 -lexpat -lfreetype' --enable-static --disable-shared --disable-ffserver --disable-doc --enable-bzlib --enable-zlib --enable-postproc --enable-runtime-cpudetect --enable-libx264 --enable-gpl --enable-libtheora --enable-libvorbis --enable-libmp3lame --enable-gray --enable-libass --enable-libfreetype --enable-libopenjpeg --enable-libspeex --enable-libvo-aacenc --enable-libvo-amrwbenc --enable-version3 --enable-libvpx
     libavutil      52. 89.100 / 52. 89.100
     libavcodec     55. 66.101 / 55. 66.101
     libavformat    55. 43.100 / 55. 43.100
     libavdevice    55. 13.101 / 55. 13.101
     libavfilter     4.  8.100 /  4.  8.100
     libswscale      2.  6.100 /  2.  6.100
     libswresample   0. 19.100 /  0. 19.100
     libpostproc    52.  3.100 / 52.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/home/user/myvideo1.mp4':
     Metadata:
       major_brand     : mp42
       minor_version   : 0
       compatible_brands: isommp42
       creation_time   : 2016-01-16 05:30:03
     Duration: 00:00:11.33, start: 0.000000, bitrate: 4659 kb/s
       Stream #0:0(eng): Video: h264 (High) (avc1 / 0x31637661), yuv420p, 640x480, 4539 kb/s, SAR 65536:65536 DAR 4:3, 30 fps, 30 tbr, 90k tbn, 180k tbc (default)
       Metadata:
         rotate          : 90
         creation_time   : 2016-01-16 05:30:03
         handler_name    : VideoHandle
       Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 127 kb/s (default)
       Metadata:
         creation_time   : 2016-01-16 05:30:03
         handler_name    : SoundHandle
    Output #0, image2, to '/home/user/Frames/image%d.png':
     Metadata:
       major_brand     : mp42
       minor_version   : 0
       compatible_brands: isommp42
       encoder         : Lavf55.43.100
       Stream #0:0(eng): Video: png, rgb24, 640x480 [SAR 1:1 DAR 4:3], q=2-31, 200 kb/s, 30 fps, 90k tbn, 30 tbc (default)
       Metadata:
         rotate          : 90
         creation_time   : 2016-01-12 05:38:03
         handler_name    : VideoHandle
         encoder         : Lavc55.66.101 png
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 (native) -> png (png))
    Press [q] to stop, [?] for help
    frame=  339 fps= 68 q=0.0 Lsize=N/A time=00:00:11.30 bitrate=N/A    
    video:195852kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown