Recherche avancée

Médias (0)

Mot : - Tags -/tags

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (38)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (6111)

  • Place two videos side by side using ffmpeg ?and download it as one file

    5 août 2019, par marceloo1223

    I have two video files, which I want to play side by side and download them later. I used FFMPEG to merge them as one :

    protected void combinetwovideo()
    {
       string strParam;
       string Path_FFMPEG = Path.Combine(HttpContext.Current.Server.MapPath(ConfigurationManager.AppSettings["FFMpegPath"]));
       string apppath=HttpRuntime.AppDomainAppPath;
       //Merging two videos              
       String video1=apppath+"\\recordings\\client2019-08-03 02_23_59";
       String video2 =apppath+"\\userrecord\\User2019-08-03 02_24_00";
       String strResult =apppath+"\\RESULT\\";
       strParam = string.Format("-i ('" + video1 + "') -i ('" + video2 + "') -filter_complex \'[0:v]pad=iw*2:ih[int];[int][1:v]overlay=W/2:0[vid]' /-map [vid] -c:v libx264 -crf 23 -preset veryfast output.mp4");
       process(Path_FFMPEG, strParam);
    }
    public void process(string Path_FFMPEG, string strParam)
    {
       try
       {
           Process ffmpeg = new Process();
           ProcessStartInfo ffmpeg_StartInfo = new ProcessStartInfo(Path_FFMPEG, strParam);
           ffmpeg_StartInfo.UseShellExecute = false;
           ffmpeg_StartInfo.RedirectStandardError = true;
           ffmpeg_StartInfo.RedirectStandardOutput = true;
           ffmpeg.StartInfo = ffmpeg_StartInfo;
           ffmpeg_StartInfo.CreateNoWindow = true;
           ffmpeg.EnableRaisingEvents = true;
           ffmpeg.Start();
           ffmpeg.WaitForExit();
           ffmpeg.Close();
           ffmpeg.Dispose();
           ffmpeg = null;
       }
       catch (Exception ex)
       {
       }
    }

    But this method outputs nothing and does not throw any errors. I’m confused about the strParam query. Did I write it wrong or am I missing something. Any help would be apreciated.

  • How to read percentage from ffmpeg command in java ?

    23 juillet 2019, par Prasab R

    I am trying to convert video file to specific format by executing ffmpeg command. In that process I want to read percentage by using timepattern format. Somehow I am not able to do it.

    I have tried using the below code. Specially I am getting null in the while loop condition.

    import java.io.*;
    import java.util.Scanner;
    import java.util.regex.Pattern;

    class Test {
     public static void main(String[] args) throws IOException {
       ProcessBuilder pb = new ProcessBuilder("ffmpeg","-i","in.webm","out.mp4");
       final Process p = pb.start();

       new Thread() {
         public void run() {

           Scanner sc = new Scanner(p.getErrorStream());

           // Find duration
           Pattern durPattern = Pattern.compile("(?<=Duration: )[^,]*");
           String dur = sc.findWithinHorizon(durPattern, 0);
           if (dur == null)
             throw new RuntimeException("Could not parse duration.");
           String[] hms = dur.split(":");
           double totalSecs = Integer.parseInt(hms[0]) * 3600
                            + Integer.parseInt(hms[1]) *   60
                            + Double.parseDouble(hms[2]);
           System.out.println("Total duration: " + totalSecs + " seconds.");

           // Find time as long as possible.
           Pattern timePattern = Pattern.compile("(?<=time=)[\\d.]*");
           String match;
           while (null != (match = sc.findWithinHorizon(timePattern, 0))) {
             double progress = Double.parseDouble(match) / totalSecs;
             System.out.printf("Progress: %.2f%%%n", progress * 100);
           }
         }
       }.start();

     }
    }

    I am expecting a value in the while condition, but it coming as null.enter code here

  • how to fix "FFmpeg cannot edit existing files in-place."-Problem in Java

    7 juillet 2019, par Suione

    I am trying to write an audio file to an existing videofile. Unfortunately I get the Logmessage, that I cannot edit existing files in-place.

    I have found some threads about the so called muxing between video and audio, but none of them could help me out really. Maybe some one of you can.
    Further this isn`t my could either. I am only the one who needs to use it, so I have to fix it.

    Thanks for your help

    String[] args = { FFMPEG_FILE.getAbsolutePath(), "-y", "-i", videoPath, "-ss", String.valueOf(-offset.toFrames() / 50.0f), "-i", audioPath, "-c", "copy", "-map", "0:v", "-map","1:a", "-bsf:v", "h264_mp4toannexb","-r", String.valueOf(project.getFramerate()), "-shortest",outputPath };

    ProcessBuilder pb = new ProcessBuilder(args).inheritIO();
    Process p = pb.start();
    int exit = p.waitFor();
    if (exit != 0) {
        throw new IOException("ffmpeg terminated with a failure");
    }

    EDIT : As someone commented outputPath was same same as videopath. So switched it to another, so this problem would not appear again. But I still have no sound in my outputvideos. I give you the LOG-information, so maybe you can read something out of it.

    INFO: start ffmpeg: [C:\Program Files\ffmpeg\bin\ffmpeg.exe, -y, -i, C:\Users\USERNAME\Desktop\Test_Workflow\1.mkv, -ss, 0.0, -i, C:\Users\USERNAME\Desktop\Test\resources/videos/P1000501.MP4\audio.ogg, -c, copy, -map, 0:v, -map, 1:a, -bsf:v, h264_mp4toannexb, -r, 25, -shortest, C:\Users\USERNAME\Desktop\Test\testoutput.mp4]
    ffmpeg version N-92087-gdcbd89e000 Copyright (c) 2000-2018 the FFmpeg developers
     built with gcc 8.2.1 (GCC) 20180813
     configuration: --disable-static --enable-shared --enable-gpl --enable-version3 --enable-sdl2 --enable-fontconfig --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libtheora --enable-libtwolame --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libzimg --enable-lzma --enable-zlib --enable-gmp --enable-libvidstab --enable-libvorbis --enable-libvo-amrwbenc --enable-libmysofa --enable-libspeex --enable-libxvid --enable-libaom --enable-libmfx --enable-amf --enable-ffnvcodec --enable-cuvid --enable-d3d11va --enable-nvenc --enable-nvdec --enable-dxva2 --enable-avisynth
     libavutil      56. 19.101 / 56. 19.101
     libavcodec     58. 31.102 / 58. 31.102
     libavformat    58. 18.104 / 58. 18.104
     libavdevice    58.  4.105 / 58.  4.105
     libavfilter     7. 33.100 /  7. 33.100
     libswscale      5.  2.100 /  5.  2.100
     libswresample   3.  2.100 /  3.  2.100
     libpostproc    55.  2.100 / 55.  2.100
    Input #0, matroska,webm, from 'C:\Users\USERNAME\Desktop\Test_Workflow\1.mkv':
     Metadata:
       ENCODER         : Lavf56.4.101
     Duration: 00:00:04.60, start: 0.080000, bitrate: 1377 kb/s
       Stream #0:0: Video: h264 (High), yuv420p(progressive), 1920x1080, SAR 1:1 DAR 16:9, 25 fps, 25 tbr, 1k tbn, 50 tbc (default)
    Input #1, ogg, from 'C:\Users\USERNAME\Desktop\Test\resources/videos/P1000501.MP4\audio.ogg':
     Duration: 00:04:20.65, start: 0.000000, bitrate: 86 kb/s
       Stream #1:0(und): Audio: vorbis, 48000 Hz, stereo, fltp, 112 kb/s
       Metadata:
         CREATION_TIME   : 2018-11-09T13:20:07.000000Z
         LANGUAGE        : und
         ENCODER         : Lavc58.31.102 libvorbis
         MAJOR_BRAND     : mp42
         MINOR_VERSION   : 1
         COMPATIBLE_BRANDS: mp42avc1
    [mp4 @ 000001e67c0a00c0] track 1: codec frame size is not set
    Output #0, mp4, to 'C:\Users\USERNAME\Desktop\Test\testoutput.mp4':
     Metadata:
       encoder         : Lavf58.18.104
       Stream #0:0: Video: h264 (High) (avc1 / 0x31637661), yuv420p(progressive), 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 25 fps, 25 tbr, 12800 tbn, 25 tbc (default)
       Stream #0:1(und): Audio: vorbis (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 112 kb/s
       Metadata:
         CREATION_TIME   : 2018-11-09T13:20:07.000000Z
         LANGUAGE        : und
         ENCODER         : Lavc58.31.102 libvorbis
         MAJOR_BRAND     : mp42
         MINOR_VERSION   : 1
         COMPATIBLE_BRANDS: mp42avc1
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
     Stream #1:0 -> #0:1 (copy)
    Press [q] to stop, [?] for help
    frame=  113 fps=0.0 q=-1.0 Lsize=     824kB time=00:00:04.41 bitrate=1530.8kbits/s speed= 634x    
    video:772kB audio:43kB subtitle:0kB other streams:0kB global headers:4kB muxing overhead: 1.092082%