Recherche avancée

Médias (1)

Mot : - Tags -/belgique

Autres articles (35)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

Sur d’autres sites (6890)

  • Shaking/trembling in video slideshow generated by frames from an image

    18 mai 2017, par raziel

    I have a PHP program which is used to generate a video slideshow from the series of images. Basically, I just need to smoothly ‘move’ from one image area to the another one according to the specified top/left coordinates and width/height area of the image. In order to do smooth movement, I use easing functions during the coordinates calculations for the each of video frame. I make an jpeg image frame based on these calculations using PHP’s Imagick library, then I combine all the generated frames into a single video using ffmpeg command.

    <?php
    const TEMP_FRAMES_DIR = __DIR__;
    const VIDEO_WIDTH = 1080;
    const VIDEO_HEIGHT = 720;
    const FPS = 30;
    const MOVEMENT_DURATION_SECONDS = 3;

    const IMAGE_PATH = __DIR__ . '/test_image.png';

    $start_coords = [
       'x' => 100,
       'y' => 100,
       'width'  => 480,
       'height' => 270
    ];
    $end_coords = [
       'x' => 400,
       'y' => 200,
       'width'  => 480,
       'height' => 270
    ];

    $timeline = make_timeline($start_coords, $end_coords);
    render_frames(IMAGE_PATH, $timeline);
    render_video_from_frames();

    function make_timeline($start_coords, $end_coords) {
       $timeline = [];

       $total_frames = MOVEMENT_DURATION_SECONDS * FPS;

       $x_change      = $end_coords['x']      - $start_coords['y'];
       $y_change      = $end_coords['y']      - $start_coords['y'];
       $width_change  = $end_coords['width']  - $start_coords['width'];
       $height_change = $end_coords['height'] - $start_coords['height'];

       for ($i = 0; $i < $total_frames; $i++) {
           $timeline[$i] = [
               'x'      => easingOutExpo($i, $start_coords['x'], $x_change, $total_frames),
               'y'      => easingOutExpo($i, $start_coords['y'], $y_change, $total_frames),
               'width'  => easingOutExpo($i, $start_coords['width'], $width_change, $total_frames),
               'height' => easingOutExpo($i, $start_coords['height'], $height_change, $total_frames)
           ];
       }
       return $timeline;
    }

    function render_frames($image_path, $timeline) {
       $image = new Imagick($image_path);
       //remove frames from the previous render
       array_map('unlink', glob( TEMP_FRAMES_DIR . "/frame*" ));

       foreach ($timeline as $frame_number => $frame) {
           $frame_img = clone $image;
           $frame_img->cropImage($frame["width"],$frame["height"], $frame["x"],$frame["y"]);
           $frame_img->resizeImage(VIDEO_WIDTH, VIDEO_HEIGHT, imagick::FILTER_LANCZOS, 0.9);
           $frame_img->writeImage(TEMP_FRAMES_DIR. "/frame$frame_number.jpg");
       }
    }

    function render_video_from_frames() {
       $fps = FPS;
       $frames_dir = TEMP_FRAMES_DIR;
       $SEP = DIRECTORY_SEPARATOR;

       $video_file = $frames_dir. $SEP . 'video.mp4';

       if (file_exists($video_file)) unlink($video_file);

       system("ffmpeg -framerate $fps -i $frames_dir{$SEP}frame%01d.jpg $video_file");
    }

    function easingOutExpo($t, $b, $c, $d) {
       return $c * ( -pow( 2, -10 * $t/$d ) + 1 ) + $b;
    }

    The problem is that I have annoying shaking/trembling when I need to move at a low speed (like at the end of easing out expo function).
    Here you can get the test video with the problem, the test image which was used and the PHP script :
    https://drive.google.com/drive/u/1/folders/0B9FOrF6IlWaGeHJCS1h6djhVZ28

    You can see this shaking starting from the middle of the test video ( 1.5 sec).

    How can I avoid shaking in such kind of situations ? Thanks in advance !

  • Releasing GME Players and Tools

    22 mai 2012, par Multimedia Mike — General, alsa, github, gme, pulseaudio, Python, sdl

    I just can’t stop living in the past. To that end, I’ve been playing around with the Game Music Emu (GME) library again. This is a software library that plays an impressive variety of special music files extracted from old video games.

    I have just posted a series of GME tools and associated utilities up on Github.

    Clone the repo and try them out. The repo includes a small test corpus since one of the most tedious parts about playing these files tends to be tracking them down in the first place.

    Players
    At first, I started with trying to write some simple command line audio output programs based on GME. GME has to be the simplest software library that it has ever been my pleasure to code against. All it took was a quick read through the gme.h header file and it was immediately obvious how to write a simple program.

    First, I wrote a command line tool that output audio through PulseAudio on Linux. Then I made a second program that used ALSA. Guess what I learned through this exercise ? PulseAudio is actually far easier to program than ALSA.

    I also created an SDL player, seen in my last post regarding how to write an oscilloscope. I think I have the A/V sync correct now. It’s a little more fun to use than the command line tools. It also works on non-Linux platforms (tested at least on Mac OS X).

    Utilities
    I also wrote some utilities. I’m interested in exporting metadata from these rather opaque game music files in order to make them a bit more accessible. To that end, I wrote gme2json, a program that uses the GME library to fetch data from a game music file and then print it out in JSON format. This makes it trivial to extract the data from a large corpus of game music files and work with it in many higher level languages.

    Finally, I wrote a few utilities that repack certain ad-hoc community-supported game music archives into... well, an ad-hoc game music archive of my own device. Perhaps it’s a bit NIH syndrome, but I don’t think certain of these ad-hoc community formats were very well thought-out, or perhaps made sense a decade or more ago. I guess I’m trying to bring a bit of innovation to this archival process.

    Endgame
    I haven’t given up on that SaltyGME idea (playing these game music files directly in a Google Chrome web browser via Google Chrome). All of this ancillary work is leading up to that goal.

    Silly ? Perhaps. But I still think it would be really neat to be able to easily browse and play these songs, and make them accessible to a broader audience.

  • How to make video from images using Java + x264 ; cross platform solution required

    19 octobre 2014, par Shashank Tulsyan

    I have made a software which records my entire day into a video.
    Example video : https://www.youtube.com/watch?v=ITZYMMcubdw (Note : >16hrs compressed in 2mins, video speed too high, might cause epilepsy :P )

    The approach that I use right now is, Avisynth + x264 + Java.
    This is very very efficient. The video for entire day is created in 3-4mins, and reduced to a size of 40-50MB. This is perfect, the only issue is that this solution is not cross platform.
    Does anyone have a better idea ?

    I tried using java based x246 libraries but

    1. They are slow as hell
    2. The video output size is too big
    3. The video quality is not satisfactory.

    Some website suggest a command such as :

    x264.exe --crf 18 --fps 24 --input-res 1920x1080 --input-csp rgb -o "T:\crf18.mkv" "T:\___BBB\big_buck_bunny_%05d.png"

    There are 2 problems with this approach.

    1. As far as I know, x264 does accept image sequence as input, ffmpeg does
    2. The input images are not named in sequence such as image01.png , image02.png etc. They are named as timestamp_as_longinteger.png . So inorder to allow x264 to accept these images as input, I have to rename all of them ( i make a symbolic link for all images in a new folder ). This approach is again unsatisfactory, because I need more flexibility in selecting/unselecting files which would be converted to a video. Right now my approach is a hack.

    The best solution is x264. But not sure how I can send it an image sequence from Java. That too, images which are not named in sequential fashion.


    BTW The purpose of making video is going back in time, and finding out how time was spend/wasted.
    The software is aware of what the user is doing. So using this I can find out (visually) how a class evolved with time. How much time I spend on a particular class/package/module/project/customer. The granuality right now is upto the class level, I wish to take it to the function level. The software is called jitendriya.

    Here is a sample graph


    Here is 1 solution
    How does one encode a series of images into H264 using the x264 C API ?

    But this is for C. If I have to do the same in java, and in a cross plaform fashion, I will have to resort to JNA/JNI. JNA might have a significant performance hit. JNI would be more work.
    FFMpeg also looks like a nice alternative, but I am still not satisfied by any of these solutions looking at the pros and cons.


    Solution Adapted.

    package weeklyvideomaker;

    import java.awt.AWTException;
    import java.awt.Rectangle;
    import java.awt.Robot;
    import java.awt.Toolkit;
    import java.awt.image.BufferedImage;
    import java.io.File;
    import java.io.IOException;
    import java.io.OutputStream;
    import java.util.Calendar;
    import java.util.LinkedList;
    import java.util.logging.Level;
    import java.util.logging.Logger;
    import neembuu.release1.util.StreamGobbler;
    import org.shashaank.activitymonitor.ScreenCaptureHandler;
    import org.shashaank.jitendriya.JitendriyaParams;

    /**
    *
    * @author Shashank
    */
    public class DirectVideoScreenHandler implements ScreenCaptureHandler {
       private final JitendriyaParams  jp;

       private String extension="264";
       private boolean lossless=false;
       private String fps="24/1";

       private Process p = null;
       private Rectangle r1;
       private Robot r;

       private int currentDay;

       private static final String[]weeks={"sun","mon","tue","wed","thu","fri","sat"};

       public DirectVideoScreenHandler(JitendriyaParams jp) {
           this.jp = jp;
       }

       public String getExtension() {
           return extension;
       }

       public void setExtension(String extension) {
           this.extension = extension;
       }

       public boolean isLossless() {
           return lossless;
       }

       public void setLossless(boolean lossless) {
           this.lossless = lossless;
       }

       public String getFps() {
           return fps;
       }

       public void setFps(String fps) {
           this.fps = fps;
       }

       private static int getday(){
           return Calendar.getInstance().get(Calendar.DAY_OF_WEEK) - 1;
       }

       public void make()throws IOException,AWTException{
           currentDay = getday();
           File week = jp.getWeekFolder();

           String destinationFile = week+"\\videos\\"+weeks[currentDay]+"_"+System.currentTimeMillis()+"_direct."+extension;

           r = new Robot();
           r1 = getScreenSize();

           ProcessBuilder pb = makeProcess(destinationFile, 0, r1.width, r1.height);

           p = pb.start();
           StreamGobbler out = new StreamGobbler(p.getInputStream(), "out");
           StreamGobbler err = new StreamGobbler(p.getErrorStream(), "err");
           out.start();err.start();
       }

       private static Rectangle getScreenSize(){
           return new Rectangle(Toolkit.getDefaultToolkit().getScreenSize());
       }

       private void screenShot(OutputStream os)throws IOException{        
           BufferedImage bi = r.createScreenCapture(r1);
           int[]intRawData = ((java.awt.image.DataBufferInt)
                   bi.getRaster().getDataBuffer()).getData();
           byte[]rawData = new byte[intRawData.length*3];
           for (int i = 0; i < intRawData.length; i++) {
               int rgb = intRawData[i];
               rawData[ i*3 + 0 ] = (byte) (rgb >> 16);
               rawData[ i*3 + 1 ] = (byte) (rgb >> 8);
               rawData[ i*3 + 2 ] = (byte) (rgb);
           }
           os.write(rawData);
       }

       private ProcessBuilder makeProcess(String destinationFile, int numberOfFrames,
               int width, int height){
           LinkedList<string> commands = new LinkedList&lt;>();
           commands.add("\""+encoderPath()+"\"");
           if(true){
               commands.add("-");
               if(lossless){
                   commands.add("--qp");
                   commands.add("0");
               }
               commands.add("--keyint");
               commands.add("240");
               commands.add("--sar");
               commands.add("1:1");
               commands.add("--output");
               commands.add("\""+destinationFile+"\"");
               if(numberOfFrames>0){
                   commands.add("--frames");
                   commands.add(String.valueOf(numberOfFrames));
               }else{
                   commands.add("--stitchable");
               }
               commands.add("--fps");
               commands.add(fps);
               commands.add("--input-res");
               commands.add(width+"x"+height);
               commands.add("--input-csp");
               commands.add("rgb");//i420
           }
           return new ProcessBuilder(commands);
       }

       private String encoderPath(){
           return jp.getToolsPath()+File.separatorChar+"x264_64.exe";
       }

       @Override public void run() {
           try {
               if(p==null){
                   make();
               }
               if(currentDay!=getday()){// day changed
                   destroy();
                   return;
               }
               if(!r1.equals(getScreenSize())){// screensize changed
                   destroy();
                   return;
               }
               screenShot(p.getOutputStream());
           } catch (Exception ex) {
               Logger.getLogger(DirectVideoScreenHandler.class.getName()).log(Level.SEVERE, null, ex);
           }
       }

       private void destroy()throws Exception{
           p.getOutputStream().flush();
           p.getOutputStream().close();
           p.destroy();
           p = null;
       }

    }
    </string>

    package weeklyvideomaker;

    import org.shashaank.jitendriya.JitendriyaParams;

    /**
    *
    * @author Shashank
    */
    public class DirectVideoScreenHandlerTest {
       public static void main(String[] args)throws Exception {
           JitendriyaParams  jp = new JitendriyaParams.Builder()
                   .setToolsPath("F:\\GeneralProjects\\JReminder\\development_environment\\tools")
                   .setOsDependentDataFolderPath("J:\\jt_data")
                   .build();
           DirectVideoScreenHandler w = new DirectVideoScreenHandler(jp);
           w.setExtension("264");
           w.setFps("24/1");
           w.setLossless(false);
           w.make();

           for (int i = 0; ; i++) {
               w.run();
               Thread.sleep(1000);
           }
       }
    }