Recherche avancée

Médias (91)

Autres articles (21)

  • Supporting all media types

    13 avril 2011, par

    Unlike most software and media-sharing platforms, MediaSPIP aims to manage as many different media types as possible. The following are just a few examples from an ever-expanding list of supported formats : images : png, gif, jpg, bmp and more audio : MP3, Ogg, Wav and more video : AVI, MP4, OGV, mpg, mov, wmv and more text, code and other data : OpenOffice, Microsoft Office (Word, PowerPoint, Excel), web (html, CSS), LaTeX, Google Earth and (...)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 is the first MediaSPIP stable release.
    Its official release date is June 21, 2013 and is announced here.
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

Sur d’autres sites (5614)

  • avcodec/mpegaudio_parser : Skip APE tags when parsing mp3 packets.

    30 janvier 2018, par Dale Curtis
    avcodec/mpegaudio_parser : Skip APE tags when parsing mp3 packets.
    

    Otherwise the decoder will throw "Missing header" errors when the
    packets are sent for decoding.

    This is similar to 89a420b71b5.

    Signed-off-by : Michael Niedermayer <michael@niedermayer.cc>

    • [DH] libavcodec/mpegaudio_parser.c
  • Improve performance of FFMPEG with Java [on hold]

    20 janvier 2018, par Nitishkumar Singh

    This question is actually for improving performance of an Java application. In this java application, I am trying to convert an video into small clips.

    Here is the implementation class for the same

    package ffmpeg.clip.process;

    import java.io.BufferedReader;
    import java.io.IOException;
    import java.io.InputStreamReader;
    import java.nio.file.Paths;
    import java.time.Duration;
    import java.time.LocalTime;
    import java.time.temporal.ChronoUnit;
    import java.util.concurrent.CompletableFuture;
    import java.util.concurrent.ExecutionException;
    import java.util.stream.Collectors;

    import ffmpeg.clip.utils.VideoConstant;
    import ffmpeg.clip.utils.VideoUtils;

    /*
    * @author Nitishkumar Singh
    * @Description: class will use ffmpeg to break an source video into clips
    */
    public class VideoToClip {

       /*
        * Prevent from creating instance
        */
       private VideoToClip() {
       }

       /**
        * Get Video Duration is milliseconds
        *
        * @Exception IOException - File does not exist VideoException- Video File have data issues
        */
       static LocalTime getDuration(String sourceVideoFile) throws Exception {
           if (!Paths.get(sourceVideoFile).toFile().exists())
               throw new Exception("File does not exist!!");

           Process proc = new ProcessBuilder(VideoConstant.SHELL, VideoConstant.SHELL_COMMAND_STRING_ARGUMENT,
                   String.format(VideoConstant.DURATION_COMMAND, sourceVideoFile)).start();
           boolean errorOccured = (new BufferedReader(new InputStreamReader(proc.getErrorStream())).lines()
                   .count() > VideoConstant.ZERO);
           String durationInSeconds = new BufferedReader(new InputStreamReader(proc.getInputStream())).lines()
                   .collect(Collectors.joining(System.lineSeparator()));
           proc.destroy();
           if (errorOccured || (durationInSeconds.length() == VideoConstant.ZERO))
               throw new Exception("Video File have some issues!");
           else
               return VideoUtils.parseHourMinuteSecondMillisecondFormat(durationInSeconds);
       }

       /**
        * Create Clips for Video Using Start and End Second
        *
        * @Exception IOException - Clip Creation Process Failed InterruptedException - Clip Creation task get's failed
        */
       static String toClipProcess(String sourceVideo, String outputDirectory, LocalTime start, LocalTime end,
               String fileExtension) throws IOException, InterruptedException, ExecutionException {

           String clipName = String.format(VideoConstant.CLIP_FILE_NAME,
                   VideoUtils.getHourMinuteSecondMillisecondFormat(start),
                   VideoUtils.getHourMinuteSecondMillisecondFormat(end), fileExtension);

           String command = String.format(VideoConstant.FFMPEG_OUTPUT_COMMAND, sourceVideo,
                   VideoUtils.getHourMinuteSecondMillisecondFormat(start),
                   VideoUtils.getHourMinuteSecondMillisecondFormat(end.minus(start.toNanoOfDay(), ChronoUnit.NANOS)),
                   outputDirectory, clipName);
           LocalTime startTime = LocalTime.now();
           System.out.println("Clip Name: " + clipName);
           System.out.println("FFMPEG Process Execution Started");
           CompletableFuture<process> completableFuture = CompletableFuture.supplyAsync(() -> {
               try {
                   return executeProcess(command);
               } catch (InterruptedException | IOException ex) {
                   throw new RuntimeException(ex);
               }
           });
           completableFuture.get();
           // remove
           LocalTime endTime = LocalTime.now();
           System.out.println("Clip Name: " + clipName);
           System.out.println("FFMPEG Process Execution Finished");
           System.out.println("Duration: " + Duration.between(startTime, endTime).toMillis() / 1000);

           return clipName;
       }

       /**
        * Create and Execute Process for each command
        */
       static Process executeProcess(String command) throws InterruptedException, IOException {
           Process clipProcess = Runtime.getRuntime().exec(command);
           clipProcess.waitFor();
           return clipProcess;
       }
    }
    </process>

    The Entire Solution is availble at Github. So I am actually using CompletableFuture and running FFMPEG command by creating Java Process. The time it takes is too much. For 40 minutes video, it takes more than 49 minutes, that too on a 64 cpu machine. I am trying to reduce the core size to 8 or something, as well improve it’s performance. Any Help would be really great.

  • Pipe video frame to OpenCV image and then to FFmpeg

    8 février 2018, par Pureheart

    There is a similar question here :
    Getting ’av_interleaved_write_frame() : Broken pipe’ error

    But what should I do if I want to write the data ?

    I put pipe_out.stdin.write(image.tostring()) in the while loop, like this

    FFMPEG_BIN = "/home/media/Downloads/ffmpeg"
    import subprocess as sp
    import sys
    width = 360
    height = 240
    command_in = [ FFMPEG_BIN,
               '-i', '/home/media/Videos/mytestvideo/zhou.avi',
               '-f', 'image2pipe',
               '-pix_fmt', 'bgr24',
               '-vcodec', 'rawvideo', '-']
    pipe_in = sp.Popen(command_in, stdout = sp.PIPE, bufsize = 10**8)

    command_out = [ FFMPEG_BIN,
           '-y', # (optional) overwrite output file if it exists
           '-f', 'rawvideo',
           '-vcodec','rawvideo',
           '-s', '360x240', # size of one frame
           '-pix_fmt', 'bgr24',
           '-r', '28', # frames per second
           '-i', '-', # The imput comes from a pipe
           '-an', # Tells FFMPEG not to expect any audio
           #'-vcodec', 'mpeg',
           'my_output_videofile.mp4' ]

    pipe_out = sp.Popen( command_out, stdin=sp.PIPE, stderr=sp.PIPE)

    import numpy
    import cv2
    import pylab
    # read width*height*3 bytes (= 1 frame)
    while True:
       raw_image = pipe_in.stdout.read(width*height*3)
       image =  numpy.fromstring(raw_image, dtype='uint8')
       image = image.reshape((height,width,3))
       pipe_in.communicate()
       pipe_out.stdin.write(image.tostring())
       pipe_out.communicate()
       pipe_in.stdout.flush()
       #cv2.imshow('image',image)
       #cv2.waitKey(0)
       # throw away the data in the pipe's buffer.


    '''
    pipe_in.stdin.close()
    pipe_in.stderr.close()
    pipe_in.wait()
    pipe_out.stdout.close()
    pipe_out.stderr.close()
    pipe_out.wait()
    '''
    #pipe_out.stdin.write(image.tostring())

    However, the output video has only 1 frame(the first frame of input video)

    Any ideas ?

    Thanks !