Recherche avancée

Médias (1)

Mot : - Tags -/Christian Nold

Autres articles (52)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Librairies et binaires spécifiques au traitement vidéo et sonore

    31 janvier 2010, par

    Les logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
    Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
    Binaires complémentaires et facultatifs flvtool2 : (...)

  • Contribute to a better visual interface

    13 avril 2011

    MediaSPIP is based on a system of themes and templates. Templates define the placement of information on the page, and can be adapted to a wide range of uses. Themes define the overall graphic appearance of the site.
    Anyone can submit a new graphic theme or template and make it available to the MediaSPIP community.

Sur d’autres sites (9544)

  • FFmpeg - cannot find ExecuteBinaryResponseHandler - Android/Java

    20 juillet 2018, par pudility

    I am trying to make a module for react-native that will change a video into a gif. I have little to no experience with android studios/java, but I would love to learn more ! I am using this library to convert the video to a gif. Here is my code :

    package com.reactlibrary;

    import android.widget.Toast;
    import com.facebook.react.bridge.ReactApplicationContext;
    import com.facebook.react.bridge.ReactContextBaseJavaModule;
    import com.facebook.react.bridge.ReactMethod;
    import com.github.hiteshsondhi88.libffmpeg.FFmpeg;

    public class RNGifMakerModule extends ReactContextBaseJavaModule {

     private final ReactApplicationContext reactContext;

     public RNGifMakerModule(ReactApplicationContext reactContext) {
       super(reactContext);
       this.reactContext = reactContext;
     }

     @Override
     public String getName() {
       return "RNGifMakerModule";
     }

     @ReactMethod
     public void alert(String message) {
         Toast.makeText(getReactApplicationContext(), "Error", Toast.LENGTH_LONG).show();
         String[] cmd = {"-i"
                 , message
                 , "Image.gif"};
         conversion(cmd);
     }

     public void conversion(String[] cmd) {

       FFmpeg ffmpeg = FFmpeg.getInstance(this.reactContext);

       try {


         // to execute "ffmpeg -version" command you just need to pass "-version"
         ffmpeg.execute(cmd, new ExecuteBinaryResponseHandler() {

           @Override
           public void onStart() {
           }

           @Override
           public void onProgress(String message) {
           }

           @Override
           public void onFailure(String message) {
           }

           @Override
           public void onSuccess(String message) {
           }

           @Override
           public void onFinish() {
           }
         });
       } catch (FFmpegCommandAlreadyRunningException e) {
         // Handle if FFmpeg is already running
         e.printStackTrace();
       }
     }
    }

    And I get this error :

    Error:(43, 31) error: cannot find symbol class ExecuteBinaryResponseHandler

    This seems odd to be, because in the documentation for ffmpeg-android-java it says to use almost exactly the same code.

    Bounty

    The bounty will be awarded to you if you can find a way to convert a video.mp4 into a gif. You do not necessarily have to use FFmpeg, but your solution has to work with java/android studios.

  • Fluent-ffmpeg and complex filter in Electron (node)

    9 juillet 2016, par Matt Sergej Rinc

    I want to use fluent-ffmpeg module to call ffmpeg with complex filter from Electron but have no success. The error ’[AVFilterGraph @ 0xb8.......] No such filter " Error initalizing complex filters . Invalid argument’ is the same as in this question Error : Running FFmpeg command in Android ,splitting command in array not working but the context is different.

    What is needed ?
    Run this ffmpeg command using fluent-ffmpeg :

    ffmpeg -i safework-background-0.mp4 -i image1.png -i image2.png -i
    image3.png -filter_complex "[0:v][1:v]
    overlay=1:1:enable=’between(t,5,8.5)’ [tmp] ; [tmp][2:v]
    overlay=1:1:enable=’between(t,8.5,12)’ [tmp] ; [tmp][3:v]
    overlay=1:1:enable=’between(t,12,15)’" test-video-safework3.mp4

    It uses a complex filter to overlay three images on a video in sequence and exports a new video.

    What doesn’t work ?
    Obviously fluent-ffmpeg chokes with required quotes for complex filter, that is my conclusion (and is the same as for the Android variant question above).

    What works without fluent-ffmpeg in Electron ?
    As you can guess I have to resort to calling ffmpeg directly. To help others, the following command, with input and output video filenames parametrized, translates to Electron as :

     var spawn = require('child_process').spawn
     var fargs = ['-y', '-i', sourceDir.path() + '/' + inVideoName, '-i', tempDir.path() + '/' + 'image1.png',
     '-i', tempDir.path() + '/' + 'image2.png', '-i', tempDir.path() + '/' + 'image3.png',
     '-filter_complex', '[0:v][1:v]overlay=1:1:enable=\'between(t,5,8.5)\'[tmp];' +
     '[tmp][2:v]overlay=1:1:enable=\'between(t,8.5,12)\'[tmp];[tmp][3:v]' +
     'overlay=1:1:enable=\'between(t,12,15)\'', targetDir.path() + '/' + outVideoName]
     var ffmpeg = spawn(ffmpegc, fargs, { cwd:jetpack.cwd(app.getPath('home')).path() })
    // some code ommitted
     ffmpeg.on('close', (code) => {
       console.log(`child process exited with code ${code}`)
       webContents.send('notify-user-reply', 'Video processing done.')
     })

    The above command already has removed spaces between various filters (in complex filter) for each image or it would also choke.

    I would really love to use fluent-ffmpeg in Electron, not just for the convenience of calling ffmpeg more elegantly but also for some additional features like easy progress reporting.

  • JavaCV read video and record

    4 janvier 2015, par John

    I am trying to merge mp4 and mp3 on Android, I am using the JavaCV, please check my code below first, then I explained the strange error after :

    private void test2() throws Exception {
       String path = Environment.getExternalStorageDirectory()
               .getAbsolutePath();

       int testId = 4;
       String videoPath = path + "/" + "sample" + testId + ".mp4";
       String audioPath = path + "/" + "love.mp3";
       String outputPath = path + "/" + "out" + testId + ".mp4";

       FrameGrabber grabber1 = new FFmpegFrameGrabber(videoPath);
       FrameGrabber grabber2 = new FFmpegFrameGrabber(audioPath);

       grabber1.start();
       grabber2.start();

       FrameRecorder recorder = new FFmpegFrameRecorder(outputPath,
               grabber1.getImageWidth(), grabber1.getImageHeight(),
               grabber2.getAudioChannels());

       double frameRate = grabber1.getFrameRate();
       recorder.setFrameRate(frameRate);
       recorder.setSampleRate(grabber2.getSampleRate());
       recorder.setVideoQuality(1);

       recorder.start();

       Frame frame1, frame2 = null;

       // getLengthInTime is correct, but getLengthInFrames not accurate.
       Log.d(TAG, " Video lengthInTime:" + grabber1.getLengthInTime()
               + " Video frames:" + grabber1.getLengthInFrames());

       // Record video.
       int count = 0;
       while (true) {
           frame1 = grabber1.grabFrame();
           if (frame1 == null) {
               break;
           }

           frame1.samples = null;
           recorder.record(frame1);
           count++;

           Log.d(TAG, "Video frame timestamp:" + grabber1.getTimestamp());
       }
       Log.d(TAG, " Video frame count:" + count);

       // Record audtio.
       long videoTimestamp = recorder.getTimestamp();
       while (true) {
           frame2 = grabber2.grabFrame();
           if (frame2 != null && grabber2.getTimestamp() <= videoTimestamp) {
               frame2.image = null;
               recorder.record(frame2);

               // Log.d(TAG, "Audio frame timestamp:" +
               // grabber2.getTimestamp());
           } else {
               break;
           }
       }

       // release
       recorder.stop();
       recorder.release();

       grabber1.stop();
       grabber2.stop();
    }

    The output’s audio is OK, but the video is strangle. The video play 1s and stop 1s, then repeat like this. The input video I recorded by Phone’s camera.

    I tried to count the real number of video frames, and I found the real number is much bigger than the number got from method getLengthInFrames().