Recherche avancée

Médias (0)

Mot : - Tags -/presse-papier

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (33)

  • La file d’attente de SPIPmotion

    28 novembre 2010, par

    Une file d’attente stockée dans la base de donnée
    Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
    Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

  • Changer son thème graphique

    22 février 2011, par

    Le thème graphique ne touche pas à la disposition à proprement dite des éléments dans la page. Il ne fait que modifier l’apparence des éléments.
    Le placement peut être modifié effectivement, mais cette modification n’est que visuelle et non pas au niveau de la représentation sémantique de la page.
    Modifier le thème graphique utilisé
    Pour modifier le thème graphique utilisé, il est nécessaire que le plugin zen-garden soit activé sur le site.
    Il suffit ensuite de se rendre dans l’espace de configuration du (...)

Sur d’autres sites (6611)

  • Is there a set of working P/Invoke declarations for FFMpeg, libavutil, libavformat and libavcodec in .NET ?

    11 février 2014, par casperOne

    I'm currently looking to access libavutil, libavformat and libavcodec (all part of FFMpeg) from .NET.

    Currently, I'm getting the libraries from the automated builds of the shared FFMpeg package performed every night for Windows 32-bit.

    I am also using the code from the ffmpeg-sharp project. In that project, I have removed a number of classes that were not compiling (they are wrapper classes not the P/Invoke declarations).

    The code compiles fine, but I am running into a few issues.

    First, it appears that the build of av*.dll uses the cdecl calling convention, as I was receiving a number of PInvokeStackImbalanceException when trying to call av_open_input_file. This was easy enough to change to get it to work right. The AVFormatContext structure is populated.

    After that, I want to call av_find_stream_info to get information about the streams in the file. However, when calling that with the AVFormatContext retrieved from the call to av_open_input_file, an AccessViolationException is thrown indicating that I am trying to read or write from protected memory.

    Has anyone used P/Invoke to access the libavutil, libavformat and libavcodec dll libraries through P/Invoke and have gotten it to work ?

    I should mention that working with the command-line version of FFMpeg, while a solution, is not a viable solution in this case, access needs to occur through the libraries. The reason for this is that I'd have to thrash the disk way too much to do what I need to do (I have to do a frame-by-frame analysis of some very high definition video) and I want to avoid the disk as much as possible.

  • ffmpeg not working with piping to stdin

    16 janvier 2016, par Guig

    I want to stream a file that is being uploaded to ffmpeg. I’m using node.js and it’s not working !

    I ended up testing piping an input to ffmpeg from a local file, and this doens’t work either. Here’s my code :

    var processVideo = function(videoStream, resultPath) {
     var cmdParams = [
       '-i', '-',
       '-y',
       '-f', 'mp4',
       '-vcodec', 'libx264',
       '-vf', 'scale=-1:720',
       '-f', 'mp4',
       resultPath
     ];
     var ffmpeg = child_process.spawn('ffmpeg', cmdParams);

     var data = '';
     ffmpeg.stdout
       .on('data', function(chunk) { data += chunk; })
       .on('end', function() { console.log('result', data); });

     var err = '';
     ffmpeg.stderr
       .on('data', function(chunk) { err += chunk; })
       .on('end', function() { console.log('error', err);});
     videoStream.pipe(ffmpeg.stdin);
    };

    processVideo(fs.createReadStream(pathToLocalMP4File), localPathToResultFile);

    The output I get is

    error ffmpeg version 2.8 Copyright (c) 2000-2015 the FFmpeg developers
     built with Apple LLVM version 7.0.0 (clang-700.1.76)
     configuration: --prefix=/usr/local/Cellar/ffmpeg/2.8 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-opencl --enable-libx264 --enable-libmp3lame --enable-libvo-aacenc --enable-libxvid --enable-libtheora --enable-vda
     libavutil      54. 31.100 / 54. 31.100
     libavcodec     56. 60.100 / 56. 60.100
     libavformat    56. 40.101 / 56. 40.101
     libavdevice    56.  4.100 / 56.  4.100
     libavfilter     5. 40.101 /  5. 40.101
     libavresample   2.  1.  0 /  2.  1.  0
     libswscale      3.  1.101 /  3.  1.101
     libswresample   1.  2.101 /  1.  2.101
     libpostproc    53.  3.100 / 53.  3.100
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fd1e2802a00] stream 0, offset 0x20: partial file
    [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7fd1e2802a00] Could not find codec parameters for stream 0 (Video: h264 (avc1 / 0x31637661), none, 1920x1080, 16242 kb/s): unspecified pixel format
    Consider increasing the value for the 'analyzeduration' and 'probesize' options
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'pipe:':
     Metadata:
       major_brand     : mp42
       minor_version   : 537134592
       compatible_brands: mp42    
       creation_time   : 2015-12-26 12:47:49
     Duration: 00:00:04.00, bitrate: N/A
       Stream #0:0(und): Video: h264 (avc1 / 0x31637661), none, 1920x1080, 16242 kb/s, 29.97 fps, 29.97 tbr, 30k tbn, 60k tbc (default)
       Metadata:
         creation_time   : 2015-12-26 12:47:49
         encoder         : AVC Coding
       Stream #0:1(und): Audio: aac (mp4a / 0x6134706D), 32000 Hz, mono, fltp, 46 kb/s (default)
       Metadata:
         creation_time   : 2015-12-26 12:47:49
    [buffer @ 0x7fd1e1c08e20] Unable to parse option value "-1" as pixel format
       Last message repeated 1 times
    [buffer @ 0x7fd1e1c08e20] Error setting option pix_fmt to value -1.
    [graph 0 input from stream 0:0 @ 0x7fd1e1c08f60] Error applying options to the filter.
    Error opening filters!

    result

    I tried to set the -pix_fmt option, or -analyzeduration and -probesize as suggested by the error code and this question, to no avail.

    However ffmpeg -i pathToLocalMP4File -y -f mp4 -pix_fmt yuv420p -vcodec libx264 -vf scale=-1:720 -f mp4 localPathToResultFile works perfectly in the terminal...

    Any idea ??

  • Why FFMPEG commands not working in marshmallows and lollipop ?

    22 février 2016, par Andy Developer

    why my code is not working in marshmallows and lollipop devices. or any idea how to use FFMPEG in that versions. any help.

    import android.os.Bundle;
    import android.os.Environment;
    import android.support.v7.app.AppCompatActivity;
    import android.view.View;
    import android.view.View.OnClickListener;
    import android.widget.Button;
    import android.widget.Toast;

    import com.kru.ffmpeglibs.FFmpeg;
    import com.kru.ffmpeglibs.FFmpegExecuteResponseHandler;
    import com.kru.ffmpeglibs.FFmpegLoadBinaryResponseHandler;
    import com.kru.ffmpeglibs.exceptions.FFmpegCommandAlreadyRunningException;
    import com.kru.ffmpeglibs.exceptions.FFmpegNotSupportedException;

    public class CommandsActivity extends AppCompatActivity {
    private FFmpeg fFmpeg;
    private Button btnGenerate;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
       super.onCreate(savedInstanceState);
       setContentView(R.layout.activity_main);

       fFmpeg = FFmpeg.getInstance(CommandsActivity.this);
       executeBinary();

       btnGenerate = (Button) findViewById(R.id.btnGenerate);
       btnGenerate.setOnClickListener(new OnClickListener() {
           @Override
           public void onClick(View v) {
               try {
                   String[] ffmpegCommand = { "-i "
                           + Environment.getExternalStorageDirectory()
                                   .getPath()
                           + "/vid.mp4"
                           + " -r 10 "
                           + Environment.getExternalStorageDirectory()
                                   .getPath()
                           + "/com.mobvcasting.mjpegffmpeg/frame_%05d.jpg" };

                   executeCommand(ffmpegCommand);

               } catch (FFmpegCommandAlreadyRunningException e) {
                   e.printStackTrace();
               }
           }
       });
    }

    private void executeCommand(String[] cmd)
           throws FFmpegCommandAlreadyRunningException {

       fFmpeg.execute(cmd, new FFmpegExecuteResponseHandler() {
           @Override
           public void onSuccess(String message) {

               Toast.makeText(CommandsActivity.this, "Sucesses..",
                       Toast.LENGTH_SHORT).show();

               System.out.println(message);
           }

           @Override
           public void onProgress(String message) {
               // Toast.makeText(MainActivity.this, "On Process",
               // Toast.LENGTH_SHORT).show();
               System.out.println(message);
           }

           @Override
           public void onFailure(String message) {
               Toast.makeText(CommandsActivity.this, "Fail this",
                       Toast.LENGTH_SHORT).show();
               System.out.println(message);
           }

           @Override
           public void onStart() {

           }

           @Override
           public void onFinish() {
               Toast.makeText(CommandsActivity.this, "Finish",
                       Toast.LENGTH_SHORT).show();

           }
       });
    }

    private void executeBinary() {

       try {
           fFmpeg.loadBinary(new FFmpegLoadBinaryResponseHandler() {
               @Override
               public void onFailure() {

               }

               @Override
               public void onSuccess() {

               }

               @Override
               public void onStart() {

               }
               @Override
               public void onFinish() {

               }
           }); } catch (FFmpegNotSupportedException e) {   e.printStackTrace();
       }
     }
    }

    Here is my code but it still not working. please tell me what is wrong in the code
    The exception i got is something like this.

    02-22 11:18:41.469: E/AndroidRuntime(27839): FATAL EXCEPTION: main
    02-22 11:18:41.469: E/AndroidRuntime(27839):     java.lang.UnsatisfiedLinkError: Native method not found: com.kru.ffmpeglibs.ArmArchHelper.cpuArchFromJNI:()Ljava/lang/String;
    02-22 11:18:41.469: E/AndroidRuntime(27839):    at com.kru.ffmpeglibs.ArmArchHelper.cpuArchFromJNI(Native Method)
    02-22 11:18:41.469: E/AndroidRuntime(27839):    at com.kru.ffmpeglibs.CpuArchHelper.getCpuArch(CpuArchHelper.java:61)
    02-22 11:18:41.469: E/AndroidRuntime(27839):    at com.kru.ffmpeglibs.FFmpeg.loadBinary(FFmpeg.java:40)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at com.kru.sampleffmpeg.MainActivity.loadFFMpegBinary(MainActivity.java:68)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at com.kru.sampleffmpeg.MainActivity.onCreate(MainActivity.java:36)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at android.app.Activity.performCreate(Activity.java:5372)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at android.app.Instrumentation.callActivityOnCreate(Instrumentation.java:1104)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2257)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2349)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at android.app.ActivityThread.access$700(ActivityThread.java:159)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1316)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at android.os.Handler.dispatchMessage(Handler.java:99)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at android.os.Looper.loop(Looper.java:176)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at android.app.ActivityThread.main(ActivityThread.java:5419)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at java.lang.reflect.Method.invokeNative(Native Method)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at java.lang.reflect.Method.invoke(Method.java:525)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1046)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:862)
    02-22 11:18:41.469: E/AndroidRuntime(27839):   at dalvik.system.NativeStart.main(Native Method)