Recherche avancée

Médias (91)

Autres articles (97)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Problèmes fréquents

    10 mars 2010, par

    PHP et safe_mode activé
    Une des principales sources de problèmes relève de la configuration de PHP et notamment de l’activation du safe_mode
    La solution consiterait à soit désactiver le safe_mode soit placer le script dans un répertoire accessible par apache pour le site

Sur d’autres sites (8018)

  • can't record docker selenium with ffmpeg and ubuntu 18

    1er janvier 2019, par Alex028502

    I have a script that starts a selenium/standalone-chrome container, starts recording with ffmpeg, and runs the selenium tests. However, it is not working with ubuntu 18, possibly because of the ffmpeg version (3.4.4 instead of 2.8.15).

    I have broken down the problem into a couple commands which work fine in ubuntu 16 but not ubuntu 18 :

    start selenium container in terminal #1

    docker run --network=host --shm-size=2g -e SCREEN_WIDTH=1920 -e SCREEN_HEIGHT=1080 selenium/standalone-chrome:3.141.59-antimony

    start recording in terminal #2

    rm -f test.mp4
    # :99 seems to be the default for the selenium container
    ffmpeg -f x11grab -video_size 1920x1080 -i :99 -codec:v libx264 -r 4 test.mp4

    and I get something that looks like this

    frame=    2 fps=0.1 q=-1.0 Lsize=       2kB time=00:00:00.25 bitrate=  75.9kbits/s dup=0 drop=413 speed=0.0165

    the time stays at 0, and then goes up to 25msec when I stop it.

    On the other hand, if I just start regular screen buffer in terminal #1

    Xvfb :99 -screen 0 1920x1080x24

    and run the same thing as above in terminal #2, everything works

    Also, I am pretty sure that the above ffmpeg command worked in ubuntu 16, with ffmpeg-3.

    So to summarise when the above ffmpeg command seems to work :

                      | ubuntu 16 (ffmpeg 3) | ubuntu 18 (ffmpeg 4)
    just start Xvfb    | works                | works
    selenium container | works                | DOES NOT WORK

    Any ideas ?

  • Use Windows ffmpeg to record audio output without using the StereoMix

    5 janvier 2019, par DevtelSoftware

    I am looking for a way to record the audio output (speakers) using Windows ffmpeg.
    I need to do this WITHOUT installing any extra dshow filters and without having the StereoMix input enabled (since this is not available on many computers).

    I have read in the ffmpeg documentation that the -map would allow redirecting an audio output so that ffmpeg sees it as an audio input but I can’t find any example of how to do that.

    In Linux I managed to do it like this :

    ffmpeg -f pulse -ac 2 -ar 44100 -i alsa_output.pci-0000_00_1f.4.analog-stereo.monitor -f pulse -ac 2 -ar 44100 -i alsa_input.pci-0000_00_1f.4.analog-stereo -filter_complex amix=inputs=2 test.mp4

    However I can’t find a similar way to do it in Windows and MacOSX.

    So in short, is it possible with the Windows ffmpeg to record audio from the speakers without extra dshow filters (out-of-the-box) ? Same question goes for MacOSX.

    Thanks !

  • FFmpeg record and stream

    16 janvier 2019, par Robert

    I’m getting the following error.

      E/FFmpeg: Exception while trying to run:
      [/data/user/0/com.example.pathways.testipcam/files/ffmpeg, -y, -i,
     rtsp://log:pass@IP:port/video.h264, -acodec, copy, -vcodec, copy, -t,
    00:03:00,
    content://com.example.android.fileprovider/external_files/Android/data/com.example.pathways.testipcam/files/Movies/IPcam_20190116_150628_6019720208966811003.m>kv]
    java.io.IOException: Cannot run program "/data/user/0/com.example.pathways.testipcam/files/ffmpeg": error=2, No such >file or directory

    OK I’m trying to learn how to record and stream my IP cam on android. I can stream the video in a surface view media player no problem so I then went to the record task. Now I’m a bit lost. I know this is possible as I have read people say they used this to do it.

    I started with implamenting

    implementation 'nl.bravobit:android-ffmpeg:1.1.5'

    But I can’t figure out what I am missing or doing wrong as there aren’t really any tutorials explaining this completely. So hopefully this can be the thread everyone else finds for a solution. I have listed my code below. Exactly what have I got wrong here. It runs and goes to onStart...then right on onFailure.
    DO I need to have 2 streams in order to view and watch ? or what the deal.
    I know FFmpeg can do that.
    "ffmpeg supports multiple outputs created out of the same input(s) in the same process. The usual way to accomplish this is :

    ffmpeg -i input1 -i input2 \
    -acodec … -vcodec … output1 \
    -acodec … -vcodec … output2 \"

    but I have no idea how to sue that.
    Anyway, I know I’m kind of close, at least I hope I need a little help on getting over the finish line on this. What have I done wrong, how does this work.

    Here is what I did, I streamed the video as the app does. no problem

       surfaceView = (SurfaceView) findViewById(R.id.videoView);
       _surfaceHolder = surfaceView.getHolder();
       _surfaceHolder.addCallback(this);
       _surfaceHolder.setFixedSize(320, 240);

    ....

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
       mpPlayerRun();

    }

    public void mpPlayerRun(){
       _mediaPlayer = new MediaPlayer();
       _mediaPlayer.setDisplay(_surfaceHolder);

       try {
           // Specify the IP camera's URL and auth headers.
           _mediaPlayer.setDataSource(RTSP_URL);

           // Begin the process of setting up a video stream.
           _mediaPlayer.setOnPreparedListener(this);
           _mediaPlayer.prepareAsync();
       }
       catch (Exception e) {}
    }
    ...
    @Override
    public void onPrepared(MediaPlayer mp) {
       _mediaPlayer.start();
    }

    OK, so then I created a button to record.

    @Override
    public void onClick(View view) {
       switch (view.getId()) {
           case R.id.IPcamback:
               break;

           case R.id.IPcamrecord:
               if (ContextCompat.checkSelfPermission(this,
                       Manifest.permission.WRITE_EXTERNAL_STORAGE) == PackageManager.PERMISSION_GRANTED) {
                   takeIPvid();
               } else {
                   requestIPCamPermission();
               }

    .....

    requested permission then results.

       private void requestIPCamPermission() {
       if (ActivityCompat.shouldShowRequestPermissionRationale(this,
               Manifest.permission.WRITE_EXTERNAL_STORAGE)) {

           new AlertDialog.Builder(this)
                   .setTitle("Permission needed")
                   .setMessage("This permission is needed do to android safety protocol")
                   .setPositiveButton("ok", new DialogInterface.OnClickListener() {
                       @Override
                       public void onClick(DialogInterface dialog, int which) {
                           ActivityCompat.requestPermissions(MainActivity.this,
                                   new String[] {Manifest.permission.WRITE_EXTERNAL_STORAGE}, 420);
                       }
                   })
                   .setNegativeButton("cancel", new DialogInterface.OnClickListener() {
                       @Override
                       public void onClick(DialogInterface dialog, int which) {
                           dialog.dismiss();
                       }
                   })
                   .create().show();

       } else {
           ActivityCompat.requestPermissions(this,
                   new String[] {Manifest.permission.WRITE_EXTERNAL_STORAGE}, 420);
       }
    }

    @Override
    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {

       if (requestCode == 420)  {
           if (grantResults.length > 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
               takeIPvid();
           } else {
               Toast.makeText(this, "Permission DENIED", Toast.LENGTH_SHORT).show();
           }
    .....

    then I made the file provider and createVideoOutputFile() method and the takeIPvid method.

    private File createVideoOutputFile() throws IOException {
       String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
       String imageFileName = "IPcam_" + timeStamp + "_";
       IPstorageDir = getExternalFilesDir(Environment.DIRECTORY_MOVIES);
       IPvideo_file = File.createTempFile(
               imageFileName,  /* prefix */
               ".mp4",         /* suffix */
               IPstorageDir      /* directory */
       );

       // Save a file: path for use with ACTION_VIEW intents
       String IPmVideoFilename = IPvideo_file.getAbsolutePath();
       return IPvideo_file;
    }

      private void takeIPvid() {

           File ipfile = null;
           try {
               ipfile = createVideoOutputFile();
           } catch (IOException e) {
               e.printStackTrace();
           }
       IPvideo_uri = FileProvider.getUriForFile(this,
               "com.example.android.fileprovider",
               ipfile);


       String[] cmd = {"-y", "-i", "rtsp://Login:Passord@IP:port/video.h264", "-acodec", "copy", "-vcodec", "copy","-t","00:00:20", IPvideo_uri.toString() };

       FFmpeg.getInstance(this).execute(cmd,new ExecuteBinaryResponseHandler(){

           @Override
           public void onStart() {
               super.onStart();


           }

           @Override
           public void onFailure(String message) {
               super.onFailure(message);



           }

           @Override
           public void onSuccess(String message) {
               super.onSuccess(message);


           }

           @Override
           public void onProgress(String message) {
               super.onProgress(message);

           }

           @Override
           public void onFinish() {
               super.onFinish();

           }
       });

    }