Recherche avancée

Médias (1)

Mot : - Tags -/book

Autres articles (50)

  • La sauvegarde automatique de canaux SPIP

    1er avril 2010, par

    Dans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
    Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

Sur d’autres sites (10816)

  • Use ffmpeg in android for playing video

    14 février 2017, par devxcon

    I am trying use ffmpeg in android. Here is the code so far. I took reference from this project. It just lets me convert video file. But I want to play a video file using ffmpeg. Is it possible ? If yes then how we can do that ?

       package com.ffmpeg;

    import android.support.v7.app.AppCompatActivity;
    import android.os.Bundle;
    import android.widget.ImageView;

    import com.github.hiteshsondhi88.libffmpeg.ExecuteBinaryResponseHandler;
    import com.github.hiteshsondhi88.libffmpeg.FFmpeg;
    import com.github.hiteshsondhi88.libffmpeg.LoadBinaryResponseHandler;
    import com.github.hiteshsondhi88.libffmpeg.exceptions.FFmpegCommandAlreadyRunningException;
    import com.github.hiteshsondhi88.libffmpeg.exceptions.FFmpegNotSupportedException;

    import org.apache.commons.io.FileUtils;

    import java.io.File;
    import java.io.IOException;

    public class MainActivity extends AppCompatActivity {


       Boolean loadedFlag = false;

       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           setContentView(R.layout.activity_main);
           if (!loadedFlag) {
               FFmpegInitLoader();
           }
           decodeVideo();
       }

       public void FFmpegInitLoader() {
           FFmpeg ffmpeg = FFmpeg.getInstance(this);
           try {
               ffmpeg.loadBinary(new LoadBinaryResponseHandler() {
                   @Override
                   public void onStart() {
                   }

                   @Override
                   public void onFailure() {
                   }

                   @Override
                   public void onSuccess() {
                       System.out.println("Successfully loaded FFmpeg!!!");
                       loadedFlag = true;
                   }

                   @Override
                   public void onFinish() {
                   }
               });
           } catch (FFmpegNotSupportedException e) {
               System.out.println("Whatever....this thing is not supported :::::::::::::::::::: ");
           }
       }

       public void decodeVideo() {
           FFmpeg ffmpeg = FFmpeg.getInstance(this);
           try {
               ffmpeg.execute(new String[]{"-y", "-i", "/storage/sdcard0/AVSEQ02.mp4", "-c:v", "libx264", "/storage/sdcard0/conv.mp4"}, new ExecuteBinaryResponseHandler() {
                   @Override
                   public void onStart() {
                       System.out.println("FFmpeg started for decoding");
                   }

                   @Override
                   public void onProgress(String message) {
                       System.out.println("progress message:::: " + message);
                   }

                   @Override
                   public void onFailure(String message) {
                       System.out.println("failure message:::: " + message);
                   }

                   @Override
                   public void onSuccess(String message) {
                       System.out.println("success message:::: " + message);
                   }

                   @Override
                   public void onFinish() {
                   }
               });
           } catch (FFmpegCommandAlreadyRunningException e) {
               System.out.println("already running::::::");
           }
       }
    }
  • playing video in android using javacv ffmpeg

    20 février 2017, par d91

    I’m trying to play a video stored in sd card using javacv. Following is my code

    public class MainActivity extends AppCompatActivity {

       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           setContentView(R.layout.activity_main);
           playthevideo();
       }

       protected void playthevideo() {

          String imageInSD = "/storage/080A-0063/dama/" + "test3.mp4";

          FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(imageInSD);

          AndroidFrameConverter converterToBitmap = new AndroidFrameConverter();

          OpenCVFrameConverter.ToIplImage converterToipi = new OpenCVFrameConverter.ToIplImage();

              try {
                  Log.d("Tag", "try");
                  grabber.start();
                  Log.d("Tag", "started");

                  int i = 0;
                  IplImage grabbedImage = null;

                  ImageView mimg = (ImageView) findViewById(R.id.a);

                  grabber.setFrameRate(grabber.getFrameRate());
                  ArrayList<bitmap> bitmapArray = new ArrayList<bitmap>();
                  while (((grabbedImage = converterToipi.convert(grabber.grabImage())) != null)) {

                      Log.d("Tag", String.valueOf(i));

                      int width = grabbedImage.width();

                      int height = grabbedImage.height();

                      if (grabbedImage == null) {
                          Log.d("Tag", "error");
                      }

                      IplImage container = IplImage.create(width, height, IPL_DEPTH_8U, 4);

                      cvCvtColor(grabbedImage, container, CV_BGR2RGBA);

                      Bitmap bitmapnew = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);

                      bitmapnew.copyPixelsFromBuffer(container.getByteBuffer());

                      if (bitmapnew == null) {
                          Log.d("Tag", "bit error");
                      }

                      mimg.setImageBitmap(bitmapnew);

                      mimg.requestFocus();

                      i++;
                  }

                  Log.d("Tag", "go");

              }
              catch(Exception e) {

             }
        }
    }
    </bitmap></bitmap>

    just ignore the tags because those are only for my testing purposes..
    When I run this code the main activity layout is still loading while android monitor shows the value of "i" (which is current frame number) and suddenly after frame number 3671 code exits the while loop and the imageview shows a frame which is not the end frame of that video(it somewhere around staring of the video).
    I was unable to find a way to show the grabbed frames from ffmpegframegrabber so i decided to show the image in imageview in this way. Can anybody tell me why I’m getting this error or another error non path to play and show the video in android activity ?
    BTW javacv 1.3.1 is imported correctly into my android dev environment. Thanks.

  • FFMPEG Android Camera Streaming to RTMP

    2 février 2017, par Omer Abbas

    I need help for streaming android Camera using FFMPEG to a RTMP Server. I have compiled FFMPEG for android as a shared library. Everything from FFMPGEG side is working perfectly fine. I have tried to stream a already existed video file to RTMP and its working great. Then i have used Camera and write rawvideo to a file and stream it to RTMP server while writing camera rawvideo to the file, It’s also working but the issue is file size keep increasing. I have read about MediaRecorder LocalSocket that can stream data to ffmpeg as Unix Domain name local socket. I have used MediaRecorder Sample source of android sample https://github.com/googlesamples/android-MediaRecorder modified MediaRecorder to use local socket

    receiver = new LocalSocket();
       try {

           localSocketServer = new LocalServerSocket("camera2rtsp");

           // FileDescriptor the Camera can send to
           sender = localSocketServer.accept();
           sender.setReceiveBufferSize(500000);
           sender.setSendBufferSize(500000);

       } catch (IOException e1) {
           e1.printStackTrace();
           super.onResume();
           finish();
           //return;
       }

       mMediaRecorder.setOutputFile(sender.getFileDescriptor());

    Tried to access this socket with ffmpeg command, But it’s failed with the error unix ://camera2rtsp : no directory or file found.

    ffmpeg -i unix ://camera2rtsp -vcodec libx264 -f flv rtmp ://server

    so i tried ParcelFileDescriptor pipe

    pipe = getPipe();
    ParcelFileDescriptor parcelWrite  = new ParcelFileDescriptor(pipe[1]);
    mMediaRecorder.setOutputFile(parcelWrite.getFileDescriptor());

    With command

    "ffmpeg -re -r 30 -f rawvideo -i pipe :"+ pipe[0].getfd() +" -vcodec libx264 -f flv rtmp ://server"

    But it also not working, Seems like ffmpeg trying to read empty pipe and giving a warning of Size of Stream#0:0

    Also MediaRecorder with setOutputFile as ParcelFileDescriptor is giving error "E/MediaRecorder : start failed : -2147483648" in Galaxy S7 but working in a motorolla Phone with Kitkat.

    I might have been using Pipe or Socket Incorrectly Please if someone have any idea or experience of using ffmpeg stream with android camera please help me.

    I have figured out that MediaRecorder encode in MP4 format which is not seekable or streamable format because of headers or meta data.

    So i read a tutorial http://www.oodlestechnologies.com/blogs/Stream-video-from-camera-preview-to-wowza-server-in-Android this tutorial shows we can write raw video to a file and can pass this file to ffmpeg. It’s working great, But the issue is file size it increasing. So now my question is can i pass camera raw video to ffmpeg with ParcelFileDescriptor Pipe or LocalSocket ?

    Code :

    File f = new  File(Environment.getExternalStorageDirectory()+"/test.data");
                       if(!f.exists()){
                           f.createNewFile();
                       }

                       OutputStream outStream = new FileOutputStream(write);

                       Camera.Parameters parameters = mCamera.getParameters();
                       int imageFormat = parameters.getPreviewFormat();
                       if (imageFormat == ImageFormat.NV21) {
                           Camera.Size previewSize = parameters.getPreviewSize();
                           int frameWidth = previewSize.width;
                           int frameHeight = previewSize.height;

                           Rect rect = new Rect(0, 0, frameWidth, frameHeight);
                           YuvImage img = new YuvImage(arg0, ImageFormat.NV21, frameWidth, frameHeight, null);

                           outStream.write(arg0);
                           outStream.flush();

                           //img.compressToJpeg(rect, 50, processIn);
                       }