Recherche avancée

Médias (91)

Autres articles (32)

  • Pas question de marché, de cloud etc...

    10 avril 2011

    Le vocabulaire utilisé sur ce site essaie d’éviter toute référence à la mode qui fleurit allègrement
    sur le web 2.0 et dans les entreprises qui en vivent.
    Vous êtes donc invité à bannir l’utilisation des termes "Brand", "Cloud", "Marché" etc...
    Notre motivation est avant tout de créer un outil simple, accessible à pour tout le monde, favorisant
    le partage de créations sur Internet et permettant aux auteurs de garder une autonomie optimale.
    Aucun "contrat Gold ou Premium" n’est donc prévu, aucun (...)

  • Selection of projects using MediaSPIP

    2 mai 2011, par

    The examples below are representative elements of MediaSPIP specific uses for specific projects.
    MediaSPIP farm @ Infini
    The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...)

  • Gestion générale des documents

    13 mai 2011, par

    MédiaSPIP ne modifie jamais le document original mis en ligne.
    Pour chaque document mis en ligne il effectue deux opérations successives : la création d’une version supplémentaire qui peut être facilement consultée en ligne tout en laissant l’original téléchargeable dans le cas où le document original ne peut être lu dans un navigateur Internet ; la récupération des métadonnées du document original pour illustrer textuellement le fichier ;
    Les tableaux ci-dessous expliquent ce que peut faire MédiaSPIP (...)

Sur d’autres sites (4301)

  • Android FFMpeg guardian project IOException : Permission Denied

    30 décembre 2016, par sector11

    I have cloned compiled ffmpeg binary from here https://github.com/guardianproject/android-ffmpeg and using in android project reference from here https://github.com/guardianproject/android-ffmpeg-java

    But i am not able to get -version command output as it is saying Caused by: java.io.IOException: Permission denied

    Error Logs as follow :

    Permission changed now
    12-30 02:30:20.083 3765-3765/in.ashish29agre.ffmpeg I/System.out: Permission changed now
    12-30 02:30:20.083 3765-3765/in.ashish29agre.ffmpeg I/System.out: Binary path: /data/data/in.ashish29agre.ffmpeg/app_bin/ffmpeg
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg I/TestActivity: Progress: -version
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err: java.io.IOException: Error running exec(). Command: [-version] Working Directory: /data/data/in.ashish29agre.ffmpeg/app_bin Environment: [ANDROID_ROOT=/system, LOOP_MOUNTPOINT=/mnt/obb, ANDROID_BOOTLOGO=1, LD_LIBRARY_PATH=/vendor/lib:/system/lib, EXTERNAL_STORAGE=/storage/sdcard, ANDROID_SOCKET_zygote=9, ANDROID_DATA=/data, PATH=/sbin:/vendor/bin:/system/sbin:/system/bin:/system/xbin, ANDROID_ASSETS=/system/app, ASEC_MOUNTPOINT=/mnt/asec, BOOTCLASSPATH=/system/framework/core.jar:/system/framework/conscrypt.jar:/system/framework/okhttp.jar:/system/framework/core-junit.jar:/system/framework/bouncycastle.jar:/system/framework/ext.jar:/system/framework/framework.jar:/system/framework/framework2.jar:/system/framework/telephony-common.jar:/system/framework/voip-common.jar:/system/framework/mms-common.jar:/system/framework/android.policy.jar:/system/framework/services.jar:/system/framework/apache-xml.jar:/system/framework/webviewchromium.jar, ANDROID_PROPERTY_WORKSPACE=8,0, ANDROID_STORAGE=/storage]
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at java.lang.ProcessManager.exec(ProcessManager.java:211)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at java.lang.ProcessBuilder.start(ProcessBuilder.java:195)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at org.ffmpeg.android.FfmpegController.execProcess(FfmpegController.java:137)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at org.ffmpeg.android.FfmpegController.execFFMPEG(FfmpegController.java:101)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at org.ffmpeg.android.FfmpegController.execFFMPEG(FfmpegController.java:111)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at in.ashish29agre.ffmpeg.TestActivity.onResume(TestActivity.java:37)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at android.app.Instrumentation.callActivityOnResume(Instrumentation.java:1192)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at android.app.Activity.performResume(Activity.java:5310)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at android.app.ActivityThread.performResumeActivity(ActivityThread.java:2778)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at android.app.ActivityThread.handleResumeActivity(ActivityThread.java:2817)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2250)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at android.app.ActivityThread.access$800(ActivityThread.java:135)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1196)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at android.os.Handler.dispatchMessage(Handler.java:102)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at android.os.Looper.loop(Looper.java:136)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at android.app.ActivityThread.main(ActivityThread.java:5017)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at java.lang.reflect.Method.invokeNative(Native Method)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at java.lang.reflect.Method.invoke(Method.java:515)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:779)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:595)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at dalvik.system.NativeStart.main(Native Method)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err: Caused by: java.io.IOException: Permission denied
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at java.lang.ProcessManager.exec(Native Method)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:     at java.lang.ProcessManager.exec(ProcessManager.java:209)
    12-30 02:30:20.093 3765-3765/in.ashish29agre.ffmpeg W/System.err:   ... 20 more

    My Activity code is as follows :

    @Override
       protected void onResume() {
           super.onResume();
           File fileTmp = getCacheDir();
           File fileAppRoot = new File(getApplicationInfo().dataDir);
           List<string> commands = new ArrayList&lt;>();
           commands.add("-version");
           try {
               FfmpegController ffmpegController = new FfmpegController(this, fileAppRoot);
               ffmpegController.installBinaries(this, true);
               System.out.println("Binary path: " + ffmpegController.getBinaryPath());

               ffmpegController.execFFMPEG(commands, new ShellUtils.ShellCallback() {
                   @Override
                   public void shellOut(String shellLine) {
                       logger.info("Progress: " + shellLine);

                   }

                   @Override
                   public void processComplete(int exitValue) {
                       logger.info("Process complete");
                   }
               });
           } catch (IOException e) {
               e.printStackTrace();
           } catch (InterruptedException e) {
               e.printStackTrace();
           }
       }
    </string>

    Just to make sure that i added permission of reading and writing external storage in manifest here is AndroidManifest.xml

    &lt;?xml version="1.0" encoding="utf-8"?>
    <manifest package="in.test.ffmpeg">

       
       

       <application>
           <activity>
               
                   <action></action>

                   <category></category>
               
           </activity>
       </application>

    </manifest>
  • FFMPEG Android Camera Streaming to RTMP

    2 février 2017, par Omer Abbas

    I need help for streaming android Camera using FFMPEG to a RTMP Server. I have compiled FFMPEG for android as a shared library. Everything from FFMPGEG side is working perfectly fine. I have tried to stream a already existed video file to RTMP and its working great. Then i have used Camera and write rawvideo to a file and stream it to RTMP server while writing camera rawvideo to the file, It’s also working but the issue is file size keep increasing. I have read about MediaRecorder LocalSocket that can stream data to ffmpeg as Unix Domain name local socket. I have used MediaRecorder Sample source of android sample https://github.com/googlesamples/android-MediaRecorder modified MediaRecorder to use local socket

    receiver = new LocalSocket();
       try {

           localSocketServer = new LocalServerSocket("camera2rtsp");

           // FileDescriptor the Camera can send to
           sender = localSocketServer.accept();
           sender.setReceiveBufferSize(500000);
           sender.setSendBufferSize(500000);

       } catch (IOException e1) {
           e1.printStackTrace();
           super.onResume();
           finish();
           //return;
       }

       mMediaRecorder.setOutputFile(sender.getFileDescriptor());

    Tried to access this socket with ffmpeg command, But it’s failed with the error unix ://camera2rtsp : no directory or file found.

    ffmpeg -i unix ://camera2rtsp -vcodec libx264 -f flv rtmp ://server

    so i tried ParcelFileDescriptor pipe

    pipe = getPipe();
    ParcelFileDescriptor parcelWrite  = new ParcelFileDescriptor(pipe[1]);
    mMediaRecorder.setOutputFile(parcelWrite.getFileDescriptor());

    With command

    "ffmpeg -re -r 30 -f rawvideo -i pipe :"+ pipe[0].getfd() +" -vcodec libx264 -f flv rtmp ://server"

    But it also not working, Seems like ffmpeg trying to read empty pipe and giving a warning of Size of Stream#0:0

    Also MediaRecorder with setOutputFile as ParcelFileDescriptor is giving error "E/MediaRecorder : start failed : -2147483648" in Galaxy S7 but working in a motorolla Phone with Kitkat.

    I might have been using Pipe or Socket Incorrectly Please if someone have any idea or experience of using ffmpeg stream with android camera please help me.

    I have figured out that MediaRecorder encode in MP4 format which is not seekable or streamable format because of headers or meta data.

    So i read a tutorial http://www.oodlestechnologies.com/blogs/Stream-video-from-camera-preview-to-wowza-server-in-Android this tutorial shows we can write raw video to a file and can pass this file to ffmpeg. It’s working great, But the issue is file size it increasing. So now my question is can i pass camera raw video to ffmpeg with ParcelFileDescriptor Pipe or LocalSocket ?

    Code :

    File f = new  File(Environment.getExternalStorageDirectory()+"/test.data");
                       if(!f.exists()){
                           f.createNewFile();
                       }

                       OutputStream outStream = new FileOutputStream(write);

                       Camera.Parameters parameters = mCamera.getParameters();
                       int imageFormat = parameters.getPreviewFormat();
                       if (imageFormat == ImageFormat.NV21) {
                           Camera.Size previewSize = parameters.getPreviewSize();
                           int frameWidth = previewSize.width;
                           int frameHeight = previewSize.height;

                           Rect rect = new Rect(0, 0, frameWidth, frameHeight);
                           YuvImage img = new YuvImage(arg0, ImageFormat.NV21, frameWidth, frameHeight, null);

                           outStream.write(arg0);
                           outStream.flush();

                           //img.compressToJpeg(rect, 50, processIn);
                       }
  • playing video in android using javacv ffmpeg

    20 février 2017, par d91

    I’m trying to play a video stored in sd card using javacv. Following is my code

    public class MainActivity extends AppCompatActivity {

       @Override
       protected void onCreate(Bundle savedInstanceState) {
           super.onCreate(savedInstanceState);
           setContentView(R.layout.activity_main);
           playthevideo();
       }

       protected void playthevideo() {

          String imageInSD = "/storage/080A-0063/dama/" + "test3.mp4";

          FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(imageInSD);

          AndroidFrameConverter converterToBitmap = new AndroidFrameConverter();

          OpenCVFrameConverter.ToIplImage converterToipi = new OpenCVFrameConverter.ToIplImage();

              try {
                  Log.d("Tag", "try");
                  grabber.start();
                  Log.d("Tag", "started");

                  int i = 0;
                  IplImage grabbedImage = null;

                  ImageView mimg = (ImageView) findViewById(R.id.a);

                  grabber.setFrameRate(grabber.getFrameRate());
                  ArrayList<bitmap> bitmapArray = new ArrayList<bitmap>();
                  while (((grabbedImage = converterToipi.convert(grabber.grabImage())) != null)) {

                      Log.d("Tag", String.valueOf(i));

                      int width = grabbedImage.width();

                      int height = grabbedImage.height();

                      if (grabbedImage == null) {
                          Log.d("Tag", "error");
                      }

                      IplImage container = IplImage.create(width, height, IPL_DEPTH_8U, 4);

                      cvCvtColor(grabbedImage, container, CV_BGR2RGBA);

                      Bitmap bitmapnew = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);

                      bitmapnew.copyPixelsFromBuffer(container.getByteBuffer());

                      if (bitmapnew == null) {
                          Log.d("Tag", "bit error");
                      }

                      mimg.setImageBitmap(bitmapnew);

                      mimg.requestFocus();

                      i++;
                  }

                  Log.d("Tag", "go");

              }
              catch(Exception e) {

             }
        }
    }
    </bitmap></bitmap>

    just ignore the tags because those are only for my testing purposes..
    When I run this code the main activity layout is still loading while android monitor shows the value of "i" (which is current frame number) and suddenly after frame number 3671 code exits the while loop and the imageview shows a frame which is not the end frame of that video(it somewhere around staring of the video).
    I was unable to find a way to show the grabbed frames from ffmpegframegrabber so i decided to show the image in imageview in this way. Can anybody tell me why I’m getting this error or another error non path to play and show the video in android activity ?
    BTW javacv 1.3.1 is imported correctly into my android dev environment. Thanks.