Recherche avancée

Médias (0)

Mot : - Tags -/objet éditorial

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (79)

  • Contribute to documentation

    13 avril 2011

    Documentation is vital to the development of improved technical capabilities.
    MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
    To contribute, register to the project users’ mailing (...)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Selection of projects using MediaSPIP

    2 mai 2011, par

    The examples below are representative elements of MediaSPIP specific uses for specific projects.
    MediaSPIP farm @ Infini
    The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...)

Sur d’autres sites (8892)

  • Issue with image rotation using JavaCV library

    26 février 2015, par intrepidkarthi

    I am writing an Android application which records video for a specified amount of time. Everything works fine if I record using the smartphone’s back camera. The app has a feature to pause/record feature like in Vine app. The issue comes when recording using the device’s front camera. The video surface frame looks fine when storing/playing the video the video is upside down. There is a lot of things discussed about this issue everywhere. But I didn’t find any solution that WORKS.

    Have a look at the code and image mentioned below.

    Here is the original image taken from front camera. I have turned it upside down for a better view.

    enter image description here

    Here is what I actually get after rotation :

    enter image description here

    Method :

        IplImage copy = cvCloneImage(image);
        IplImage rotatedImage = cvCreateImage(cvGetSize(copy), copy.depth(), copy.nChannels());
        //Define Rotational Matrix
        CvMat mapMatrix = cvCreateMat(2, 3, CV_32FC1);

        //Define Mid Point
        CvPoint2D32f centerPoint = new CvPoint2D32f();
        centerPoint.x(copy.width() / 2);
        centerPoint.y(copy.height() / 2);

        //Get Rotational Matrix
        cv2DRotationMatrix(centerPoint, angle, 1.0, mapMatrix);

        //Rotate the Image
        cvWarpAffine(copy, rotatedImage, mapMatrix, CV_INTER_CUBIC + CV_WARP_FILL_OUTLIERS, cvScalarAll(170));
        cvReleaseImage(copy);
        cvReleaseMat(mapMatrix);

    I have tried doing

        double angleTemp = angle;

        angleTemp= ((angleTemp / 90)%4)*90;      
        final int number = (int) Math.abs(angleTemp/90);

        for(int i = 0; i != number; ++i){            
            cvTranspose(rotatedImage, rotatedImage);
            cvFlip(rotatedImage, rotatedImage, 0);          
        }

    Ends up in throwing exception saying that source and destination doesn’t match with number of columns and rows.

    Update :

    Video is recorded in this way.

    IplImage newImage = null;
    if(cameraSelection == CameraInfo.CAMERA_FACING_FRONT){
       newImage = videoRecorder.rotate(yuvIplImage, 180);
       videoRecorder.record(newImage);
    }
    else
       videoRecorder.record(yuvIplImage);  

    Rotation is done in this way :

       IplImage img = IplImage.create(image.height(), image.width(),
               image.depth(), image.nChannels());

       for (int i = 0; i < 180; i++) {
           cvTranspose(image, img);
           cvFlip(img, img, 0);
       }

    Can anyone point out what is wrong here if you have experienced this before ?

  • Issue in concatenating two video files using FFMPEG

    12 mai 2014, par intrepidkarthi

    I am trying to concatenate two mp4 files taken from gallery. I am getting issue with process execution failure. I have added the code and the error log. Using the ffmpeg library taken from guardian project.

    I am running this in Samsung Galaxy S3 device.

    The error is thrown in this particular line.

    ProcessBuilder pb = new ProcessBuilder(cmds);
    pb.directory(fileExec);
    Process process = pb.start();  

    When I replace the last line above with this,

    Process process = Runtime.getRuntime().exec("chmod 777 "+cmds.toArray(new String[cmds.size()]));

    It works with out the exception as shown below. But the output doesn’t seem to come.

    File concatenation Code :

       File fileVideoOutput = new File(getApplicationContext()
               .getExternalFilesDir("test") + "hello.mp4");
       fileVideoOutput.delete();

       File fileTmp = getApplicationContext().getCacheDir();
       File fileAppRoot = new File(getApplicationContext()
               .getApplicationInfo().dataDir);

       try {
           FfmpegController fc = new FfmpegController(fileTmp, fileAppRoot);

           ArrayList<clip> listVideos = new ArrayList<clip>();
           Clip clip = new Clip();
           clip.path = video1;
           fc.getInfo(clip);
           clip.duration = clip.duration;
           System.out.println("Clip1 duration " + clip.duration);
           listVideos.add(clip);

           Clip clip2 = new Clip();
           clip2.path = video2;
           fc.getInfo(clip2);
           clip2.duration = clip2.duration;
           System.out.println("Clip2 duration " + clip2.duration);
           listVideos.add(clip2);

           Clip clipOut = new Clip();
           clipOut.path = fileVideoOutput.getCanonicalPath();

           fc.concatAndTrimFilesMP4Stream(listVideos, clipOut, false, false,
                   new ShellUtils.ShellCallback() {

                       @Override
                       public void shellOut(String shellLine) {

                           System.out.println("fc>" + shellLine);
                       }

                       @Override
                       public void processComplete(int exitValue) {

                           if (exitValue &lt; 0)
                               System.err.println("concat non-zero exit: "
                                       + exitValue);
                       }
                   });
       } catch (Exception e1) {
           e1.printStackTrace();
       }
    </clip></clip>

    Error log :

    05-08 11:17:03.765: W/System.err(25209): java.io.IOException: Error running exec(). Command: [ffmpeg, -y, -i, /storage/emulated/0/DCIM/Camera/20140507_155713.mp4, -f, mpegts, -c, copy, -an, -bsf:v, h264_mp4toannexb, /data/data/com.yoyo.videoeditor/cache/0.ts] Working Directory: /data/data/com.yoyo.videoeditor/lib Environment: [VIBE_PIPE_PATH=/dev/pipes, ANDROID_ROOT=/system, EMULATED_STORAGE_SOURCE=/mnt/shell/emulated, LOOP_MOUNTPOINT=/mnt/obb, EMULATED_STORAGE_TARGET=/storage/emulated, ANDROID_BOOTLOGO=1, LD_LIBRARY_PATH=/vendor/lib:/system/lib, EXTERNAL_STORAGE=/storage/emulated/legacy, ANDROID_SOCKET_zygote=9, ANDROID_DATA=/data, PATH=/sbin:/vendor/bin:/system/sbin:/system/bin:/system/xbin, ANDROID_ASSETS=/system/app, ASEC_MOUNTPOINT=/mnt/asec, BOOTCLASSPATH=/system/framework/core.jar:/system/framework/core-junit.jar:/system/framework/bouncycastle.jar:/system/framework/ext.jar:/system/framework/framework.jar:/system/framework/framework2.jar:/system/framework/telephony-common.jar:/system/framework/voip-common.jar:/system/framework/mms-common.jar:/system/framework/android.policy.jar:/system/framework/services.jar:/system/framework/apache-xml.jar:/system/framework/sec_edm.jar:/system/framework/seccamera.jar:/system/framework/scrollpause.jar:/system/framework/stayrotation.jar:/system/framework/smartfaceservice.jar:/system/framework/sc.jar:/system/framework/secocsp.jar:/system/framework/commonimsinterface.jar, ANDROID_PROPERTY_WORKSPACE=8,66560, SECONDARY_STORAGE=/storage/extSdCard:/storage/UsbDriveA:/storage/UsbDriveB:/storage/UsbDriveC:/storage/UsbDriveD:/storage/UsbDriveE:/storage/UsbDriveF, ANDROID_STORAGE=/storage]
    05-08 11:17:03.770: W/System.err(25209):    at java.lang.ProcessManager.exec(ProcessManager.java:211)
    05-08 11:17:03.770: W/System.err(25209):    at java.lang.ProcessBuilder.start(ProcessBuilder.java:195)
    05-08 11:17:03.770: W/System.err(25209):    at org.ffmpeg.android.FfmpegController.execProcess(FfmpegController.java:101)
    05-08 11:17:03.770: W/System.err(25209):    at org.ffmpeg.android.FfmpegController.execFFMPEG(FfmpegController.java:71)
    05-08 11:17:03.770: W/System.err(25209):    at org.ffmpeg.android.FfmpegController.execFFMPEG(FfmpegController.java:75)
    05-08 11:17:03.775: W/System.err(25209):    at org.ffmpeg.android.FfmpegController.convertToMP4Stream(FfmpegController.java:657)
    05-08 11:17:03.775: W/System.err(25209):    at org.ffmpeg.android.FfmpegController.concatAndTrimFilesMP4Stream(FfmpegController.java:1107)
    05-08 11:17:03.775: W/System.err(25209):    at com.yoyo.videoeditor.EditorActivity.mergeVideosOld(EditorActivity.java:271)
    05-08 11:17:03.775: W/System.err(25209):    at com.yoyo.videoeditor.EditorActivity.access$0(EditorActivity.java:243)
    05-08 11:17:03.775: W/System.err(25209):    at com.yoyo.videoeditor.EditorActivity$3.onClick(EditorActivity.java:85)
    05-08 11:17:03.775: W/System.err(25209):    at android.view.View.performClick(View.java:4475)
    05-08 11:17:03.775: W/System.err(25209):    at android.view.View$PerformClick.run(View.java:18786)
    05-08 11:17:03.780: W/System.err(25209):    at android.os.Handler.handleCallback(Handler.java:730)
    05-08 11:17:03.780: W/System.err(25209):    at android.os.Handler.dispatchMessage(Handler.java:92)
    05-08 11:17:03.780: W/System.err(25209):    at android.os.Looper.loop(Looper.java:176)
    05-08 11:17:03.780: W/System.err(25209):    at android.app.ActivityThread.main(ActivityThread.java:5419)
    05-08 11:17:03.780: W/System.err(25209):    at java.lang.reflect.Method.invokeNative(Native Method)
    05-08 11:17:03.785: W/System.err(25209):    at java.lang.reflect.Method.invoke(Method.java:525)
    05-08 11:17:03.785: W/System.err(25209):    at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1046)
    05-08 11:17:03.785: W/System.err(25209):    at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:862)
    05-08 11:17:03.785: W/System.err(25209):    at dalvik.system.NativeStart.main(Native Method)
    05-08 11:17:03.785: W/System.err(25209): Caused by: java.io.IOException: Permission denied
    05-08 11:17:03.785: W/System.err(25209):    at java.lang.ProcessManager.exec(Native Method)
    05-08 11:17:03.790: W/System.err(25209):    at java.lang.ProcessManager.exec(ProcessManager.java:209)
    05-08 11:17:03.790: W/System.err(25209):    ... 20 more

    Here is my ffmpeg commands as given in the ffmpeg tutorial page.

    ffmpeg -i input1.mp4 -c copy -bsf:v h264_mp4toannexb -f mpegts intermediate1.ts
    ffmpeg -i input2.mp4 -c copy -bsf:v h264_mp4toannexb -f mpegts intermediate2.ts
    ffmpeg -i "concat:intermediate1.ts|intermediate2.ts" -c copy -bsf:a aac_adtstoasc output.mp4

    FFMPEG library doesn’t seem to be working as expected. When I add commands and create output file using "touch" shell command file is being created. Still I am unable to see the output from ffmpeg as expected.

  • issue after video rotation how fix

    2 avril 2015, par Vahagn

    I have next code for rotate video

    OpenCVFrameConverter.ToIplImage converter2 = new OpenCVFrameConverter.ToIplImage() ;

    for (int i = firstIndex; i &lt;= lastIndex; i++) {
       long t = timestamps[i % timestamps.length] - startTime;
       if (t >= 0) {
           if (t > recorder.getTimestamp()) {
               recorder.setTimestamp(t);
           }
           Frame g = converter2.convert(rotate(converter2.convertToIplImage(images[i % images.length]),9 0));
       recorder.record(g);
       }
    }

    images[i] - Frame in JavaCV
    after in video have green lines

    UPDATE
    Convertation function

    /*
    * Copyright (C) 2015 Samuel Audet
    *
    * This file is part of JavaCV.
    *
    * JavaCV is free software: you can redistribute it and/or modify
    * it under the terms of the GNU General Public License as published by
    * the Free Software Foundation, either version 2 of the License, or
    * (at your option) any later version (subject to the "Classpath" exception
    * as provided in the LICENSE.txt file that accompanied this code).
    *
    * JavaCV is distributed in the hope that it will be useful,
    * but WITHOUT ANY WARRANTY; without even the implied warranty of
    * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    * GNU General Public License for more details.
    *
    * You should have received a copy of the GNU General Public License
    * along with JavaCV.  If not, see /www.gnu.org/licenses/>.
    */

    package com.example.vvardanyan.ffmpeg;

    import org.bytedeco.javacpp.BytePointer;
    import org.bytedeco.javacpp.Pointer;

    import java.nio.Buffer;

    import static org.bytedeco.javacpp.opencv_core.CV_16S;
    import static org.bytedeco.javacpp.opencv_core.CV_16U;
    import static org.bytedeco.javacpp.opencv_core.CV_32F;
    import static org.bytedeco.javacpp.opencv_core.CV_32S;
    import static org.bytedeco.javacpp.opencv_core.CV_64F;
    import static org.bytedeco.javacpp.opencv_core.CV_8S;
    import static org.bytedeco.javacpp.opencv_core.CV_8U;
    import static org.bytedeco.javacpp.opencv_core.CV_MAKETYPE;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_16S;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_16U;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_32F;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_32S;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_64F;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_8S;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_8U;
    import static org.bytedeco.javacpp.opencv_core.IplImage;
    import static org.bytedeco.javacpp.opencv_core.Mat;

    /**
    * A utility class to map data between {@link Frame} and {@link IplImage} or {@link Mat}.
    * Since this is an abstract class, one must choose between two concrete classes:
    * {@link ToIplImage} or {@link ToMat}.
    *
    * @author Samuel Audet
    */
    public abstract class OpenCVFrameConverter<f> extends FrameConverter<f> {
       IplImage img;
       Mat mat;

       public static class ToIplImage extends OpenCVFrameConverter<iplimage> {
           @Override public IplImage convert(Frame frame) { return convertToIplImage(frame); }
       }

       public static class ToMat extends OpenCVFrameConverter<mat> {
           @Override public Mat convert(Frame frame) { return convertToMat(frame); }
       }

       public static int getFrameDepth(int depth) {
           switch (depth) {
               case IPL_DEPTH_8U:  case CV_8U:  return Frame.DEPTH_UBYTE;
               case IPL_DEPTH_8S:  case CV_8S:  return Frame.DEPTH_BYTE;
               case IPL_DEPTH_16U: case CV_16U: return Frame.DEPTH_USHORT;
               case IPL_DEPTH_16S: case CV_16S: return Frame.DEPTH_SHORT;
               case IPL_DEPTH_32F: case CV_32F: return Frame.DEPTH_FLOAT;
               case IPL_DEPTH_32S: case CV_32S: return Frame.DEPTH_INT;
               case IPL_DEPTH_64F: case CV_64F: return Frame.DEPTH_DOUBLE;
               default: return -1;
           }
       }

       public static int getIplImageDepth(Frame frame) {
           switch (frame.imageDepth) {
               case Frame.DEPTH_UBYTE:  return IPL_DEPTH_8U;
               case Frame.DEPTH_BYTE:   return IPL_DEPTH_8S;
               case Frame.DEPTH_USHORT: return IPL_DEPTH_16U;
               case Frame.DEPTH_SHORT:  return IPL_DEPTH_16S;
               case Frame.DEPTH_FLOAT:  return IPL_DEPTH_32F;
               case Frame.DEPTH_INT:    return IPL_DEPTH_32S;
               case Frame.DEPTH_DOUBLE: return IPL_DEPTH_64F;
               default:  return -1;
           }
       }
       static boolean isEqual(Frame frame, IplImage img) {
           return img != null &amp;&amp; frame != null &amp;&amp; frame.image != null &amp;&amp; frame.image.length > 0
                   &amp;&amp; frame.imageWidth == img.width() &amp;&amp; frame.imageHeight == img.height()
                   &amp;&amp; frame.imageChannels == img.nChannels() &amp;&amp; getIplImageDepth(frame) == img.depth()
                   &amp;&amp; new Pointer(frame.image[0]).address() == img.imageData().address()
                   &amp;&amp; frame.imageStride * Math.abs(frame.imageDepth) / 8 == img.widthStep();
       }
       public IplImage convertToIplImage(Frame frame) {
           if (frame == null) {
               return null;
           } else if (frame.opaque instanceof IplImage) {
               return (IplImage)frame.opaque;
           } else if (!isEqual(frame, img)) {
               int depth = getIplImageDepth(frame);
               img = depth &lt; 0 ? null : IplImage.createHeader(frame.imageWidth, frame.imageHeight, depth, frame.imageChannels)
                       .imageData(new BytePointer(new Pointer(frame.image[0].position(0)))).widthStep(frame.imageStride * Math.abs(frame.imageDepth) / 8);
           }
           return img;
       }
       public Frame convert(IplImage img) {
           if (img == null) {
               return null;
           } else if (!isEqual(frame, img)) {
               frame = new Frame();
               frame.imageWidth = img.width();
               frame.imageHeight = img.height();
               frame.imageDepth = getFrameDepth(img.depth());
               frame.imageChannels = img.nChannels();
               frame.imageStride = img.widthStep() * 8 / Math.abs(frame.imageDepth);
               frame.image = new Buffer[] { img.createBuffer() };
               frame.opaque = img;
           }
           return frame;
       }

       public static int getMatDepth(Frame frame) {
           switch (frame.imageDepth) {
               case Frame.DEPTH_UBYTE:  return CV_8U;
               case Frame.DEPTH_BYTE:   return CV_8S;
               case Frame.DEPTH_USHORT: return CV_16U;
               case Frame.DEPTH_SHORT:  return CV_16S;
               case Frame.DEPTH_FLOAT:  return CV_32F;
               case Frame.DEPTH_INT:    return CV_32S;
               case Frame.DEPTH_DOUBLE: return CV_64F;
               default:  return -1;
           }
       }
       static boolean isEqual(Frame frame, Mat mat) {
           return mat != null &amp;&amp; frame != null &amp;&amp; frame.image != null &amp;&amp; frame.image.length > 0
                   &amp;&amp; frame.imageWidth == mat.cols() &amp;&amp; frame.imageHeight == mat.rows()
                   &amp;&amp; frame.imageChannels == mat.channels() &amp;&amp; getMatDepth(frame) == mat.depth()
                   &amp;&amp; new Pointer(frame.image[0]).address() == mat.data().address()
                   &amp;&amp; frame.imageStride * Math.abs(frame.imageDepth) / 8 == (int)mat.step();
       }
       public Mat convertToMat(Frame frame) {
           if (frame == null) {
               return null;
           } else if (frame.opaque instanceof Mat) {
               return (Mat)frame.opaque;
           } else if (!isEqual(frame, mat)) {
               int depth = getMatDepth(frame);
               mat = depth &lt; 0 ? null : new Mat(frame.imageHeight, frame.imageWidth, CV_MAKETYPE(depth, frame.imageChannels),
                       new Pointer(frame.image[0].position(0)), frame.imageStride * Math.abs(frame.imageDepth) / 8);
           }
           return mat;
       }
       public Frame convert(Mat mat) {
           if (mat == null) {
               return null;
           } else if (!isEqual(frame, mat)) {
               frame = new Frame();
               frame.imageWidth = mat.cols();
               frame.imageHeight = mat.rows();
               frame.imageDepth = getFrameDepth(mat.depth());
               frame.imageChannels = mat.channels();
               frame.imageStride = (int)mat.step() * 8 / Math.abs(frame.imageDepth);
               frame.image = new Buffer[] { mat.createBuffer() };
               frame.opaque = mat;
           }
           return frame;
       }
    }
    </mat></iplimage></f></f>