Recherche avancée

Médias (1)

Mot : - Tags -/censure

Autres articles (106)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

Sur d’autres sites (15916)

  • libav ffmpeg multiple streams performance issue

    11 avril 2015, par hillsm

    I have a program capable of generating 5 mpeg transport streams simultaneously. each stream has its own context and is done in an independent thread. when only one stream is active everything works great. as soon as i activate another stream my frame rate drops tremendously. both streams still work properly just the frame rate is slow. i have tracked it down to the call to av_interleaved_write_frame. it goes from approx 4 ms when a single stream to over 50ms when two. i am running on a 6 core haswell-e with hyperthreading and the machine is nowhere near overoaded.

  • Manage multipe IP cameras at the same time

    4 juin 2015, par Alessio

    How I can manage multiple GoPro cameras at the same time ? I want to stream three videos of three GoPro cameras at the same time and record the videos on the hard disk.

    I have written a tool in Java for one GoPro and it works correctly.

    Help me please !

    This is the code :

    public class GoProStreamer extends JFrame {

    private static final String CAMERA_IP = "10.5.5.9";
    private static int PORT = 8080;
    private static DatagramSocket mOutgoingUdpSocket;
    private Process streamingProcess;
    private Process writeVideoProcess;
    private KeepAliveThread mKeepAliveThread;

    private JPanel contentPane;

    public GoProStreamer() {
       setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
       setBounds(800, 10, 525, 300);

       contentPane = new JPanel();
       contentPane.setBorder(new EmptyBorder(5, 5, 5, 5));
       contentPane.setLayout(new BorderLayout(0, 0));
       setContentPane(contentPane);

       JButton btnStop = new JButton("Stop stream");
       JButton btnStart = new JButton("Start stream");
       JButton btnRec = new JButton("Rec");
       JButton btnStopRec = new JButton("Stop Rec");
       //      JButton btnZoomIn = new JButton("Zoom In sincrono");
       //      JButton btnZoomOut = new JButton("Zoom out sincrono");
       //      JButton btnZoomIn1 = new JButton("Zoom In Camera 1");
       //      JButton btnZoomOut1 = new JButton("Zoom out Camera 1");
       //      JButton btnZoomIn2 = new JButton("Zoom in Camera 2");
       //      JButton btnZoomOut2 = new JButton("Zoom out Camera 2");
       //      JButton btnZoomIn3 = new JButton("Zoom in camera 3");
       //      JButton btnZoomOut3 = new JButton("Zoom out Camera 3");

       btnStop.setEnabled(false);
       btnRec.setEnabled(false);
       btnStopRec.setEnabled(false);
       //      btnZoomIn.setEnabled(false);
       //      btnZoomOut.setEnabled(false);
       //      btnZoomIn1.setEnabled(false);
       //      btnZoomOut1.setEnabled(false);
       //      btnZoomIn2.setEnabled(false);
       //      btnZoomOut2.setEnabled(false);
       //      btnZoomIn3.setEnabled(false);
       //      btnZoomOut3.setEnabled(false);

       JPanel panel = new JPanel();
       //      JPanel panel2 = new JPanel();
       //      JPanel panel3 = new JPanel();
       //      JPanel panel4 = new JPanel();

       panel.add(btnStart);
       panel.add(btnStop);
       panel.add(btnRec);
       panel.add(btnStopRec);
       //      panel2.add(btnZoomIn1);
       //      panel3.add(btnZoomOut1);
       //      panel2.add(btnZoomIn2);
       //      panel3.add(btnZoomOut2);
       //      panel2.add(btnZoomIn3);
       //      panel3.add(btnZoomOut3);
       //      panel4.add(btnZoomIn);
       //      panel4.add(btnZoomOut);

       contentPane.add(panel, BorderLayout.SOUTH);
       //  contentPane.add(panel2, BorderLayout.NORTH);
       //  contentPane.add(panel3, BorderLayout.CENTER);

       btnStart.addActionListener(new ActionListener() {
           public void actionPerformed(ActionEvent arg0) {
               startStreamService();
               keepAlive();
               startStreaming();

               btnStart.setEnabled(false);
               btnStop.setEnabled(true);
               btnRec.setEnabled(true);
               btnStopRec.setEnabled(false);
               //              btnZoomIn.setEnabled(true);
               //              btnZoomOut.setEnabled(true);
               //              btnZoomIn1.setEnabled(true);
               //              btnZoomOut1.setEnabled(true);
               //              btnZoomIn2.setEnabled(true);
               //              btnZoomOut2.setEnabled(true);
               //              btnZoomIn3.setEnabled(true);
               //              btnZoomOut3.setEnabled(true);
           }
       });

       btnStop.addActionListener(new ActionListener() {
           public void actionPerformed(ActionEvent arg0) {
               stopStreaming();
               stopKeepalive();

               btnStart.setEnabled(true);
               btnStop.setEnabled(false);
               btnRec.setEnabled(false);
               btnStopRec.setEnabled(false);
           }
       });

       btnRec.addActionListener(new ActionListener() {
           public void actionPerformed(ActionEvent arg0) {
               startRec();

               btnStart.setEnabled(false);
               btnStop.setEnabled(false);
               btnRec.setEnabled(false);
               btnStopRec.setEnabled(true);
           }
       });

       btnStopRec.addActionListener(new ActionListener() {
           public void actionPerformed(ActionEvent arg0) {
               stopRec();

               btnStart.setEnabled(false);
               btnStop.setEnabled(true);
               btnRec.setEnabled(true);
               btnStopRec.setEnabled(false);
           }
       });
    }

    private void startStreamService() {
       HttpURLConnection localConnection = null;
       try {
           String str = "http://" + CAMERA_IP + "/gp/gpExec?p1=gpStreamA9&c1=restart";
           localConnection = (HttpURLConnection) new URL(str).openConnection();
           localConnection.addRequestProperty("Cache-Control", "no-cache");
           localConnection.setConnectTimeout(5000);
           localConnection.setReadTimeout(5000);
           int i = localConnection.getResponseCode();
           if (i >= 400) {
               throw new IOException("sendGET HTTP error " + i);
           }
       }
       catch (Exception e) {

       }
       if (localConnection != null) {
           localConnection.disconnect();
       }
    }

    @SuppressWarnings("static-access")
    private void sendUdpCommand(int paramInt) throws SocketException, IOException {
       Locale localLocale = Locale.US;
       Object[] arrayOfObject = new Object[4];
       arrayOfObject[0] = Integer.valueOf(0);
       arrayOfObject[1] = Integer.valueOf(0);
       arrayOfObject[2] = Integer.valueOf(paramInt);
       arrayOfObject[3] = Double.valueOf(0.0D);
       byte[] arrayOfByte = String.format(localLocale, "_GPHD_:%d:%d:%d:%1f\n", arrayOfObject).getBytes();
       String str = CAMERA_IP;
       int i = PORT;
       DatagramPacket localDatagramPacket = new DatagramPacket(arrayOfByte, arrayOfByte.length, new InetSocketAddress(str, i));
       this.mOutgoingUdpSocket.send(localDatagramPacket);
    }

    private void startStreaming() {
       Thread threadStream = new Thread() {
           @Override
           public void run() {
               try {
                   streamingProcess = Runtime.getRuntime().exec("ffmpeg-20150318-git-0f16dfd-win64-static\\bin\\ffplay -i http://10.5.5.9:8080/live/amba.m3u8");
                   InputStream errorStream = streamingProcess.getErrorStream();
                   byte[] data = new byte[1024];
                   int length = 0;
                   while ((length = errorStream.read(data, 0, data.length)) > 0) {
                       System.out.println(new String(data, 0, length));
                       System.out.println(System.currentTimeMillis());
                   }

               } catch (IOException e) {

               }
           }
       };
       threadStream.start();
    }

    private void startRec() {
       Thread threadRec = new Thread() {
           @Override
           public void run() {
               try {
                   writeVideoProcess = Runtime.getRuntime().exec("ffmpeg-20150318-git-0f16dfd-win64-static\\bin\\ffmpeg -re -i http://10.5.5.9:8080/live/amba.m3u8 -c copy -an Video_GoPro_" + Math.random() + ".avi");
                   InputStream errorRec = writeVideoProcess.getErrorStream();
                   byte[] dataRec = new byte[1024];
                   int lengthRec = 0;
                   while ((lengthRec = errorRec.read(dataRec, 0, dataRec.length)) > 0) {
                       System.out.println(new String(dataRec, 0, lengthRec));
                       System.out.println(System.currentTimeMillis());
                   }
               } catch (IOException e) {

               }
           }
       };
       threadRec.start();
    }

    private void keepAlive() {
       mKeepAliveThread = new KeepAliveThread();
       mKeepAliveThread.start();
    }

    class KeepAliveThread extends Thread {
       public void run() {
           try {
               Thread.currentThread().setName("gopro");
               if (mOutgoingUdpSocket == null) {
                   mOutgoingUdpSocket = new DatagramSocket();
               }
               while ((!Thread.currentThread().isInterrupted()) && (mOutgoingUdpSocket != null)) {
                   sendUdpCommand(2);
                   Thread.sleep(2500L);
               }
           }
           catch (SocketException e) {

           }
           catch (InterruptedException e) {

           }
           catch (Exception e) {

           }
       }
    }

    private void stopStreaming() {
       if (streamingProcess != null) {
           streamingProcess.destroy();
           streamingProcess = null;
       }
       stopKeepalive();
       mOutgoingUdpSocket.disconnect();
       mOutgoingUdpSocket.close();
    }

    private void stopRec() {
       writeVideoProcess.destroy();
       writeVideoProcess = null;
    }

    private void stopKeepalive() {
       if (mKeepAliveThread != null) {
           mKeepAliveThread.interrupt();
           try {
               mKeepAliveThread.join(10L);
               mKeepAliveThread = null;
           }
           catch (InterruptedException e) {
               Thread.currentThread().interrupt();
           }
       }
    }

    public static void main(String[] args) {
       GoProStreamer streamer = new GoProStreamer();
       streamer.setVisible(true);
       streamer.setTitle("Pannello di controllo");
    }

    }

  • issue after video rotation how fix

    2 avril 2015, par Vahagn

    I have next code for rotate video

    OpenCVFrameConverter.ToIplImage converter2 = new OpenCVFrameConverter.ToIplImage() ;

    for (int i = firstIndex; i <= lastIndex; i++) {
       long t = timestamps[i % timestamps.length] - startTime;
       if (t >= 0) {
           if (t > recorder.getTimestamp()) {
               recorder.setTimestamp(t);
           }
           Frame g = converter2.convert(rotate(converter2.convertToIplImage(images[i % images.length]),9 0));
       recorder.record(g);
       }
    }

    images[i] - Frame in JavaCV
    after in video have green lines

    UPDATE
    Convertation function

    /*
    * Copyright (C) 2015 Samuel Audet
    *
    * This file is part of JavaCV.
    *
    * JavaCV is free software: you can redistribute it and/or modify
    * it under the terms of the GNU General Public License as published by
    * the Free Software Foundation, either version 2 of the License, or
    * (at your option) any later version (subject to the "Classpath" exception
    * as provided in the LICENSE.txt file that accompanied this code).
    *
    * JavaCV is distributed in the hope that it will be useful,
    * but WITHOUT ANY WARRANTY; without even the implied warranty of
    * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
    * GNU General Public License for more details.
    *
    * You should have received a copy of the GNU General Public License
    * along with JavaCV.  If not, see /www.gnu.org/licenses/>.
    */

    package com.example.vvardanyan.ffmpeg;

    import org.bytedeco.javacpp.BytePointer;
    import org.bytedeco.javacpp.Pointer;

    import java.nio.Buffer;

    import static org.bytedeco.javacpp.opencv_core.CV_16S;
    import static org.bytedeco.javacpp.opencv_core.CV_16U;
    import static org.bytedeco.javacpp.opencv_core.CV_32F;
    import static org.bytedeco.javacpp.opencv_core.CV_32S;
    import static org.bytedeco.javacpp.opencv_core.CV_64F;
    import static org.bytedeco.javacpp.opencv_core.CV_8S;
    import static org.bytedeco.javacpp.opencv_core.CV_8U;
    import static org.bytedeco.javacpp.opencv_core.CV_MAKETYPE;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_16S;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_16U;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_32F;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_32S;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_64F;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_8S;
    import static org.bytedeco.javacpp.opencv_core.IPL_DEPTH_8U;
    import static org.bytedeco.javacpp.opencv_core.IplImage;
    import static org.bytedeco.javacpp.opencv_core.Mat;

    /**
    * A utility class to map data between {@link Frame} and {@link IplImage} or {@link Mat}.
    * Since this is an abstract class, one must choose between two concrete classes:
    * {@link ToIplImage} or {@link ToMat}.
    *
    * @author Samuel Audet
    */
    public abstract class OpenCVFrameConverter<f> extends FrameConverter<f> {
       IplImage img;
       Mat mat;

       public static class ToIplImage extends OpenCVFrameConverter<iplimage> {
           @Override public IplImage convert(Frame frame) { return convertToIplImage(frame); }
       }

       public static class ToMat extends OpenCVFrameConverter<mat> {
           @Override public Mat convert(Frame frame) { return convertToMat(frame); }
       }

       public static int getFrameDepth(int depth) {
           switch (depth) {
               case IPL_DEPTH_8U:  case CV_8U:  return Frame.DEPTH_UBYTE;
               case IPL_DEPTH_8S:  case CV_8S:  return Frame.DEPTH_BYTE;
               case IPL_DEPTH_16U: case CV_16U: return Frame.DEPTH_USHORT;
               case IPL_DEPTH_16S: case CV_16S: return Frame.DEPTH_SHORT;
               case IPL_DEPTH_32F: case CV_32F: return Frame.DEPTH_FLOAT;
               case IPL_DEPTH_32S: case CV_32S: return Frame.DEPTH_INT;
               case IPL_DEPTH_64F: case CV_64F: return Frame.DEPTH_DOUBLE;
               default: return -1;
           }
       }

       public static int getIplImageDepth(Frame frame) {
           switch (frame.imageDepth) {
               case Frame.DEPTH_UBYTE:  return IPL_DEPTH_8U;
               case Frame.DEPTH_BYTE:   return IPL_DEPTH_8S;
               case Frame.DEPTH_USHORT: return IPL_DEPTH_16U;
               case Frame.DEPTH_SHORT:  return IPL_DEPTH_16S;
               case Frame.DEPTH_FLOAT:  return IPL_DEPTH_32F;
               case Frame.DEPTH_INT:    return IPL_DEPTH_32S;
               case Frame.DEPTH_DOUBLE: return IPL_DEPTH_64F;
               default:  return -1;
           }
       }
       static boolean isEqual(Frame frame, IplImage img) {
           return img != null &amp;&amp; frame != null &amp;&amp; frame.image != null &amp;&amp; frame.image.length > 0
                   &amp;&amp; frame.imageWidth == img.width() &amp;&amp; frame.imageHeight == img.height()
                   &amp;&amp; frame.imageChannels == img.nChannels() &amp;&amp; getIplImageDepth(frame) == img.depth()
                   &amp;&amp; new Pointer(frame.image[0]).address() == img.imageData().address()
                   &amp;&amp; frame.imageStride * Math.abs(frame.imageDepth) / 8 == img.widthStep();
       }
       public IplImage convertToIplImage(Frame frame) {
           if (frame == null) {
               return null;
           } else if (frame.opaque instanceof IplImage) {
               return (IplImage)frame.opaque;
           } else if (!isEqual(frame, img)) {
               int depth = getIplImageDepth(frame);
               img = depth &lt; 0 ? null : IplImage.createHeader(frame.imageWidth, frame.imageHeight, depth, frame.imageChannels)
                       .imageData(new BytePointer(new Pointer(frame.image[0].position(0)))).widthStep(frame.imageStride * Math.abs(frame.imageDepth) / 8);
           }
           return img;
       }
       public Frame convert(IplImage img) {
           if (img == null) {
               return null;
           } else if (!isEqual(frame, img)) {
               frame = new Frame();
               frame.imageWidth = img.width();
               frame.imageHeight = img.height();
               frame.imageDepth = getFrameDepth(img.depth());
               frame.imageChannels = img.nChannels();
               frame.imageStride = img.widthStep() * 8 / Math.abs(frame.imageDepth);
               frame.image = new Buffer[] { img.createBuffer() };
               frame.opaque = img;
           }
           return frame;
       }

       public static int getMatDepth(Frame frame) {
           switch (frame.imageDepth) {
               case Frame.DEPTH_UBYTE:  return CV_8U;
               case Frame.DEPTH_BYTE:   return CV_8S;
               case Frame.DEPTH_USHORT: return CV_16U;
               case Frame.DEPTH_SHORT:  return CV_16S;
               case Frame.DEPTH_FLOAT:  return CV_32F;
               case Frame.DEPTH_INT:    return CV_32S;
               case Frame.DEPTH_DOUBLE: return CV_64F;
               default:  return -1;
           }
       }
       static boolean isEqual(Frame frame, Mat mat) {
           return mat != null &amp;&amp; frame != null &amp;&amp; frame.image != null &amp;&amp; frame.image.length > 0
                   &amp;&amp; frame.imageWidth == mat.cols() &amp;&amp; frame.imageHeight == mat.rows()
                   &amp;&amp; frame.imageChannels == mat.channels() &amp;&amp; getMatDepth(frame) == mat.depth()
                   &amp;&amp; new Pointer(frame.image[0]).address() == mat.data().address()
                   &amp;&amp; frame.imageStride * Math.abs(frame.imageDepth) / 8 == (int)mat.step();
       }
       public Mat convertToMat(Frame frame) {
           if (frame == null) {
               return null;
           } else if (frame.opaque instanceof Mat) {
               return (Mat)frame.opaque;
           } else if (!isEqual(frame, mat)) {
               int depth = getMatDepth(frame);
               mat = depth &lt; 0 ? null : new Mat(frame.imageHeight, frame.imageWidth, CV_MAKETYPE(depth, frame.imageChannels),
                       new Pointer(frame.image[0].position(0)), frame.imageStride * Math.abs(frame.imageDepth) / 8);
           }
           return mat;
       }
       public Frame convert(Mat mat) {
           if (mat == null) {
               return null;
           } else if (!isEqual(frame, mat)) {
               frame = new Frame();
               frame.imageWidth = mat.cols();
               frame.imageHeight = mat.rows();
               frame.imageDepth = getFrameDepth(mat.depth());
               frame.imageChannels = mat.channels();
               frame.imageStride = (int)mat.step() * 8 / Math.abs(frame.imageDepth);
               frame.image = new Buffer[] { mat.createBuffer() };
               frame.opaque = mat;
           }
           return frame;
       }
    }
    </mat></iplimage></f></f>