Recherche avancée

Médias (1)

Mot : - Tags -/musée

Autres articles (104)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

Sur d’autres sites (12779)

  • How to programmatically read an audio RTP stream using javacv and ffmpeg ?

    21 mai 2019, par Chris

    I am trying to read an audio RTP stream coming from ffmpeg in command line using javaCV. I create a DatagramSocket that listens to a specified port but can’t get the audio frames.

    I have tried with different types of buffer to play the audio to my speakers but I am getting a lot of "Invalid return value 0 for stream protocol" error messages with no audio in my speakers.

    I am running the following command to stream an audio file :

    ffmpeg -re -i /some/file.wav -ar 44100 -f mulaw -f rtp rtp ://127.0.0.1:7780

    And an excerpt of my code so far :

    public class FrameGrabber implements Runnable

    private static final TimeUnit SECONDS = TimeUnit.SECONDS;
    private InetAddress ipAddress;
    private DatagramSocket serverSocket;

    public FrameGrabber(Integer port) throws UnknownHostException, SocketException {
       super();

       this.ipAddress = InetAddress.getByName("192.168.44.18");
       serverSocket = new DatagramSocket(port, ipAddress);

    }

    public AudioFormat getAudioFormat() {
       float sampleRate = 44100.0F;
       // 8000,11025,16000,22050,44100
       int sampleSizeInBits = 16;
       // 8,16
       int channels = 1;
       // 1,2
       boolean signed = true;
       // true,false
       boolean bigEndian = false;
       // true,false
       return new AudioFormat(sampleRate, sampleSizeInBits, channels, signed, bigEndian);
    }

    @Override
    public void run() {


       byte[] buffer = new byte[2048];
       DatagramPacket packet = new DatagramPacket(buffer, buffer.length);

       DataInputStream dis = new DataInputStream(new ByteArrayInputStream(packet.getData(), packet.getOffset(), packet.getLength()));


       FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(dis);
       grabber.setFormat("mulaw");
       grabber.setSampleRate((int) getAudioFormat().getSampleRate());
       grabber.setAudioChannels(getAudioFormat().getChannels());

       SourceDataLine soundLine = null;


       try {
           grabber.start();


           if (grabber.getSampleRate() > 0 && grabber.getAudioChannels() > 0) {

               AudioFormat audioFormat = new AudioFormat(grabber.getSampleRate(), 16, grabber.getAudioChannels(), true, true);

               DataLine.Info info = new DataLine.Info(SourceDataLine.class, audioFormat);
               soundLine = (SourceDataLine) AudioSystem.getLine(info);
               soundLine.open(audioFormat);

               soundLine.start();
           }

           ExecutorService executor = Executors.newSingleThreadExecutor();


           while (true) {

               try {
                   serverSocket.receive(packet);
               } catch (IOException e) {
                   e.printStackTrace();
               }

               Frame frame = grabber.grab();

               //if (frame == null) break;


               if (frame != null && frame.samples != null) {

                   ShortBuffer channelSamplesFloatBuffer = (ShortBuffer) frame.samples[0];
                   channelSamplesFloatBuffer.rewind();

                   ByteBuffer outBuffer = ByteBuffer.allocate(channelSamplesFloatBuffer.capacity() * 2);
                   float[] samples = new float[channelSamplesFloatBuffer.capacity()];

                   for (int i = 0; i < channelSamplesFloatBuffer.capacity(); i++) {
                       short val = channelSamplesFloatBuffer.get(i);
                       outBuffer.putShort(val);
                   }

                   if (soundLine == null) return;
                   try {
                       SourceDataLine finalSoundLine = soundLine;
                       executor.submit(() -> {
                           finalSoundLine.write(outBuffer.array(), 0, outBuffer.capacity());
                           outBuffer.clear();
                       }).get();
                   } catch (InterruptedException interruptedException) {
                       Thread.currentThread().interrupt();
                   }
               }

           }

           /*
           executor.shutdownNow();
           executor.awaitTermination(1, SECONDS);

           if (soundLine != null) {
               soundLine.stop();
           }

           grabber.stop();
           grabber.release();*/

           } catch (ExecutionException ex) {
           System.out.println("ExecutionException");
           ex.printStackTrace();
       } catch (org.bytedeco.javacv.FrameGrabber.Exception ex) {
           System.out.println("FrameGrabberException");
           ex.printStackTrace();
       } catch (LineUnavailableException ex) {
           System.out.println("LineUnavailableException");
           ex.printStackTrace();
       }/* catch (InterruptedException e) {
           System.out.println("InterruptedException");
           e.printStackTrace();
       }*/


    }

    public static void main(String[] args) throws SocketException, UnknownHostException {
       Runnable apRunnable = new FrameGrabber(7780);
       Thread ap = new Thread(apRunnable);
       ap.start();
    }

    At this stage, I am trying to play the audio file in my speakers but I am getting the following logs :

    Task :FrameGrabber.main()
    Invalid return value 0 for stream protocol
    Invalid return value 0 for stream protocol
    Input #0, mulaw, from ’java.io.DataInputStream@474e6cea’ :
    Duration : N/A, bitrate : 352 kb/s
    Stream #0:0 : Audio : pcm_mulaw, 44100 Hz, 1 channels, s16, 352 kb/s
    Invalid return value 0 for stream protocol
    Invalid return value 0 for stream protocol
    Invalid return value 0 for stream protocol
    Invalid return value 0 for stream protocol
    ...

    What am I doing wrong ?

    Thanks in advance !

  • I need to build a camera server, which connects with several cameras and serves video stream to several clients. Preferably using C # [closed]

    10 mars 2020, par George Nardes

    I need to build an application that distributes videos from IP cameras.
    The cameras stream via the RTSP protocol. I need to distribute the frames received to several clients.
    it is a requirement that the system uses C #.

    I would like help with frameworks or libraries that I can use to solve my problem. Thank You

  • Merge commit ’ec4c48397641dbaf4ae8df36c32aaa5a311a11bf’

    4 mars 2016, par Derek Buitenhuis
    Merge commit ’ec4c48397641dbaf4ae8df36c32aaa5a311a11bf’
    

    This is a no-op. API is already implemented by us.

    * commit ’ec4c48397641dbaf4ae8df36c32aaa5a311a11bf’ :
    lavf : add a protocol whitelist/blacklist for file opened internally

    Merged-by : Derek Buitenhuis <derek.buitenhuis@gmail.com>