Recherche avancée

Médias (91)

Autres articles (94)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Multilang : améliorer l’interface pour les blocs multilingues

    18 février 2011, par

    Multilang est un plugin supplémentaire qui n’est pas activé par défaut lors de l’initialisation de MediaSPIP.
    Après son activation, une préconfiguration est mise en place automatiquement par MediaSPIP init permettant à la nouvelle fonctionnalité d’être automatiquement opérationnelle. Il n’est donc pas obligatoire de passer par une étape de configuration pour cela.

  • Amélioration de la version de base

    13 septembre 2013

    Jolie sélection multiple
    Le plugin Chosen permet d’améliorer l’ergonomie des champs de sélection multiple. Voir les deux images suivantes pour comparer.
    Il suffit pour cela d’activer le plugin Chosen (Configuration générale du site > Gestion des plugins), puis de configurer le plugin (Les squelettes > Chosen) en activant l’utilisation de Chosen dans le site public et en spécifiant les éléments de formulaires à améliorer, par exemple select[multiple] pour les listes à sélection multiple (...)

Sur d’autres sites (6330)

  • Most efficient way to render bitmap to screen on Linux [on hold]

    22 juillet 2019, par Maximus

    My goal is to receive a video feed from wifi and display it on my screen. For this, I’ve created a couple of small programs and a bash script to automate them running. It works like this :

    UDPBitmap/Plotter & ffplay -i - < UDPBitmap/pipe & python requester.py;

    Translation : There is a C++ program called Plotter, its job is to receive packets on an assigned UDP port, process them and write it to pipe (named : UDPBitmap/pipe). The pipe is accessed by ffplay, and ffplay renders the video on screen. The python file is solely responsible for accessing and controlling the camera with various HTTP requests.

    The above command works fine, everything works as expected. However, the resulting latency and framerate is a bit lower than what I’ve wanted. The bottleneck of this program is not the pipe, it is fast enough. Wifi transmission is also fast enough. The only thing left is ffplay.

    Question :

    What is the most efficient way to render a bitmap to screen, on Linux ? Is there a de facto library for this that I can use ?

    Note :

    • Language/framework/library does not matter (C, C++, Java, Python, native linux tools and so on...)
    • I do not need a window handle, but is SDL+OpenGL the way to go ?
    • Writing directly to the framebuffer would be super cool...
  • How to programmatically read an audio RTP stream using javacv and ffmpeg ?

    21 mai 2019, par Chris

    I am trying to read an audio RTP stream coming from ffmpeg in command line using javaCV. I create a DatagramSocket that listens to a specified port but can’t get the audio frames.

    I have tried with different types of buffer to play the audio to my speakers but I am getting a lot of "Invalid return value 0 for stream protocol" error messages with no audio in my speakers.

    I am running the following command to stream an audio file :

    ffmpeg -re -i /some/file.wav -ar 44100 -f mulaw -f rtp rtp ://127.0.0.1:7780

    And an excerpt of my code so far :

    public class FrameGrabber implements Runnable

    private static final TimeUnit SECONDS = TimeUnit.SECONDS;
    private InetAddress ipAddress;
    private DatagramSocket serverSocket;

    public FrameGrabber(Integer port) throws UnknownHostException, SocketException {
       super();

       this.ipAddress = InetAddress.getByName("192.168.44.18");
       serverSocket = new DatagramSocket(port, ipAddress);

    }

    public AudioFormat getAudioFormat() {
       float sampleRate = 44100.0F;
       // 8000,11025,16000,22050,44100
       int sampleSizeInBits = 16;
       // 8,16
       int channels = 1;
       // 1,2
       boolean signed = true;
       // true,false
       boolean bigEndian = false;
       // true,false
       return new AudioFormat(sampleRate, sampleSizeInBits, channels, signed, bigEndian);
    }

    @Override
    public void run() {


       byte[] buffer = new byte[2048];
       DatagramPacket packet = new DatagramPacket(buffer, buffer.length);

       DataInputStream dis = new DataInputStream(new ByteArrayInputStream(packet.getData(), packet.getOffset(), packet.getLength()));


       FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(dis);
       grabber.setFormat("mulaw");
       grabber.setSampleRate((int) getAudioFormat().getSampleRate());
       grabber.setAudioChannels(getAudioFormat().getChannels());

       SourceDataLine soundLine = null;


       try {
           grabber.start();


           if (grabber.getSampleRate() > 0 && grabber.getAudioChannels() > 0) {

               AudioFormat audioFormat = new AudioFormat(grabber.getSampleRate(), 16, grabber.getAudioChannels(), true, true);

               DataLine.Info info = new DataLine.Info(SourceDataLine.class, audioFormat);
               soundLine = (SourceDataLine) AudioSystem.getLine(info);
               soundLine.open(audioFormat);

               soundLine.start();
           }

           ExecutorService executor = Executors.newSingleThreadExecutor();


           while (true) {

               try {
                   serverSocket.receive(packet);
               } catch (IOException e) {
                   e.printStackTrace();
               }

               Frame frame = grabber.grab();

               //if (frame == null) break;


               if (frame != null && frame.samples != null) {

                   ShortBuffer channelSamplesFloatBuffer = (ShortBuffer) frame.samples[0];
                   channelSamplesFloatBuffer.rewind();

                   ByteBuffer outBuffer = ByteBuffer.allocate(channelSamplesFloatBuffer.capacity() * 2);
                   float[] samples = new float[channelSamplesFloatBuffer.capacity()];

                   for (int i = 0; i < channelSamplesFloatBuffer.capacity(); i++) {
                       short val = channelSamplesFloatBuffer.get(i);
                       outBuffer.putShort(val);
                   }

                   if (soundLine == null) return;
                   try {
                       SourceDataLine finalSoundLine = soundLine;
                       executor.submit(() -> {
                           finalSoundLine.write(outBuffer.array(), 0, outBuffer.capacity());
                           outBuffer.clear();
                       }).get();
                   } catch (InterruptedException interruptedException) {
                       Thread.currentThread().interrupt();
                   }
               }

           }

           /*
           executor.shutdownNow();
           executor.awaitTermination(1, SECONDS);

           if (soundLine != null) {
               soundLine.stop();
           }

           grabber.stop();
           grabber.release();*/

           } catch (ExecutionException ex) {
           System.out.println("ExecutionException");
           ex.printStackTrace();
       } catch (org.bytedeco.javacv.FrameGrabber.Exception ex) {
           System.out.println("FrameGrabberException");
           ex.printStackTrace();
       } catch (LineUnavailableException ex) {
           System.out.println("LineUnavailableException");
           ex.printStackTrace();
       }/* catch (InterruptedException e) {
           System.out.println("InterruptedException");
           e.printStackTrace();
       }*/


    }

    public static void main(String[] args) throws SocketException, UnknownHostException {
       Runnable apRunnable = new FrameGrabber(7780);
       Thread ap = new Thread(apRunnable);
       ap.start();
    }

    At this stage, I am trying to play the audio file in my speakers but I am getting the following logs :

    Task :FrameGrabber.main()
    Invalid return value 0 for stream protocol
    Invalid return value 0 for stream protocol
    Input #0, mulaw, from ’java.io.DataInputStream@474e6cea’ :
    Duration : N/A, bitrate : 352 kb/s
    Stream #0:0 : Audio : pcm_mulaw, 44100 Hz, 1 channels, s16, 352 kb/s
    Invalid return value 0 for stream protocol
    Invalid return value 0 for stream protocol
    Invalid return value 0 for stream protocol
    Invalid return value 0 for stream protocol
    ...

    What am I doing wrong ?

    Thanks in advance !

  • Render Multiple Gifs with ffplay/ffmpeg in Winform

    17 juin 2019, par Purqs

    I’m trying to get x number of animated gifs to render on like a Panel or PictureBox and using transparency that is in each gif. I’ve tried a couple approaches but am not super famiular with ffmpeg and such. Below is some code that I use to get it to render inside a panel, but I can’t figure out how to get like 5 gifs to stack/layer on one another and still render as you would expect.

    I need/want this to render in the form and not outputted. I am a little confused to why the ffplay.exe doesn’t use the -i command and that might be why i can’t get it to render. any ideas ?

    Working example below.

    using System;
    using System.Collections.Generic;
    using System.ComponentModel;
    using System.Data;
    using System.Drawing;
    using System.Linq;
    using System.Text;
    using System.Threading.Tasks;
    using System.Windows.Forms;
    using System.Diagnostics;
    using System.Threading;
    using System.IO;
    using System.Reflection;
    using System.Runtime.InteropServices;
    using System.Drawing.Text;
    using System.Text.RegularExpressions;
    using System.Configuration;
    using Microsoft.Win32;
    using System.Windows.Forms.VisualStyles;

    //FOR THIS EXAMPLE CREATE FORM HAVE BUTTON ON IT AND PANEL.
    //button: button's click is "button1_Click"
    //panel: Needed to output the render on it.
    //FILES:
    //Test.Gif
    //These ff files came from the ffmpeg offical site.
    //ffplay.exe //currently using
    //ffmpeg.exe //thinking i need to use to get it how I want.
    //I most of the code below from https://stackoverflow.com/questions/31465630/ffplay-successfully-moved-inside-my-winform-how-to-set-it-borderless which was a good starting point.

    namespace Test_Form
    {
       public partial class Form1 : Form
       {
           [DllImport("user32.dll", SetLastError = true)]
           private static extern bool MoveWindow(IntPtr hWnd, int X, int Y, int nWidth, int nHeight, bool bRepaint);

           [DllImport("user32.dll")]
           private static extern IntPtr SetParent(IntPtr hWndChild, IntPtr hWndNewParent);


           //Process ffplay = null;

           public Form1()
           {
               InitializeComponent();
               Application.EnableVisualStyles();
               this.DoubleBuffered = true;
           }



           public Process ffplay = new Process();
           private void FFplay()
           {
               ffplay.StartInfo.FileName = "ffplay.exe";
               ffplay.StartInfo.Arguments = "-noborder Test.gif"; //THIS IS WHERE I INPUT THE GIF FILE
               ffplay.StartInfo.CreateNoWindow = true;
               ffplay.StartInfo.RedirectStandardOutput = true;
               ffplay.StartInfo.UseShellExecute = false;

               ffplay.EnableRaisingEvents = true;
               ffplay.OutputDataReceived += (o, e) => Debug.WriteLine(e.Data ?? "NULL", "ffplay");
               ffplay.ErrorDataReceived += (o, e) => Debug.WriteLine(e.Data ?? "NULL", "ffplay");
               ffplay.Exited += (o, e) => Debug.WriteLine("Exited", "ffplay");
               ffplay.Start();

               Thread.Sleep(1000); // you need to wait/check the process started, then...

               // child, new parent
               // make 'this' the parent of ffmpeg (presuming you are in scope of a Form or Control)
               SetParent(ffplay.MainWindowHandle, this.Handle);

               // window, x, y, width, height, repaint
               // move the ffplayer window to the top-left corner and set the size to 320x280
               MoveWindow(ffplay.MainWindowHandle, 800, 600, 320, 280, true);

               SetParent(ffplay.MainWindowHandle, this.panel1.Handle);
               MoveWindow(ffplay.MainWindowHandle, -5, -30, 320, 280, true);
           }

    //runs the FFplay Command
    private void button1_Click(object sender, EventArgs e)
           {
               FFplay();

           }

           private void Form1_FormClosed(object sender, FormClosedEventArgs e)
           {
               try { ffplay.Kill(); }
               catch { }
           }
       }

    I would like the button to allow me to add any number of gifs (like 5 or 10) all to the same area and have them being animated with their transparent showing what is under that gif.

    So for example I could have a circle image, then a spinning/loading transparent gif on top, and then a gif that counts up/down on top of that one to give me the effect of a count-down.

    Thanks for all the help !