Recherche avancée

Médias (3)

Mot : - Tags -/image

Autres articles (89)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

Sur d’autres sites (6452)

  • Can't play at a good framerate with JavaAV

    31 mars 2017, par TW2

    I have a problem in framerate when I use JavaAV (which use JavaCPP + ffmpeg). When I set my framerate as mentionned at DemuxerExample.java, the frame change each 47000ms which is too long and false. It’s better when I set 1000 / demuxer.getFrameRate() * 1000 ( 47ms) but it’s also false. I can’t have a good framerate. Here is my player class :

    package gr.av;

    import hoary.javaav.Audio;
    import hoary.javaav.AudioFrame;
    import hoary.javaav.Demuxer;
    import hoary.javaav.Image;
    import hoary.javaav.JavaAVException;
    import hoary.javaav.MediaFrame;
    import hoary.javaav.VideoFrame;
    import java.awt.Color;
    import java.awt.Graphics;
    import java.awt.image.BufferedImage;
    import java.io.ByteArrayInputStream;
    import java.io.ByteArrayOutputStream;
    import java.io.IOException;
    import java.util.logging.Level;
    import java.util.logging.Logger;
    import javax.sound.sampled.AudioFormat;
    import javax.sound.sampled.AudioInputStream;
    import javax.sound.sampled.AudioSystem;
    import javax.sound.sampled.DataLine;
    import javax.sound.sampled.LineUnavailableException;
    import javax.sound.sampled.SourceDataLine;
    import javax.sound.sampled.UnsupportedAudioFileException;
    import javax.swing.JPanel;

    public class Player extends JPanel {

       VideoThread videoTHREAD;
       AudioThread audioTHREAD;
       BufferedImage img = null;

       public Player() {
           init();
       }

       private void init(){        
           videoTHREAD = new VideoThread(this);
           audioTHREAD = new AudioThread();
       }

       public void setImage(BufferedImage img){
           this.img = img;
           repaint();
       }

       @Override
       public void paint(Graphics g){
           if(img != null){
               g.drawImage(img, 0, 0, null);
           }else{
               g.setColor(Color.blue);
               g.fillRect(0, 0, getWidth(), getHeight());
           }
       }

       public void setFilename(String filename){
           videoTHREAD.setFilename(filename);
           audioTHREAD.setFilename(filename);
       }

       public void play(){
           videoTHREAD.playThread();
           audioTHREAD.playThread();
       }

       public void stop(){
           videoTHREAD.stopThread();
           audioTHREAD.stopThread();
       }

       public static class VideoThread extends Thread {

           //The video filename and a controller
           String filename = null;
           private volatile boolean active = false;

           //Panel to see video
           Player player;

           public VideoThread(Player player) {
               this.player = player;
           }

           public void setFilename(String filename){
               this.filename = filename;
           }

           public void playThread(){
               if(filename != null && active == false){
                   active = true;
                   this.start();
               }
           }

           public void stopThread(){
               if(active == true){
                   active = false;
                   this.interrupt();
               }            
           }

           public void video() throws JavaAVException, InterruptedException, IOException, UnsupportedAudioFileException{            
               Demuxer demuxer = new Demuxer();
               demuxer.open(filename);

               MediaFrame mediaFrame;
               while (active && (mediaFrame = demuxer.readFrame()) != null) {
                   if (mediaFrame.getType() == MediaFrame.Type.VIDEO) {

                       VideoFrame videoFrame = (VideoFrame) mediaFrame;

                       player.setImage(Image.createImage(videoFrame, BufferedImage.TYPE_3BYTE_BGR));

                       double FPS = demuxer.getFrameRate() * 1000d;
                       long ms = (long)(1000d / FPS);
                       System.out.println("FPS = " + FPS + " ; Milliseconds = " + ms);
                       java.util.concurrent.TimeUnit.MILLISECONDS.sleep(ms);
                   }
               }
               demuxer.close();
           }

           @Override
           public void run() {
               if(filename != null){
                   try {
                       video();
                   } catch (JavaAVException | InterruptedException | IOException | UnsupportedAudioFileException ex) {
                          Logger.getLogger(Player.class.getName()).log(Level.SEVERE, null, ex);
                   }
               }
           }

       }

       public static class AudioThread extends Thread {

           //The video filename and a controller
           String filename = null;
           private volatile boolean active = false;

           //Audio
           AudioFormat format = new AudioFormat(44000, 16, 2, true, false);
           AudioInputStream ais;
           DataLine.Info info = new DataLine.Info(SourceDataLine.class, format);
           SourceDataLine soundLine;

           public AudioThread() {            
           }

           public void setFilename(String filename){
               this.filename = filename;
           }

           public void playThread(){
               if(filename != null && active == false){
                   active = true;
                   soundON();
                   this.start();
               }
           }

           public void stopThread(){
               if(active == true){
                   active = false;
                   soundOFF();
                   this.interrupt();
               }            
           }

           public void audio() throws JavaAVException, IOException {            
               Demuxer demuxer = new Demuxer();
               demuxer.open(filename);

               MediaFrame mediaFrame;
               while (active && (mediaFrame = demuxer.readFrame()) != null) {
                   if (mediaFrame.getType() == MediaFrame.Type.AUDIO) {
                       AudioFrame audioFrame = (AudioFrame) mediaFrame;
                       byte[] bytes = Audio.getAudio16(audioFrame);

                       try (ByteArrayInputStream bais = new ByteArrayInputStream(bytes)) {
                           ais = new AudioInputStream(bais, format, bytes.length / format.getFrameSize());
                           try (ByteArrayOutputStream baos = new ByteArrayOutputStream()) {
                               int nBufferSize = bytes.length * format.getFrameSize();
                               byte[] abBuffer = new byte[nBufferSize];

                               while (true){
                                   int nBytesRead = ais.read(abBuffer);
                                   if (nBytesRead == -1)
                                       break;

                                   baos.write(abBuffer, 0, nBytesRead);
                               }

                               byte[] abAudioData = baos.toByteArray();
                               soundLine.write(abAudioData, 0, abAudioData.length);
                           }
                           ais.close();
                       }

                   }
               }

               demuxer.close();
           }

           @Override
           public void run() {
               if(filename != null){
                   try {
                       audio();
                   } catch (JavaAVException | IOException ex) {
                       Logger.getLogger(Player.class.getName()).log(Level.SEVERE, null, ex);
                   }
               }
           }

           private void soundON(){
               try {
                   soundLine = (SourceDataLine) AudioSystem.getLine(info);
                   soundLine.open(format);
                   soundLine.start();
               } catch (LineUnavailableException ex) {
                   Logger.getLogger(Player.class.getName()).log(Level.SEVERE, null, ex);
               }
           }

           private void soundOFF(){
               soundLine.drain();
               soundLine.stop();
               soundLine.close();
           }

       }

    }
  • Stream RTP to FFMPEG using SDP

    21 mars 2017, par Johnathan Kanarek

    I get RTP stream from WebRTC server (I used mediasoup) using node.js and I get the decrypted RTP packets raw data from the stream.
    I want to forward this RTP data to ffmpeg and from there I can save it to file, or push it as RTMP stream to other media servers.
    I guess that the best way would be to create SDP file that describes both the audio and video streams and send the packets therough new sockets.

    The ffmpeg command is :

    ffmpeg -loglevel debug -protocol_whitelist file,crypto,udp,rtp -re -vcodec libvpx -acodec opus -i test.sdp -vcodec libx264 -acodec aac -y output.mp4

    I tried to send the packets through UDP :

    v=0
    o=mediasoup 7199daf55e496b370e36cd1d25b1ef5b9dff6858 0 IN IP4 192.168.193.182
    s=7199daf55e496b370e36cd1d25b1ef5b9dff6858
    c=IN IP4 192.168.193.182
    t=0 0
    m=audio 33301 RTP/AVP 111
    a=rtpmap:111 /opus/48000
    a=fmtp:111 minptime=10;useinbandfec=1
    a=rtcp-fb:111 transport-cc
    a=sendrecv
    m=video 33302 RTP/AVP 100
    a=rtpmap:100 /VP8/90000
    a=rtcp-fb:100 ccm fir
    a=rtcp-fb:100 nack
    a=rtcp-fb:100 nack pli
    a=rtcp-fb:100 goog-remb
    a=rtcp-fb:100 transport-cc
    a=sendrecv

    But I always get (removed the boring parts) :

    Opening an input file: test.sdp.

    [sdp @ 0x103dea0]
    Format sdp probed with size=2048 and score=50
    [sdp @ 0x103dea0] audio codec set to: (null)
    [sdp @ 0x103dea0] audio samplerate set to: 44100
    [sdp @ 0x103dea0] audio channels set to: 1
    [sdp @ 0x103dea0] video codec set to: (null)
    [udp @ 0x10402e0] end receive buffer size reported is 131072
    [udp @ 0x10400c0] end receive buffer size reported is 131072
    [sdp @ 0x103dea0] setting jitter buffer size to 500
    [udp @ 0x1040740] bind failed: Address already in use
    [AVIOContext @ 0x1046980] Statistics: 473 bytes read, 0 seeks
    test.sdp: Invalid data found when processing input

    Note that I get it even if I don’t open socket at all or send anything to this port, as if the ffmpeg itself tries to open these ports more than once.

    I tried also to open two (video and audio) TCP servers and define SDP with TCP :

    v=0
    o=mediasoup 7199daf55e496b370e36cd1d25b1ef5b9dff6858 0 IN IP4 192.168.193.182
    s=7199daf55e496b370e36cd1d25b1ef5b9dff6858
    c=IN IP4 192.168.193.182
    t=0 0
    m=audio 33301 TCP 111
    a=rtpmap:111 /opus/48000
    a=fmtp:111 minptime=10;useinbandfec=1
    a=rtcp-fb:111 transport-cc
    a=setup:active
    a=connection:new
    a=sendrecv
    m=video 33302 TCP 100
    a=rtpmap:100 /VP8/90000
    a=rtcp-fb:100 ccm fir
    a=rtcp-fb:100 nack
    a=rtcp-fb:100 nack pli
    a=rtcp-fb:100 goog-remb
    a=rtcp-fb:100 transport-cc
    a=setup:active
    a=connection:new
    a=sendrecv

    However I don’t see any incoming connection into my TCP servers and I get the following from ffmpeg :

    Opening an input file: test.sdp.

    [sdp @ 0xdddea0]
    Format sdp probed with size=2048 and score=50

    [sdp @ 0xdddea0]
    audio codec set to: (null)

    [sdp @ 0xdddea0]
    audio samplerate set to: 44100
    [sdp @ 0xdddea0] audio channels set to: 1
    [sdp @ 0xdddea0] video codec set to: (null)
    [udp @ 0xde02e0] end receive buffer size reported is 131072
    [udp @ 0xde00c0] end receive buffer size reported is 131072
    [sdp @ 0xdddea0] setting jitter buffer size to 500
    [udp @ 0xde0740] end receive buffer size reported is 131072

    [udp @ 0xde0180] end receive buffer size reported is 131072
    [sdp @ 0xdddea0] setting jitter buffer size to 500
    [sdp @ 0xdddea0] Before avformat_find_stream_info() pos: 593 bytes read:593 seeks:0 nb_streams:2
    [libvpx @ 0xdeea80] v1.3.0
    [libvpx @ 0xdeea80] --target=x86_64-linux-gcc --enable-pic --disable-install-srcs --as=nasm --enable-shared --prefix=/usr --libdir=/usr/lib64

    [sdp @ 0xdddea0] Could not find codec parameters for stream 1 (Video: vp8, 1 reference frame, none): unspecified size
    Consider increasing the value for the 'analyzeduration' and 'probesize' options
    [sdp @ 0xdddea0] After avformat_find_stream_info() pos: 593 bytes read:593 seeks:0 frames:0
    Input #0, sdp, from 'test.sdp':
     Metadata:
       title           : 7199daf55e496b370e36cd1d25b1ef5b9dff6858
     Duration: N/A, bitrate: N/A
       Stream #0:0, 0, 1/90000: Audio: opus, 48000 Hz, mono, fltp
       Stream #0:1, 0, 1/90000: Video: vp8, 1 reference frame, none, 90k tbr, 90k tbn, 90k tbc
    Successfully opened the file.
    Parsing a group of options: output file output.mp4.
    Successfully parsed a group of options.
    Opening an output file: output.mp4.
    [file @ 0xde3660] Setting default whitelist 'file,crypto'
    Successfully opened the file.

    detected 1 logical cores
    [graph 0 input from stream 0:0 @ 0xde3940] Setting 'time_base' to value '1/48000'
    [graph 0 input from stream 0:0 @ 0xde3940] Setting 'sample_rate' to value '48000'
    [graph 0 input from stream 0:0 @ 0xde3940] Setting 'sample_fmt' to value 'fltp'
    [graph 0 input from stream 0:0 @ 0xde3940] Setting 'channel_layout' to value '0x4'
    [graph 0 input from stream 0:0 @ 0xde3940] tb:1/48000 samplefmt:fltp samplerate:48000 chlayout:0x4
    [audio format for output stream 0:0 @ 0xe37900] Setting 'sample_fmts' to value 'fltp'
    [audio format for output stream 0:0 @ 0xe37900] Setting 'sample_rates' to value '96000|88200|64000|48000|44100|32000|24000|22050|16000|12000|11025|8000|7350'
    [AVFilterGraph @ 0xde0220] query_formats: 4 queried, 9 merged, 0 already done, 0 delayed

    Output #0, mp4, to 'output.mp4':

     Metadata:

       title           :
    7199daf55e496b370e36cd1d25b1ef5b9dff6858


       encoder         :
    Lavf57.56.100


       Stream #0:0
    , 0, 1/48000
    : Audio: aac (LC) ([64][0][0][0] / 0x0040), 48000 Hz, mono, fltp, delay 1024, 69 kb/s


       Metadata:

         encoder         :
    Lavc57.64.100 aac


    Stream mapping:

     Stream #0:0 -> #0:0 (opus (native) -> aac (native))
    Press [q] to stop, [?] for help
    cur_dts is invalid (this is harmless if it occurs once at the start per stream)

    test.sdp: Connection timed out
    cur_dts is invalid (this is harmless if it occurs once at the start per stream)
    cur_dts is invalid (this is harmless if it occurs once at the start per stream)
    [output stream 0:0 @ 0xde3b40] EOF on sink link output stream 0:0:default.
    No more output streams to write to, finishing.
    [aac @ 0xde2b00] Trying to remove 1024 samples, but the queue is empty
    [aac @ 0xde2b00] Trying to remove 1024 more samples than there are in the queue
    [mp4 @ 0xe6a540] Timestamps are unset in a packet for stream 0. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly
    [mp4 @ 0xe6a540] Encoder did not produce proper pts, making some up.
    [aac @ 0xde2b00] Trying to remove 1024 samples, but the queue is empty
    [aac @ 0xde2b00] Trying to remove 1024 more samples than there are in the queue
    size=       1kB time=00:00:00.04 bitrate= 157.9kbits/s speed=0.00426x
    video:0kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 3268.000000%
    Input file #0 (test.sdp):
     Input stream #0:0 (audio): 0 packets read (0 bytes); 0 frames decoded (0 samples);
     Input stream #0:1 (video): 0 packets read (0 bytes);
     Total: 0 packets (0 bytes) demuxed
    Output file #0 (output.mp4):
     Output stream #0:0 (audio): 0 frames encoded (0 samples); 2 packets muxed (25 bytes);
     Total: 2 packets (25 bytes) muxed
    0 frames successfully decoded, 0 decoding errors
    [AVIOContext @ 0xde37a0] Statistics: 30 seeks, 25 writeouts
    [aac @ 0xde2b00] Qavg: 47249.418

    [AVIOContext @ 0xde6980] Statistics: 593 bytes read, 0 seeks

    Note to the "Connection timed out" in the log above.

    I guess that both my SDPs are wrong, any suggestions ?

    Alternatives to SDP are also most welcomed.

  • Evolution #3237 : Utiliser medium-editor

    15 février 2021, par RastaPopoulos ♥

    Moui mof, quand on fait la live demo, ya plein de soucis, ça active le bouton "B" (gras) même quand on est sur un lien ou un H4, H5, etc, alors que ya pas strong dessus, juste parce qu’en CSS ya le style gras ou je sais pas… L’éditeur de lien est horrible : on peut juste en ajouter mais quand on est sur un morceau de texte qui a déjà un lien et qu’on clique dessus : ça supprime entièrement le lien ! et plusieurs autres soucis et manques dans ce genre… c’est vraiment pas le meilleur éditeur wysiwyM que j’ai pu utiliser…