Recherche avancée

Médias (17)

Mot : - Tags -/wired

Autres articles (108)

  • Les vidéos

    21 avril 2011, par

    Comme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
    Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
    Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

Sur d’autres sites (9533)

  • FFmpeg library drops frame in mp4 video using filter 'fps' or 'framerate'

    28 juin 2019, par Dzmitry

    After decoding/encoding original mp4 video with fps 25 I’ve gotten video with fps 25 bui if I set video filter fps=fps=25 then last frame drops and I get the following result (AV_LOG_TRACE) :

    ...
    [Parsed_fps_0 @ 0x7f8f2c735f80] fps=25/1
    [Parsed_fps_0 @ 0x7f8f2c735f80] Read frame with in pts 1024, out pts 2
    [Parsed_fps_0 @ 0x7f8f2c735f80] Read frame with in pts 1536, out pts 3
    [Parsed_fps_0 @ 0x7f8f2c735f80] Set first pts to 2
    [Parsed_fps_0 @ 0x7f8f2c735f80] Writing frame with pts 2 to pts 2
    [Parsed_fps_0 @ 0x7f8f2c735f80] Read frame with in pts 2048, out pts 4
    [Parsed_fps_0 @ 0x7f8f2c735f80] Writing frame with pts 3 to pts 3
    [Parsed_fps_0 @ 0x7f8f2c735f80] Read frame with in pts 2560, out pts 5
    [Parsed_fps_0 @ 0x7f8f2c735f80] Writing frame with pts 4 to pts 4
    ...
    [Parsed_fps_0 @ 0x7f8f2c735f80] Read frame with in pts 90624, out pts 177
    [Parsed_fps_0 @ 0x7f8f2c735f80] Writing frame with pts 176 to pts 176
    [Parsed_fps_0 @ 0x7f8f2c735f80] Read frame with in pts 91136, out pts 178
    [Parsed_fps_0 @ 0x7f8f2c735f80] Writing frame with pts 177 to pts 177
    [Parsed_fps_0 @ 0x7f8f2c735f80] Dropping frame with pts 178
    [Parsed_fps_0 @ 0x7f8f2c735f80] 177 frames in, 176 frames out; 1 frames dropped, 0 frames duplicated.

    I’ve tried to set input/output options like these :

    //input
    av_dict_set(&options, "r", "25", 0); //doesn't help
    avformat_open_input(&pFormatContext, NULL, NULL, &options);
    //input video graph filter fps
    ...
    // encoder
    AVOutputFormat* pOutputFormat = av_guess_format("mp4", NULL, NULL);
    AVFormatContext* pFormatContext = avformat_alloc_context();
    pFormatContext->oformat = pOutputFormat;
    AVStream* pVideoStream = avformat_new_stream(m_pFormatContext, NULL);

    AVCodec* pCodec = avcodec_find_encoder_by_name("libx264");
    AVCodecContext* pCodecContext = avcodec_alloc_context3(pCodec);
    pCodecContext->codec_id = pOutputFormat->video_codec;
    pCodecContext->codec_type = AVMEDIA_TYPE_VIDEO;
    pCodecContext->level = 51;
    pCodecContext->width = 800;
    pCodecContext->height = 600;
    pCodecContext->time_base = (AVRational) {1, 25};
    pCodecContext->framerate = av_inv_q(pCodecContext->time_base);
    pCodecContext->gop_size = 12;
    pCodecContext->max_b_frames = 1;
    pCodecContext->qcompress = 1.0f;
    pCodecContext->pix_fmt = AV_PIX_FMT_YUV420P;
    //...
    av_opt_set(pCodecContext->priv_data, "x264opts", "fps=25", 0); //doesn't help

    pVideoStream->time_base = pCodecContext->time_base;
    pVideoStream->avg_frame_rate = pCodecContext->framerate; //doesn't help
    // output
    av_dict_set(&options, "r", "25", 0); //doesn't help
    avformat_write_header(m_pFormatContext, &options);
    // flush decoder
    // flush filterbuffer and encoder
    av_write_trailer(pFormatContext);

    I’ve expected to get video with value 25 without drops frames.

  • Using HW accelerate video encoding

    9 juin 2019, par RoboGil

    I am trying to get ffmpeg HW acceleration on an UpBoard .
    It uses an Atom x5 z8350 processor and Intel HD 400 GPU

    I have also followed this tutorial in order to compile ffmpeg with libmfx enabled.

    $ ffmpeg -encoders | grep qsv
    ffmpeg version 4.1.3 Copyright (c) 2000-2019 the FFmpeg developers
     built with gcc 5.4.0 (Ubuntu 5.4.0-6ubuntu1~16.04.11) 20160609
     configuration: --enable-libmfx
     libavutil      56. 22.100 / 56. 22.100
     libavcodec     58. 35.100 / 58. 35.100
     libavformat    58. 20.100 / 58. 20.100
     libavdevice    58.  5.100 / 58.  5.100
     libavfilter     7. 40.101 /  7. 40.101
     libswscale      5.  3.100 /  5.  3.100
     libswresample   3.  3.100 /  3.  3.100
    V..... h264_qsv             H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 (Intel Quick Sync Video acceleration) (codec h264)
    V..... hevc_qsv             HEVC (Intel Quick Sync Video acceleration) (codec hevc)
    V..... mjpeg_qsv            MJPEG (Intel Quick Sync Video acceleration) (codec mjpeg)
    V..... mpeg2_qsv            MPEG-2 video (Intel Quick Sync Video acceleration) (codec mpeg2video)

    However, when trying to encode I video I am getting :

    $ ffmpeg -i in.mp4 -c:v h264_qsv -global_quality 10 -look_ahead 1 out.mp4


    ffmpeg version 4.1.3 Copyright (c) 2000-2019 the FFmpeg developers
     built with gcc 5.4.0 (Ubuntu 5.4.0-6ubuntu1~16.04.11) 20160609
     configuration: --enable-libmfx
     libavutil      56. 22.100 / 56. 22.100
     libavcodec     58. 35.100 / 58. 35.100
     libavformat    58. 20.100 / 58. 20.100
     libavdevice    58.  5.100 / 58.  5.100
     libavfilter     7. 40.101 /  7. 40.101
     libswscale      5.  3.100 /  5.  3.100
     libswresample   3.  3.100 /  3.  3.100
    Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'in.mp4':
     Metadata:
       major_brand     : mp42
       minor_version   : 0
       compatible_brands: isommp42
       creation_time   : 2019-06-04T19:06:40.000000Z
     Duration: 00:04:03.07, start: 0.000000, bitrate: 408 kb/s
       Stream #0:0(und): Video: h264 (Constrained Baseline) (avc1 / 0x31637661), yuv420p(tv, bt709), 640x360 [SAR 1:1 DAR 16:9], 310 kb/s, 25 fps, 25 tbr, 12800 tbn, 50 tbc (default)
       Metadata:
         creation_time   : 2019-06-04T19:06:40.000000Z
         handler_name    : ISO Media file produced by Google Inc. Created on: 06/04/2019.
       Stream #0:1(und): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 95 kb/s (default)
       Metadata:
         creation_time   : 2019-06-04T19:06:40.000000Z
         handler_name    : ISO Media file produced by Google Inc. Created on: 06/04/2019.
    File 'out.mp4' already exists. Overwrite ? [y/N] y
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 (native) -> h264 (h264_qsv))
     Stream #0:1 -> #0:1 (aac (native) -> aac (native))
    Press [q] to stop, [?] for help


    [h264_qsv @ 0x3b30a00] Error initializing an internal MFX session: unsupported (-3)
    Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height
    Conversion failed!

    What am I missing ?

    Thank you very much !

  • How to programmatically read an audio RTP stream using javacv and ffmpeg ?

    21 mai 2019, par Chris

    I am trying to read an audio RTP stream coming from ffmpeg in command line using javaCV. I create a DatagramSocket that listens to a specified port but can’t get the audio frames.

    I have tried with different types of buffer to play the audio to my speakers but I am getting a lot of "Invalid return value 0 for stream protocol" error messages with no audio in my speakers.

    I am running the following command to stream an audio file :

    ffmpeg -re -i /some/file.wav -ar 44100 -f mulaw -f rtp rtp ://127.0.0.1:7780

    And an excerpt of my code so far :

    public class FrameGrabber implements Runnable

    private static final TimeUnit SECONDS = TimeUnit.SECONDS;
    private InetAddress ipAddress;
    private DatagramSocket serverSocket;

    public FrameGrabber(Integer port) throws UnknownHostException, SocketException {
       super();

       this.ipAddress = InetAddress.getByName("192.168.44.18");
       serverSocket = new DatagramSocket(port, ipAddress);

    }

    public AudioFormat getAudioFormat() {
       float sampleRate = 44100.0F;
       // 8000,11025,16000,22050,44100
       int sampleSizeInBits = 16;
       // 8,16
       int channels = 1;
       // 1,2
       boolean signed = true;
       // true,false
       boolean bigEndian = false;
       // true,false
       return new AudioFormat(sampleRate, sampleSizeInBits, channels, signed, bigEndian);
    }

    @Override
    public void run() {


       byte[] buffer = new byte[2048];
       DatagramPacket packet = new DatagramPacket(buffer, buffer.length);

       DataInputStream dis = new DataInputStream(new ByteArrayInputStream(packet.getData(), packet.getOffset(), packet.getLength()));


       FFmpegFrameGrabber grabber = new FFmpegFrameGrabber(dis);
       grabber.setFormat("mulaw");
       grabber.setSampleRate((int) getAudioFormat().getSampleRate());
       grabber.setAudioChannels(getAudioFormat().getChannels());

       SourceDataLine soundLine = null;


       try {
           grabber.start();


           if (grabber.getSampleRate() > 0 && grabber.getAudioChannels() > 0) {

               AudioFormat audioFormat = new AudioFormat(grabber.getSampleRate(), 16, grabber.getAudioChannels(), true, true);

               DataLine.Info info = new DataLine.Info(SourceDataLine.class, audioFormat);
               soundLine = (SourceDataLine) AudioSystem.getLine(info);
               soundLine.open(audioFormat);

               soundLine.start();
           }

           ExecutorService executor = Executors.newSingleThreadExecutor();


           while (true) {

               try {
                   serverSocket.receive(packet);
               } catch (IOException e) {
                   e.printStackTrace();
               }

               Frame frame = grabber.grab();

               //if (frame == null) break;


               if (frame != null && frame.samples != null) {

                   ShortBuffer channelSamplesFloatBuffer = (ShortBuffer) frame.samples[0];
                   channelSamplesFloatBuffer.rewind();

                   ByteBuffer outBuffer = ByteBuffer.allocate(channelSamplesFloatBuffer.capacity() * 2);
                   float[] samples = new float[channelSamplesFloatBuffer.capacity()];

                   for (int i = 0; i < channelSamplesFloatBuffer.capacity(); i++) {
                       short val = channelSamplesFloatBuffer.get(i);
                       outBuffer.putShort(val);
                   }

                   if (soundLine == null) return;
                   try {
                       SourceDataLine finalSoundLine = soundLine;
                       executor.submit(() -> {
                           finalSoundLine.write(outBuffer.array(), 0, outBuffer.capacity());
                           outBuffer.clear();
                       }).get();
                   } catch (InterruptedException interruptedException) {
                       Thread.currentThread().interrupt();
                   }
               }

           }

           /*
           executor.shutdownNow();
           executor.awaitTermination(1, SECONDS);

           if (soundLine != null) {
               soundLine.stop();
           }

           grabber.stop();
           grabber.release();*/

           } catch (ExecutionException ex) {
           System.out.println("ExecutionException");
           ex.printStackTrace();
       } catch (org.bytedeco.javacv.FrameGrabber.Exception ex) {
           System.out.println("FrameGrabberException");
           ex.printStackTrace();
       } catch (LineUnavailableException ex) {
           System.out.println("LineUnavailableException");
           ex.printStackTrace();
       }/* catch (InterruptedException e) {
           System.out.println("InterruptedException");
           e.printStackTrace();
       }*/


    }

    public static void main(String[] args) throws SocketException, UnknownHostException {
       Runnable apRunnable = new FrameGrabber(7780);
       Thread ap = new Thread(apRunnable);
       ap.start();
    }

    At this stage, I am trying to play the audio file in my speakers but I am getting the following logs :

    Task :FrameGrabber.main()
    Invalid return value 0 for stream protocol
    Invalid return value 0 for stream protocol
    Input #0, mulaw, from ’java.io.DataInputStream@474e6cea’ :
    Duration : N/A, bitrate : 352 kb/s
    Stream #0:0 : Audio : pcm_mulaw, 44100 Hz, 1 channels, s16, 352 kb/s
    Invalid return value 0 for stream protocol
    Invalid return value 0 for stream protocol
    Invalid return value 0 for stream protocol
    Invalid return value 0 for stream protocol
    ...

    What am I doing wrong ?

    Thanks in advance !