Recherche avancée

Médias (91)

Autres articles (34)

  • MediaSPIP version 0.1 Beta

    16 avril 2011, par

    MediaSPIP 0.1 beta est la première version de MediaSPIP décrétée comme "utilisable".
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Pour avoir une installation fonctionnelle, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Librairies et binaires spécifiques au traitement vidéo et sonore

    31 janvier 2010, par

    Les logiciels et librairies suivantes sont utilisées par SPIPmotion d’une manière ou d’une autre.
    Binaires obligatoires FFMpeg : encodeur principal, permet de transcoder presque tous les types de fichiers vidéo et sonores dans les formats lisibles sur Internet. CF ce tutoriel pour son installation ; Oggz-tools : outils d’inspection de fichiers ogg ; Mediainfo : récupération d’informations depuis la plupart des formats vidéos et sonores ;
    Binaires complémentaires et facultatifs flvtool2 : (...)

  • Support audio et vidéo HTML5

    10 avril 2011

    MediaSPIP utilise les balises HTML5 video et audio pour la lecture de documents multimedia en profitant des dernières innovations du W3C supportées par les navigateurs modernes.
    Pour les navigateurs plus anciens, le lecteur flash Flowplayer est utilisé.
    Le lecteur HTML5 utilisé a été spécifiquement créé pour MediaSPIP : il est complètement modifiable graphiquement pour correspondre à un thème choisi.
    Ces technologies permettent de distribuer vidéo et son à la fois sur des ordinateurs conventionnels (...)

Sur d’autres sites (4082)

  • JavaCV : avformat_open_input() hangs (not network, but with custom AVIOContext)

    14 octobre 2015, par Yun Tao Hai

    I’m using a custom AVIOContext to bridge FFMpeg with java IO. The function avformat_open_input() never returns. I have searched the web for similar problems, all of which were caused by faulty network or wrong server configurations. However, I’m not using network at all, as you can see in the following little program :

    package com.example;

    import org.bytedeco.javacpp.*;
    import java.io.File;
    import java.io.IOException;
    import java.io.RandomAccessFile;
    import static org.bytedeco.javacpp.avcodec.*;
    import static org.bytedeco.javacpp.avformat.*;
    import static org.bytedeco.javacpp.avutil.*;
    import static org.bytedeco.javacpp.avdevice.*;
    import static org.bytedeco.javacpp.avformat.AVFormatContext.*;

    public class Test {

       public static void main(String[] args) throws Exception {
           File dir = new File(System.getProperty("user.home"), "Desktop");
           File file = new File(dir, "sample.3gp");
           final RandomAccessFile raf = new RandomAccessFile(file, "r");

           Loader.load(avcodec.class);
           Loader.load(avformat.class);
           Loader.load(avutil.class);
           Loader.load(avdevice.class);
           Loader.load(swscale.class);
           Loader.load(swresample.class);

           avcodec_register_all();
           av_register_all();
           avformat_network_init();
           avdevice_register_all();

           Read_packet_Pointer_BytePointer_int reader = new Read_packet_Pointer_BytePointer_int() {
               @Override
               public int call(Pointer pointer, BytePointer buf, int bufSize) {
                   try {
                       byte[] data = new byte[bufSize]; // this is inefficient, just use as a quick example
                       int read = raf.read(data);

                       if (read <= 0) {
                           System.out.println("EOF found.");
                           return AVERROR_EOF;
                       }

                       System.out.println("Successfully read " + read + " bytes of data.");
                       buf.position(0);
                       buf.put(data, 0, read);
                       return read;
                   } catch (Exception ex) {
                       ex.printStackTrace();
                       return -1;
                   }
               }
           };

           Seek_Pointer_long_int seeker = new Seek_Pointer_long_int() {
               @Override
               public long call(Pointer pointer, long offset, int whence) {
                   try {
                       raf.seek(offset);
                       System.out.println("Successfully seeked to position " + offset + ".");
                       return offset;
                   } catch (IOException ex) {
                       return -1;
                   }
               }
           };

           int inputBufferSize = 32768;
           BytePointer inputBuffer = new BytePointer(av_malloc(inputBufferSize));
           AVIOContext ioContext = avio_alloc_context(inputBuffer, inputBufferSize, 1, null, reader, null, seeker);
           AVInputFormat format = av_find_input_format("3gp");
           AVFormatContext formatContext = avformat_alloc_context();
           formatContext.iformat(format);
           formatContext.flags(formatContext.flags() | AVFMT_FLAG_CUSTOM_IO);
           formatContext.pb(ioContext);

           // This never returns. And I can never get result.
           int result = avformat_open_input(formatContext, "", format, null);

           // all clean-up code omitted for simplicity
       }

    }

    And below is my sample console output :

    Successfully read 32768 bytes of data.
    Successfully read 32768 bytes of data.
    Successfully read 32768 bytes of data.
    Successfully read 32768 bytes of data.
    Successfully read 32768 bytes of data.
    Successfully read 7240 bytes of data.
    EOF found.

    I’ve checked the sum of bytes, which corresponds to the file size ; EOF is also hit, meaning the file is completely read. Actually I am a bit skeptical as why avformat_open_input() would even read the entire file and still without returning ? There must be something wrong with what I am doing. Can any expert shed some lights or point me to the right direction ? I’m new to javacv and ffmpeg and especially to programming with Buffers and stuff. Any help, suggestion or criticism is welcome. Thanks in advance.

  • ffmpeg stream chrome kiosk mode ubuntu 16.04 server

    21 décembre 2016, par Raul

    I have a weird out-of-sync issue while using ffmpeg to stream to youtube live a chrome browser from an ub untu 16.04 server.

    Issue : output video streamed to youtube has audio/video out of sync, sometimes with as much as 3s

    Current flow :

    1) start pulseaudio - we using something like this to start it :

    pulseaudio --start -vvv --disallow-exit --log-target=syslog --high-priority --exit-idle-time=-1 --daemonize

    2) start Xvfb

    Xvfb :0 -ac -screen 0 1920x1080x24

    3) start chrome linux in kiosk mode

    google-chrome --kiosk --disable-gpu --incognito --no-first-run --disable-java --disable-plugins --disable-translate --disk-cache-size=$((1024 * 1024)) --disk-cache-dir=/tmp/chrome/ --user-data-dir=/tmp/chrome/ --force-device-scale-factor=1 --window-size=1920,1080 --window-position=0,0 LOCATION_URL

    4) start ffmpeg

    ffmpeg -y \
     -thread_queue_size 8192 -rtbufsize 250M -f x11grab -video_size 1920x1080 -framerate 24 -i :0 \
     -thread_queue_size 8192 -channel_layout stereo -f alsa -i pulse \
     -c:v libx264 -pix_fmt yuv420p -c:v libx264 -g 48 -crf 24 -filter:v fps=24 -preset ultrafast -tune zerolatency \
     -c:a aac -strict -2 -channel_layout stereo -ab 96k -ac 2 -flags +global_header \
     -f flv YOUTUBE_LIVE_STREAMING_RTMP

    Note : this is running on an amazon ec2 instance, meaning there is no soundcard, so alsa and pulseaudio are creating a dummy audio card. However, the latency does not come from there. Logs :

    Nov 25 06:14:22 ip-172-31-29-8 pulseaudio[26602]: [pulseaudio] protocol-native.c: Adjust latency mode enabled, configuring sink latency to half of overall latency.
    Nov 25 06:14:22 ip-172-31-29-8 pulseaudio[26602]: [pulseaudio] protocol-native.c: Requested latency=23.22 ms, Received latency=23.22 ms
    Nov 25 06:14:22 ip-172-31-29-8 pulseaudio[26602]: [pulseaudio] protocol-native.c: Final latency 69.66 ms = 23.22 ms + 2*11.61 ms + 23.22 ms

    At this point, here’s what we observed :

    1. if we start ffmpeg exactly after issuing the command to start chrome, we see the DTS errors from ffmpeg. Audio is out of sync with the video and has delay of 3-5seconds AHEAD. We also noticed the out of sync remains the same for the full duration of the stream

    2. if we start ffmpeg after around 10seconds, audio and video are almost in sync. We then manually added a -itsoffset -0.125 to the ffmpeg command and everything is perfect.

    Questions :

    1. Why would ffmpeg have so much lag if it’s started right after chrome ?
    2. Is starting the ffmpeg after 10s or X seconds the expected behavior ? That is, is this because the system needs to wait for audio/video signals to be "ready" or something ?
    3. Is there a way to 100% calculate or know when Chrome is fully ready and start ffmpeg ? We found sometimes it takes 5s, sometimes 10. Depends on the URL we load.
    4. Besides the DTS error that ffmpeg throws, is there any other way to know if audio/video is out-of-sync ? as sometimes we have a delay of between 0.5 to 1s, but ffmpeg does not report anything. And a restart is required to "re-balance" the audio/video inputs and get them back in sync.
    5. Can pulseaudio be the problem in this scenario ?

    Thank you

    UPDATE Dec 20

    We were able to do some tricks to force chrome to start the audio on page load, and that will force connect to pulseaudio. Doing that, plus adding a 3s delay for ffmpeg to start, there is no more delay when ffmpeg starts.
    However, our app is a webRTC app, and we have a STRANGER thing happening : if we start the page with no webcam/audio, once the webcam/audio is enabled, ffmpeg (while showing no errors) has a delay of 2s or so. While keep talking, in about max 30s, that delay is GONE.

    So the new questions are :

    1. Besides the DTS error that ffmpeg throws, is there any other way to know if audio/video is out-of-sync ? as sometimes we have a delay of between 0.5 to 1s, but ffmpeg does not report anything.
    2. What could cause the initial audio/video out of sync issue and then catching up ?
  • ffmpeg stream chrome kiosk mode ubuntu 16.04 server

    15 février 2021, par Raul

    I have a weird out-of-sync issue while using ffmpeg to stream to youtube live a chrome browser from an ub untu 16.04 server.

    



    Issue : output video streamed to youtube has audio/video out of sync, sometimes with as much as 3s

    



    Current flow :

    



    1) start pulseaudio - we using something like this to start it :

    



    pulseaudio --start -vvv --disallow-exit --log-target=syslog --high-priority --exit-idle-time=-1 --daemonize


    



    2) start Xvfb

    



    Xvfb :0 -ac -screen 0 1920x1080x24


    



    3) start chrome linux in kiosk mode

    



    google-chrome --kiosk --disable-gpu --incognito --no-first-run --disable-java --disable-plugins --disable-translate --disk-cache-size=$((1024 * 1024)) --disk-cache-dir=/tmp/chrome/ --user-data-dir=/tmp/chrome/ --force-device-scale-factor=1 --window-size=1920,1080 --window-position=0,0 LOCATION_URL


    



    4) start ffmpeg

    



    ffmpeg -y \
  -thread_queue_size 8192 -rtbufsize 250M -f x11grab -video_size 1920x1080 -framerate 24 -i :0 \
  -thread_queue_size 8192 -channel_layout stereo -f alsa -i pulse \
  -c:v libx264 -pix_fmt yuv420p -c:v libx264 -g 48 -crf 24 -filter:v fps=24 -preset ultrafast -tune zerolatency \
  -c:a aac -strict -2 -channel_layout stereo -ab 96k -ac 2 -flags +global_header \
  -f flv YOUTUBE_LIVE_STREAMING_RTMP


    



    Note : this is running on an amazon ec2 instance, meaning there is no soundcard, so alsa and pulseaudio are creating a dummy audio card. However, the latency does not come from there. Logs :

    



    Nov 25 06:14:22 ip-172-31-29-8 pulseaudio[26602]: [pulseaudio] protocol-native.c: Adjust latency mode enabled, configuring sink latency to half of overall latency.
Nov 25 06:14:22 ip-172-31-29-8 pulseaudio[26602]: [pulseaudio] protocol-native.c: Requested latency=23.22 ms, Received latency=23.22 ms
Nov 25 06:14:22 ip-172-31-29-8 pulseaudio[26602]: [pulseaudio] protocol-native.c: Final latency 69.66 ms = 23.22 ms + 2*11.61 ms + 23.22 ms


    



    At this point, here's what we observed :

    



      

    1. if we start ffmpeg exactly after issuing the command to start chrome, we see the DTS errors from ffmpeg. Audio is out of sync with the video and has delay of 3-5seconds AHEAD. We also noticed the out of sync remains the same for the full duration of the stream

    2. 


    3. if we start ffmpeg after around 10seconds, audio and video are almost in sync. We then manually added a -itsoffset -0.125 to the ffmpeg command and everything is perfect.

    4. 


    



    Questions :

    



      

    1. Why would ffmpeg have so much lag if it's started right after chrome ?
    2. 


    3. Is starting the ffmpeg after 10s or X seconds the expected behavior ? That is, is this because the system needs to wait for audio/video signals to be "ready" or something ?
    4. 


    5. Is there a way to 100% calculate or know when Chrome is fully ready and start ffmpeg ? We found sometimes it takes 5s, sometimes 10. Depends on the URL we load.
    6. 


    7. Besides the DTS error that ffmpeg throws, is there any other way to know if audio/video is out-of-sync ? as sometimes we have a delay of between 0.5 to 1s, but ffmpeg does not report anything. And a restart is required to "re-balance" the audio/video inputs and get them back in sync.
    8. 


    9. Can pulseaudio be the problem in this scenario ?
    10. 


    



    Thank you

    



    UPDATE Dec 20

    



    We were able to do some tricks to force chrome to start the audio on page load, and that will force connect to pulseaudio. Doing that, plus adding a 3s delay for ffmpeg to start, there is no more delay when ffmpeg starts.
However, our app is a webRTC app, and we have a STRANGER thing happening : if we start the page with no webcam/audio, once the webcam/audio is enabled, ffmpeg (while showing no errors) has a delay of 2s or so. While keep talking, in about max 30s, that delay is GONE.

    



    So the new questions are :

    



      

    1. Besides the DTS error that ffmpeg throws, is there any other way to know if audio/video is out-of-sync ? as sometimes we have a delay of between 0.5 to 1s, but ffmpeg does not report anything.
    2. 


    3. What could cause the initial audio/video out of sync issue and then catching up ?
    4.