Recherche avancée

Médias (1)

Mot : - Tags -/école

Autres articles (21)

  • Récupération d’informations sur le site maître à l’installation d’une instance

    26 novembre 2010, par

    Utilité
    Sur le site principal, une instance de mutualisation est définie par plusieurs choses : Les données dans la table spip_mutus ; Son logo ; Son auteur principal (id_admin dans la table spip_mutus correspondant à un id_auteur de la table spip_auteurs)qui sera le seul à pouvoir créer définitivement l’instance de mutualisation ;
    Il peut donc être tout à fait judicieux de vouloir récupérer certaines de ces informations afin de compléter l’installation d’une instance pour, par exemple : récupérer le (...)

  • Menus personnalisés

    14 novembre 2010, par

    MediaSPIP utilise le plugin Menus pour gérer plusieurs menus configurables pour la navigation.
    Cela permet de laisser aux administrateurs de canaux la possibilité de configurer finement ces menus.
    Menus créés à l’initialisation du site
    Par défaut trois menus sont créés automatiquement à l’initialisation du site : Le menu principal ; Identifiant : barrenav ; Ce menu s’insère en général en haut de la page après le bloc d’entête, son identifiant le rend compatible avec les squelettes basés sur Zpip ; (...)

  • Contribute to translation

    13 avril 2011

    You can help us to improve the language used in the software interface to make MediaSPIP more accessible and user-friendly. You can also translate the interface into any language that allows it to spread to new linguistic communities.
    To do this, we use the translation interface of SPIP where the all the language modules of MediaSPIP are available. Just subscribe to the mailing list and request further informantion on translation.
    MediaSPIP is currently available in French and English (...)

Sur d’autres sites (5576)

  • Anomalie #3461 (Nouveau) : bug sur les urls à cause d’une mauvaise global $profondeur_url

    4 juin 2015, par Maïeul Rouquette

    Le symptome

    de temps en temps (mais assez fréquemment dans mon cas) lorsqu’on a un site dans un sous repertoire, les urls produite par #URL_ARTICLE et co contiennent des "../". Conséquent, lorsqu’on clique on remonte d’un niveau, et du coup on tombe sur une page 404.

    La cause

    Visiblement d’après https://core.spip.net/projects/spip/repository/revisions/20729 la cause serait une mauvaise globale $profondeur_url.
    Je cite ESJ

    Enfin compris pourquoi SPIP compile parfois des squelettes où la globale profondeur_url est incorrecte. Lorsqu’on place dans ecrire/.htaccess une redirection comme "ErrorDocument 403 / ?page=403", curieusement Apache met dans $_SERVER[’REQUEST_URI’] l’URL initiale (donc avec .../ecrire/...) tandis qu’il met dans $_SERVER[’SCRIPT_NAME’] l’URL de redirection (dans l’exemple ci-dessus une page à la racine). Du coup, la compilation de cette page à la racine se fait avec une profondeur d’URL qui est celle de ecrire/ et non de la racine. Pour peu que cette page et ses inclusions soient mises en cache, c’est toutes les autres pages qui les partagent qui se retouvent avec de mauvaises URL.

    Les tentatives de résolution

    Deux commits en 2.1 ont tenté de résoudre les pb :
    https://core.spip.net/projects/spip/repository/revisions/20729
    et https://core.spip.net/projects/spip/repository/revisions/20762

    Ils ont été reporté en 3.0 par
    - https://core.spip.net/projects/spip/repository/revisions/20744
    - https://core.spip.net/projects/spip/repository/revisions/20745
    puis annulé car visiblement non pleinement fonctionnels en

    - https://core.spip.net/projects/spip/repository/revisions/20746
    - https://core.spip.net/projects/spip/repository/revisions/20747

    à ma connaissance c’est le point mort depuis

  • What is Google Analytics data sampling and what’s so bad about it ?

    16 août 2019, par Joselyn Khor — Analytics Tips, Development

    What is Google Analytics data sampling, and what’s so bad about it ?

    Google (2019) explains what data sampling is :

    “In data analysis, sampling is the practice of analysing a subset of all data in order to uncover the meaningful information in the larger data set.”[1]

    This is basically saying instead of analysing all of the data, there’s a threshold on how much data is analysed and any data after that will be an assumption based on patterns.

    Google’s (2019) data sampling thresholds :

    Ad-hoc queries of your data are subject to the following general thresholds for sampling :
    [Google] Analytics Standard : 500k sessions at the property level for the date range you are using
    [Google] Analytics 360 : 100M sessions at the view level for the date range you are using (para. 3) [2]

    This threshold is limiting because your data in GA may become more inaccurate as the traffic to your website increases.

    Say you’re looking through all your traffic data from the last year and find you have 5 million page views. Only 500K of that 5 million is accurate ! The data for the remaining 4.5 million (90%) is an assumption based on the 500K sample size.

    This is a key weapon Google uses to sell to large businesses. In order to increase that threshold for more accurate reporting, upgrading to premium Google Analytics 360 for approximately US$150,000 per year seems to be the only choice.

    What’s so bad about data sampling ?

    It’s unfair to say sampled data is to be disregarded completely. There is a calculation ensuring it is representative and can allow you to get good enough insights. However, we don’t encourage it as we don’t just want “good enough” data. We want the actual facts.

    In a recent survey sent to Matomo customers, we found a large proportion of users switched from GA to Matomo due to the data sampling issue.

    The two reasons why data sampling isn’t preferable : 

    1. If the selected sample size is too small, you won’t get a good representative of all the data. 
    2. The bigger your website grows, the more inaccurate your reports will become.

    An example of why we don’t fully trust sampled data is, say you have an ecommerce store and see your GA revenue reports aren’t matching the actual sales data, due to data sampling. In GA you may be seeing revenue for the month as $1 million, instead of actual sales of $800K.

    The sampling here has caused an inaccuracy that could have negative financial implications. What you get in the GA report is an estimated dollar figure rather than the actual sales. Making decisions based on inaccurate data can be costly in this case. 

    Another disadvantage to sampled data is that you might be missing out on opportunities you would’ve noticed if you were given a view of the whole. E.g. not being able to see real patterns occurring due to the data already being predicted. 

    By not getting a chance to see things as they are and only being able to jump to the conclusions and assumptions made by GA is risky. The bigger your business grows, the less you can risk making business decisions based on assumptions that could be inaccurate. 

    If you feel you could be missing out on opportunities because your GA data is sampled data, get 100% accurately reported data. 

    The benefits of 100% accurate data

    Matomo doesn’t use data sampling on any of our products or plans. You get to see all of your data and not a sampled data set.

    Data quality is necessary for high impact decision-making. It’s hard to make strategic changes if you don’t have confidence that your data is reliable and accurate.

    Learn about how Matomo is a serious contender to Google Analytics 360. 

    Now you can import your Google Analytics data directly into your Matomo

    If you’re wanting to make the switch to Matomo but worried about losing all your historic Google Analytics data, you can now import this directly into your Matomo with the Google Analytics Importer tool.


    Take the challenge !

    Compare your Google Analytics data (sampled data) against your Matomo data, or if you don’t have Matomo data yet, sign up to our 30-day free trial and start tracking !

    References :

    [1 & 2] About data sampling. (2019). In Analytics Help About data sampling. Retrieved August 14, 2019, from https://support.google.com/analytics/answer/2637192

  • Send AVPacket over Network

    2 décembre 2019, par Yondonator

    I’m generating AVPackets with an encoder with ffmpeg and now I want to send them with UDP to another Computer and show them there.
    The problem is I don’t know how to convert the packet to bytes and back. I tried this to copy the package :

    AVPacket newPacket = avcodec.av_packet_alloc();


    ByteBuffer byteBuffer = packet.buf().buffer().asByteBuffer();
    int bufferSize = byteBuffer.capacity();
    byte bytes[] = new byte[bufferSize];
    byteBuffer.get(bytes);
    AVBufferRef newBufferRef = avutil.av_buffer_alloc(bufferSize);
    newBufferRef.data(new BytePointer(bytes));
    newPacket.buf(newBufferRef);


    ByteBuffer dataBuffer = packet.data().asByteBuffer();
    int dataSize = dataBuffer.capacity();
    byte dataBytes[] = new byte[dataSize];
    dataBuffer.get(dataBytes);
    BytePointer dataPointer = new BytePointer(dataBytes);
    newPacket.data(dataPointer);


    newPacket.dts(packet.dts());
    newPacket.duration(packet.duration());
    newPacket.flags(packet.flags());
    newPacket.pos(packet.pos());
    newPacket.pts(packet.pts());
    newPacket.side_data_elems(0);
    newPacket.size(packet.size());
    newPacket.stream_index(packet.stream_index());


    videoPlayer.sendPacket(newPacket);

    This gives me this Error :

    [h264 @ 0000018951be8440] Invalid NAL unit size (3290676 > 77).
    [h264 @ 0000018951be8440] Error splitting the input into NAL units.
    [h264 @ 0000018951bf6480] Invalid NAL unit size (15305314 > 163).
    [h264 @ 0000018951bf6480] Error splitting the input into NAL units.

    The problem is newPacket.data(). When I set it directly : newPacket.data(packet.data())
    it works. Also packet.data().asByteBuffer().capacity() returns 1 and packet.data().capacity() returns 0.

    This is my method that creates the decoder :

    private void startUnsafe() throws Exception
       {
           int result;

           convertContext = null;
           codec = null;
           codecContext = null;
           AVFrame = null;
           RGBAVFrame = null;
           frame = new Frame();

           codec = avcodec_find_decoder(codecID);
           if(codec == null)
           {
               throw new Exception("Unable to find decoder");
           }

           codecContext = avcodec_alloc_context3(codec);
           if(codecContext == null)
           {
               releaseUnsafe();
               throw new Exception("Unable to alloc codec context!");
           }

           AVCodecParameters para = avcodec_parameters_alloc();
           para.bit_rate(streamBitrate);
           para.width(streamWidth);
           para.height(streamHeight);
           para.codec_id(codecID);
           para.codec_type(AVMEDIA_TYPE_VIDEO);
           try
           {
               byte extradataByte[] = Files.readAllBytes(new File("extradata.byte").toPath());
               para.extradata(new BytePointer(extradataByte));
               para.extradata_size(extradataByte.length);
           }
           catch (IOException e1)
           {
               e1.printStackTrace();
               throw new Exception("extradata file not available");
           }

           result = avcodec_parameters_to_context(codecContext, para);
           if(result < 0)
           {
               throw new Exception("Unable to copy parameters to context! [" + result + "]");
           }

           codecContext.thread_count(0);

           result = avcodec_open2(codecContext, codec, new AVDictionary());
           if(result < 0)
           {
               releaseUnsafe();
               throw new Exception("Unable to open codec context![" + result + "]");
           }

           AVFrame = av_frame_alloc();
           if(AVFrame == null)
           {
               releaseUnsafe();
               throw new Exception("Unable to alloc AVFrame!");
           }

           RGBAVFrame = av_frame_alloc();
           if(RGBAVFrame == null)
           {
               releaseUnsafe();
               throw new Exception("Unable to alloc AVFrame!");
           }
           initRGBAVFrame();

           TimerTask task = new TimerTask() {

               @Override
               public void run()
               {
                   timerTask();
               }
           };
           timer = new Timer();
           timer.scheduleAtFixedRate(task, 0, (long) (1000/streamFramerateDouble));

           window.setVisible(true);
       }

    The file extradata.byte has some bytes that I got from another video, because without them it doesn’t work too.

    EDIT :

    package org.stratostream.streaming;

    import java.nio.ByteBuffer;


    import org.bytedeco.javacpp.BytePointer;
    import org.bytedeco.javacpp.Pointer;
    import org.bytedeco.javacpp.avcodec;
    import org.bytedeco.javacpp.avutil;
    import org.bytedeco.javacpp.avcodec.AVPacket;
    import org.bytedeco.javacpp.avcodec.AVPacketSideData;


    public class PacketIO {


       public static final int SIDE_DATA_FIELD = 0;
       public static final int SIDE_ELEMENTS_FIELD = 4;
       public static final int SIDE_TYPE_FIELD = 8;
       public static final int DTS_FIELD = 12;
       public static final int PTS_FIELD = 20;
       public static final int FLAGS_FIELD = 28;
       public static final int DATA_OFFSET = 32;

       public static byte[] toByte(AVPacket packet) throws Exception
       {
           int dataSize = packet.size();
           ByteBuffer dataBuffer = packet.data().capacity(dataSize).asByteBuffer();
           byte dataBytes[] = new byte[dataSize];
           dataBuffer.get(dataBytes);

           AVPacketSideData sideData = packet.side_data();
           int sideSize = sideData.size();
           ByteBuffer sideBuffer = sideData.data().capacity(sideSize).asByteBuffer();
           byte sideBytes[] = new byte[sideSize];
           sideBuffer.get(sideBytes);

           int sideOffset = DATA_OFFSET + dataSize;
           int resultSize = sideOffset + sideSize;
           byte resultBytes[] = new byte[resultSize];
           System.arraycopy(dataBytes, 0, resultBytes, DATA_OFFSET, dataSize);
           System.arraycopy(sideBytes, 0, resultBytes, sideOffset, sideSize);
           resultBytes[SIDE_DATA_FIELD] = (byte) (sideOffset >>> 24);
           resultBytes[SIDE_DATA_FIELD+1] = (byte) (sideOffset >>> 16);
           resultBytes[SIDE_DATA_FIELD+2] = (byte) (sideOffset >>> 8);
           resultBytes[SIDE_DATA_FIELD+3] = (byte) (sideOffset >>> 0);

           int sideType = sideData.type();
           intToByte(resultBytes, SIDE_TYPE_FIELD, sideType);

           int sideElements = packet.side_data_elems();
           intToByte(resultBytes, SIDE_ELEMENTS_FIELD, sideElements);

           long dts = packet.dts();
           longToByte(resultBytes, DTS_FIELD, dts);

           long pts = packet.pts();
           longToByte(resultBytes, PTS_FIELD, pts);

           int flags = packet.flags();
           intToByte(resultBytes, FLAGS_FIELD, flags);

           return resultBytes;
       }

       public static AVPacket toPacket(byte bytes[]) throws Exception
       {
           AVPacket packet = avcodec.av_packet_alloc();

           int sideOffset = byteToInt(bytes, SIDE_DATA_FIELD);
           int sideElements = byteToInt(bytes, SIDE_ELEMENTS_FIELD);
           int sideType = byteToInt(bytes, SIDE_TYPE_FIELD);
           int dataSize = sideOffset - DATA_OFFSET;
           int sideSize = bytes.length - sideOffset;

           long pts = byteToLong(bytes, PTS_FIELD);
           long dts = byteToLong(bytes, DTS_FIELD);
           int flags = byteToInt(bytes, FLAGS_FIELD);

           packet.pts(pts);
           packet.dts(dts);
           packet.flags(flags);


           Pointer newDataPointer =  avutil.av_malloc(bytes.length);
           BytePointer dataPointer = new BytePointer(newDataPointer);
           byte dataBytes[] = new byte[dataSize];
           System.arraycopy(bytes, DATA_OFFSET, dataBytes, 0, dataSize);
           dataPointer.put(dataBytes);
           packet.data(dataPointer);
           packet.size(dataSize);

           Pointer newSidePointer = avutil.av_malloc(sideSize);
           BytePointer sidePointer = new BytePointer(newSidePointer);
           byte sideBytes[] = new byte[sideSize];
           System.arraycopy(bytes, sideOffset, sideBytes, 0, sideSize);
           sidePointer.put(sideBytes);
           AVPacketSideData sideData = new AVPacketSideData();
           sideData.data(sidePointer);
           sideData.type(sideType);
           sideData.size(sideSize);
           //packet.side_data(sideData);
           //packet.side_data_elems(sideElements);

           return packet;
       }

       private static void intToByte(byte[] bytes, int offset, int value)
       {
           bytes[offset] = (byte) (value >>> 24);
           bytes[offset+1] = (byte) (value >>> 16);
           bytes[offset+2] = (byte) (value >>> 8);
           bytes[offset+3] = (byte) (value >>> 0);
       }

       private static void longToByte(byte[] bytes, int offset, long value)
       {
           bytes[offset] = (byte) (value >>> 56);
           bytes[offset+1] = (byte) (value >>> 48);
           bytes[offset+2] = (byte) (value >>> 40);
           bytes[offset+3] = (byte) (value >>> 32);
           bytes[offset+4] = (byte) (value >>> 24);
           bytes[offset+5] = (byte) (value >>> 16);
           bytes[offset+6] = (byte) (value >>> 8);
           bytes[offset+7] = (byte) (value >>> 0);
       }

       private static int byteToInt(byte[] bytes, int offset)
       {
           return (bytes[offset]<<24)&0xff000000|(bytes[offset+1]<<16)&0x00ff0000|(bytes[offset+2]<<8)&0x0000ff00|(bytes[offset+3]<<0)&0x000000ff;
       }

       private static long byteToLong(byte[] bytes, int offset)
       {
           return (bytes[offset]<<56)&0xff00000000000000L|(bytes[offset+1]<<48)&0x00ff000000000000L|(bytes[offset+2]<<40)&0x0000ff0000000000L|(bytes[offset+3]<<32)&0x000000ff00000000L|(bytes[offset+4]<<24)&0x00000000ff000000L|(bytes[offset+5]<<16)&0x0000000000ff0000L|(bytes[offset+6]<<8)&0x000000000000ff00L|(bytes[offset+7]<<0)&0x00000000000000ffL;
       }

    }

    Now I have this class that works fine on the same programm, but when I send the bytes over the network I get a bad output and this error is printed to the console :

    [h264 @ 00000242442acc40] Missing reference picture, default is 72646
    [h264 @ 000002424089de00] Missing reference picture, default is 72646
    [h264 @ 000002424089e3c0] mmco: unref short failure
    [h264 @ 000002424081a580] reference picture missing during reorder
    [h264 @ 000002424081a580] Missing reference picture, default is 72652
    [h264 @ 000002424082c400] mmco: unref short failure
    [h264 @ 000002424082c400] co located POCs unavailable
    [h264 @ 000002424082c9c0] co located POCs unavailable
    [h264 @ 00000242442acc40] co located POCs unavailable
    [h264 @ 000002424089de00] mmco: unref short failure

    I think its because I dont set the sidedata field but when I try to set it the encoder crashes with the second packet.

    The output looks like this :
    Decoder Output