Recherche avancée

Médias (1)

Mot : - Tags -/portrait

Autres articles (54)

  • Keeping control of your media in your hands

    13 avril 2011, par

    The vocabulary used on this site and around MediaSPIP in general, aims to avoid reference to Web 2.0 and the companies that profit from media-sharing.
    While using MediaSPIP, you are invited to avoid using words like "Brand", "Cloud" and "Market".
    MediaSPIP is designed to facilitate the sharing of creative media online, while allowing authors to retain complete control of their work.
    MediaSPIP aims to be accessible to as many people as possible and development is based on expanding the (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (7127)

  • libav ffmpeg codec copy rtp_mpegts streaming with very bad quality

    27 décembre 2017, par Dinkan

    I am trying to do codec copy of a stream(testing with file now & later
    going to use live stream) with format rtp_mpegts over network & play
    using VLC player. Started my proof of concept code with slightly
    modified remuxing.c in the examples.

    I am essentially trying to do is to replicate
    ./ffmpeg -re -i TEST_VIDEO.ts -acodec copy -vcodec copy -f rtp_mpegts
    rtp ://239.245.0.2:5002

    Streaming is happening, but the quality is terrible.
    Looks like many frames are skipped plus streaming is happening really
    slow(buffer underflow reported by VLC player)

    File plays perfectly fine directly on VLC player.
    Please help.

    Stream details.
    Input #0, mpegts, from ' TEST_VIDEO.ts':
     Duration: 00:10:00.40, start: 41313.400811, bitrate: 2840 kb/s
     Program 1
       Stream #0:0[0x11]: Video: h264 (High) ([27][0][0][0] / 0x001B),
    yuv420p(tv, bt709, top first), 1440x1080 [SAR 4:3 DAR 16:9], 29.97
    fps, 59.94 tbr, 90k tbn, 59.94 tbc
       Stream #0:1[0x14]: Audio: ac3 (AC-3 / 0x332D4341), 48000 Hz,
    stereo, fltp, 448 kb/s
    Output #0, rtp_mpegts, to 'rtp://239.255.0.2:5004':
     Metadata:
       encoder         : Lavf57.83.100
       Stream #0:0: Video: h264 (High) ([27][0][0][0] / 0x001B),
    yuv420p(tv, bt709, top first), 1440x1080 [SAR 4:3 DAR 16:9], q=2-31,
    29.97 fps, 59.94 tbr, 90k tbn, 29.97 tbc
       Stream #0:1: Audio: ac3 (AC-3 / 0x332D4341), 48000 Hz, stereo,
    fltp, 448 kb/s
    Stream mapping:
     Stream #0:0 -> #0:0 (copy)
     Stream #0:1 -> #0:1 (copy)
    Press [q] to stop, [?] for help
    frame=  418 fps=5.2 q=-1.0 size=    3346kB time=00:00:08.50
    bitrate=3223.5kbits/s speed=0.106x

    My complete source code(This is almost same as remuxing.c)

    #include <libavutil></libavutil>timestamp.h>
    #include <libavformat></libavformat>avformat.h>

    static void log_packet(const AVFormatContext *fmt_ctx, const AVPacket
    *pkt, const char *tag)
    {
       AVRational *time_base = &amp;fmt_ctx->streams[pkt->stream_index]->time_base;

       printf("%s: pts:%s pts_time:%s dts:%s dts_time:%s duration:%s
    duration_time:%s stream_index:%d\n",
              tag,
              av_ts2str(pkt->pts), av_ts2timestr(pkt->pts, time_base),
              av_ts2str(pkt->dts), av_ts2timestr(pkt->dts, time_base),
              av_ts2str(pkt->duration), av_ts2timestr(pkt->duration, time_base),
              pkt->stream_index);
    }


    int main(int argc, char **argv)
    {
       AVOutputFormat *ofmt = NULL;
       AVFormatContext *ifmt_ctx = NULL, *ofmt_ctx = NULL;
       AVPacket pkt;
       const char *in_filename, *out_filename;
       int ret, i;
       int stream_index = 0;
       int *stream_mapping = NULL;
       int stream_mapping_size = 0;
       AVRational mux_timebase;
       int64_t start_time = 0; //(of->start_time == AV_NOPTS_VALUE) ? 0 :
    of->start_time;
       int64_t ost_tb_start_time = 0; //av_rescale_q(start_time,
    AV_TIME_BASE_Q, ost->mux_timebase);

       if (argc &lt; 3) {
           printf("usage: %s input output\n"
                  "API example program to remux a media file with
    libavformat and libavcodec.\n"
                  "The output format is guessed according to the file extension.\n"
                  "\n", argv[0]);
           return 1;
       }

       in_filename  = argv[1];
       out_filename = argv[2];

       av_register_all();
       avcodec_register_all();
       avformat_network_init();

       if ((ret = avformat_open_input(&amp;ifmt_ctx, in_filename, 0, 0)) &lt; 0) {
           fprintf(stderr, "Could not open input file '%s'", in_filename);
           goto end;
       }

       if ((ret = avformat_find_stream_info(ifmt_ctx, 0)) &lt; 0) {
           fprintf(stderr, "Failed to retrieve input stream information");
           goto end;
       }

       av_dump_format(ifmt_ctx, 0, in_filename, 0);

       avformat_alloc_output_context2(&amp;ofmt_ctx, NULL, "rtp_mpegts", out_filename);
       if (!ofmt_ctx) {
           fprintf(stderr, "Could not create output context\n");
           ret = AVERROR_UNKNOWN;
           goto end;
       }

       stream_mapping_size = ifmt_ctx->nb_streams;
       stream_mapping = av_mallocz_array(stream_mapping_size,
    sizeof(*stream_mapping));
       if (!stream_mapping) {
           ret = AVERROR(ENOMEM);
           goto end;
       }

       ofmt = ofmt_ctx->oformat;

       for (i = 0; i &lt; ifmt_ctx->nb_streams; i++)
       {
           AVStream *out_stream;
           AVStream *in_stream = ifmt_ctx->streams[i];
           AVCodecParameters *in_codecpar = in_stream->codecpar;

           if (in_codecpar->codec_type != AVMEDIA_TYPE_AUDIO &amp;&amp;
               in_codecpar->codec_type != AVMEDIA_TYPE_VIDEO &amp;&amp;
               in_codecpar->codec_type != AVMEDIA_TYPE_SUBTITLE) {
               stream_mapping[i] = -1;
               continue;
           }

           stream_mapping[i] = stream_index++;


           out_stream = avformat_new_stream(ofmt_ctx, NULL);
           if (!out_stream) {
               fprintf(stderr, "Failed allocating output stream\n");
               ret = AVERROR_UNKNOWN;
               goto end;
           }

           //out_stream->codecpar->codec_tag = 0;
           if (0 == out_stream->codecpar->codec_tag)
           {
               unsigned int codec_tag_tmp;

               if (!out_stream->codecpar->codec_tag ||
                   av_codec_get_id (ofmt->codec_tag,
    in_codecpar->codec_tag) == in_codecpar->codec_id ||
                   !av_codec_get_tag2(ofmt->codec_tag,
    in_codecpar->codec_id, &amp;codec_tag_tmp))
                   out_stream->codecpar->codec_tag  = in_codecpar->codec_tag;
           }
           //ret = avcodec_parameters_to_context(ost->enc_ctx, ist->st->codecpar);

           ret = avcodec_parameters_copy(out_stream->codecpar, in_codecpar);
           if (ret &lt; 0) {
               fprintf(stderr, "Failed to copy codec parameters\n");
               goto end;
           }
           //out_stream->codecpar->codec_tag = codec_tag;
           // copy timebase while removing common factors

           printf("bit_rate %lld sample_rate %d frame_size %d\n",
                  in_codecpar->bit_rate, in_codecpar->sample_rate,
    in_codecpar->frame_size);

           out_stream->avg_frame_rate = in_stream->avg_frame_rate;

           ret = avformat_transfer_internal_stream_timing_info(ofmt,

    out_stream, in_stream,
                                                               AVFMT_TBCF_AUTO);
           if (ret &lt; 0) {
               fprintf(stderr,
    "avformat_transfer_internal_stream_timing_info failed\n");
               goto end;
           }

           if (out_stream->time_base.num &lt;= 0 || out_stream->time_base.den &lt;= 0)
               out_stream->time_base =
    av_add_q(av_stream_get_codec_timebase(out_stream), (AVRational){0,
    1});

           // copy estimated duration as a hint to the muxer
           if (out_stream->duration &lt;= 0 &amp;&amp; in_stream->duration > 0)
               out_stream->duration = av_rescale_q(in_stream->duration,
    in_stream->time_base, out_stream->time_base);

           // copy disposition
           out_stream->disposition = in_stream->disposition;

           out_stream->sample_aspect_ratio = in_stream->sample_aspect_ratio;
           out_stream->avg_frame_rate = in_stream->avg_frame_rate;
           out_stream->r_frame_rate = in_stream->r_frame_rate;

           if ( in_codecpar->codec_type == AVMEDIA_TYPE_VIDEO)
           {

               mux_timebase = in_stream->time_base;
           }


           if (in_stream->nb_side_data) {
               for (i = 0; i &lt; in_stream->nb_side_data; i++) {
                   const AVPacketSideData *sd_src = &amp;in_stream->side_data[i];
                   uint8_t *dst_data;

                   dst_data = av_stream_new_side_data(out_stream,
    sd_src->type, sd_src->size);
                   if (!dst_data)
                       return AVERROR(ENOMEM);
                   memcpy(dst_data, sd_src->data, sd_src->size);
               }
           }
       }

       av_dump_format(ofmt_ctx, 0, out_filename, 1);

       start_time = ofmt_ctx->duration;
       ost_tb_start_time = av_rescale_q(ofmt_ctx->duration,
    AV_TIME_BASE_Q, mux_timebase);

       if (!(ofmt->flags &amp; AVFMT_NOFILE))
       {
           ret = avio_open(&amp;ofmt_ctx->pb, out_filename, AVIO_FLAG_WRITE);
           if (ret &lt; 0) {
               fprintf(stderr, "Could not open output file '%s'", out_filename);
               goto end;
           }
       }

       ret = avformat_write_header(ofmt_ctx, NULL);
       if (ret &lt; 0) {
           fprintf(stderr, "Error occurred when opening output file\n");
           goto end;
       }

       while (1)
       {
           AVStream *in_stream, *out_stream;

           ret = av_read_frame(ifmt_ctx, &amp;pkt);
           if (ret &lt; 0)
               break;

           in_stream  = ifmt_ctx->streams[pkt.stream_index];
           if (pkt.stream_index >= stream_mapping_size ||
               stream_mapping[pkt.stream_index] &lt; 0) {
               av_packet_unref(&amp;pkt);
               continue;
           }

           pkt.stream_index = stream_mapping[pkt.stream_index];
           out_stream = ofmt_ctx->streams[pkt.stream_index];

           //log_packet(ifmt_ctx, &amp;pkt, "in");


           //ofmt_ctx->bit_rate = ifmt_ctx->bit_rate;
           ofmt_ctx->duration = ifmt_ctx->duration;
           /* copy packet */
           //pkt.pts = av_rescale_q_rnd(pkt.pts, in_stream->time_base,
    out_stream->time_base, AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX);
           //pkt.dts = av_rescale_q_rnd(pkt.dts, in_stream->time_base,
    out_stream->time_base, AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX);

           if (pkt.pts != AV_NOPTS_VALUE)
               pkt.pts = av_rescale_q(pkt.pts,
    in_stream->time_base,mux_timebase) - ost_tb_start_time;
           else
               pkt.pts = AV_NOPTS_VALUE;

           if (pkt.dts == AV_NOPTS_VALUE)
               pkt.dts = av_rescale_q(pkt.dts, AV_TIME_BASE_Q, mux_timebase);
           else
               pkt.dts = av_rescale_q(pkt.dts, in_stream->time_base, mux_timebase);
           pkt.dts -= ost_tb_start_time;

           pkt.duration = av_rescale_q(pkt.duration,
    in_stream->time_base, mux_timebase);
           //pkt.duration = av_rescale_q(1,
    av_inv_q(out_stream->avg_frame_rate), mux_timebase);
           pkt.pos = -1;
           //log_packet(ofmt_ctx, &amp;pkt, "out");


           ret = av_interleaved_write_frame(ofmt_ctx, &amp;pkt);
           if (ret &lt; 0) {
               fprintf(stderr, "Error muxing packet\n");
               break;
           }
           av_packet_unref(&amp;pkt);
       }

       av_write_trailer(ofmt_ctx);
    end:

       avformat_close_input(&amp;ifmt_ctx);

       /* close output */
       if (ofmt_ctx &amp;&amp; !(ofmt->flags &amp; AVFMT_NOFILE))
           avio_closep(&amp;ofmt_ctx->pb);
       avformat_free_context(ofmt_ctx);

       av_freep(&amp;stream_mapping);

       if (ret &lt; 0 &amp;&amp; ret != AVERROR_EOF) {
           fprintf(stderr, "Error occurred: %s\n", av_err2str(ret));
           return 1;
       }

       return 0;
    }
  • What is Google Analytics data sampling and what’s so bad about it ?

    16 août 2019, par Joselyn Khor — Analytics Tips, Development

    What is Google Analytics data sampling, and what’s so bad about it ?

    Google (2019) explains what data sampling is :

    “In data analysis, sampling is the practice of analysing a subset of all data in order to uncover the meaningful information in the larger data set.”[1]

    This is basically saying instead of analysing all of the data, there’s a threshold on how much data is analysed and any data after that will be an assumption based on patterns.

    Google’s (2019) data sampling thresholds :

    Ad-hoc queries of your data are subject to the following general thresholds for sampling :
    [Google] Analytics Standard : 500k sessions at the property level for the date range you are using
    [Google] Analytics 360 : 100M sessions at the view level for the date range you are using (para. 3) [2]

    This threshold is limiting because your data in GA may become more inaccurate as the traffic to your website increases.

    Say you’re looking through all your traffic data from the last year and find you have 5 million page views. Only 500K of that 5 million is accurate ! The data for the remaining 4.5 million (90%) is an assumption based on the 500K sample size.

    This is a key weapon Google uses to sell to large businesses. In order to increase that threshold for more accurate reporting, upgrading to premium Google Analytics 360 for approximately US$150,000 per year seems to be the only choice.

    What’s so bad about data sampling ?

    It’s unfair to say sampled data is to be disregarded completely. There is a calculation ensuring it is representative and can allow you to get good enough insights. However, we don’t encourage it as we don’t just want “good enough” data. We want the actual facts.

    In a recent survey sent to Matomo customers, we found a large proportion of users switched from GA to Matomo due to the data sampling issue.

    The two reasons why data sampling isn’t preferable : 

    1. If the selected sample size is too small, you won’t get a good representative of all the data. 
    2. The bigger your website grows, the more inaccurate your reports will become.

    An example of why we don’t fully trust sampled data is, say you have an ecommerce store and see your GA revenue reports aren’t matching the actual sales data, due to data sampling. In GA you may be seeing revenue for the month as $1 million, instead of actual sales of $800K.

    The sampling here has caused an inaccuracy that could have negative financial implications. What you get in the GA report is an estimated dollar figure rather than the actual sales. Making decisions based on inaccurate data can be costly in this case. 

    Another disadvantage to sampled data is that you might be missing out on opportunities you would’ve noticed if you were given a view of the whole. E.g. not being able to see real patterns occurring due to the data already being predicted. 

    By not getting a chance to see things as they are and only being able to jump to the conclusions and assumptions made by GA is risky. The bigger your business grows, the less you can risk making business decisions based on assumptions that could be inaccurate. 

    If you feel you could be missing out on opportunities because your GA data is sampled data, get 100% accurately reported data. 

    The benefits of 100% accurate data

    Matomo doesn’t use data sampling on any of our products or plans. You get to see all of your data and not a sampled data set.

    Data quality is necessary for high impact decision-making. It’s hard to make strategic changes if you don’t have confidence that your data is reliable and accurate.

    Learn about how Matomo is a serious contender to Google Analytics 360. 

    Now you can import your Google Analytics data directly into your Matomo

    If you’re wanting to make the switch to Matomo but worried about losing all your historic Google Analytics data, you can now import this directly into your Matomo with the Google Analytics Importer tool.


    Take the challenge !

    Compare your Google Analytics data (sampled data) against your Matomo data, or if you don’t have Matomo data yet, sign up to our 30-day free trial and start tracking !

    References :

    [1 & 2] About data sampling. (2019). In Analytics Help About data sampling. Retrieved August 14, 2019, from https://support.google.com/analytics/answer/2637192

  • Understanding Data Processing Agreements and How They Affect GDPR Compliance

    9 octobre 2023, par Erin — GDPR

    The General Data Protection Regulation (GDPR) impacts international organisations that conduct business or handle personal data in the European Union (EU), and they must know how to stay compliant.

    One way of ensuring GDPR compliance is through implementing a data processing agreement (DPA). Most businesses overlook DPAs when considering ways of maintaining user data security. So, what exactly is a DPA’s role in ensuring GDPR compliance ?

    In this article, we’ll discuss DPAs, their advantages, which data protection laws require them and the clauses that make up a DPA. We’ll also discuss the consequences of non-compliance and how you can maintain GDPR compliance using Matomo.

    What is a data processing agreement ?

    A data processing agreement, data protection agreement or data processing addendum is a contractual agreement between a data controller (a company) and a data processor (a third-party service provider.) It defines each party’s rights and obligations regarding data protection.

    A DPA also defines the responsibilities of the controller and the processor and sets out the terms they’ll use for data processing. For instance, when MHP/Team SI sought the services of Matomo (a data processor) to get reliable and compliant web analytics, a DPA helped to outline their responsibilities and liabilities.

    A DPA is one of the basic requirements for GDPR compliance. The GDPR is an EU regulation concerning personal data protection and security. The GDPR is binding on any company that actively collects data from EU residents or citizens, regardless of their location.

    As a business, you need to know what goes into a DPA to identify possible liabilities that may arise if you don’t comply with European data protection laws. For example, having a recurrent security incident can lead to data breaches as you process customer personal data.

    The average data breach cost for 2023 is $4.45 million. This amount includes regulatory fines, containment costs and business losses. As such, a DPA can help you assess the organisational security measures of your data processing methods and define the protocol for reporting a data breach.

    Why is a DPA essential for your business ?

    If your company processes personal data from your customers, such as contact details, you need a DPA to ensure compliance with data security laws like GDPR. You’ll also need a DPA to hire a third party to process your data, e.g., through web analytics or cloud storage.

    But what are the benefits of having a DPA in place ?

    Benefits of a data processing agreement

    A key benefit of signing a DPA is it outlines business terms with a third-party data processor and guarantees compliance with the relevant data privacy laws. A DPA also helps to create an accountability framework between you and your data processor by establishing contractual obligations.

    Additionally, a DPA helps to minimise the risk of unauthorised access to sensitive data. A DPA defines organisational measures that help protect the rights of individuals and safeguard personal data against unauthorised disclosure. Overall, before choosing a data processor, having a DPA ensures that they are capable, compliant and qualified.

    More than 120 countries have already adopted some form of international data protection laws to protect their citizens and their data better. Hence, knowing which laws require a DPA and how you can better ensure compliance is important.

    Which data protection laws require a DPA ?

    Regulatory bodies enact data protection laws to grant consumers greater control over their data and how businesses use it. These laws ensure transparency in data processing and compliance for businesses.

    Data protection laws that require a DPA

    The following are some of the relevant data privacy laws that require you to have a DPA :

    • UK GDPR
    • Brazil LGPD
    • EU GDPR
    • Dubai PDPA
    • Colorado CPA
    • California CCPA/CPRA
    • Virginia VCDPA
    • Connecticut DPA
    • South African POPIA
    • Thailand PDPA

    Companies that don’t adhere to these data protection obligations usually face liabilities such as fines and penalties. With a DPA, you can set clear expectations regarding data processing between you and your customers.

    Review and update any DPAs with third-party processors to ensure compliance with GDPR and the laws we mentioned above. Additionally, confirm that all the relevant clauses are present for compliance with relevant data privacy laws. 

    So, what key data processing clauses should you have in your DPA ? Let’s take a closer look in the next section.

    Key clauses in a data processing agreement

    GDPR provides some general recommendations for what you should state in a DPA.

    Key elements found in a DPA

    Here are the elements you should include :

    Data processing specifications

    Your DPA should address the specific business purposes for data processing, the duration of processing and the categories of data under processing. It should also clearly state the party responsible for maintaining GDPR compliance and who the data subjects are, including their location and nationality.

    Your DPA should also address the data processor and controller’s responsibilities concerning data deletion and contract termination.

    Role of processor

    Your DPA should clearly state what your data processor is responsible for and liable for. Some key responsibilities include record keeping, reporting breaches and maintaining data security.

    Other roles of your data processor include providing you with audit opportunities and cooperating with data protection authorities during inquiries. If you decide to end your contract, the data processor is responsible for deleting or returning data, depending on your agreement.

    Role of controller

    Your DPA should inform the responsibilities of the data controller, which typically include issuing processing instructions to the data processor and directing them on how to handle data processing.

    Your DPA should let you define the lawful data processes the data processor should follow and how you’ll uphold the data protection rights of individuals’ sensitive data.

    Organisational and technical specifications

    Your DPA should define specifications such as how third-party processors encrypt, access and test personal data. It should also include specifications on how the data processor and controller will maintain ongoing data security through various factors such as :

    • State of the technology : Do ‌third-party processors have reliable technology, and can they ensure data security within their systems ?
    • Costs of implementation : Does the data controller’s budget allow them to seek third-party services from industry-leading providers who can guarantee a certain level of security ?
    • Variances in users’ personal freedom : Are there privacy policies and opt-out forms for users to express how they want companies to use their sensitive data ?

    Moreover, your DPA should define how you and your data processor will ensure the confidentiality, availability and integrity of data processing services and systems.

    What are the penalties for DPA GDPR non-compliance ?

    Regulators use GDPR’s stiff fines to encourage data controllers and third-party processors to follow‌ best data security practices. One way of maintaining compliance is through drafting up a DPA with your data processor.

    The DPA should clearly outline the necessary legal requirements and include all the relevant clauses mentioned above. Understand what goes into this agreement since data protection authorities can hold your business accountable for a breach — even if a processor’s error caused it.

    Data protection authorities can issue penalties now that the GDPR is in place. For example, according to Article 83 of the GDPR, penalties for data or privacy breaches or non-compliance can amount to up to €20 million or 4% of your annual revenue.

    There are two tiers of fines : tier one and tier two. Violations related to data processors typically attract fines on the tier-one level. Tier one fines can cost your business €10 million or 2% of your company’s global revenue.

    Tier-two fines result from infringement of the right to forget and the right to privacy of your consumer. Tier-two fines can cost your business up to €20 million or 4% of your company’s global revenue.

    GDPR fines make non-compliance an expensive mistake for businesses of all sizes. As such, signing a DPA with any party that acts as a data processor for your business can help you remain GDPR-compliant.

    How a DPA can help your business remain GDPR compliant

    A DPA can help your business define and adhere to lawful data processes.

    Steps to take to be DPA GDPR compliant

    So, in what other ways can a DPA help you to remain compliant with GDPR ? Let’s take a look !

    1. Assess data processor’s compliance

    Having a DPA helps ensure that the data processor you are working with is GDPR-compliant. You should check if they have a DPA and confirm the processor’s terms of service and legal basis.

    For example, if you want an alternative to Google Analytics that’s GDPR compliant, then you can opt for Matomo. Matomo features a DPA, which you can agree to when you sign up for web analytics services or later.

    2. Establish lawful data processes

    A DPA can also help you review your data processes to ensure they’re GDPR compliant. For example, by defining lawful data processes, you better understand personally identifiable information (PII) and how it relates to data privacy.

    Further, you can allow users to opt out of sharing their data. As such, Matomo can help you to enable Do Not Track preferences on your website.

    With this feature, users are given the option to opt in or out of tracking via a toggle in their respective browsers.

    Indeed, establishing lawful data processes helps you define the specific business purposes for collecting and processing personal data. By doing so, you get to notify your users why you need their data and get their consent to process it by including a GDPR-compliant privacy policy on your website.

    3. Anonymise your data

    Global privacy laws like GDPR and ePrivacy mandate companies to display cookie banners or seek consent before tracking visitors’ data. You can either include a cookie consent banner on your site or stop tracking cookies to follow the applicable regulations.

    Further, you can enable cookie-less tracking or easily let users opt out. For example, you can use Matomo without a cookie consent banner, exempting it from many countries’ privacy rules.

    Additionally, through a DPA, you can define organisational measures that define how you’ll anonymise all your users’ data. Matomo can help you anonymise IP addresses, and we recommend that you at least anonymise the last two bytes.

    As one of the few web analytics tools you can use to collect data without tracking consent, Matomo also has the French Data Protection Authority (CNIL) approval.

    4. Assess the processor’s bandwidth

    Having a DPA can help you implement data retention policies that show clear retention periods. Such policies are useful when ending a contract with a third-party service provider and determining how they should handle your data.

    A DPA also helps you ensure the processor has the necessary technology to store personal data securely. You can conduct an audit to understand possible vulnerabilities and your data processor’s technological capacity.

    5. Obtain legal counsel

    When drafting a DPA, it’s important to get a consultation on what is needed to ensure complete compliance. Obtaining legal counsel points you in the right direction so you don’t make any mistakes that may lead to non-compliance.

    Conclusion

    Businesses that process users’ data are subject to several DPA contract requirements under GDPR. One of the most important is having DPAs with every third-party provider that helps them perform data processing.

    It’s important to stay updated on GDPR requirements for compliance. As such, Matomo can help you maintain lawful data processes. Matomo gives you complete control over your data and complies with GDPR requirements.

    To get started with Matomo, you can sign up for a 21-day free trial. No credit card required.

    Disclaimer

    We are not lawyers and don’t claim to be. The information provided here is to help give an introduction to GDPR. We encourage every business and website to take data privacy seriously and discuss these issues with your lawyer if you have any concerns.