Recherche avancée

Médias (0)

Mot : - Tags -/diogene

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (58)

  • Participer à sa traduction

    10 avril 2011

    Vous pouvez nous aider à améliorer les locutions utilisées dans le logiciel ou à traduire celui-ci dans n’importe qu’elle nouvelle langue permettant sa diffusion à de nouvelles communautés linguistiques.
    Pour ce faire, on utilise l’interface de traduction de SPIP où l’ensemble des modules de langue de MediaSPIP sont à disposition. ll vous suffit de vous inscrire sur la liste de discussion des traducteurs pour demander plus d’informations.
    Actuellement MediaSPIP n’est disponible qu’en français et (...)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • MediaSPIP Player : problèmes potentiels

    22 février 2011, par

    Le lecteur ne fonctionne pas sur Internet Explorer
    Sur Internet Explorer (8 et 7 au moins), le plugin utilise le lecteur Flash flowplayer pour lire vidéos et son. Si le lecteur ne semble pas fonctionner, cela peut venir de la configuration du mod_deflate d’Apache.
    Si dans la configuration de ce module Apache vous avez une ligne qui ressemble à la suivante, essayez de la supprimer ou de la commenter pour voir si le lecteur fonctionne correctement : /** * GeSHi (C) 2004 - 2007 Nigel McNie, (...)

Sur d’autres sites (14824)

  • Developing A Shader-Based Video Codec

    22 juin 2013, par Multimedia Mike — Outlandish Brainstorms

    Early last month, this thing called ORBX.js was in the news. It ostensibly has something to do with streaming video and codec technology, which naturally catches my interest. The hype was kicked off by Mozilla honcho Brendan Eich when he posted an article asserting that HD video decoding could be entirely performed in JavaScript. We’ve seen this kind of thing before using Broadway– an H.264 decoder implemented entirely in JS. But that exposes some very obvious limitations (notably CPU usage).

    But this new video codec promises 1080p HD playback directly in JavaScript which is a lofty claim. How could it possibly do this ? I got the impression that performance was achieved using WebGL, an extension which allows JavaScript access to accelerated 3D graphics hardware. Browsing through the conversations surrounding the ORBX.js announcement, I found this confirmation from Eich himself :

    You’re right that WebGL does heavy lifting.

    As of this writing, ORBX.js remains some kind of private tech demo. If there were a public demo available, it would necessarily be easy to reverse engineer the downloadable JavaScript decoder.

    But the announcement was enough to make me wonder how it could be possible to create a video codec which effectively leverages 3D hardware.

    Prior Art
    In theorizing about this, it continually occurs to me that I can’t possibly be the first person to attempt to do this (or the ORBX.js people, for that matter). In googling on the matter, I found various forums and Q&A posts where people asked if it were possible to, e.g., accelerate JPEG decoding and presentation using 3D hardware, with no answers. I also found a blog post which describes a plan to use 3D hardware to accelerate VP8 video decoding. It was a project done under the banner of Google’s Summer of Code in 2011, though I’m not sure which open source group mentored the effort. The project did not end up producing the shader-based VP8 codec originally chartered but mentions that “The ‘client side’ of the VP8 VDPAU implementation is working and is currently being reviewed by the libvdpau maintainers.” I’m not sure what that means. Perhaps it includes modifications to the public API that supports VP8, but is waiting for the underlying hardware to actually implement VP8 decoding blocks in hardware.

    What’s So Hard About This ?
    Video decoding is a computationally intensive task. GPUs are known to be really awesome at chewing through computationally intensive tasks. So why aren’t GPUs a natural fit for decoding video codecs ?

    Generally, it boils down to parallelism, or lack of opportunities thereof. GPUs are really good at doing the exact same operations over lots of data at once. The problem is that decoding compressed video usually requires multiple phases that cannot be parallelized, and the individual phases often cannot be parallelized. In strictly mathematical terms, a compressed data stream will need to be decoded by applying a function f(x) over each data element, x0 .. xn. However, the function relies on having applied the function to the previous data element, i.e. :

    f(xn) = f(f(xn-1))
    

    What happens when you try to parallelize such an algorithm ? Temporal rifts in the space/time continuum, if you’re in a Star Trek episode. If you’re in the real world, you’ll get incorrect, unusuable data as the parallel computation is seeded with a bunch of invalid data at multiple points (which is illustrated in some of the pictures in the aforementioned blog post about accelerated VP8).

    Example : JPEG
    Let’s take a very general look at the various stages involved in decoding the ubiquitous JPEG format :


    High level JPEG decoding flow

    What are the opportunities to parallelize these various phases ?

    • Huffman decoding (run length decoding and zig-zag reordering is assumed to be rolled into this phase) : not many opportunities for parallelizing the various Huffman formats out there, including this one. Decoding most Huffman streams is necessarily a sequential operation. I once hypothesized that it would be possible to engineer a codec to achieve some parallelism during the entropy decoding phase, and later found that On2′s VP8 codec employs the scheme. However, such a scheme is unlikely to break down to such a fine level that WebGL would require.
    • Reverse DC prediction : JPEG — and many other codecs — doesn’t store full DC coefficients. It stores differences in successive DC coefficients. Reversing this process can’t be parallelized. See the discussion in the previous section.
    • Dequantize coefficients : This could be very parallelized. It should be noted that software decoders often don’t dequantize all coefficients. Many coefficients are 0 and it’s a waste of a multiplication operation to dequantize. Thus, this phase is sometimes rolled into the Huffman decoding phase.
    • Invert discrete cosine transform : This seems like it could be highly parallelizable. I will be exploring this further in this post.
    • Convert YUV -> RGB for final display : This is a well-established use case for 3D acceleration.

    Crash Course in 3D Shaders and Humility
    So I wanted to see if I could accelerate some parts of JPEG decoding using something called shaders. I made an effort to understand 3D programming and its associated math throughout the 1990s but 3D technology left me behind a very long time ago while I got mixed up in this multimedia stuff. So I plowed through a few books concerning WebGL (thanks to my new Safari Books Online subscription). After I learned enough about WebGL/JS to be dangerous and just enough about shader programming to be absolutely lethal, I set out to try my hand at optimizing IDCT using shaders.

    Here’s my extremely high level (and probably hopelessly naive) view of the modern GPU shader programming model :


    Basic WebGL rendering pipeline

    The WebGL program written in JavaScript drives the show. It sends a set of vertices into the WebGL system and each vertex is processed through a vertex shader. Then, each pixel that falls within a set of vertices is sent through a fragment shader to compute the final pixel attributes (R, G, B, and alpha value). Another consideration is textures : This is data that the program uploads to GPU memory which can be accessed programmatically by the shaders).

    These shaders (vertex and fragment) are key to the GPU’s programmability. How are they programmed ? Using a special C-like shading language. Thought I : “C-like language ? I know C ! I should be able to master this in short order !” So I charged forward with my assumptions and proceeded to get smacked down repeatedly by the overall programming paradigm. I came to recognize this as a variation of the scientific method : Develop a hypothesis– in my case, a mental model of how the system works ; develop an experiment (short program) to prove or disprove the model ; realize something fundamental that I was overlooking ; formulate new hypothesis and repeat.

    First Approach : Vertex Workhorse
    My first pitch goes like this :

    • Upload DCT coefficients to GPU memory in the form of textures
    • Program a vertex mesh that encapsulates 16×16 macroblocks
    • Distribute the IDCT effort among multiple vertex shaders
    • Pass transformed Y, U, and V blocks to fragment shader which will convert the samples to RGB

    So the idea is that decoding of 16×16 macroblocks is parallelized. A macroblock embodies 6 blocks :


    JPEG macroblocks

    It would be nice to process one of these 6 blocks in each vertex. But that means drawing a square with 6 vertices. How do you do that ? I eventually realized that drawing a square with 6 vertices is the recommended method for drawing a square on 3D hardware. Using 2 triangles, each with 3 vertices (0, 1, 2 ; 3, 4, 5) :


    2 triangles make a square

    A vertex shader knows which (x, y) coordinates it has been assigned, so it could figure out which sections of coefficients it needs to access within the textures. But how would a vertex shader know which of the 6 blocks it should process ? Solution : Misappropriate the vertex’s z coordinate. It’s not used for anything else in this case.

    So I set all of that up. Then I hit a new roadblock : How to get the reconstructed Y, U, and V samples transported to the fragment shader ? I have found that communicating between shaders is quite difficult. Texture memory ? WebGL doesn’t allow shaders to write back to texture memory ; shaders can only read it. The standard way to communicate data from a vertex shader to a fragment shader is to declare variables as “varying”. Up until this point, I knew about varying variables but there was something I didn’t quite understand about them and it nagged at me : If 3 different executions of a vertex shader set 3 different values to a varying variable, what value is passed to the fragment shader ?

    It turns out that the varying variable varies, which means that the GPU passes interpolated values to each fragment shader invocation. This completely destroys this idea.

    Second Idea : Vertex Workhorse, Take 2
    The revised pitch is to work around the interpolation issue by just having each vertex shader invocation performs all 6 block transforms. That seems like a lot of redundant. However, I figured out that I can draw a square with only 4 vertices by arranging them in an ‘N’ pattern and asking WebGL to draw a TRIANGLE_STRIP instead of TRIANGLES. Now it’s only doing the 4x the extra work, and not 6x. GPUs are supposed to be great at this type of work, so it shouldn’t matter, right ?

    I wired up an experiment and then ran into a new problem : While I was able to transform a block (or at least pretend to), and load up a varying array (that wouldn’t vary since all vertex shaders wrote the same values) to transmit to the fragment shader, the fragment shader can’t access specific values within the varying block. To clarify, a WebGL shader can use a constant value — or a value that can be evaluated as a constant at compile time — to index into arrays ; a WebGL shader can not compute an index into an array. Per my reading, this is a WebGL security consideration and the limitation may not be present in other OpenGL(-ES) implementations.

    Not Giving Up Yet : Choking The Fragment Shader
    You might want to be sitting down for this pitch :

    • Vertex shader only interpolates texture coordinates to transmit to fragment shader
    • Fragment shader performs IDCT for a single Y sample, U sample, and V sample
    • Fragment shader converts YUV -> RGB

    Seems straightforward enough. However, that step concerning IDCT for Y, U, and V entails a gargantuan number of operations. When computing the IDCT for an entire block of samples, it’s possible to leverage a lot of redundancy in the math which equates to far fewer overall operations. If you absolutely have to compute each sample individually, for an 8×8 block, that requires 64 multiplication/accumulation (MAC) operations per sample. For 3 color planes, and including a few extra multiplications involved in the RGB conversion, that tallies up to about 200 MACs per pixel. Then there’s the fact that this approach means a 4x redundant operations on the color planes.

    It’s crazy, but I just want to see if it can be done. My approach is to pre-compute a pile of IDCT constants in the JavaScript and transmit them to the fragment shader via uniform variables. For a first order optimization, the IDCT constants are formatted as 4-element vectors. This allows computing 16 dot products rather than 64 individual multiplication/addition operations. Ideally, GPU hardware executes the dot products faster (and there is also the possibility of lining these calculations up as matrices).

    I can report that I actually got a sample correctly transformed using this approach. Just one sample, through. Then I ran into some new problems :

    Problem #1 : Computing sample #1 vs. sample #0 requires a different table of 64 IDCT constants. Okay, so create a long table of 64 * 64 IDCT constants. However, this suffers from the same problem as seen in the previous approach : I can’t dynamically compute the index into this array. What’s the alternative ? Maintain 64 separate named arrays and implement 64 branches, when branching of any kind is ill-advised in shader programming to begin with ? I started to go down this path until I ran into…

    Problem #2 : Shaders can only be so large. 64 * 64 floats (4 bytes each) requires 16 kbytes of data and this well exceeds the amount of shader storage that I can assume is allowed. That brings this path of exploration to a screeching halt.

    Further Brainstorming
    I suppose I could forgo pre-computing the constants and directly compute the IDCT for each sample which would entail lots more multiplications as well as 128 cosine calculations per sample (384 considering all 3 color planes). I’m a little stuck with the transform idea right now. Maybe there are some other transforms I could try.

    Another idea would be vector quantization. What little ORBX.js literature is available indicates that there is a method to allow real-time streaming but that it requires GPU assistance to yield enough horsepower to make it feasible. When I think of such severe asymmetry between compression and decompression, my mind drifts towards VQ algorithms. As I come to understand the benefits and limitations of GPU acceleration, I think I can envision a way that something similar to SVQ1, with its copious, hierarchical vector tables stored as textures, could be implemented using shaders.

    So far, this all pertains to intra-coded video frames. What about opportunities for inter-coded frames ? The only approach that I can envision here is to use WebGL’s readPixels() function to fetch the rasterized frame out of the GPU, and then upload it again as a new texture which a new frame processing pipeline could reference. Whether this idea is plausible would require some profiling.

    Using interframes in such a manner seems to imply that the entire codec would need to operate in RGB space and not YUV.

    Conclusions
    The people behind ORBX.js have apparently figured out a way to create a shader-based video codec. I have yet to even begin to reason out a plausible approach. However, I’m glad I did this exercise since I have finally broken through my ignorance regarding modern GPU shader programming. It’s nice to have a topic like multimedia that allows me a jumping-off point to explore other areas.

  • A Guide to Ethical Web Analytics in 2024

    17 juin 2024, par Erin

    User data is more valuable and sought after than ever. 

    Ninety-four percent of respondents in Cisco’s Data Privacy Benchmark Study said their customers wouldn’t buy from them if their data weren’t protected, with 95% saying privacy was a business imperative. 

    Unfortunately, the data collection practices of most businesses are far from acceptable and often put their customers’ privacy at risk. 

    But it doesn’t have to be this way. You can ethically collect valuable and insightful customer data—you just need the right tools.

    In this article, we show you what an ethical web analytics solution can look like, why Google Analytics is a problem and how you can collect data without risking your customers’ privacy.

    What is ethical web analytics ?

    Ethical web analytics put user privacy first. These platforms prioritise privacy and transparency by only collecting necessary data, avoiding implicit user identification and openly communicating data practices and tracking methods. 

    Ethical tools adhere to data protection laws like GDPR as standard (meaning businesses using these tools never have to worry about fines or disruptions). In other words, ethical web analytics refrain from exploiting and profiting from user behaviour and data. 

    Unfortunately, most traditional data solutions collect as much data as possible without users’ knowledge or consent.

    Why does digital privacy matter ?

    Digital privacy matters because companies have repeatedly proven they will collect and use data for financial gain. It also presents security risks. Unsecured user data can lead to identity theft, cyberattacks and harassment. 

    Big tech companies like Google and Meta are often to blame for all this. These companies collect millions of user data points — like age, gender, income, political beliefs and location. Worse still, they share this information with interested third parties.

    After public outrage over data breaches and other privacy scandals, consumers are taking active steps to disallow tracking where possible. IAPP’s Privacy and Consumer Trust Report finds that 68% of consumers across 19 countries are somewhat or very concerned about their digital privacy. 

    There’s no way around it : companies of all sizes and shapes need to consider how they handle and protect customers’ private information

    Why should you use an ethical web analytics tool ?

    When companies use ethical web analytics tools they can build customer trust, boost their brand reputation, improve data security practices and future proof their website tracking solution. 

    Boost brand reputation

    The fallout from a data privacy scandal can be severe. 

    Just look at what happened to Facebook during the Cambridge Analytica data scandal. The eponymous consulting firm harvested 50 million Facebook profiles and used that information to target people with political messages. Due to the instant public backlash, Facebook’s stock tanked, and use of the “delete Facebook” hashtag increased by 423% in the following days.

    That’s because consumers care about data privacy, according to Deloitte’s Connected Consumer Study :

    • Almost 90 percent agree they should be able to view and delete data companies collect 
    • 77 percent want the government to introduce stricter regulations
    • Half feel the benefits they get from online services outweigh data privacy concerns.

    If you can prove you buck the trend by collecting data using ethical methods, it can boost your brand’s reputation. 

    Build trust with customers

    At the same time, collecting data in an ethical way can help you build customer trust. You’ll go a long way to changing consumer perceptions, too. Almost half of consumers don’t like sharing data, and 57% believe companies sell their data. 

    This additional trust should generate a positive ROI for your business. According to Cisco’s Data Privacy Benchmark Study, the average company gains $180 for every $100 they invest in privacy. 

    Improve data security

    According to IBM’s Cost of a Data Breach report, the average cost of a data breach is nearly $4.5 million. This kind of scenario becomes much less likely when you use an ethical tool that collects less data overall and anonymises the data you do collect. 

    Futureproof your web analytics solution

    The obvious risk of not complying with privacy regulations is a fine — which can be up to €20 million, or 4% of worldwide annual revenue in the case of GDPR.

    It’s not just fines and penalties you risk if you fail to comply with privacy regulations like GDPR. For some companies, especially larger ones, the biggest risk of non-compliance with privacy regulations is the potential sudden need to abandon Google Analytics and switch to an ethical alternative.

    If Data Protection Authorities ban Google Analytics again, as has happened in Austria, France, and other countries, businesses will be forced to drop everything and make an immediate transition to a compliant web analytics solution.

    When an organisation’s entire marketing operation relies on data, migrating to a new solution can be incredibly painful and time-consuming. So, the sooner you switch to an ethical tool, the less of a headache the process will be. 

    The problem with Google Analytics

    Google Analytics (GA) is the most popular analytics platform in the world, but it’s a world away from being an ethical tool. Here’s why :

    You don’t have data ownership

    Google Analytics is attractive to businesses of all sizes because of its price. Everyone loves getting something for free, but there’s still a cost — your and your customers’ data.

    That’s because Google combines the data you collect with information from the millions of other websites it tracks to inform its advertising efforts. It may also use your data to train large language models like Gemini. 

    It has a rocky history with GDPR laws

    Google and EU regulators haven’t always got along. For example, the German Data Protection Authority is investigating 200,000 pending cases against websites using GA. The platform has also been banned and added back to the EU-US Data Privacy Framework several times over the past few years. 

    You can use GA to collect data about EU customers right now, but there’s no guarantee you’ll be able to do so in the future. 

    It requires a specific setup to remain compliant

    While you can currently use GA in a GDPR-compliant way — owing to its inclusion in the EU-US Data Privacy Framework — you have to set it up in a very specific way. That’s because the platform’s compliance depends on what data you collect, how you inform users and the level of consent you acquire. You’ll still need to include an extensive privacy policy on your website. 

    What does ethical web analytics look like ?

    An ethical web analytics solution should put user privacy first, ensure compliance with regulations like GDPR, give businesses 100% control of the data they collect and be completely transparent about data collection and storage practices. 

    What does ethical web tracking look like?

    100% data ownership

    You don’t fully control customer data when you use Google Analytics. The search giant uses your data for its own advertising purposes and may also use it to train large language models like Gemini. 

    When you choose an ethical web analytics alternative like Matomo, you can ensure you completely own your data.

    Try Matomo for Free

    Get the web insights you need, without compromising data accuracy.

    No credit card required

    Respects user privacy

    It’s possible to track and measure user behaviour without collecting personally identifiable information (PII). Just look at the ethical web analytics tools we’ve reviewed below. 

    These platforms respect user privacy and conform to strict privacy regulations like GDPR, CCPA and HIPAA by incorporating some or all of the following features :

    In Matomo’s case, it’s all of the above. Better still, you can check our privacy credentials yourself. Our software’s source code is open source on GitHub and accessible to anyone at any time. 

    Compliant with government regulations

    While Google’s history with data regulations is tumultuous, an ethical web analytics platform should follow even the strictest privacy laws, including GDPR, HIPAA, CCPA, LGPD and PECR.

    But why stop there ? Matomo has been approved by the French Data Protection Authority (CNIL) as one of the few web analytics tools that French sites can use to collect data without tracking consent. So you don’t need an annoying consent banner popping up on your website anymore. 

    Try Matomo for Free

    Get the web insights you need, without compromising data accuracy.

    No credit card required

    Complete transparency 

    Ethical web analytics tools will be upfront about their data collection practices, whether that’s in the U.S., EU, or on your own private servers. Look for a solution that refrains from collecting personally identifiable information, shows where data is stored, and lets you alter tracking methods to increase privacy even further. 

    Some solutions, like Matomo, will increase transparency further by providing open source software. Anyone can find our source code on GitHub to see exactly how our platform tracks and stores user data. This means our code is regularly examined and reviewed by a community of developers, making it more secure, too.

    Ethical web analytics solutions

    There are several options for an ethical web analytics tool. We list three of the best providers below. 

    Matomo

    Matomo is an open source web analytics tool and privacy-focused Google Analytics alternative used by over one million sites globally. 

    Screenshot example of the Matomo dashboard

    Matomo is fully compliant with prominent global privacy regulations like GDPR, CCPA and HIPAA, meaning you never have to worry about collecting consent when tracking user behaviour. 

    The data you collect is completely accurate since Matomo doesn’t use data sampling and is 100% yours. We don’t share data with third parties but can prove it. Our product source code is publicly available on GitHub. As a community-led project, you can download and install it yourself for free. 

    With Matomo, you get a full range of web analytics capabilities and behavioural analytics. That includes your standard metrics (think visitors, traffic sources, bounce rates, etc.), advanced features to analyse user behaviour like A/B Testing, Form Analytics, Heatmaps and Session Recordings. 

    Migrating to Matomo is easy. You can even import historical Google Analytics data to generate meaningful insights immediately. 

    Try Matomo for Free

    Get the web insights you need, without compromising data accuracy.

    No credit card required

    Fathom

    Fathom Analytics is a lightweight privacy-focused analytics solution that launched in 2018. It aims to be an easy-to-use Google Analytics alternative that doesn’t compromise privacy. 

    A screenshot of the Fathom website

    Like Matomo, Fathom complies with all major privacy regulations, including GDPR and CCPA. It also provides 100% accurate, unsampled reports and doesn’t share your data with third parties. 

    While Fathom provides fairly comprehensive analytics reports, it doesn’t have some of Matomo’s more advanced features. That includes e-commerce tracking, heatmaps, session recordings, and more. 

    Plausible

    Plausible Analytics is another open source Google Analytics alternative that was built and hosted in the EU. 

    A screenshot of the Plausible website

    Launched in 2019, Plausible is a newer player in the privacy-focused analytics market. Still, its ultra-lightweight script makes it an attractive option for organisations that prioritise speed over everything else. 

    Like Matomo and Fathom, Plausible is GDPR and CCPA-compliant by design. Nor is there any cap on the amount of data you collect or any debate over whether the data is accurate (Plausible doesn’t use data sampling) or who owns the data (you do). 

    Matomo makes it easy to migrate to an ethical web analytics alternative

    There’s no reason to put your users’ privacy at risk, especially when there are so many benefits to choosing an ethical tool. Whether you want to avoid fines, build trust with your customers, or simply know you’re doing the right thing, choosing a privacy-focused, ethical solution like Matomo is taking a massive step in the right direction. 

    Making the switch is easy, too. Matomo is one of the few options that lets you import historical Google Analytics data, so starting from scratch is unnecessary. 

    Get started today by trying Matomo for free for 21-days. No credit card required. 

  • CRO Testing : The 6-Steps for Maximising Conversion Rates

    10 mars 2024, par Erin

    It’s a nightmare every marketing manager faces. Traffic is soaring after you’ve launched new digital marketing campaigns, but conversions have barely moved.

    Sound familiar ?

    The good news is you’re not alone — loads of marketing managers struggle to get potential customers to purchase. The better news is that you can test dozens of strategies to turn around your site’s fortunes. 

    Conversion rate optimisation testing (CRO testing for short) is the name for this kind of experimentation — and it can send conversion rates and revenue soaring.

    In this article, we’ll explain CRO testing and how you can start doing it today using Matomo. 

    What is CRO Testing ? 

    CRO testing is optimising your site’s conversion funnel using a series of experiments designed to improve conversion rates.

    A CRO test can take several forms, but it usually involves changing one or more elements of your landing page. It looks something like this :

    1. You hypothesise what you expect to happen.
    2. You then run an A/B test using a dedicated CRO platform or tool.
    3. This tool will divide your site’s traffic, sending one segment to one variation and the other segment to another.
    4. The CRO tool will measure conversions, track statistical significance, and declare one variation the winner. 

    A CRO tool isn’t the only software you can use to gather data when running tests. There are several other valuable data sources, including :

    • A web analytics platform : to identify issues with your website
    • User surveys : to find out what your target audience thinks about your site
    • Heatmaps : to learn where users focus their attention
    • Session recordings : to discover how visitors browse your site

    Use as many of these features, tools, and methods as you can when brainstorming hypotheses and measuring results. After all, your CRO test is only as good as your data.

    On that note, we need to mention the importance of data accuracy when researching issues with your website and running CRO tests. If you trust a platform like Google Analytics that uses data sampling (where only a subset of data is analysed), then there’s a risk you make business decisions based on inaccurate reports.

    In practice, that could see you overestimate the effectiveness of a landing page, potentially wasting thousands in ad spend on poorly converting pages. 

    That’s why over a million websites rely on Matomo as their web analytics solution—it doesn’t sample data, providing 100% accurate website traffic insights you can trust to make informed decisions.

    Try Matomo for Free

    Get the web insights you need, without compromising data accuracy.

    No credit card required

    Types of CRO Testing 

    There are three core types of CRO tests :

    A/B testing

    A/B testing, or split testing, is when you test two versions of the same page against each other. Usually, the two pages have only one difference, such as a new headline or a different CTA. 

    An A/B test setup in Matomo

    In the test above, for example, we test what happens if we remove one of the affiliate links from a page. We hypothesise that conversions won’t change because these links aren’t effective.

    A/B/n testing

    A/B/n testing is when you test multiple variations of the same element on the same page. 

    Rather than just testing one headline against another, for example, you test multiple different headlines at once.

    A screenshot of A/B test results run using Matomo

    In the test above in Matomo, we’re testing a website’s original header against a wider and smaller version. It turns out the wider header converts significantly better. 

    Multivariate testing

    In a multivariate CRO test, you test multiple different elements at the same time. That could mean testing combining a different headline, CTA button, and image. 

    Multivariate testing can save time because you test multiple elements at once and find the best combination of elements. But you’ll usually need a lot of traffic to find a statistically significant result.

    Why is CRO testing important ?

    Who doesn’t want more conversions, right ? Improving your conversion rate is the core benefit of running a CRO test, but there are a couple of other reasons you should do it, too :

    Why Is CRO Testing Important?

    Improve conversion rates

    How well does your website convert visitors ? The average conversion rate of a typical website is 2.35%, but better-performing websites have significantly higher conversion rates. The top 25% of websites across all industries convert at a rate of 5.31% or higher.

    CRO testing is the best way to improve your site’s conversion rate by tweaking elements of your website and implementing the best results. And because it’s based on data, not your intuition, you’re likely to identify changes that move the needle. 

    Optimise the user experience

    CRO tests are also a great way to improve your site’s user experience. The process of CRO testing forces you to understand how users navigate your website using heatmaps and session recordings and fix the issues they face. 

    You could simplify your form fields to make them easier to fill in, for example, or make your pages easier to navigate. In both cases, your actions will also increase conversion rates.

    Decrease acquisition costs

    Improving your conversion rate using CRO testing will usually mean a decrease in customer acquisition costs and other conversion metrics

    After all, if the cost of your PPC ads stays the same but you convert more traffic, then each new customer will cost less to acquire.

    How to do CRO testing in 6 steps 

    Ready to get your hands dirty ? Follow these six steps to set up your first CRO test :

    Have a clear goal

    Don’t jump straight into testing. You need to be clear about what you want to achieve ; otherwise, you risk wasting time on irrelevant experiments. 

    If you’re unsure what to focus on, look back through your web analytics data and other tools like heatmaps, form analytics, and session recordings to get a feel for some of your site’s biggest conversion roadblocks. 

    Maybe there’s a page with a much lower conversion rate, for example — or a form that most users fail to complete. 

    If it’s the former, then your goal could be to increase the conversion rate of this specific landing page by 25%, bringing it in line with your site’s average. 

    The Goals dashboard in Matomo

    Make sure your new conversion goal is set up properly in your website analytics platform, too. This will ensure you’re tracking conversions accurately. 

    Set a hypothesis

    Now you’ve got a goal, it’s time to create a hypothesis. Based on your available research, a hypothesis is an assumption you make about your conversion rate optimisation test.

    A heatmap of your poorly converting landing page may show that users aren’t focusing on your CTA button because it’s hidden below the fold. 

    You could hypothesise that by placing the CTA button directly under your headline above the fold, your conversion rate should increase. 

    Whatever your goal, you can use the following template to write a hypothesis :

    If we [make this specific change], then [this specific outcome] will occur because [reason].

    Design your test elements

    Most marketing managers won’t be able to run CRO tests independently. A team of talented experts must create the assets you need for a successful experimentation. This includes designers, copywriters, and web developers. 

    Don’t just have them create one new element at a time. Accelerate the process by having your team create dozens of designs simultaneously. That way, you can run a new CRO test as soon as your current test has finished. 

    Create and launch the test

    It’s time to launch your test. Use a CRO tool to automate building your test and tracking results. 

    With Matomo’s A/B Testing feature, it’s as easy as giving your test a name, writing a hypothesis and description, and uploading the URLs of your page variants.

    How to create a new A/B test in Matomo

    Matomo handles everything else, giving you a detailed breakdown at the end of the test with the winning variant. 

    Try Matomo for Free

    Get the web insights you need, without compromising data accuracy.

    No credit card required

    Analyse the results

    You can only review the results of your CRO test once it has reached statistical significance — which means the observed outcome isn’t the result of chance.

    In the same way you wouldn’t say a die is unbiased after three rolls, you need thousands of visitors to see your landing pages and take action before deciding which is better. 

    Luckily, most CRO testing platforms, including Matomo, will highlight when a test reaches statistical significance. That means you only need to look at the result to see if your hypothesis is correct. 

    Implement and repeat

    Was your test a success ? Great, you can implement the results and test a new element. 

    Yep, that’s right. There’s no time to rest on your laurels. Continuous CRO testing is necessary to squeeze every conversion possible from your website. Just like fashion trends, website effectiveness changes over time. What works today might not work tomorrow, making ongoing CRO testing beneficial and necessary.

    That’s why it’s a good idea to choose a CRO testing platform like Matomo with no data limits.

    Try Matomo for Free

    Get the web insights you need, without compromising data accuracy.

    No credit card required

    CRO testing examples you can run today 

    There’s no shortage of CRO tests you can run. Here are some experiments to get started with :

    Change your CTA design and copy

    Calls to action (CTAs) are the best elements to optimise during your first CRO test. You can change many things about them ; even the smallest optimisation can have a huge impact. 

    Just take a look at the image below to see how diverse your CTAs could be :

    A range of different CTA buttons

    Changing your CTA’s copy is a great place to start, especially if you have generic instructions like “Apply Now.”

    Try a more specific instruction like “Download your free trial” or “Buy now to get 30% off.” Or test benefit-led instructions like “Reduce your ad spend today” or “Take back control of your data.”

    Changing the colour of your CTAs can also yield more conversions. Bright colours are always a good bet. Just make sure your button stands out from the rest of your page. 

    Move the CTA button placement

    The placement of your CTA can be just as important as its copy or colour. If it’s down at the bottom of your page, there’s a good chance most of your visitors will miss it. 

    Try moving it above the fold to see if that makes a difference. Then, test multiple CTA buttons as opposed to just one. 

    Heatmaps and session recordings can identify whether this test is worthwhile. If users rarely focus on your CTA or just don’t scroll far enough to find it, then it’s a good bet you could see an uptick in conversions by moving it. 

    Try different headlines

    Your website’s headlines are another great place to start CRO testing. These are usually the first (and sometimes only) things visitors read, so optimising them as much as possible makes sense. 

    There are entire books written about creating persuasive headlines, but start with one of the following tactics :

    • Include a benefit
      • “Achieve radiant skin—discover the secret !”
    • Add numbers
      • “3 foolproof methods for saving money on your next vacation”
    • Using negative words instead of positive ones
      • “Avoid these 7 mistakes to unlock your potential for personal growth”
    • Shortening or lengthening your headline
      • Shortened : “Crush your fitness goals : Expert tips for success”
      • Lengthened : “Embark on your fitness journey : Learn from experts with proven tips to crush your wellness goals”

    Add more trust signals

    Adding trust signals to your website, such as brand logos, customer reviews, and security badges, can increase your conversion rate.

    We use it at Matomo by adding the logos of well-known clients like the United Nations and Amnesty International underneath our CTAs.

    Trust signals on the Matomo website

    It’s incredibly effective, too. Research by Edelman finds that trust is among the top three most important buying decision factors, above brand likeability.

    Start CRO testing with Matomo

    CRO testing is a data-backed method to improve your site’s conversion rate, making it more user-friendly and decreasing customer acquisition costs. Even a small improvement will be worth the cost of the tools and your time. 

    Fortunately, there’s no need to allocate hundreds of dollars monthly for multiple specialised testing tools. With Matomo, you get a comprehensive platform offering web analytics, user behaviour insights, and CRO testing – all conveniently bundled into one solution. Matomo’s pricing starts from just $19 per month, making it accessible to businesses of all sizes.

    Plus, rest assured knowing that you are GDPR compliant and the data provided is 100% accurate, ethically empowering you to make informed decisions with confidence.

    Take the first step on your CRO testing journey by trying Matomo free for 21 days ; no credit card required.