Recherche avancée

Médias (0)

Mot : - Tags -/clipboard

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (62)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • Support de tous types de médias

    10 avril 2011

    Contrairement à beaucoup de logiciels et autres plate-formes modernes de partage de documents, MediaSPIP a l’ambition de gérer un maximum de formats de documents différents qu’ils soient de type : images (png, gif, jpg, bmp et autres...) ; audio (MP3, Ogg, Wav et autres...) ; vidéo (Avi, MP4, Ogv, mpg, mov, wmv et autres...) ; contenu textuel, code ou autres (open office, microsoft office (tableur, présentation), web (html, css), LaTeX, Google Earth) (...)

  • Emballe médias : à quoi cela sert ?

    4 février 2011, par

    Ce plugin vise à gérer des sites de mise en ligne de documents de tous types.
    Il crée des "médias", à savoir : un "média" est un article au sens SPIP créé automatiquement lors du téléversement d’un document qu’il soit audio, vidéo, image ou textuel ; un seul document ne peut être lié à un article dit "média" ;

Sur d’autres sites (8190)

  • 7 Reasons to Migrate from Google Analytics to Matomo Now

    15 mai 2022, par Erin

    The release of Google Analytics 4 (GA4), and the subsequent depreciation of Universal Analytics, has caused a stir amongst webmasters, SEO experts, marketers and the likes.

    Google’s Universal Analytics is the most widely used web analytics platform in the world, but from 1 July 2023, it will no longer process any new data. Google is now pushing users to set up GA4 tracking imminently.

    If you’re like many and wondering if you should upgrade to Google Analytics 4, there are two key reasons why this might be a risk :

    1. GDPR violations : recent rulings have deemed Google Analytics illegal in France and Austria, and it’s likely that this trend will continue across the EU.
    2. Data loss : users switching to Google Analytics 4 can’t migrate their data from Universal Analytics.

    To mitigate these risks, many organisations are looking to switch to a Google Analytics alternative like Matomo. This is an ideal option for organisations that want to take ownership of their data, get compliant with privacy regulations and save themselves the stress of Google deprecating the software they rely on.

    Whilst there are two major reasons to steer clear of Google Analytics 4, there are 7 reasons why migrating to Matomo instead could save your business time, money and peace of mind.

    If you want to avoid the pitfalls of GA4 and are thinking about migrating from Universal Analytics to Matomo, here’s why you should make the switch now.

    1. Keep your historical Universal Analytics data

    Users switching to Google Analytics 4 will be disappointed to find out that GA4 does not accept data imports from Universal Analytics. On top of that, Google also announced that after Universal Analytics stops processing new data (1 July 2023), users will only be able to access this data for “at least six months”. 

    Years of valuable insights will be completely wiped and organisations will not be able to report on year over year results.

    Fortunately, any organisation using Universal Analytics can import this data into Matomo using our Google Analytics Importer plugin. So you can reduce business disruptions and retain years of valuable web analytics data when you switch to Matomo.

    Our comprehensive migration documentation features a handy video, written guides and FAQs to ensure a smooth migration process.

    2. Ease of use

    Web analytics is complicated enough without having to navigate confusing platform user interfaces (UIs). One of GA4’s biggest drawbacks is the “awful and unusable” interface which has received an overwhelming amount of negative backlash online. 

    Matomo’s intuitive UI contains many of the familiar features that made Universal Analytics so well-liked. You’ll find the same popular features like Visitors, Behaviour, and Acquisition to name a few.

    Behaviour User Flow in Matomo

    User Flow in Matomo

    When you switch to Matomo you can get up to speed quickly and spend more time focusing on high-value tasks, rather than learning about everything new in GA4.

    3. 100% accurate unsampled data

    GA4 implements data sampling and machine learning to fill gaps. Often what you are basing critical business decisions on is actually an estimate of activity. 

    Matomo does not use data sampling, so this guarantees you will always see the full picture.

    “My primary reason to use Matomo is to get the unsampled data, [...] if your website gets lots of traffic and you can’t afford an enterprise level tool like GA premium [GA360] then Matomo is your best choice.”

    Himanshu Sharma, Digital Marketing Consultant & Founder at Optimize Smart.

    With Matomo you can be confident your data-driven decisions are being made with real data.

    4. Privacy by design

    Built-in privacy has always been at the core of Matomo. One key method we use to achieve this, is by giving you 100% data ownership of your web analytics data. You don’t ever have to worry about the data landing in the wrong hands or being used in unethical ways – like unsolicited advertising. 

    On the contrary, Google Analytics is regularly under fire for controversial uses of data. While Google has made changes to make GA4 more privacy-focused, it’s all just smoke and mirrors. The data collected from Google Analytics accounts is used by Google to create digital profiles on internet users, which is then used for advertising. 

    Consumers are becoming increasingly concerned about how businesses are using their data. Businesses that develop privacy strategies, utilise privacy-focused tools will gain a competitive advantage and a loyal customer-base. 

    Prioritise the protection of your user data by switching to a privacy-by-design analytics solution.

    5. Compliance with global privacy laws

    To date, Google Analytics has been deemed illegal to use in France and Austria due to data transfers to the US. Upgrading to GA4 doesn’t make this problem go away either since data is still transferred to the US. 

    Matomo is easily configured to follow even the strictest of privacy laws like GDPR, HIPAA, CCPA, LGPD and PECR. Here’s how :

    Matomo can also be used without cookie consent banners (unlike with Google Analytics, which will always need user consent to track). Matomo has been approved by the French Data Protection Authority (CNIL) as one of the select few web analytics tools that can be used to collect data without tracking consent.

    Every year more countries are drafting legislation that mirrors the European Union’s GDPR (like the Brazilian LGPD). Matomo is designed to stay data-privacy law compliant, and always will be.

    Stay on top of global privacy laws and reduce the time you spend on compliance by switching to a privacy-compliant solution. 

    6. All-in-one web analytics

    Matomo gives you easy access to Heatmaps, Session Recordings, A/B testing, Funnels analytics, and more right out of the box. This means that digital marketing, UX and procurement teams won’t need to set up and manage multiple tools for behavioural analytics – it’s all in one place.

    Learn more about your audience, save money and reduce complexity by switching to an all-in-one analytics solution.

    Check out Matomo’s extensive product features.

    Heatmaps in Matomo

    Page Scroll Depth in Matomo

    7. Tag Manager built-in

    Unlike GA4, the Matomo Tag Manager comes built-in for an efficient and consistent user experience. Matomo Tag Manager offers a pain-free solution for embedding tracking codes on your website without needing help from a web developer or someone with technical knowledge.

    Help your Marketing team track more website actions and give time back to your web developer by switching to Matomo Tag Manager.

    Final Thoughts

    Google Analytics is free to use, but the surrounding legal issues with the platform and implications of switching to GA4 will make migrating a tough choice for many businesses. 

    Now is the chance for organisations to step away from the advertising tech giant, take ownership of web analytics data and get compliant. Switch to the leading Google Analytics alternative and see why over 1 million websites choose Matomo for their web analytics.

    Ready to get started with your own Google Analytics to Matomo migration ? Try Matomo free for 21 days now – no credit card required. 

  • How to Stream RTP (IP camera) Into React App setup

    10 novembre 2024, par sharon2469

    I am trying to transfer a live broadcast from an IP camera or any other broadcast coming from an RTP/RTSP source to my REACT application. BUT MUST BE LIVE

    


    My setup at the moment is :

    


    IP Camera -> (RTP) -> FFmpeg -> (udp) -> Server(nodeJs) -> (WebRTC) -> React app

    


    In the current situation, There is almost no delay, but there are some things here that I can't avoid and I can't understand why, and here is my question :

    


    1) First, is the SETUP even correct and this is the only way to Stream RTP video in Web app ?

    


    2) Is it possible to avoid re-encode the stream , RTP transmission necessarily comes in H.264, hence I don't really need to execute the following command :

    


        return spawn('ffmpeg', [
    '-re',                              // Read input at its native frame rate Important for live-streaming
    '-probesize', '32',                 // Set probing size to 32 bytes (32 is minimum)
    '-analyzeduration', '1000000',      // An input duration of 1 second
    '-c:v', 'h264',                     // Video codec of input video
    '-i', 'rtp://238.0.0.2:48888',      // Input stream URL
    '-map', '0:v?',                     // Select video from input stream
    '-c:v', 'libx264',                  // Video codec of output stream
    '-preset', 'ultrafast',             // Faster encoding for lower latency
    '-tune', 'zerolatency',             // Optimize for zero latency
    // '-s', '768x480',                    // Adjust the resolution (experiment with values)
    '-f', 'rtp', `rtp://127.0.0.1:${udpPort}` // Output stream URL
]);


    


    As you can se in this command I re-encode to libx264, But if I set FFMPEG a parameter '-c:v' :'copy' instead of '-c:v', 'libx264' then FFMPEG throw an error says : that it doesn't know how to encode h264 and only knows what is libx264-> Basically, I want to stop the re-encode because there is really no need for it, because the stream is already encoded to H264. Are there certain recommendations that can be made ?

    


    3) I thought about giving up the FFMPEG completely, but the RTP packets arrive at a size of 1200+ BYTES when WEBRTC is limited to up to 1280 BYTE. Is there a way to manage these sabotages without damaging the video and is it to enter this world ? I guess there is the whole story with the JITTER BUFFER here

    


    This is my server side code (THIS IS JUST A TEST CODE)

    


    import {
    MediaStreamTrack,
    randomPort,
    RTCPeerConnection,
    RTCRtpCodecParameters,
    RtpPacket,
} from 'werift'
import {Server} from "ws";
import {createSocket} from "dgram";
import {spawn} from "child_process";
import LoggerFactory from "./logger/loggerFactory";

//

const log = LoggerFactory.getLogger('ServerMedia')

// Websocket server -> WebRTC
const serverPort = 8888
const server = new Server({port: serverPort});
log.info(`Server Media start om port: ${serverPort}`);

// UDP server -> ffmpeg
const udpPort = 48888
const udp = createSocket("udp4");
// udp.bind(udpPort, () => {
//     udp.addMembership("238.0.0.2");
// })
udp.bind(udpPort)
log.info(`UDP port: ${udpPort}`)


const createFFmpegProcess = () => {
    log.info(`Start ffmpeg process`)
    return spawn('ffmpeg', [
        '-re',                              // Read input at its native frame rate Important for live-streaming
        '-probesize', '32',                 // Set probing size to 32 bytes (32 is minimum)
        '-analyzeduration', '1000000',      // An input duration of 1 second
        '-c:v', 'h264',                     // Video codec of input video
        '-i', 'rtp://238.0.0.2:48888',      // Input stream URL
        '-map', '0:v?',                     // Select video from input stream
        '-c:v', 'libx264',                  // Video codec of output stream
        '-preset', 'ultrafast',             // Faster encoding for lower latency
        '-tune', 'zerolatency',             // Optimize for zero latency
        // '-s', '768x480',                    // Adjust the resolution (experiment with values)
        '-f', 'rtp', `rtp://127.0.0.1:${udpPort}` // Output stream URL
    ]);

}

let ffmpegProcess = createFFmpegProcess();


const attachFFmpegListeners = () => {
    // Capture standard output and print it
    ffmpegProcess.stdout.on('data', (data) => {
        log.info(`FFMPEG process stdout: ${data}`);
    });

    // Capture standard error and print it
    ffmpegProcess.stderr.on('data', (data) => {
        console.error(`ffmpeg stderr: ${data}`);
    });

    // Listen for the exit event
    ffmpegProcess.on('exit', (code, signal) => {
        if (code !== null) {
            log.info(`ffmpeg process exited with code ${code}`);
        } else if (signal !== null) {
            log.info(`ffmpeg process killed with signal ${signal}`);
        }
    });
};


attachFFmpegListeners();


server.on("connection", async (socket) => {
    const payloadType = 96; // It is a numerical value that is assigned to each codec in the SDP offer/answer exchange -> for H264
    // Create a peer connection with the codec parameters set in advance.
    const pc = new RTCPeerConnection({
        codecs: {
            audio: [],
            video: [
                new RTCRtpCodecParameters({
                    mimeType: "video/H264",
                    clockRate: 90000, // 90000 is the default value for H264
                    payloadType: payloadType,
                }),
            ],
        },
    });

    const track = new MediaStreamTrack({kind: "video"});


    udp.on("message", (data) => {
        console.log(data)
        const rtp = RtpPacket.deSerialize(data);
        rtp.header.payloadType = payloadType;
        track.writeRtp(rtp);
    });

    udp.on("error", (err) => {
        console.log(err)

    });

    udp.on("close", () => {
        console.log("close")
    });

    pc.addTransceiver(track, {direction: "sendonly"});

    await pc.setLocalDescription(await pc.createOffer());
    const sdp = JSON.stringify(pc.localDescription);
    socket.send(sdp);

    socket.on("message", (data: any) => {
        if (data.toString() === 'resetFFMPEG') {
            ffmpegProcess.kill('SIGINT');
            log.info(`FFMPEG process killed`)
            setTimeout(() => {
                ffmpegProcess = createFFmpegProcess();
                attachFFmpegListeners();
            }, 5000)
        } else {
            pc.setRemoteDescription(JSON.parse(data));
        }
    });
});


    


    And this fronted :

    


    &#xA;&#xA;&#xA;    &#xA;    &#xA;    <code class="echappe-js">&lt;script&amp;#xA;            crossorigin&amp;#xA;            src=&quot;https://unpkg.com/react@16/umd/react.development.js&quot;&amp;#xA;    &gt;&lt;/script&gt;&#xA;    &lt;script&amp;#xA;            crossorigin&amp;#xA;            src=&quot;https://unpkg.com/react-dom@16/umd/react-dom.development.js&quot;&amp;#xA;    &gt;&lt;/script&gt;&#xA;    &lt;script&amp;#xA;            crossorigin&amp;#xA;            src=&quot;https://cdnjs.cloudflare.com/ajax/libs/babel-core/5.8.34/browser.min.js&quot;&amp;#xA;    &gt;&lt;/script&gt;&#xA;    &lt;script src=&quot;https://cdn.jsdelivr.net/npm/babel-regenerator-runtime@6.5.0/runtime.min.js&quot;&gt;&lt;/script&gt;&#xA;&#xA;&#xA;
    &#xA;

    &#xA;

    &#xA;&lt;script type=&quot;text/babel&quot;&gt;&amp;#xA;    let rtc;&amp;#xA;&amp;#xA;    const App = () =&gt; {&amp;#xA;        const [log, setLog] = React.useState([]);&amp;#xA;        const videoRef = React.useRef();&amp;#xA;        const socket = new WebSocket(&quot;ws://localhost:8888&quot;);&amp;#xA;        const [peer, setPeer] = React.useState(null); // Add state to keep track of the peer connection&amp;#xA;&amp;#xA;        React.useEffect(() =&gt; {&amp;#xA;            (async () =&gt; {&amp;#xA;                await new Promise((r) =&gt; (socket.onopen = r));&amp;#xA;                console.log(&quot;open websocket&quot;);&amp;#xA;&amp;#xA;                const handleOffer = async (offer) =&gt; {&amp;#xA;                    console.log(&quot;new offer&quot;, offer.sdp);&amp;#xA;&amp;#xA;                    const updatedPeer = new RTCPeerConnection({&amp;#xA;                        iceServers: [],&amp;#xA;                        sdpSemantics: &quot;unified-plan&quot;,&amp;#xA;                    });&amp;#xA;&amp;#xA;                    updatedPeer.onicecandidate = ({ candidate }) =&gt; {&amp;#xA;                        if (!candidate) {&amp;#xA;                            const sdp = JSON.stringify(updatedPeer.localDescription);&amp;#xA;                            console.log(sdp);&amp;#xA;                            socket.send(sdp);&amp;#xA;                        }&amp;#xA;                    };&amp;#xA;&amp;#xA;                    updatedPeer.oniceconnectionstatechange = () =&gt; {&amp;#xA;                        console.log(&amp;#xA;                            &quot;oniceconnectionstatechange&quot;,&amp;#xA;                            updatedPeer.iceConnectionState&amp;#xA;                        );&amp;#xA;                    };&amp;#xA;&amp;#xA;                    updatedPeer.ontrack = (e) =&gt; {&amp;#xA;                        console.log(&quot;ontrack&quot;, e);&amp;#xA;                        videoRef.current.srcObject = e.streams[0];&amp;#xA;                    };&amp;#xA;&amp;#xA;                    await updatedPeer.setRemoteDescription(offer);&amp;#xA;                    const answer = await updatedPeer.createAnswer();&amp;#xA;                    await updatedPeer.setLocalDescription(answer);&amp;#xA;&amp;#xA;                    setPeer(updatedPeer);&amp;#xA;                };&amp;#xA;&amp;#xA;                socket.onmessage = (ev) =&gt; {&amp;#xA;                    const data = JSON.parse(ev.data);&amp;#xA;                    if (data.type === &quot;offer&quot;) {&amp;#xA;                        handleOffer(data);&amp;#xA;                    } else if (data.type === &quot;resetFFMPEG&quot;) {&amp;#xA;                        // Handle the resetFFMPEG message&amp;#xA;                        console.log(&quot;FFmpeg reset requested&quot;);&amp;#xA;                    }&amp;#xA;                };&amp;#xA;            })();&amp;#xA;        }, []); // Added socket as a dependency to the useEffect hook&amp;#xA;&amp;#xA;        const sendRequestToResetFFmpeg = () =&gt; {&amp;#xA;            socket.send(&quot;resetFFMPEG&quot;);&amp;#xA;        };&amp;#xA;&amp;#xA;        return (&amp;#xA;            &lt;div&gt;&amp;#xA;                Video: &amp;#xA;                &lt;video ref={videoRef} autoPlay muted /&gt;&amp;#xA;                &lt;button onClick={() =&gt; sendRequestToResetFFmpeg()}&gt;Reset FFMPEG&lt;/button&gt;&amp;#xA;            &lt;/div&gt;&amp;#xA;        );&amp;#xA;    };&amp;#xA;&amp;#xA;    ReactDOM.render(&lt;App /&gt;, document.getElementById(&quot;app1&quot;));&amp;#xA;&lt;/script&gt;&#xA;&#xA;&#xA;

    &#xA;

  • Four Trends Shaping the Future of Analytics in Banking

    27 novembre 2024, par Daniel Crough — Banking and Financial Services

    While retail banking revenues have been growing in recent years, trends like rising financial crimes and capital required for generative AI and ML tech pose significant risks and increase operating costs across the financial industry, according to McKinsey’s State of Retail Banking report.

     

    Today’s financial institutions are focused on harnessing AI and advanced analytics to make their data work for them. To be up to the task, analytics solutions must allow banks to give consumers the convenient, personalised experiences they want while respecting their privacy.

     

    In this article, we’ll explore some of the big trends shaping the future of analytics in banking and finance. We’ll also look at how banks use data and technology to cut costs and personalise customer experiences.

    So, let’s get into it.

    Graph showing average age of IT applications in insurance (18 years)

    This doesn’t just represent a security risk, it also impacts the usability for both customers and employees. Does any of the following sound familiar ?

    • Only specific senior employees know how to navigate the software to generate custom reports or use its more advanced features.
    • Customer complaints about your site’s usability or online banking experience are routine.
    • Onboarding employees takes much longer than necessary because of convoluted systems.
    • Teams and departments experience ‘data siloing,’ meaning that not everyone can access the data they need.

    These are warning signs that IT systems are ready for a review. Anyone thinking, “If it’s not broken, why fix it ?” should consider that legacy systems can also present data security risks. As more countries introduce regulations to protect customer privacy, staying ahead of the curve is increasingly important to avoid penalties and litigation.

    And regulations aren’t the only trends impacting the future of financial institutions’ IT and analytics.

    4 trends shaping the future of analytics in banking

    New regulations and new technology have changed the landscape of analytics in banking.

    New privacy regulations impact banks globally

    The first major international example was the advent of GDPR, which went into effect in the EU in 2018. But a lot has happened since. New privacy regulations and restrictions around AI continue to roll out.

    • The European Artificial Intelligence Act (EU AI Act), which was held up as the world’s first comprehensive legislation on AI, took effect on 31 July 2024.
    • In Europe’s federated data initiative, Gaia-X’s planned cloud infrastructure will provide for more secure, transparent, and trustworthy data storage and processing.
    • The revised Payment Services Directive (PSD2) makes payments more secure and strengthens protections for European businesses and consumers, aiming to create a more integrated and efficient payments market.

    But even businesses that don’t have customers in Europe aren’t safe. Consumer privacy is a hot-button issue globally.

    For example, the California Consumer Privacy Act (CCPA), which took effect in January, impacts the financial services industry more than any other. Case in point, 34% of CCPA-related cases filed in 2022 were related to the financial sector.

    California’s privacy regulations were the first in the US, but other states are following closely behind. On 1 July 2024, new privacy laws went into effect in Florida, Oregon, and Texas, giving people more control over their data.

    Share of CCPA cases in the financial industry in 2022 (34%)

    One typical issue for companies in the banking industry is that their privacy measures regarding user data collected from their website are much less lax than those in their online banking system.

    It’s better to proactively invest in a privacy-centric analytics platform before you get tangled up in a lawsuit and have to pay a fine (and are forced to change your system anyway). 

    And regulatory compliance isn’t the only bonus of an ethical analytics solution. The right alternative can unlock key customer insights that can help you improve the user experience.

    The demand for personalised banking services

    At the same time, consumers are expecting a more and more streamlined personal experience from financial institutions. 86% of bank employees say personalisation is a clear priority for the company. But 63% described resources as limited or only available after demonstrating clear business cases.

    McKinsey’s The data and analytics edge in corporate and commercial banking points out how advanced analytics are empowering frontline bank employees to give customers more personalised experiences at every stage :

    • Pre-meeting/meeting prep : Using advanced analytics to assess customer potential, recommend products, and identify prospects who are most likely to convert
    • Meetings/negotiation : Applying advanced models to support price negotiations, what-if scenarios and price multiple products simultaneously
    • Post-meeting/tracking : Using advanced models to identify behaviours that lead to high performance and improve forecast accuracy and sales execution

    Today’s banks must deliver the personalisation that drives customer satisfaction and engagement to outperform their competitors.

    The rise of AI and its role in banking

    With AI and machine learning technologies becoming more powerful and accessible, financial institutions around the world are already reaping the rewards.

    McKinsey estimates that AI in banking could add $200 to 340 billion annually across the global banking sector through productivity gains.

    • Credit card fraud prevention : Algorithms analyse usage to flag and block fraudulent transactions.
    • More accurate forecasting : AI-based tools can analyse a broader spectrum of data points and forecast more accurately.
    • Better risk assessment and modelling : More advanced analytics and predictive models help avoid extending credit to high-risk customers.
    • Predictive analytics : Help spot clients most likely to churn 
    • Gen-AI assistants : Instantly analyse customer profiles and apply predictive models to suggest the next best actions.

    Considering these market trends, let’s discuss how you can move your bank into the future.

    Using analytics to minimise risk and establish a competitive edge 

    With the right approach, you can leverage analytics and AI to help future-proof your bank against changing customer expectations, increased fraud, and new regulations.

    Use machine learning to prevent fraud

    Every year, more consumers are victims of credit and debit card fraud. Debit card skimming cases nearly doubled in the US in 2023. The last thing you want as a bank is to put your customer in a situation where a criminal has spent their money.

    This not only leads to a horrible customer experience but also creates a lot of internal work and additional costs.Thankfully, machine learning can help identify suspicious activity and stop transactions before they go through. For example, Mastercard’s fraud prevention model has improved fraud detection rates by 20–300%.

    A credit card fraud detection robot

    Implementing a solution like this (or partnering with credit card companies who use it) may be a way to reduce risk and improve customer trust.

    Foresee and avoid future issues with AI-powered risk management

    Regardless of what type of financial products organisations offer, AI can be an enormous tool. Here are just a few ways in which it can mitigate financial risk in the future :

    • Predictive analytics can evaluate risk exposure and allow for more informed decisions about whether to approve commercial loan applications.
    • With better credit risk modelling, banks can avoid extending personal loans to customers most likely to default.
    • Investment banks (or individual traders or financial analysts) can use AI- and ML-based systems to monitor market and trading activity more effectively.

    Those are just a few examples that barely scratch the surface. Many other AI-based applications and analytics use cases exist across all industries and market segments.

    Protect customer privacy while still getting detailed analytics

    New regulations and increasing consumer privacy concerns don’t mean banks and financial institutions should forego website analytics altogether. Its insights into performance and customer behaviour are simply too valuable. And without customer interaction data, you’ll only know something’s wrong if someone complains.

    Fortunately, it doesn’t have to be one or the other. The right financial analytics solution can give you the data and insights needed without compromising privacy while complying with regulations like GDPR and CCPA.

    That way, you can track usage patterns and improve site performance and content quality based on accurate data — without compromising privacy. Reliable, precise analytics are crucial for any bank that’s serious about user experience.

    Use A/B testing and other tools to improve digital customer experiences

    Personalised digital experiences can be key differentiators in banking and finance when done well. But there’s stiff competition. In 2023, 40% of bank customers rated their bank’s online and mobile experience as excellent. 

    Improving digital experiences for users while respecting their privacy means going above and beyond a basic web analytics tool like Google Analytics. Invest in a platform with features like A/B tests and user session analysis for deeper insights into user behaviour.

    Diagram of an A/B test with 4 visitors divided into two groups shown different options

    Behavioural analytics are crucial to understanding customer interactions. By identifying points of friction and drop-off points, you can make digital experiences smoother and more engaging.

    Matomo offers all this and is a great GDPR-compliant alternative to Google Analytics for banks and financial institutions

    Of course, this can be challenging. This is why taking an ethical and privacy-centric approach to analytics can be a key competitive edge for banks. Prioritising data security and privacy will attract other like-minded, ethically conscious consumers and boost customer loyalty.

    Get privacy-friendly web analytics suitable for banking & finance with Matomo

    Improving digital experiences for today’s customers requires a solid web analytics platform that prioritises data privacy and accurate analytics. And choosing the wrong one could even mean ending up in legal trouble or scrambling to reconstruct your entire analytics setup.

    Matomo provides privacy-friendly analytics with 100% data accuracy (no sampling), advanced privacy controls and the ability to run A/B tests and user session analysis within the same platform (limiting risk and minimising costs). 

    It’s easy to get started with Matomo. Users can access clear, easy-to-understand metrics and plenty of pre-made reports that deliver valuable insights from day one. Form usage reports can help banks and fintechs identify potential issues with broken links or technical glitches and reveal clues on improving UX in the short term.

    Over one million websites, including some of the world’s top banks and financial institutions, use Matomo for their analytics.

    Start your 21-day free trial to see why, or book a demo with one of our analytics experts.