Recherche avancée

Médias (0)

Mot : - Tags -/organisation

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (77)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

  • La sauvegarde automatique de canaux SPIP

    1er avril 2010, par

    Dans le cadre de la mise en place d’une plateforme ouverte, il est important pour les hébergeurs de pouvoir disposer de sauvegardes assez régulières pour parer à tout problème éventuel.
    Pour réaliser cette tâche on se base sur deux plugins SPIP : Saveauto qui permet une sauvegarde régulière de la base de donnée sous la forme d’un dump mysql (utilisable dans phpmyadmin) mes_fichiers_2 qui permet de réaliser une archive au format zip des données importantes du site (les documents, les éléments (...)

  • Automated installation script of MediaSPIP

    25 avril 2011, par

    To overcome the difficulties mainly due to the installation of server side software dependencies, an "all-in-one" installation script written in bash was created to facilitate this step on a server with a compatible Linux distribution.
    You must have access to your server via SSH and a root account to use it, which will install the dependencies. Contact your provider if you do not have that.
    The documentation of the use of this installation script is available here.
    The code of this (...)

Sur d’autres sites (6714)

  • Open Banking Security 101 : Is open banking safe ?

    3 décembre 2024, par Daniel Crough — Banking and Financial Services

    Open banking is changing the financial industry. Statista reports that open banking transactions hit $57 billion worldwide in 2023 and will likely reach $330 billion by 2027. According to ACI, global real-time payment (RTP) transactions are expected to exceed $575 billion by 2028.

    Open banking is changing how banking works, but is it safe ? And what are the data privacy and security implications for global financial service providers ?

    This post explains the essentials of open banking security and addresses critical data protection and compliance questions. We’ll explore how a privacy-first approach to data analytics can help you meet regulatory requirements, build customer trust and ultimately thrive in the open banking market while offering innovative financial products.

     

    Discover trends, strategies, and opportunities to balance compliance and competitiveness.

    What is open banking ?

    Open banking is a system that connects banks, authorised third-party providers and technology, empowering customers to securely share their financial data with other companies. At the same time, it unlocks access to more innovative and personalised financial products and services like spend management solutions, tailored budgeting apps and more convenient payment gateways. 

    With open banking, consumers have greater choice and control over their financial data, ultimately fostering a more competitive financial industry, supporting technological innovation and paving the way for a more customer-centric financial future.

    Imagine offering your clients a service that analyses spending habits across all accounts — no matter the institution — and automatically finds ways to save them money. Envision providing personalised financial advice tailored to individual needs or enabling customers to apply for a mortgage with just a few taps on their phone. That’s the power of open banking.

    Embracing this technology is an opportunity for banks and fintech companies to build new solutions for customers who are eager for a more transparent and personalised digital experience.

    How is open banking different from traditional banking ?

    In traditional banking, consumers’ financial data is locked away and siloed within each bank’s systems, accessible only to the bank and the account holder. While account holders could manually aggregate and share this data, the process is cumbersome and prone to errors.

    With open banking, users can choose what data to share and with whom, allowing trusted third-party providers to access their financial information directly from the source. 

    Side-by-side comparison between open banking and traditional banking showing the flow of financial information between the bank and the user with and without a third party.

    How does open banking work ?

    The technology that makes open banking possible is the application programming interface (API). Think of banking APIs as digital translators for different software systems ; instead of translating languages, they translate data and code.

    The bank creates and publishes APIs that provide secure access to specific types of customer data, like credit card transaction history and account balances. The open banking API acts like a friendly librarian, ready to assist apps in accessing the information they need in a secure and organised way.

    Third-party providers, like fintech companies, use these APIs to build their applications and services. Some tech companies also act as intermediaries between fintechs and banks to simplify connections to multiple APIs simultaneously.

    For example, banks like BBVA (Spain) and Capital One (USA) offer secure API platforms. Fintechs like Plaid and TrueLayer use those banking APIs as a bridge to users’ financial data. This bridge gives other service providers like Venmo, Robinhood and Coinbase access to customer data, allowing them to offer new payment gateways and investment tools that traditional banks don’t provide.

    Is open banking safe for global financial services ?

    Yes, open banking is designed from the ground up to be safe for global financial services.

    Open banking doesn’t make customer financial data publicly available. Instead, it uses a secure, regulated framework for sharing information. This framework relies on strong security measures and regulatory oversight to protect user data and ensure responsible access by authorised third-party providers.

    In the following sections, we’ll explore the key security features and banking regulations that make this technology safe and reliable.

    Regulatory compliance in open banking

    Regulatory oversight is a cornerstone of open banking security.

    In the UK and the EU, strict regulations govern how companies access and use customer data. The revised Payment Services Directive (PSD2) in Europe mandates strong customer authentication and secure communication, promoting a high level of security for open banking services.

    To offer open banking services, companies must register with their respective regulatory bodies and comply with all applicable data protection laws.

    For example, third-party service providers in the UK must be authorised by the Financial Conduct Authority (FCA) and listed on the Financial Services Register. Depending on the service they provide, they must get an Account Information Service Provider (AISP) or a Payment Initiation Service Provider (PISP) license.

    Similar regulations and registries exist across Europe, enforced by the European National Competent Authority, like BaFin in Germany and the ACPR in France.

    In the United States, open banking providers don’t require a special federal license. However, this will soon change, as the U.S. Consumer Financial Protection Bureau (CFPB) unveiled a series of rules on 22 October 2024 to establish a regulatory framework for open banking.

    These regulations ensure that only trusted providers can participate in the open banking ecosystem. Anyone can check if a company is a trusted provider on public databases like the Regulated Providers registry on openbanking.org.uk. While being registered doesn’t guarantee fair play, it adds a layer of safety for consumers and banks.

    Key open banking security features that make it safe for global financial services

    Open banking is built on a foundation of solid security measures. Let’s explore five key features that make it safe and reliable for financial institutions and their customers.

    List of the five most important features that make open banking safe for global finance

    Strong Customer Authentication (SCA)

    Strong Customer Authentication (SCA) is a security principle that protects against unauthorised access to user financial data. It’s a regulated and legally required form of multi-factor authentication (MFA) within the European Economic Area.

    SCA mandates that users verify their identity using at least two of the following three factors :

    • Something they know (a password, PIN, security question, etc.)
    • Something they have (a mobile phone, a hardware token or a bank card)
    • Something they are (a fingerprint, facial recognition or voice recognition)

    This type of authentication helps reduce the risk of fraud and unauthorised transactions.

    API security

    PSD2 regulations mandate that banks provide open APIs, giving consumers the right to use any third-party service provider for their online banking services. According to McKinsey research, this has led to a surge in API adoption within the banking sector, with the largest banks allocating 14% of their IT budget to APIs. 

    To ensure API security, banks and financial service providers implement several measures, including :

    • API gateways, which act as a central point of control for all API traffic, enforcing security policies and preventing unauthorised access
    • API keys and tokens to authenticate and authorise API requests (the equivalent of a library card for apps)
    • Rate limiting to prevent denial-of-service attacks by limiting the number of requests a third-party application can make within a specific timeframe
    • Regular security audits and penetration testing to identify and address potential vulnerabilities in the API infrastructure

    Data minimisation and purpose limitation

    Data minimisation and purpose limitation are fundamental principles of data protection that contribute significantly to open banking safety.

    Data minimisation means third parties will collect and process only the data necessary to provide their service. Purpose limitation requires them to use the collected data only for its original purpose.

    For example, a budgeting app that helps users track their spending only needs access to transaction history and account balances. It doesn’t need access to the user’s full transaction details, investment portfolio or loan applications.

    Limiting the data collected from individual banks significantly reduces the risk of potential misuse or exposure in a data breach.

    Encryption

    Encryption is a security method that protects data in transit and at rest. It scrambles data into an unreadable format, making it useless to anyone without the decryption key.

    In open banking, encryption protects users’ data as it travels between the bank and the third-party provider’s systems via the API. It also protects data stored on the bank’s and the provider’s servers. Encryption ensures that even if a breach occurs, user data remains confidential.

    Explicit consent

    In open banking, before a third-party provider can access user data, it must first inform the user what data it will pull and why. The customer must then give their explicit consent to the third party collecting and processing that data.

    This transparency and control are essential for building trust and ensuring customers feel safe using third-party services.

    But beyond that, from the bank’s perspective, explicit customer consent is also vital for compliance with GDPR and other data protection regulations. It can also help limit the bank’s liability in case of a data breach.

    Explicit consent goes beyond sharing financial data. It’s also part of new data privacy regulations around tracking user behaviour online. This is where an ethical web analytics solution like Matomo can be invaluable. Matomo fully complies with some of the world’s strictest privacy regulations, like GDPR, lGPD and HIPAA. With Matomo, you get peace of mind knowing you can continue gathering valuable insights to improve your services and user experience while respecting user privacy and adhering to regulations.

    Risks of open banking for global financial services

    While open banking offers significant benefits, it’s crucial to acknowledge the associated risks. Understanding these risks allows financial institutions to implement safeguards and protect themselves and their customers.

    List of the three key risks that banks should always keep in mind.

    Risk of data breaches

    By its nature, open banking is like adding more doors and windows to your house. It’s convenient but also gives burglars more ways to break in.

    Open banking increases what cybersecurity professionals call the “attack surface,” or the number of potential points of vulnerability for hackers to steal financial data.

    Data breaches are a serious threat to banks and financial institutions. According to IBM’s 2024 Cost of a Data Breach Report, each breach costs companies in the US an average of $4.88 million. Therefore, banks and fintechs must prioritise strong security measures and data protection protocols to mitigate these risks.

    Risk of third-party access

    By definition, open banking involves granting third-party providers access to customer financial information. This introduces a level of risk outside the bank’s direct control.

    Financial institutions must carefully vet third-party providers, ensuring they meet stringent security standards and comply with all relevant data protection regulations.

    Risk of user account takeover

    Open banking can increase the risk of user account takeover if adequate security measures are not in place. For example, if a malicious third-party provider gains unauthorised access to a user’s bank login details, they could take control of the user’s account and make fraudulent bank transactions.

    A proactive approach to security, continuous monitoring and a commitment to evolving best practices and security protocols are crucial for navigating the open banking landscape.

    Open banking and data analytics : A balancing act for financial institutions

    The additional data exchanged through open banking unveils deeper insights into customer behaviour and preferences. This data can fuel innovation, enabling the development of personalised products and services and improved risk management strategies.

    However, using this data responsibly requires a careful balancing act.

    Too much reliance on data without proper safeguards can erode trust and invite regulatory issues. The opposite can stifle innovation and limit the technology’s potential.

    Matomo Analytics derisks web and app environments by giving full control over what data is tracked and how it is stored. The platform prioritises user data privacy and security while providing valuable data and analytics that will be familiar to anyone who has used Google Analytics.

    Open banking, data privacy and AI

    The future of open banking is entangled with emerging technologies like artificial intelligence (AI) and machine learning. These technologies significantly enhance open banking analytics, personalise services, and automate financial tasks.

    Several banks, credit unions and financial service providers are already exploring AI’s potential in open banking. For example, HSBC developed the AI-enabled FX Prompt in 2023 to improve forex trading. The bank processed 823 million client API calls, many of which were open banking.

    However, using AI in open banking raises important data privacy considerations. As the American Bar Association highlights, balancing personalisation with responsible AI use is crucial for open banking’s future. Financial institutions must ensure that AI-driven solutions are developed and implemented ethically, respecting customer privacy and data protection.

    Conclusion

    Open banking presents a significant opportunity for innovation and growth in the financial services industry. While it’s important to acknowledge the associated risks, security measures like explicit customer consent, encryption and regulatory frameworks make open banking a safe and reliable system for banks and their clients.

    Financial service providers must adopt a multifaceted approach to data privacy, implementing privacy-centred solutions across all aspects of their business, from open banking to online services and web analytics.

    By prioritising data privacy and security, financial institutions can build customer trust, unlock the full potential of open banking and thrive in today’s changing financial environment.

  • Six Best Amplitude Alternatives

    10 décembre 2024, par Daniel Crough

    Product analytics is big business. Gone are the days when we could only guess what customers were doing with our products or services. Now, we can track, visualise, and analyse how they interact with them and, with that, constantly improve and optimise. 

    The problem is that many product analytics tools are expensive and complicated — especially for smaller businesses. They’re also packed with functionality more attuned to the needs of massive companies. 

    Amplitude is such a tool. It’s brilliant and it has all the bells and whistles that you’ll probably never need. Fortunately, there are alternatives. In this guide, we’ll explore the best of those alternatives and, along the way, provide the insight you’ll need to select the best analytics tool for your organisation. 

    Amplitude : a brief overview

    To set the stage, it makes sense to understand exactly what Amplitude offers. It’s a real-time data analytics tool for tracking user actions and gaining insight into engagement, retention, and revenue drivers. It helps you analyse that data and find answers to questions about what happened, why it happened, and what to do next.

    However, as good as Amplitude is, it has some significant disadvantages. While it does offer data export functionality, that seems deliberately restricted. It allows data exports for specific events, but it’s not possible to export complete data sets to manipulate or format in another tool. Even pulling it into a CSV file has a 10,000-row limit. There is an API, but not many third-party integration options.

    Getting data in can also be a problem. Amplitude requires manual tags on events that must be tracked for analysis, which can leave holes in the data if every possible subsequent action isn’t tagged. That’s a time-consuming exercise, and it’s made worse because those tags will have to be updated every time the website or app is updated. 

    As good as it is, it can also be overwhelming because it’s stacked with features that can create confusion for novice or inexperienced analysts. It’s also expensive. There is a freemium plan that limits functionality and events. Still, when an organisation wants to upgrade for additional functionality or to analyse more events, the step up to the paid plan is massive.

    Lastly, Amplitude has made some strides towards being a web analytics option, but it lacks some basic functionality that may frustrate people who are trying to see the full picture from web to app.

    Snapshot of Amplitude alternatives

    So, in place of Amplitude, what product analytics tools are available that won’t break the bank and still provide the functionality needed to improve your product ? The good news is that there are literally hundreds of alternatives, and we’ve picked out six of the best.

    1. Matomo – Best privacy-focused web and mobile analytics
    2. Mixpanel – Best for product analytics
    3. Google Analytics – Best free option
    4. Adobe Analytics – Best for predictive analytics
    5. Umami – Best lightweight tool for product analytics
    6. Heap – Best for automatic user data capture

    A more detailed analysis of the Amplitude alternatives

    Now, let’s dive deeper into each of the six Amplitude alternatives. We’ll cover standout features, integrations, pricing, use cases and community critiques. By the end, you’ll know which analytics tool can help optimise website and app performance to grow your business.

    1. Matomo – Best privacy-friendly web and app analytics

    Privacy is a big concern these days, especially for organisations with a presence in the European Union (EU). Unlike other analytics tools, Matomo ensures you comply with privacy laws and regulations, like the General Data Protection Regulation (GDPR) and California’s Consumer Privacy Act (CCPA).

    Matomo helps businesses get the insights they need without compromising user privacy. It’s also one of the few self-hosted tools, ensuring data never has to leave your site.

    Matomo is open-source, which is also rare in this class of tools. That means it’s available for anyone to adapt and customise as they wish. Everything you need to build custom APIs is there.

    Image showing the origin of website traffic.
    The Locations page in Matomo shows the countries, continents, regions, and cities where website traffic originates.

    Its most useful capabilities include visitor logs and session recordings to trace the entire customer journey, spot drop-off points, and fine-tune sales funnels. The platform also comes with heatmaps and A/B testing tools. Heatmaps provide a useful visual representation of your data, while A/B testing allows for more informed, data-driven decisions.

    Despite its range of features, many reviewers laud Matomo’s user interface for its simplicity and user-friendliness. 

    Why Matomo : Matomo is an excellent alternative because it fills in the gaps where Amplitude comes up short, like with cookieless tracking. Also, while Amplitude focuses mainly on behavioural analytics, Matomo offers both behavioural and traditional analytics, which allows more profound insight into your data. Furthermore, Matomo fully complies with the strictest privacy regulations worldwide, including GDPR, LGPD, and HIPAA.

    Standout features include multi-touch attribution, visits log, content engagement, ecommerce, customer segments, event tracking, goal tracking, custom dimensions, custom reports, automated email reports, tag manager, sessions recordings, roll-up reporting that can pull data from multiple websites or mobile apps, Google Analytics importer, Matomo tag manager, comprehensive visitor tracking, heatmaps, and more.

    Integrations with 100+ technologies, including Cloudflare, WordPress, Magento, Google Ads, Drupal, WooCommerce, Vue, SharePoint and Wix.

    Pricing is free for Matomo On-Premise and $23 per month for Matomo Cloud, which comes with a 21-day free trial (no credit card required).

    Strengths

    • Privacy focused
    • Cookieless consent banners
    • 100% accurate, unsampled data
    • Open-source code 
    • Complete data ownership (no sharing with third parties)
    • Self-hosting and cloud-based options
    • Built-in GDPR Manager
    • Custom alerts, white labelling, dashboards and reports

    Community critiques 

    • Premium features are expensive and proprietary
    • Learning curve for non-technical users

    2. Mixpanel – Best for product analytics

    Mixpanel is a dedicated product analytics tool. It tracks and analyses customer interactions with a product across different platforms and helps optimise digital products to improve the user experience. It works with real-time data and can provide answers from customer and revenue data in seconds.

    It also presents data visualisations to show how customers interact with products.

    Screenshot reflecting useful customer trends

    Mixpanel allows you to play around filters and views to reveal and chart some useful customer trends. (Image source)

    Why Mixpanel : One of the strengths of this platform is the ability to test hypotheses. Need to test an ambitious idea ? Mixpanel data can do it with real user analytics. That allows you to make data-driven decisions to find the best path forward.

    Standout features include automatic funnel segment analysis, behavioural segmentation, cohort segmentation, collaboration support, customisable dashboards, data pipelines, filtered data views, SQL queries, warehouse connectors and a wide range of pre-built integrations.

    Integrations available include Appcues, AppsFlyer, AWS, Databox, Figma, Google Cloud, Hotjar, HubSpot, Intercom, Integromat, MailChimp, Microsoft Azure, Segment, Slack, Statsig, VWO, Userpilot, WebEngage, Zapier, ZOH) and dozens of others.

    Pricing starts with a freemium plan valid for up to 20 million events per month. The growth plan is affordable at $25 per month and adds features like no-code data transformations and data pipeline add-ons. The enterprise version runs at a monthly cost of $833 and provides the full suite of features and services and premium support.

    There’s a caveat. Those prices only allow up to 1,000 Monthly Tracked Users (MTUs), calculated based on the number of visitors that perform a qualifying event each month. Beyond that, MTU plans start at $20,000 per year.

    Strengths

    • User behaviour and interaction tracking
    • Unlimited cohort segmentation capabilities
    • Drop-off analysis showing where users get stuck
    • A/B testing capabilities

    Community critiques 

    • Expensive enterprise features
    • Extensive setup and configuration requirements

    3. Google Analytics 4 – Best free web analytics tool

    The first thing to know about Google Analytics 4 is that it’s a web analytics tool. In other words, it tracks sessions, not user behaviours in app environments. It can provide details on how people found your website and how they go there, but it doesn’t offer much detail on how people use your product. 

    There is also an enterprise version, Google Analytics 360, which is not free. We’ve broken down the differences between the two versions elsewhere.

    Image showing audience-related data provided by GA4

    GA4’s audience overview shows visitors, sessions, session lengths, bounce rates, and user engagement data. (Image source)

     

    Why Google Analytics : It’s great for gauging the effectiveness of marketing campaigns, tracking goal completions (purchases, cart additions, etc.) and spotting trends and patterns in user engagement.

    Standout features include built-in automation, customisable conversion goals, data drill-down functionality, detailed web acquisition metrics, media spend ROI calculations and out-of-the-box web analytics reporting.

    Integrations include all major CRM platforms, CallRail, DoubleClick DCM, Facebook, Hootsuite, Marketo, Shopify, VWO, WordPress, Zapier and Zendesk, among many others.

    Pricing is free for the basic version (Google Analytics 4) and scales based on features and data volume. The advanced features (in Google Analytics 360) are pitched at enterprises, and pricing is custom.

    Strengths

    • Free to start
    • Multiple website management
    • Traffic source details
    • Up-to-date traffic data

    Community critiques 

    • Steep learning curve 
    • Data sampling

    4. Adobe Analytics – Best for predictive analytics

    A fully configured Adobe Analytics implementation is the Swiss army knife of analytics tools. It begins with web analytics, adds product analytics, and then wraps it up nicely with predictive analytics.

    Unlike all the Amplitude alternatives here, there’s no free version. Adobe Analytics has a complicated pricing matrix with options like website analytics, marketing analytics, attribution, and predictive analytics. It also has a wide range of customisation options that will appeal to large businesses. But for smaller organisations, it may all be a bit too much.

    Mixpanel allows you to play around filters and views to reveal and chart some useful customer trends. (Image source)

    Screenshot categorising online orders by marketing channel

    Adobe Analytics’ cross-channel attribution ties actions from different channels into a single customer journey. (Image source)

     

    Why Adobe Analytics : For current Adobe customers, this is a logical next step. Either way, Adobe Analytics can combine, evaluate, and analyse data from any part of the customer journey. It analyses that data with predictive intelligence to provide insights to enhance customer experiences.

     

    Standout features include AI-powered prediction analysis, attribution analysis, multi-channel data collection, segmentation and detailed customer journey analytics, product analytics and web analytics.

     

    Integrations are available through the Adobe Experience Cloud Exchange. Adobe Analytics also supports data exchange with brands such as BrightEdge, Branch.io, Google Ads, Hootsuite, Invoca, Salesforce and over 200 other integrations.

     

    Pricing starts at $500 monthly, but prospective customers are encouraged to contact the company for a needs-based quotation.

     

    Strengths

    • Drag-and-drop interface
    • Flexible segmentation 
    • Easy-to-create conversion funnels
    • Threshold-based alerts and notifications

    Community critiques 

    • No free version
    • Lack of technical support
    • Steep learning curve

    5. Umami – Best lightweight tool for web analytics

    The second of our open-source analytics solutions is Umami, a favourite in the software development community. Like Matomo, it’s a powerful and privacy-focused alternative that offers complete data control and respects user privacy. It’s also available as a cloud-based freemium plan or as a self-hosted solution.

     

    Image showing current user traffic and hourly traffic going back 24 hours

    Umami’s dashboard reveals the busiest times of day and which pages are visited when.(Image source)

     

    Why Umami : Unami has a clear and simple user interface (UI) that lets you measure important metrics such as page visits, referrers, and user agents. It also features event tracking, although some reviewers complain that it’s quite limited.

    Standout features can be summed up in five words : privacy, simplicity, lightweight, real-time, and open-source. Unami’s UI is clean, intuitive and modern, and it doesn’t slow down your website. 

    Integrations include plugins for VuePress, Gatsby, Craft CMS, Docusaurus, WordPress and Publii, and a module for Nuxt. Unami’s API communicates with Javascript, PHP Laravel and Python.

    Pricing is free for up to 100k monthly events and three websites, but with limited support and data retention restrictions. The Pro plan costs $20 a month and gives you unlimited websites and team members, a million events (plus $0.00002 for each event over that), five years of data and email support. Their Enterprise plan is priced custom.

    Strengths

    • Freemium plan
    • Open-source
    • Lightweight 

    Community critiques 

    • Limited support options
    • Data retention restrictions
    • No funnel functionality

    6. Heap – Best for automatic data capture

    Product analytics with a twist is a good description of Heap. It features event auto-capture to track user interactions across all touchpoints in the user journey. This lets you fully understand how and why customers engage with your product and website. 

    Using a single Javascript snippet, Heap automatically collects data on everything users do, including how they got to your website. It also helps identify how different cohorts engage with your product, providing the critical insights teams need to boost conversion rates.

    Image showing funnel and path analysis data and insights

    Heap’s journeys feature combines funnel and path analysis. (Image source)

     

    Why Heap : The auto-capture functionality solves a major shortcoming of many product analytics tools — manual tracking. Instead of having to set up manual tags on events, Heap automatically captures all data on user activity from the start. 

    Standout features include event auto-capture, session replay, heatmaps, segments (or cohorts) and journeys, the last of which combines the functions of funnel and path analysis tools into a single feature.

    Integrations include AWS, Google, Microsoft Azure, major CRM platforms, Snowflake and many other data manipulation platforms.

    Pricing is quote-based across all payment tiers. There is also a free plan and a 14-day free trial.

    Strengths

    • Session replay
    • Heatmaps 
    • User segmentation
    • Simple setup 
    • Event auto-capture 

    Community critiques 

    • No A/B testing functionality
    • No GDPR compliance support

    Choosing the best solution for your team

    When selecting a tool, it’s crucial to understand how product analytics and web analytics solutions differ. 

    Product analytics tools track users or accounts and record the features they use, the funnels they move through, and the cohorts they’re part of. Web analytics tools focus more on sessions than users because they’re interested in data that can help improve website usage. 

    Some tools combine product and web analytics to do both of these jobs.

    Area of focus

    Product analytics tools track user behaviour within SaaS- or app-based products. They’re helpful for analysing features, user journeys, engagement metrics, product development and iteration. 

    Web analytics tools analyse web traffic, user demographics, and traffic sources. They’re most often used for marketing and SEO insights.

    Level of detail

    Product analytics tools provide in-depth tracking and analysis of user interactions, feature usage, and cohort analysis.

    Web analytics tools provide broader data on page views, bounce rates, and conversion tracking to analyse overall site performance.

    Whatever tools you try, your first step should be to search for reviews online to see what people who’ve used them think about them. There are some great review sites you can try. See what people are saying on Capterra, G2, Gartner Peer Insights, or TrustRadius

    Use Matomo to power your web and app analytics

    Web and product analytics is a competitive field, and there are many other tools worth considering. This list is a small cross-section of what’s available.

    That said, if you have concerns about privacy and costs, consider choosing Matomo. Start your 21-day free trial today.

  • Adventures In NAS

    1er janvier, par Multimedia Mike — General

    In my post last year about my out-of-control single-board computer (SBC) collection which included my meager network attached storage (NAS) solution, I noted that :

    I find that a lot of my fellow nerds massively overengineer their homelab NAS setups. I’ll explore this in a future post. For my part, people tend to find my homelab NAS solution slightly underengineered.

    So here I am, exploring this is a future post. I’ve been in the home NAS game a long time, but have never had very elaborate solutions for such. For my part, I tend to take an obsessively reductionist view of what constitutes a NAS : Any small computer with a pool of storage and a network connection, running the Linux operating system and the Samba file sharing service.


    Simple hard drive and ethernet cable

    Many home users prefer to buy turnkey boxes, usually that allow you to install hard drives yourself, and then configure the box and its services with a friendly UI. My fellow weird computer nerds often buy cast-off enterprise hardware and set up more resilient, over-engineered solutions, as long as they have strategies to mitigate the noise and dissipate the heat, and don’t mind the electricity bills.

    If it works, awesome ! As an old hand at this, I am rather stuck in my ways, however, preferring to do my own stunts, both with the hardware and software solutions.

    My History With Home NAS Setups
    In 1998, I bought myself a new computer — beige box tower PC, as was the style as the time. This was when normal people only had one computer at most. It ran Windows, but I was curious about this new thing called “Linux” and learned to dual boot that. Later that year, it dawned on me that nothing prevented me from buying a second ugly beige box PC and running Linux exclusively on it. Further, it could be a headless Linux box, connected by ethernet, and I could consolidate files into a single place using this file sharing software named Samba.

    I remember it being fairly onerous to get Samba working in those days. And the internet was not quite so helpful in those days. I recall that the thing that blocked me for awhile was needing to know that I had to specify an entry for the Samba server machine in the LMHOSTS (Lanman hosts) file on the Windows 95 machine.

    However, after I cracked that code, I have pretty much always had some kind of ad-hoc home NAS setup, often combined with a headless Linux development box.

    In the early 2000s, I built a new beige box PC for a file server, with a new hard disk, and a coworker tutored me on setting up a (P)ATA UDMA 133 (or was it 150 ? anyway, it was (P)ATA’s last hurrah before SATA conquered all) expansion card and I remember profiling that the attached hard drive worked at a full 21 MBytes/s reading. It was pretty slick. Except I hadn’t really thought things through. You see, I had a hand-me-down ethernet hub cast-off from my job at the time which I wanted to use. It was a 100 Mbps repeater hub, not a switch, so the catch was that all connected machines had to be capable of 100 Mbps. So, after getting all of my machines (3 at the time) upgraded to support 10/100 ethernet (the old off-brand PowerPC running Linux was the biggest challenge), I profiled transfers and realized that the best this repeater hub could achieve was about 3.6 MBytes/s. For a long time after that, I just assumed that was the upper limit of what a 100 Mbps network could achieve. Obviously, I now know that the upper limit ought to be around 11.2 MBytes/s and if I had gamed out that fact in advance, I would have realized it didn’t make sense to care about super-fast (for the time) disk performance.

    At this time, I was doing a lot for development for MPlayer/xine/FFmpeg. I stored all of my multimedia material on this NAS. I remember being confused when I was working with Y4M data, which is raw frames, which is lots of data. xine, which employed a pre-buffering strategy, would play fine for a few seconds and then stutter. Eventually, I reasoned out that the files I was working with had a data rate about twice what my awful repeater hub supported, which is probably the first time I came to really understand and respect streaming speeds and their implications for multimedia playback.

    Smaller Solutions
    For a period, I didn’t have a NAS. Then I got an Apple AirPort Extreme, which I noticed had a USB port. So I bought a dual drive brick to plug into it and used that for a time. Later (2009), I had this thing called the MSI Wind Nettop which is the only PC I’ve ever seen that can use a CompactFlash (CF) card for a boot drive. So I did just that, and installed a large drive so it could function as a NAS, as well as a headless dev box. I’m still amazed at what a low-power I/O beast this thing is, at least when compared to all the ARM SoCs I have tried in the intervening 1.5 decades. I’ve had spinning hard drives in this thing that could read at 160 MBytes/s (‘dd’ method) and have no trouble saturating the gigabit link at 112 MBytes/s, all with its early Intel Atom CPU.

    Around 2015, I wanted a more capable headless dev box and discovered Intel’s line of NUCs. I got one of the fat models that can hold a conventional 2.5″ spinning drive in addition to the M.2 SATA SSD and I was off and running. That served me fine for a few years, until I got into the ARM SBC scene. One major limitation here is that 2.5″ drives aren’t available in nearly the capacities that make a NAS solution attractive.

    Current Solution
    My current NAS solution, chronicled in my last SBC post– the ODroid-HC2, which is a highly compact ARM SoC with an integrated USB3-SATA bridge so that a SATA drive can be connected directly to it :


    ODROID-HC2 NAS

    ODROID-HC2 NAS


    I tend to be weirdly proficient at recalling dates, so I’m surprised that I can’t recall when I ordered this and put it into service. But I’m pretty sure it was circa 2018. It’s only equipped with an 8 TB drive now, but I seem to recall that it started out with only a 4 TB drive. I think I upgraded to the 8 TB drive early in the pandemic in 2020, when ISPs were implementing temporary data cap amnesty and I was doing what a r/DataHoarder does.

    The HC2 has served me well, even though it has a number of shortcomings for a hardware set chartered for NAS :

    1. While it has a gigabit ethernet port, it’s documented that it never really exceeds about 70 MBytes/s, due to the SoC’s limitations
    2. The specific ARM chip (Samsung Exynos 5422 ; more than a decade old as of this writing) lacks cryptography instructions, slowing down encryption if that’s your thing (e.g., LUKS)
    3. While the SoC supports USB3, that block is tied up for the SATA interface ; the remaining USB port is only capable of USB2 speeds
    4. 32-bit ARM, which prevented me from running certain bits of software I wanted to try (like Minio)
    5. Only 1 drive, so no possibility for RAID (again, if that’s your thing)

    I also love to brag on the HC2’s power usage : I once profiled the unit for a month using a Kill-A-Watt and under normal usage (with the drive spinning only when in active use). The unit consumed 4.5 kWh… in an entire month.

    New Solution
    Enter the ODroid-HC4 (I purchased mine from Ameridroid but Hardkernel works with numerous distributors) :


    ODroid-HC4 with 2 drives

    ODroid-HC4 with an SSD and a conventional drive


    I ordered this earlier in the year and after many months of procrastinating and obsessing over the best approach to take with its general usage, I finally have it in service as my new NAS. Comparing point by point with the HC2 :

    1. The gigabit ethernet runs at full speed (though a few things on my network run at 2.5 GbE now, so I guess I’ll always be behind)
    2. The ARM chip (Amlogic S905X3) has AES cryptography acceleration and handles all the LUKS stuff without breaking a sweat ; “cryptsetup benchmark” reports between 500-600 MBytes/s on all the AES variants
    3. The USB port is still only USB2, so no improvement there
    4. 64-bit ARM, which means I can run Minio to simulate block storage in a local dev environment for some larger projects I would like to undertake
    5. Supports 2 drives, if RAID is your thing

    How I Set It Up
    How to set up the drive configuration ? As should be apparent from the photo above, I elected for an SSD (500 GB) for speed, paired with a conventional spinning HDD (18 TB) for sheer capacity. I’m not particularly trusting of RAID. I’ve watched it fail too many times, on systems that I don’t even manage, not to mention that aforementioned RAID brick that I had attached to the Apple AirPort Extreme.

    I had long been planning to use bcache, the block caching interface for Linux, which can use the SSD as a speedy cache in front of the more capacious disk. There is also LVM cache, which is supposed to achieve something similar. And then I had to evaluate the trade-offs in whether I wanted write-back, write-through, or write-around configurations.

    This was all predicated on the assumption that the spinning drive would not be able to saturate the gigabit connection. When I got around to setting up the hardware and trying some basic tests, I found that the conventional HDD had no trouble keeping up with the gigabit data rate, both reading and writing, somewhat obviating the need for SSD acceleration using any elaborate caching mechanisms.

    Maybe that’s because I sprung for the WD Red Pro series this time, rather than the Red Plus ? I’m guessing that conventional drives do deteriorate over the years. I’ll find out.

    For the operating system, I stuck with my newest favorite Linux distro : DietPi. While HardKernel (parent of ODroid) makes images for the HC units, I had also used DietPi for the HC2 for the past few years, as it tends to stay more up to date.

    Then I rsync’d my data from HC2 -> HC4. It was only about 6.5 TB of total data but it took days as this WD Red Plus drive is only capable of reading at around 10 MBytes/s these days. Painful.

    For file sharing, I’m pretty sure most normal folks have nice web UIs in their NAS boxes which allow them to easily configure and monitor the shares. I know there are such applications I could set up. But I’ve been doing this so long, I just do a bare bones setup through the terminal. I installed regular Samba and then brought over my smb.conf file from the HC2. 1 by 1, I tested that each of the old shares were activated on the new NAS and deactivated on the old NAS. I also set up a new share for the SSD. I guess that will just serve as a fast I/O scratch space on the NAS.

    The conventional drive spins up and down. That’s annoying when I’m actively working on something but manage not to hit the drive for like 5 minutes and then an application blocks while the drive wakes up. I suppose I could set it up so that it is always running. However, I micro-manage this with a custom bash script I wrote a long time ago which logs into the NAS and runs the “date” command every 2 minutes, appending the output to a file. As a bonus, it also prints data rate up/down stats every 5 seconds. The spinning file (“nas-main/zz-keep-spinning/keep-spinning.txt”) has never been cleared and has nearly a quarter million lines. I suppose that implies that it has kept the drive spinning for 1/2 million minutes which works out to around 347 total days. I should compare that against the drive’s SMART stats, if I can remember how. The earliest timestamp in the file is from March 2018, so I know the HC2 NAS has been in service at least that long.

    For tasks, vintage cron still does everything I could need. In this case, that means reaching out to websites (like this one) and automatically backing up static files.

    I also have to have a special script for starting up. Fortunately, I was able to bring this over from the HC2 and tweak it. The data disks (though not boot disk) are encrypted. Those need to be unlocked and only then is it safe for the Samba and Minio services to start up. So one script does all that heavy lifting in the rare case of a reboot (this is the type of system that’s well worth having on a reliable UPS).

    Further Work
    I need to figure out how to use the OLED display on the NAS, and how to make it show something more useful than the current time and date, which is what it does in its default configuration with HardKernel’s own Linux distro. With DietPi, it does nothing by default. I’m thinking it should be able to show the percent usage of each of the 2 drives, at a minimum.

    I also need to establish a more responsible backup regimen. I’m way too lazy about this. Fortunately, I reason that I can keep the original HC2 in service, repurposed to accept backups from the main NAS. Again, I’m sort of micro-managing this since a huge amount of data isn’t worth backing up (remember the whole DataHoarder bit), but the most important stuff will be shipped off.

    The post Adventures In NAS first appeared on Breaking Eggs And Making Omelettes.