
Recherche avancée
Autres articles (68)
-
Le profil des utilisateurs
12 avril 2011, parChaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...) -
Configurer la prise en compte des langues
15 novembre 2010, parAccéder à la configuration et ajouter des langues prises en compte
Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...) -
XMP PHP
13 mai 2011, parDixit Wikipedia, XMP signifie :
Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)
Sur d’autres sites (7750)
-
Cohort Analysis 101 : How-To, Examples & Top Tools
13 novembre 2023, par Erin — Analytics Tips -
Overcoming Fintech and Finserv’s Biggest Data Analytics Challenges
Data powers innovation in financial technology (fintech), from personalized banking services to advanced fraud detection systems. Industry leaders recognize the value of strong security measures and customer privacy. A recent survey highlights this focus, with 72% of finance Chief Risk Officers identifying cybersecurity as their primary concern.
Beyond cybersecurity, fintech and financial services (finserv) companies are bogged down with massive amounts of data spread throughout disconnected systems. Between this, a complex regulatory landscape and an increasingly tech-savvy and sceptical consumer base, fintech and finserv companies have a lot on their plates.
How can marketing teams get the information they need while staying focused on compliance and providing customer value ?
This article will examine strategies to address common challenges in the finserv and fintech industries. We’ll focus on using appropriate tools, following effective data management practices, and learning from traditional banks’ approaches to similar issues.
What are the biggest fintech data analytics challenges, and how do they intersect with traditional banking ?
Recent years have been tough for the fintech industry, especially after the pandemic. This period has brought new hurdles in data analysis and made existing ones more complex. As the market stabilises, both fintech and finserve companies must tackle these evolving data issues.
Let’s examine some of the most significant data analytics challenges facing the fintech industry, starting with an issue that’s prevalent across the financial sector :
1. Battling data silos
In a recent survey by InterSystems, 54% of financial institution leaders said data silos are their biggest barrier to innovation, while 62% said removing silos is their priority data strategy for the next year.
Data silos segregate data repositories across departments, products and other divisions. This is a major issue in traditional banking and something fintech companies should avoid inheriting at all costs.
Siloed data makes it harder for decision-makers to view business performance with 360-degree clarity. It’s also expensive to maintain and operationalise and can evolve into privacy and data compliance issues if left unchecked.
To avoid or remove data silos, develop a data governance framework and centralise your data repositories. Next, simplify your analytics stack into as few integrated tools as possible because complex tech stacks are one of the leading causes of data silos.
Use an analytics system like Matomo that incorporates web analytics, marketing attribution and CRO testing into one toolkit.
Matomo’s support plans help you implement a data system to meet the unique needs of your business and avoid issues like data silos. We also offer data warehouse exporting as a feature to bring all of your web analytics, customer data, support data, etc., into one centralised location.
Try Matomo for free today, or contact our sales team to discuss support plans.
2. Compliance with laws and regulations
A survey by Alloy reveals that 93% of fintech companies find it difficult to meet compliance regulations. The cost of staying compliant tops their list of worries (23%), outranking even the financial hit from fraud (21%) – and this in a year marked by cyber threats.
Data privacy laws are constantly changing, and the landscape varies across global regions, making adherence even more challenging for fintechs and traditional banks operating in multiple markets.
In the US market, companies grapple with regulations at both federal and state levels. Here are some of the state-level legislation coming into effect for 2024-2026 :
- Oregon – Oregon Consumer Privacy Act (July 1, 2024)
- Florida – Florida Digital Bill of Rights (July 1, 2024)
- Texas – Texas Data Privacy and Security Act (July 1, 2024)
- Montana – Montana Consumer Data Privacy Act (October 1, 2024)
- Delaware – Delaware Personal Privacy Act (January 1, 2025)
- Iowa – Iowa Consumer Data Protection Act (January 1, 2025)
- New Jersey – SB 332 (January 15, 2025)
- Tennessee – Tennessee Information Protection Act (July 1, 2025)
- Indiana – Indiana Consumer Data Protection Act ( January 1, 2026)
Other countries are also ramping up regional regulations. For instance, Canada has Quebec’s Act Respecting the Protection of Personal Information in the Private Sector and British Columbia’s Personal Information Protection Act (BC PIPA).
Ignorance of country- or region-specific laws will not stop companies from suffering the consequences of violating them.
The only answer is to invest in adherence and manage business growth accordingly. Ultimately, compliance is more affordable than non-compliance – not only in terms of the potential fines but also the potential risks to reputation, consumer trust and customer loyalty.
This is an expensive lesson that fintech and traditional financial companies have had to learn together. GDPR regulators hit CaixaBank S.A, one of Spain’s largest banks, with multiple multi-million Euro fines, and Klarna Bank AB, a popular Swedish fintech company, for €720,000.
To avoid similar fates, companies should :
- Build solid data systems
- Hire compliance experts
- Train their teams thoroughly
- Choose data analytics tools carefully
Remember, even popular tools like Google Analytics aren’t automatically safe. Find out how Matomo helps you gather useful insights while sticking to rules like GDPR.
3. Protecting against data security threats
Cyber threats are increasing in volume and sophistication, with the financial sector becoming the most breached in 2023.
The cybersecurity risks will only worsen, with WEF estimating annual cybercrime expenses of up to USD $10.5 trillion globally by 2025, up from USD $3 trillion in 2015.
While technology brings new security solutions, it also amplifies existing risks and creates new ones. A 2024 McKinsey report warns that the risk of data breaches will continue to increase as the financial industry increasingly relies on third-party data tools and cloud computing services unless they simultaneously improve their security posture.
The reality is that adopting a third-party data system without taking the proper precautions means adopting its security vulnerabilities.
In 2023, the MOVEit data breach affected companies worldwide, including financial institutions using its file transfer system. One hack created a global data crisis, potentially affecting the customer data of every company using this one software product.
The McKinsey report emphasises choosing tools wisely. Why ? Because when customer data is compromised, it’s your company that takes the heat, not the tool provider. As the report states :
“Companies need reliable, insightful metrics and reporting (such as security compliance, risk metrics and vulnerability tracking) to prove to regulators the health of their security capabilities and to manage those capabilities.”
Don’t put user or customer data in the hands of companies you can’t trust. Work with providers that care about security as much as you do. With Matomo, you own all of your data, ensuring it’s never used for unknown purposes.
4. Protecting users’ privacy
With security threats increasing, fintech companies and traditional banks must prioritise user privacy protection. Users are also increasingly aware of privacy threats and ready to walk away from companies that lose their trust.
Cisco’s 2023 Data Privacy Benchmark Study reveals some eye-opening statistics :
- 94% of companies said their customers wouldn’t buy from them if their data wasn’t protected, and
- 95% see privacy as a business necessity, not just a legal requirement.
Modern financial companies must balance data collection and management with increasing privacy demands. This may sound contradictory for companies reliant on dated practices like third-party cookies, but they need to learn to thrive in a cookieless web as customers move to banks and service providers that have strong data ethics.
This privacy protection journey starts with implementing web analytics ethically from the very first session.
The most important elements of ethically-sound web analytics in fintech are :
- 100% data ownership : Make sure your data isn’t used in other ways by the tools that collect it.
- Respecting user privacy : Only collect the data you absolutely need to do your job and avoid personally identifiable information.
- Regulatory compliance : Stick with solutions built for compliance to stay out of legal trouble.
- Data transparency : Know how your tools use your data and let your customers know how you use it.
Read our guide to ethical web analytics for more information.
5. Comparing customer trust across industries
While fintech companies are making waves in the financial world, they’re still playing catch-up when it comes to earning customer trust. According to RFI Global, fintech has a consumer trust score of 5.8/10 in 2024, while traditional banking scores 7.6/10.
This trust gap isn’t just about perception – it’s rooted in real issues :
- Security breaches are making headlines more often.
- Privacy regulations like GDPR are making consumers more aware of their rights.
- Some fintech companies are struggling to handle fraud effectively.
According to the UK’s Payment Systems Regulator, digital banking brands Monzo and Starling had some of the highest fraudulent activity rates in 2022. Yet, Monzo only reimbursed 6% of customers who reported suspicious transactions, compared to 70% for NatWest and 91% for Nationwide.
So, what can fintech firms do to close this trust gap ?
- Start with privacy-centric analytics from day one. This shows customers you value their privacy from the get-go.
- Build and maintain a long-term reputation free of data leaks and privacy issues. One major breach can undo years of trust-building.
- Learn from traditional banks when it comes to handling issues like fraudulent transactions, identity theft, and data breaches. Prompt, customer-friendly resolutions go a long way.
- Remember : cutting-edge financial technology doesn’t make up for poor customer care. If your digital bank won’t refund customers who’ve fallen victim to credit card fraud, they’ll likely switch to a traditional bank that will.
The fintech sector has made strides in innovation, but there’s still work to do in establishing trustworthiness. By focusing on robust security, transparent practices, and excellent customer service, fintech companies can bridge the trust gap and compete more effectively with traditional banks.
6. Collecting quality data
Adhering to data privacy regulations, protecting user data and implementing ethical analytics raises another challenge. How can companies do all of these things and still collect reliable, quality data ?
Google’s answer is using predictive models, but this replaces real data with calculations and guesswork. The worst part is that Google Analytics doesn’t even let you use all of the data you collect in the first place. Instead, it uses something called data sampling once you pass certain thresholds.
In practice, this means that Google Analytics uses a limited set of your data to calculate reports. We’ve discussed GA4 data sampling at length before, but there are two key problems for companies here :
- A sample size that’s too small won’t give you a full representation of your data.
- The more visitors that come to your site, the less accurate your reports will become.
For high-growth companies, data sampling simply can’t keep up. Financial marketers widely recognise the shortcomings of big tech analytics providers. In fact, 80% of them say they’re concerned about data bias from major providers like Google and Meta affecting valuable insights.
This is precisely why CRO:NYX Digital approached us after discovering Google Analytics wasn’t providing accurate campaign data. We set up an analytics system to suit the company’s needs and tested it alongside Google Analytics for multiple campaigns. In one instance, Google Analytics failed to register 6,837 users in a single day, approximately 9.8% of the total tracked by Matomo.
In another instance, Google Analytics only tracked 600 visitors over 24 hours, while Matomo recorded nearly 71,000 visitors – an 11,700% discrepancy.
Financial companies need a more reliable, privacy-centric alternative to Google Analytics that captures quality data without putting users at potential risk. This is why we built Matomo and why our customers love having total control and visibility of their data.
Unlock the full power of fintech data analytics with Matomo
Fintech companies face many data-related challenges, so compliant web analytics shouldn’t be one of them.
With Matomo, you get :
- An all-in-one solution that handles traditional web analytics, behavioural analytics and more with strong integrations to minimise the likelihood of data siloing
- Full compliance with GDPR, CCPA, PIPL and more
- Complete ownership of your data to minimise cybersecurity risks caused by negligent third parties
- An abundance of ways to protect customer privacy, like IP address anonymisation and respect for DoNotTrack settings
- The ability to import data from Google Analytics and distance yourself from big tech
- High-quality data that doesn’t rely on sampling
- A tool built with financial analytics in mind
Don’t let big tech companies limit the power of your data with sketchy privacy policies and counterintuitive systems like data sampling.
Start your Matomo free trial or request a demo to unlock the full power of fintech data analytics without putting your customers’ personal information at unnecessary risk.
-
Saying Goodbye To Old Machines
I recently sent a few old machines off for recycling. Both had relevance to the early days of the FATE testing effort. As is my custom, I photographed them (poorly, of course).
First, there’s the PowerPC-based Mac Mini I procured thanks to a Craigslist ad in late 2006. I had plans to develop automated FFmpeg building and testing and was already looking ahead toward testing multiple CPU architectures. Again, this was 2006 and PowerPC wasn’t completely on the outs yet– although Apple’s MacTel transition was in full swing, the entire new generation of video game consoles was based on PowerPC.
I remember trying to find a Mac Mini PPC on Craigslist. Many were to be found, but all asked more than the price of even a new Mac Mini Intel, always because the seller was leaving all of last year’s applications and perhaps including a monitor, neither of which I needed. Fortunately, I found this bare Mac Mini. Also fortunate was the fact that it was far easier to install Linux on it than the first PowerPC machine I owned.
After FATE operation transitioned away from me, I still kept the machine in service as an edge server and automated backup machine. That is, until the hard drive failed on reboot one day. Thus, when it was finally time to recycle the computer, I felt it necessary to disassemble the machine and remove the hard drive for possible salvage and then for destruction.
If you’ve ever attempted to upgrade or otherwise service this style of Mac Mini, you will no doubt recognize the pictured paint scraper tool as standard kit. I have had that tool since I first endeavored to upgrade the RAM to 1 GB from the standard 1/2 GB. Performing such activities on a Mac Mini is tedious, but only if you care about putting it back together afterwards.
The next machine is a bit older. I put it together nearly a decade ago, early in 2005. This machine’s original duty was “download agent”– this would be more specifically called a BitTorrent machine in modern tech parlance. Back then, I placed it on someone else’s woefully underutilized home broadband connection (with their permission, of course) when I was too cheap to upgrade from dialup.
This is a small form factor system from VIA that was clearly designed with home theater PC (HTPC) use cases in mind. It has a VIA C3 x86-compatible CPU (according to my notes, Centaur VIA Samuel 2 stepping 03, flags : fpu de tsc msr cx8 mtrr pge mmx 3dnow) and 128 MB of RAM (initially ; I upgraded it to 512 MB some years later, just for the sake of doing it). And then there was the 120 GB PATA HD for all that downloaded goodness.
I have specific memories of a time when my main computer at home wasn’t working correctly for one reason or another. Instead, I logged into this machine remotely via SSH to make several optimizations and fixes on FFmpeg’s VP3/Theora video decoder, all from the terminal, without being able to see the decoded images with my own eyes (which is why I insist that even blind people could work on video codecs).
By the time I got my own broadband, I had become inspired to attempt the automated build and test system for FFmpeg. This was the machine I used for prototyping early brainstorms of FATE. By the time I put a basic build/test system into place in early 2008, I had much faster computers that could build and test the project– obvious limitation of this machine is that it could take at least 1/2 hour to build the entire codebase, and that was the project from 8 years ago.
So the machine got stuffed in a closet somewhere along the line. The next time I pulled it out was in 2010 when I wanted to toy with Dreamcast programming once more (the machine appears in one of the photos in this post). This was the only machine I still owned which still had an RS-232 serial port (I didn’t know much about USB serial converters yet), plus it still had a bunch of pre-compiled DC homebrew binaries (I was having trouble getting the toolchain to work right).
The next time I dusted off this machine was late last year when I was trying some experiments with the Microsoft Xbox’s IDE drive (a photo in that post also shows the machine ; this thing shows up a lot on this blog). The VIA machine was the only machine I still owned which had 40-pin IDE connectors which was crucial to my experiment.
At this point, I was trying to make the machine more useful which meant replacing the ancient Gentoo Linux distribution as well as simply interacting with it via a keyboard and mouse. I have a long Evernote entry documenting a comedy of errors revolving around this little box. The interaction troubles were due to the fact that I didn’t have any PS/2 keyboards left and I couldn’t make a USB keyboard work with it. Diego was able to explain that I needed to flip a bit in the BIOS to address this which worked. As for upgrading the OS, I tried numerous Linux distributions large and small, mostly focusing on the small. None worked. I eventually learned that, while I was trying to use i686 distributions, this machine did not actually qualify as an i686 CPU ; installations usually booted but failed because the default kernel required the cmov instruction. I was advised to try i386 distros instead. My notes don’t indicate whether I had any luck on this front before I gave up and moved on.
I just made the connection that this VIA machine has two 40-pin IDE connectors which means that the thing was technically capable of supporting up to 4 IDE devices. Obviously, the computer couldn’t really accommodate that in terms of space or power. When I wanted to try installing a new OS, I needed take off the top and connect a rather bulky IDE CD-ROM drive. This computer’s casing was supposed to be able to support a slimline optical drive (perhaps like the type found in laptops), but I could never quite visualize how that was supposed to work, space-wise. When I disassembled the PowerPC Mac Mini, I realized I might be able to repurpose that machines optical drive for this computer. Obviously, I thought better of trying since both machines are off to the recycle pile.
I would still like to work on the Xbox project a bit more, but I procured a different, unused, much more powerful yet still old computer that has a motherboard with 1 PATA connector in addition to 6 SATA connectors. If I ever get around to toying with Linux kernel development, this should be a much more appropriate platform to use.
I thought about turning this machine into an old Windows XP (and lower, down to Windows 3.1) gaming platform ; the capabilities of the machine would probably be perfect for a huge portion of my Windows game collection. But I think the lack of an optical drive renders this idea intractable. External USB drives are likely out of the question since there is very little chance that this motherboard featured USB 2.0 (the specs don’t mention 2.0, so the USB ports are probably 1.1).
So it is with fond memories that I send off both machines, sans hard drives, to the recycle pile. I’m still deciding on an appropriate course of action for failed hard drives, though.