Recherche avancée

Médias (2)

Mot : - Tags -/doc2img

Autres articles (19)

  • Other interesting software

    13 avril 2011, par

    We don’t claim to be the only ones doing what we do ... and especially not to assert claims to be the best either ... What we do, we just try to do it well and getting better ...
    The following list represents softwares that tend to be more or less as MediaSPIP or that MediaSPIP tries more or less to do the same, whatever ...
    We don’t know them, we didn’t try them, but you can take a peek.
    Videopress
    Website : http://videopress.com/
    License : GNU/GPL v2
    Source code : (...)

  • Pas question de marché, de cloud etc...

    10 avril 2011

    Le vocabulaire utilisé sur ce site essaie d’éviter toute référence à la mode qui fleurit allègrement
    sur le web 2.0 et dans les entreprises qui en vivent.
    Vous êtes donc invité à bannir l’utilisation des termes "Brand", "Cloud", "Marché" etc...
    Notre motivation est avant tout de créer un outil simple, accessible à pour tout le monde, favorisant
    le partage de créations sur Internet et permettant aux auteurs de garder une autonomie optimale.
    Aucun "contrat Gold ou Premium" n’est donc prévu, aucun (...)

  • Qualité du média après traitement

    21 juin 2013, par

    Le bon réglage du logiciel qui traite les média est important pour un équilibre entre les partis ( bande passante de l’hébergeur, qualité du média pour le rédacteur et le visiteur, accessibilité pour le visiteur ). Comment régler la qualité de son média ?
    Plus la qualité du média est importante, plus la bande passante sera utilisée. Le visiteur avec une connexion internet à petit débit devra attendre plus longtemps. Inversement plus, la qualité du média est pauvre et donc le média devient dégradé voire (...)

Sur d’autres sites (3125)

  • Lean Analytics in a Privacy-First Environment – Bootcamp with Timo Dechau

    In a recent bootcamp, Timo Dechau walked attendees through his approach to data and measurement in privacy-focused analytics environments. He demonstrates how to shift from a chaotic, ‘track-it-all’ mentality to a focused method that prioritizes quality over quantity. This post will summarize some of his key privacy-first analytics ideas, but be sure to check out the on-demand video for more detail.

    Watch the bootcamp on demand

    <script type="text/javascript">var gform;gform||(document.addEventListener("gform_main_scripts_loaded",function(){gform.scriptsLoaded=!0}),window.addEventListener("DOMContentLoaded",function(){gform.domLoaded=!0}),gform={domLoaded:!1,scriptsLoaded:!1,initializeOnLoaded:function(o){gform.domLoaded&&gform.scriptsLoaded?o():!gform.domLoaded&&gform.scriptsLoaded?window.addEventListener("DOMContentLoaded",o):document.addEventListener("gform_main_scripts_loaded",o)},hooks:{action:{},filter:{}},addAction:function(o,n,r,t){gform.addHook("action",o,n,r,t)},addFilter:function(o,n,r,t){gform.addHook("filter",o,n,r,t)},doAction:function(o){gform.doHook("action",o,arguments)},applyFilters:function(o){return gform.doHook("filter",o,arguments)},removeAction:function(o,n){gform.removeHook("action",o,n)},removeFilter:function(o,n,r){gform.removeHook("filter",o,n,r)},addHook:function(o,n,r,t,i){null==gform.hooks[o][n]&&(gform.hooks[o][n]=[]);var e=gform.hooks[o][n];null==i&&(i=n+"_"+e.length),gform.hooks[o][n].push({tag:i,callable:r,priority:t=null==t?10:t})},doHook:function(n,o,r){var t;if(r=Array.prototype.slice.call(r,1),null!=gform.hooks[n][o]&&((o=gform.hooks[n][o]).sort(function(o,n){return o.priority-n.priority}),o.forEach(function(o){"function"!=typeof(t=o.callable)&&(t=window[t]),"action"==n?t.apply(null,r):r[0]=t.apply(null,r)})),"filter"==n)return r[0]},removeHook:function(o,n,t,i){var r;null!=gform.hooks[o][n]&&(r=(r=gform.hooks[o][n]).filter(function(o,n,r){return!!(null!=i&&i!=o.tag||null!=t&&t!=o.priority)}),gform.hooks[o][n]=r)}});</script>
    &lt;script&gt;<br />
    gform.initializeOnLoaded( function() {gformInitSpinner( 72, 'https://matomo.org/wp-content/plugins/gravityforms/images/spinner.svg', true );jQuery('#gform_ajax_frame_72').on('load',function(){var contents = jQuery(this).contents().find('*').html();var is_postback = contents.indexOf('GF_AJAX_POSTBACK') &gt;= 0;if(!is_postback){return;}var form_content = jQuery(this).contents().find('#gform_wrapper_72');var is_confirmation = jQuery(this).contents().find('#gform_confirmation_wrapper_72').length &gt; 0;var is_redirect = contents.indexOf('gformRedirect(){') &gt;= 0;var is_form = form_content.length &gt; 0 &amp;&amp; ! is_redirect &amp;&amp; ! is_confirmation;var mt = parseInt(jQuery('html').css('margin-top'), 10) + parseInt(jQuery('body').css('margin-top'), 10) + 100;if(is_form){jQuery('#gform_wrapper_72').html(form_content.html());if(form_content.hasClass('gform_validation_error')){jQuery('#gform_wrapper_72').addClass('gform_validation_error');} else {jQuery('#gform_wrapper_72').removeClass('gform_validation_error');}setTimeout( function() { /* delay the scroll by 50 milliseconds to fix a bug in chrome */  }, 50 );if(window['gformInitDatepicker']) {gformInitDatepicker();}if(window['gformInitPriceFields']) {gformInitPriceFields();}var current_page = jQuery('#gform_source_page_number_72').val();gformInitSpinner( 72, 'https://matomo.org/wp-content/plugins/gravityforms/images/spinner.svg', true );jQuery(document).trigger('gform_page_loaded', [72, current_page]);window['gf_submitting_72'] = false;}else if(!is_redirect){var confirmation_content = jQuery(this).contents().find('.GF_AJAX_POSTBACK').html();if(!confirmation_content){confirmation_content = contents;}setTimeout(function(){jQuery('#gform_wrapper_72').replaceWith(confirmation_content);jQuery(document).trigger('gform_confirmation_loaded', [72]);window['gf_submitting_72'] = false;wp.a11y.speak(jQuery('#gform_confirmation_message_72').text());}, 50);}else{jQuery('#gform_72').append(contents);if(window['gformRedirect']) {gformRedirect();}}jQuery(document).trigger(&quot;gform_pre_post_render&quot;, [{ formId: &quot;72&quot;, currentPage: &quot;current_page&quot;, abort: function() { this.preventDefault(); } }]);                if (event.defaultPrevented) {                return;         }        const gformWrapperDiv = document.getElementById( &quot;gform_wrapper_72&quot; );        if ( gformWrapperDiv ) {            const visibilitySpan = document.createElement( &quot;span&quot; );            visibilitySpan.id = &quot;gform_visibility_test_72&quot;;            gformWrapperDiv.insertAdjacentElement( &quot;afterend&quot;, visibilitySpan );        }        const visibilityTestDiv = document.getElementById( &quot;gform_visibility_test_72&quot; );        let postRenderFired = false;                function triggerPostRender() {            if ( postRenderFired ) {                return;            }            postRenderFired = true;            jQuery( document ).trigger( 'gform_post_render', [72, current_page] );            gform.utils.trigger( { event: 'gform/postRender', native: false, data: { formId: 72, currentPage: current_page } } );            if ( visibilityTestDiv ) {                visibilityTestDiv.parentNode.removeChild( visibilityTestDiv );            }        }        function debounce( func, wait, immediate ) {            var timeout;            return function() {                var context = this, args = arguments;                var later = function() {                    timeout = null;                    if ( !immediate ) func.apply( context, args );                };                var callNow = immediate &amp;&amp; !timeout;                clearTimeout( timeout );                timeout = setTimeout( later, wait );                if ( callNow ) func.apply( context, args );            };        }        const debouncedTriggerPostRender = debounce( function() {            triggerPostRender();        }, 200 );        if ( visibilityTestDiv &amp;&amp; visibilityTestDiv.offsetParent === null ) {            const observer = new MutationObserver( ( mutations ) =&gt; {                mutations.forEach( ( mutation ) =&gt; {                    if ( mutation.type === 'attributes' &amp;&amp; visibilityTestDiv.offsetParent !== null ) {                        debouncedTriggerPostRender();                        observer.disconnect();                    }                });            });            observer.observe( document.body, {                attributes: true,                childList: false,                subtree: true,                attributeFilter: [ 'style', 'class' ],            });        } else {            triggerPostRender();        }    } );} );<br />
    &lt;/script&gt;
    the consequences of more data are missing and incomplete data that messes up attribution and measurement.

    Unrestrained data collection leads to data bloat

    Marketing and the business world are experiencing a data problem. Analysts and business intelligence teams grapple with large amounts of data that aren’t always useful and are often incomplete. The idea that “more data is better” became a guiding principle in the early 2000s, encouraging companies to gather everything possible using all available data collection methods. This unrestrained pursuit often led to an unexpected problem : data bloat. Too much data, too little clarity. Digital marketers, analysts, and business leaders now try to navigate vast amounts of information that create more confusion than insight, especially when the data is incomplete due to privacy regulations.

    Cutting through the noise, focusing on what matters

    The “more data is better” mindset emerged when digital marketers were beginning to understand data’s potential. It seemed logical : more data should mean more opportunities to optimise, personalise, and drive results. But in practice, gathering every possible piece of data often leads to a cluttered, confusing pile of metrics that can mislead more than guide.

    This approach carries hidden costs. Excessive data collection burns resources, increases privacy concerns, and leaves teams unfocused. It’s easy to get lost trying to make sense of endless dashboards, metrics, and reports. More data doesn’t necessarily lead to better decisions ; it often just leads to more noise, hindering effective data management.

    Rethinking data management : From data overload to data mindfulness

    Data management has often prioritised comprehensive data gathering without considering the specific value of each data point. This approach has created more information, but not necessarily better insights.

    Data mindfulness is about taking a deliberate, focused approach to data collection and analysis. Instead of trying to collect everything, it emphasises gathering only what truly adds value. It’s about ensuring the data you collect serves a purpose and directly contributes to better insights and data-driven decision-making.

    Think of it like applying a “lean” methodology to data—trimming away the unnecessary and keeping only what is essential. Or consider embracing data minimalism to declutter your data warehouse, keeping only what truly sparks insight.

    Mindful data is ethical data

    Adopting a mindful approach to data can pay off in several ways :

    • Reduces overwhelm : When you reduce the clutter, you’re left with fewer, clearer metrics that lead to stronger decisions and actionable data insights.

    • Mitigates compliance risks : By collecting less, companies align better with privacy regulations and build trust with their customers. Privacy-first analytics and privacy-compliant analytics practices mean there’s no need for invasive tracking if it doesn’t add value—and customers will appreciate that.

    • Enhances data ethics : Focusing on the quality rather than the quantity of data collected ensures ethical data collection and management. Companies use data responsibly, respect user privacy, and minimise unnecessary data handling, strengthening customer relationships and brand integrity.

    • Improves data efficiency : Focused analytics means better use of resources. You’re spending less time managing meaningless metrics and more time working on meaningful insights. Many companies have found success by switching to a leaner, quality-first data approach, reporting sharper, more impactful results.

    Shifting towards simplicity and lean analytics

    If data mindfulness sounds appealing, here’s how you can get started :

    1. Ask the right questions. Before collecting any data, ask yourself : Why are we collecting this ? How will it drive value ? If you can’t answer these questions clearly, that data probably isn’t worth collecting. This is a key step in smart data management.

    2. Simplify metrics. Focus on the KPIs that truly matter for your business. Choose a handful of key metrics that reflect your goals rather than a sprawling list of nice-to-haves. Embracing data simplicity helps in targeting data collection effectively.

    3. Audit your current data. Review your existing data collection processes. Which metrics are you actively using to make decisions ? Eliminate any redundant or low-value metrics that create noise. Use ethical data management practices to ensure data efficiency and compliance. Understanding what is data management in this context is crucial.

    4. Implement lean analytics practices. Shift towards lean analytics by cutting down on unnecessary tracking. This can involve reducing reliance on multiple tracking scripts, simplifying your reporting, and setting up a streamlined dashboard focused on key outcomes. Embrace data reduction strategies to eliminate waste and boost effectiveness.

    Who should watch this bootcamp

    This bootcamp is perfect for data analysts, product managers, digital marketers and business leaders who are seeking a more streamlined approach to data measurement. If you’re interested in moving away from a chaotic “track-it-all” mentality and towards a focused, lean, and privacy-first analytics strategy, this workshop is for you.

    What you’ll discover

    • Practical steps : Learn actionable strategies to reduce data bloat and implement lean, privacy-first analytics in your organisation.

    • Real-life examples : Explore case studies of companies that have successfully adopted focused and privacy-first analytics.

    • Deep insights : Gain a deeper understanding of how to prioritise quality over quantity without sacrificing valuable insights.

    Watch the bootcamp on-demand

    For a comprehensive dive into these topics, watch the full workshop video or download the detailed transcript. Equip yourself with the knowledge and tools to transform your data management approach today.

    &lt;script&gt;<br />
    gform.initializeOnLoaded( function() {gformInitSpinner( 72, 'https://matomo.org/wp-content/plugins/gravityforms/images/spinner.svg', true );jQuery('#gform_ajax_frame_72').on('load',function(){var contents = jQuery(this).contents().find('*').html();var is_postback = contents.indexOf('GF_AJAX_POSTBACK') &gt;= 0;if(!is_postback){return;}var form_content = jQuery(this).contents().find('#gform_wrapper_72');var is_confirmation = jQuery(this).contents().find('#gform_confirmation_wrapper_72').length &gt; 0;var is_redirect = contents.indexOf('gformRedirect(){') &gt;= 0;var is_form = form_content.length &gt; 0 &amp;&amp; ! is_redirect &amp;&amp; ! is_confirmation;var mt = parseInt(jQuery('html').css('margin-top'), 10) + parseInt(jQuery('body').css('margin-top'), 10) + 100;if(is_form){jQuery('#gform_wrapper_72').html(form_content.html());if(form_content.hasClass('gform_validation_error')){jQuery('#gform_wrapper_72').addClass('gform_validation_error');} else {jQuery('#gform_wrapper_72').removeClass('gform_validation_error');}setTimeout( function() { /* delay the scroll by 50 milliseconds to fix a bug in chrome */  }, 50 );if(window['gformInitDatepicker']) {gformInitDatepicker();}if(window['gformInitPriceFields']) {gformInitPriceFields();}var current_page = jQuery('#gform_source_page_number_72').val();gformInitSpinner( 72, 'https://matomo.org/wp-content/plugins/gravityforms/images/spinner.svg', true );jQuery(document).trigger('gform_page_loaded', [72, current_page]);window['gf_submitting_72'] = false;}else if(!is_redirect){var confirmation_content = jQuery(this).contents().find('.GF_AJAX_POSTBACK').html();if(!confirmation_content){confirmation_content = contents;}setTimeout(function(){jQuery('#gform_wrapper_72').replaceWith(confirmation_content);jQuery(document).trigger('gform_confirmation_loaded', [72]);window['gf_submitting_72'] = false;wp.a11y.speak(jQuery('#gform_confirmation_message_72').text());}, 50);}else{jQuery('#gform_72').append(contents);if(window['gformRedirect']) {gformRedirect();}}jQuery(document).trigger(&quot;gform_pre_post_render&quot;, [{ formId: &quot;72&quot;, currentPage: &quot;current_page&quot;, abort: function() { this.preventDefault(); } }]);                if (event.defaultPrevented) {                return;         }        const gformWrapperDiv = document.getElementById( &quot;gform_wrapper_72&quot; );        if ( gformWrapperDiv ) {            const visibilitySpan = document.createElement( &quot;span&quot; );            visibilitySpan.id = &quot;gform_visibility_test_72&quot;;            gformWrapperDiv.insertAdjacentElement( &quot;afterend&quot;, visibilitySpan );        }        const visibilityTestDiv = document.getElementById( &quot;gform_visibility_test_72&quot; );        let postRenderFired = false;                function triggerPostRender() {            if ( postRenderFired ) {                return;            }            postRenderFired = true;            jQuery( document ).trigger( 'gform_post_render', [72, current_page] );            gform.utils.trigger( { event: 'gform/postRender', native: false, data: { formId: 72, currentPage: current_page } } );            if ( visibilityTestDiv ) {                visibilityTestDiv.parentNode.removeChild( visibilityTestDiv );            }        }        function debounce( func, wait, immediate ) {            var timeout;            return function() {                var context = this, args = arguments;                var later = function() {                    timeout = null;                    if ( !immediate ) func.apply( context, args );                };                var callNow = immediate &amp;&amp; !timeout;                clearTimeout( timeout );                timeout = setTimeout( later, wait );                if ( callNow ) func.apply( context, args );            };        }        const debouncedTriggerPostRender = debounce( function() {            triggerPostRender();        }, 200 );        if ( visibilityTestDiv &amp;&amp; visibilityTestDiv.offsetParent === null ) {            const observer = new MutationObserver( ( mutations ) =&gt; {                mutations.forEach( ( mutation ) =&gt; {                    if ( mutation.type === 'attributes' &amp;&amp; visibilityTestDiv.offsetParent !== null ) {                        debouncedTriggerPostRender();                        observer.disconnect();                    }                });            });            observer.observe( document.body, {                attributes: true,                childList: false,                subtree: true,                attributeFilter: [ 'style', 'class' ],            });        } else {            triggerPostRender();        }    } );} );<br />
    &lt;/script&gt;
  • Overcoming Fintech and Finserv’s Biggest Data Analytics Challenges

    13 septembre 2024, par Daniel Crough — Banking and Financial Services, Marketing, Security

    Data powers innovation in financial technology (fintech), from personalized banking services to advanced fraud detection systems. Industry leaders recognize the value of strong security measures and customer privacy. A recent survey highlights this focus, with 72% of finance Chief Risk Officers identifying cybersecurity as their primary concern.

    Beyond cybersecurity, fintech and financial services (finserv) companies are bogged down with massive amounts of data spread throughout disconnected systems. Between this, a complex regulatory landscape and an increasingly tech-savvy and sceptical consumer base, fintech and finserv companies have a lot on their plates.

    How can marketing teams get the information they need while staying focused on compliance and providing customer value ? 

    This article will examine strategies to address common challenges in the finserv and fintech industries. We’ll focus on using appropriate tools, following effective data management practices, and learning from traditional banks’ approaches to similar issues.

    What are the biggest fintech data analytics challenges, and how do they intersect with traditional banking ?

    Recent years have been tough for the fintech industry, especially after the pandemic. This period has brought new hurdles in data analysis and made existing ones more complex. As the market stabilises, both fintech and finserve companies must tackle these evolving data issues.

    Let’s examine some of the most significant data analytics challenges facing the fintech industry, starting with an issue that’s prevalent across the financial sector :

    1. Battling data silos

    In a recent survey by InterSystems, 54% of financial institution leaders said data silos are their biggest barrier to innovation, while 62% said removing silos is their priority data strategy for the next year.

    a graphic highlighting fintech concerns about siloed data

    Data silos segregate data repositories across departments, products and other divisions. This is a major issue in traditional banking and something fintech companies should avoid inheriting at all costs.

    Siloed data makes it harder for decision-makers to view business performance with 360-degree clarity. It’s also expensive to maintain and operationalise and can evolve into privacy and data compliance issues if left unchecked.

    To avoid or remove data silos, develop a data governance framework and centralise your data repositories. Next, simplify your analytics stack into as few integrated tools as possible because complex tech stacks are one of the leading causes of data silos.

    Use an analytics system like Matomo that incorporates web analytics, marketing attribution and CRO testing into one toolkit.

    A screenshot of Matomo web analytics

    Matomo’s support plans help you implement a data system to meet the unique needs of your business and avoid issues like data silos. We also offer data warehouse exporting as a feature to bring all of your web analytics, customer data, support data, etc., into one centralised location.

    Try Matomo for free today, or contact our sales team to discuss support plans.

    2. Compliance with laws and regulations

    A survey by Alloy reveals that 93% of fintech companies find it difficult to meet compliance regulations. The cost of staying compliant tops their list of worries (23%), outranking even the financial hit from fraud (21%) – and this in a year marked by cyber threats.

    a bar chart shows the top concerns of fintech regulation compliance

    Data privacy laws are constantly changing, and the landscape varies across global regions, making adherence even more challenging for fintechs and traditional banks operating in multiple markets. 

    In the US market, companies grapple with regulations at both federal and state levels. Here are some of the state-level legislation coming into effect for 2024-2026 :

    Other countries are also ramping up regional regulations. For instance, Canada has Quebec’s Act Respecting the Protection of Personal Information in the Private Sector and British Columbia’s Personal Information Protection Act (BC PIPA).

    Ignorance of country- or region-specific laws will not stop companies from suffering the consequences of violating them.

    The only answer is to invest in adherence and manage business growth accordingly. Ultimately, compliance is more affordable than non-compliance – not only in terms of the potential fines but also the potential risks to reputation, consumer trust and customer loyalty.

    This is an expensive lesson that fintech and traditional financial companies have had to learn together. GDPR regulators hit CaixaBank S.A, one of Spain’s largest banks, with multiple multi-million Euro fines, and Klarna Bank AB, a popular Swedish fintech company, for €720,000.

    To avoid similar fates, companies should :

    1. Build solid data systems
    2. Hire compliance experts
    3. Train their teams thoroughly
    4. Choose data analytics tools carefully

    Remember, even popular tools like Google Analytics aren’t automatically safe. Find out how Matomo helps you gather useful insights while sticking to rules like GDPR.

    3. Protecting against data security threats

    Cyber threats are increasing in volume and sophistication, with the financial sector becoming the most breached in 2023.

    a bar chart showing the percentage of data breaches per industry from 2021 to 2023
<p>

    The cybersecurity risks will only worsen, with WEF estimating annual cybercrime expenses of up to USD $10.5 trillion globally by 2025, up from USD $3 trillion in 2015.

    While technology brings new security solutions, it also amplifies existing risks and creates new ones. A 2024 McKinsey report warns that the risk of data breaches will continue to increase as the financial industry increasingly relies on third-party data tools and cloud computing services unless they simultaneously improve their security posture.

    The reality is that adopting a third-party data system without taking the proper precautions means adopting its security vulnerabilities.

    In 2023, the MOVEit data breach affected companies worldwide, including financial institutions using its file transfer system. One hack created a global data crisis, potentially affecting the customer data of every company using this one software product.

    The McKinsey report emphasises choosing tools wisely. Why ? Because when customer data is compromised, it’s your company that takes the heat, not the tool provider. As the report states :

    “Companies need reliable, insightful metrics and reporting (such as security compliance, risk metrics and vulnerability tracking) to prove to regulators the health of their security capabilities and to manage those capabilities.”

    Don’t put user or customer data in the hands of companies you can’t trust. Work with providers that care about security as much as you do. With Matomo, you own all of your data, ensuring it’s never used for unknown purposes.

    A screenshot of Matomo visitor reporting

    4. Protecting users’ privacy

    With security threats increasing, fintech companies and traditional banks must prioritise user privacy protection. Users are also increasingly aware of privacy threats and ready to walk away from companies that lose their trust.

    Cisco’s 2023 Data Privacy Benchmark Study reveals some eye-opening statistics :

    • 94% of companies said their customers wouldn’t buy from them if their data wasn’t protected, and 
    • 95% see privacy as a business necessity, not just a legal requirement.

    Modern financial companies must balance data collection and management with increasing privacy demands. This may sound contradictory for companies reliant on dated practices like third-party cookies, but they need to learn to thrive in a cookieless web as customers move to banks and service providers that have strong data ethics.

    This privacy protection journey starts with implementing web analytics ethically from the very first session.

    A graphic showing the four key elements of ethical web analytics: 100% data ownership, respecting user privacy, regulatory compliance and Data transparency

    The most important elements of ethically-sound web analytics in fintech are :

    1. 100% data ownership : Make sure your data isn’t used in other ways by the tools that collect it.
    2. Respecting user privacy : Only collect the data you absolutely need to do your job and avoid personally identifiable information.
    3. Regulatory compliance : Stick with solutions built for compliance to stay out of legal trouble.
    4. Data transparency : Know how your tools use your data and let your customers know how you use it.

    Read our guide to ethical web analytics for more information.

    5. Comparing customer trust across industries 

    While fintech companies are making waves in the financial world, they’re still playing catch-up when it comes to earning customer trust. According to RFI Global, fintech has a consumer trust score of 5.8/10 in 2024, while traditional banking scores 7.6/10.

    a comparison of consumer trust in fintech vs traditional finance

    This trust gap isn’t just about perception – it’s rooted in real issues :

    • Security breaches are making headlines more often.
    • Privacy regulations like GDPR are making consumers more aware of their rights.
    • Some fintech companies are struggling to handle fraud effectively.

    According to the UK’s Payment Systems Regulator, digital banking brands Monzo and Starling had some of the highest fraudulent activity rates in 2022. Yet, Monzo only reimbursed 6% of customers who reported suspicious transactions, compared to 70% for NatWest and 91% for Nationwide.

    So, what can fintech firms do to close this trust gap ?

    • Start with privacy-centric analytics from day one. This shows customers you value their privacy from the get-go.
    • Build and maintain a long-term reputation free of data leaks and privacy issues. One major breach can undo years of trust-building.
    • Learn from traditional banks when it comes to handling issues like fraudulent transactions, identity theft, and data breaches. Prompt, customer-friendly resolutions go a long way.
    • Remember : cutting-edge financial technology doesn’t make up for poor customer care. If your digital bank won’t refund customers who’ve fallen victim to credit card fraud, they’ll likely switch to a traditional bank that will.

    The fintech sector has made strides in innovation, but there’s still work to do in establishing trustworthiness. By focusing on robust security, transparent practices, and excellent customer service, fintech companies can bridge the trust gap and compete more effectively with traditional banks.

    6. Collecting quality data

    Adhering to data privacy regulations, protecting user data and implementing ethical analytics raises another challenge. How can companies do all of these things and still collect reliable, quality data ?

    Google’s answer is using predictive models, but this replaces real data with calculations and guesswork. The worst part is that Google Analytics doesn’t even let you use all of the data you collect in the first place. Instead, it uses something called data sampling once you pass certain thresholds.

    In practice, this means that Google Analytics uses a limited set of your data to calculate reports. We’ve discussed GA4 data sampling at length before, but there are two key problems for companies here :

    1. A sample size that’s too small won’t give you a full representation of your data.
    2. The more visitors that come to your site, the less accurate your reports will become.

    For high-growth companies, data sampling simply can’t keep up. Financial marketers widely recognise the shortcomings of big tech analytics providers. In fact, 80% of them say they’re concerned about data bias from major providers like Google and Meta affecting valuable insights.

    This is precisely why CRO:NYX Digital approached us after discovering Google Analytics wasn’t providing accurate campaign data. We set up an analytics system to suit the company’s needs and tested it alongside Google Analytics for multiple campaigns. In one instance, Google Analytics failed to register 6,837 users in a single day, approximately 9.8% of the total tracked by Matomo.

    In another instance, Google Analytics only tracked 600 visitors over 24 hours, while Matomo recorded nearly 71,000 visitors – an 11,700% discrepancy.

    a data visualisation showing the discrepancy in Matomo's reporting vs Google Analytics

    Financial companies need a more reliable, privacy-centric alternative to Google Analytics that captures quality data without putting users at potential risk. This is why we built Matomo and why our customers love having total control and visibility of their data.

    Unlock the full power of fintech data analytics with Matomo

    Fintech companies face many data-related challenges, so compliant web analytics shouldn’t be one of them. 

    With Matomo, you get :

    • An all-in-one solution that handles traditional web analytics, behavioural analytics and more with strong integrations to minimise the likelihood of data siloing
    • Full compliance with GDPR, CCPA, PIPL and more
    • Complete ownership of your data to minimise cybersecurity risks caused by negligent third parties
    • An abundance of ways to protect customer privacy, like IP address anonymisation and respect for DoNotTrack settings
    • The ability to import data from Google Analytics and distance yourself from big tech
    • High-quality data that doesn’t rely on sampling
    • A tool built with financial analytics in mind

    Don’t let big tech companies limit the power of your data with sketchy privacy policies and counterintuitive systems like data sampling. 

    Start your Matomo free trial or request a demo to unlock the full power of fintech data analytics without putting your customers’ personal information at unnecessary risk.

  • D3D11 hardware screen recording with ffmpeg using Intel H264 QSV hardware encoding

    13 mai 2024, par Cactus

    I'm trying to screen record using ffmpeg and having it all done via hardware on my Intel integrated graphics.

    &#xA;

    Specs

    &#xA;

    For reference, here are my specs :

    &#xA;

      &#xA;
    • Windows 11 Pro Version 10.0.22631 Build 22631
    • &#xA;

    • Model : Dell Inc. OptiPlex 3090
    • &#xA;

    • Processor : 11th Gen Intel(R) Core(TM) i3-1115G4 @ 3.00GHz, 2995 Mhz, 2 Core(s), 4 Logical Processor(s)
    • &#xA;

    • Terminal : Windows Terminal using Windows Powershell
    • &#xA;

    &#xA;

    In this wiki, it looked like I could use the "Windows 8+ Desktop Duplication API" to be "directly encoded by a compatible hardware encoder" — so I thought that I could use h264_qsv to encode, allowing me to record without much performance overhead costs.

    &#xA;

    Attempt 1

    &#xA;

    I've tried copying one of the command examples, but it didn't work :

    &#xA;

    ffmpeg -init_hw_device d3d11va -filter_complex ddagrab=0 &#xA;

    &#xA;

    Output :

    &#xA;

    ffmpeg version 2024-05-02-git-71669f2ad5-full_build-www.gyan.dev Copyright (c) 2000-2024 the FFmpeg developers&#xA;  built with gcc 13.2.0 (Rev5, Built by MSYS2 project)&#xA;  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libxevd --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxeve --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-dxva2 --enable-d3d11va --enable-d3d12va --enable-ffnvcodec --enable-libvpl --enable-nvdec --enable-nvenc --enable-vaapi --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libcodec2 --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint&#xA;  libavutil      59. 16.101 / 59. 16.101&#xA;  libavcodec     61.  5.103 / 61.  5.103&#xA;  libavformat    61.  3.103 / 61.  3.103&#xA;  libavdevice    61.  2.100 / 61.  2.100&#xA;  libavfilter    10.  2.101 / 10.  2.101&#xA;  libswscale      8.  2.100 /  8.  2.100&#xA;  libswresample   5.  2.100 /  5.  2.100&#xA;  libpostproc    58.  2.100 / 58.  2.100&#xA;Stream mapping:&#xA;  ddagrab:default -> Stream #0:0 (h264_qsv)&#xA;Press [q] to stop, [?] for help&#xA;Impossible to convert between the formats supported by the filter &#x27;Parsed_ddagrab_0&#x27; and the filter &#x27;auto_scale_0&#x27;&#xA;[fc#0 @ 000001e4ed4393c0] Error configuring filter graph: Function not implemented&#xA;[fc#0 @ 000001e4ed4393c0] Task finished with error code: -40 (Function not implemented)&#xA;[fc#0 @ 000001e4ed4393c0] Terminating thread with return code -40 (Function not implemented)&#xA;[vost#0:0/h264_qsv @ 000001e4ed454000] Could not open encoder before EOF&#xA;[vost#0:0/h264_qsv @ 000001e4ed454000] Task finished with error code: -22 (Invalid argument)&#xA;[vost#0:0/h264_qsv @ 000001e4ed454000] Terminating thread with return code -22 (Invalid argument)&#xA;[out#0/matroska @ 000001e4ed453280] Nothing was written into output file, because at least one of its streams received no packets.&#xA;frame=    0 fps=0.0 q=0.0 Lsize=       0KiB time=N/A bitrate=N/A speed=N/A&#xA;Conversion failed!&#xA;

    &#xA;

    Attempt 2

    &#xA;

    I then tried copying one of the command examples in the linked ddagrab wiki page, but also no luck.

    &#xA;

    ffmpeg -f lavfi -i ddagrab -c:v h264_qsv output.mkv&#xA;

    &#xA;

    Output :

    &#xA;

    ffmpeg version 2024-05-02-git-71669f2ad5-full_build-www.gyan.dev Copyright (c) 2000-2024 the FFmpeg developers&#xA;  built with gcc 13.2.0 (Rev5, Built by MSYS2 project)&#xA;  configuration: --enable-gpl --enable-version3 --enable-static --disable-w32threads --disable-autodetect --enable-fontconfig --enable-iconv --enable-gnutls --enable-libxml2 --enable-gmp --enable-bzlib --enable-lzma --enable-libsnappy --enable-zlib --enable-librist --enable-libsrt --enable-libssh --enable-libzmq --enable-avisynth --enable-libbluray --enable-libcaca --enable-sdl2 --enable-libaribb24 --enable-libaribcaption --enable-libdav1d --enable-libdavs2 --enable-libuavs3d --enable-libxevd --enable-libzvbi --enable-librav1e --enable-libsvtav1 --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs2 --enable-libxeve --enable-libxvid --enable-libaom --enable-libjxl --enable-libopenjpeg --enable-libvpx --enable-mediafoundation --enable-libass --enable-frei0r --enable-libfreetype --enable-libfribidi --enable-libharfbuzz --enable-liblensfun --enable-libvidstab --enable-libvmaf --enable-libzimg --enable-amf --enable-cuda-llvm --enable-cuvid --enable-dxva2 --enable-d3d11va --enable-d3d12va --enable-ffnvcodec --enable-libvpl --enable-nvdec --enable-nvenc --enable-vaapi --enable-libshaderc --enable-vulkan --enable-libplacebo --enable-opencl --enable-libcdio --enable-libgme --enable-libmodplug --enable-libopenmpt --enable-libopencore-amrwb --enable-libmp3lame --enable-libshine --enable-libtheora --enable-libtwolame --enable-libvo-amrwbenc --enable-libcodec2 --enable-libilbc --enable-libgsm --enable-libopencore-amrnb --enable-libopus --enable-libspeex --enable-libvorbis --enable-ladspa --enable-libbs2b --enable-libflite --enable-libmysofa --enable-librubberband --enable-libsoxr --enable-chromaprint&#xA;  libavutil      59. 16.101 / 59. 16.101&#xA;  libavcodec     61.  5.103 / 61.  5.103&#xA;  libavformat    61.  3.103 / 61.  3.103&#xA;  libavdevice    61.  2.100 / 61.  2.100&#xA;  libavfilter    10.  2.101 / 10.  2.101&#xA;  libswscale      8.  2.100 /  8.  2.100&#xA;  libswresample   5.  2.100 /  5.  2.100&#xA;  libpostproc    58.  2.100 / 58.  2.100&#xA;Input #0, lavfi, from &#x27;ddagrab&#x27;:&#xA;  Duration: N/A, start: 0.000000, bitrate: N/A&#xA;  Stream #0:0: Video: wrapped_avframe, d3d11, 1920x1080 [SAR 1:1 DAR 16:9], 30 fps, 30 tbr, 1000k tbn&#xA;File &#x27;output.mkv&#x27; already exists. Overwrite? [y/N] y&#xA;Stream mapping:&#xA;  Stream #0:0 -> #0:0 (wrapped_avframe (native) -> h264 (h264_qsv))&#xA;Press [q] to stop, [?] for help&#xA;Impossible to convert between the formats supported by the filter &#x27;Parsed_null_0&#x27; and the filter &#x27;auto_scale_0&#x27;&#xA;[vf#0:0 @ 00000115e105bd40] Error reinitializing filters!&#xA;[vf#0:0 @ 00000115e105bd40] Task finished with error code: -40 (Function not implemented)&#xA;[vf#0:0 @ 00000115e105bd40] Terminating thread with return code -40 (Function not implemented)&#xA;[vost#0:0/h264_qsv @ 00000115e105aa00] Could not open encoder before EOF&#xA;[vost#0:0/h264_qsv @ 00000115e105aa00] Task finished with error code: -22 (Invalid argument)&#xA;[vost#0:0/h264_qsv @ 00000115e105aa00] Terminating thread with return code -22 (Invalid argument)&#xA;[out#0/matroska @ 00000115e1054b80] Nothing was written into output file, because at least one of its streams received no packets.&#xA;frame=    0 fps=0.0 q=0.0 Lsize=       0KiB time=N/A bitrate=N/A speed=N/A&#xA;Conversion failed!&#xA;

    &#xA;

    My main problem is that I don't know how to use really any of these encoders and the examples I've found above don't work with me. What command should I use to screen record all with hardware acceleration on my Intel system ? I don't care about any presets or other options at the moment.

    &#xA;

    Is there also a way to record audio ? When I tried to use -f dshow -i audio="Stereo Mix (Realtek(R) Audio)" by itself, which I made sure was turned on in Windows' settings and named correctly, it didn't work (is it because I'm using Bluetooth headphones that isn't connected to Realtek's system ?). Should this question be made separately than this one ?

    &#xA;