Recherche avancée

Médias (1)

Mot : - Tags -/getid3

Autres articles (23)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

  • Le plugin : Podcasts.

    14 juillet 2010, par

    Le problème du podcasting est à nouveau un problème révélateur de la normalisation des transports de données sur Internet.
    Deux formats intéressants existent : Celui développé par Apple, très axé sur l’utilisation d’iTunes dont la SPEC est ici ; Le format "Media RSS Module" qui est plus "libre" notamment soutenu par Yahoo et le logiciel Miro ;
    Types de fichiers supportés dans les flux
    Le format d’Apple n’autorise que les formats suivants dans ses flux : .mp3 audio/mpeg .m4a audio/x-m4a .mp4 (...)

Sur d’autres sites (5250)

  • FFmpeg - Max rtbufsize via dshow ?

    14 septembre 2018, par Nimble

    I recently added an additional 4K capture card to my setup and now I’m dropping frames while initializing and ending recordings. In the past I was encoding a 1080P60 stream and a 4K60 stream simultaneously and had no issues, but now that I am trying to encode 2 4K60 streams at once I’m dropping frames as mentioned before.

    The error displays as :

    [dshow @ 000001499bb17180] real-time buffer [Video (00 Pro Capture HDMI 4K+)] [video input] too full or near too full (62% of size: 2147480000 [rtbufsize parameter])! frame dropped!

    or

    [dshow @ 00000149944e7080] real-time buffer [AVerMedia HD Capture GC573 1] [video input] too full or near too full (62% of size: 2147480000 [rtbufsize parameter])! frame dropped!

    10 - 20 times when starting a recording or ending a recording.

    You’d think the solution would be simply increasing my rtbufsize but when I do I just get another error :

    [dshow @ 00000250df6c7080] Value 3000000000.000000 for parameter 'rtbufsize' out of range [0 - 2.14748e+09]
    [dshow @ 00000250df6c7080] Error setting option rtbufsize to value 3000M.
    video=AVerMedia HD Capture GC573 1:audio=SPDIF/ADAT (1+2) (RME Fireface UC): Result too large

    This same error seems to appear if I try to increase the rtbufsize past 2147.48M on any input so I assume it’s a limitation of FFmpeg and not my hardware ? If it is a baked in limitation of FFmpeg what would be the reasoning ? Any way to bypass or other possible solutions ?

    Full command :

    ffmpeg -y -hide_banner -thread_queue_size 9999 -indexmem 9999 -guess_layout_max 0 -f dshow -rtbufsize 2147.48M `
    -i audio="Analog (1+2) (RME Fireface UC)" `
    -thread_queue_size 9999 -indexmem 9999 -guess_layout_max 0 -f dshow -rtbufsize 2147.48M `
    -i audio="ADAT (5+6) (RME Fireface UC)" `
    -thread_queue_size 9999 -indexmem 9999 -guess_layout_max 0 -f dshow -video_size 3840x2160 -rtbufsize 2147.48M `
    -framerate 60 -pixel_format nv12 -i video="Video (00 Pro Capture HDMI 4K+)":audio="ADAT (3+4) (RME Fireface UC)" `
    -thread_queue_size 9999 -indexmem 9999 -guess_layout_max 0 -f dshow -video_size 3840x2160 -rtbufsize 2147.48M `
    -framerate 60 -pixel_format nv12 -i video="AVerMedia HD Capture GC573 1":audio="SPDIF/ADAT (1+2) (RME Fireface UC)" `
    -thread_queue_size 9999 -indexmem 9999 -r 25 -f lavfi -rtbufsize 2147.48M -i color=c=black:s=50x50 `
    -map 4,0 -map 0 -c:v libx264 -r 25 -rc-lookahead 50 -forced-idr 1 -sc_threshold 0 -flags +cgop `
    -force_key_frames "expr:gte(t,n_forced*2)" -preset ultrafast -pix_fmt nv12 -b:v 16K -minrate 16K -maxrate 16K -bufsize 16k `
    -c:a aac -ar 44100 -b:a 384k -ac 2 -af "aresample=async=250" -vsync 1 -ss 00:00:01.768 `
    -max_muxing_queue_size 9999 -f segment -segment_time 600 -segment_wrap 9 -reset_timestamps 1 `
    -segment_format_options max_delay=0 C:\Users\djcim\Videos\Main\Discord\Discord%02d.ts `
    -map 4,1 -map 1 -c:v libx264 -r 25 -rc-lookahead 50 -forced-idr 1 -sc_threshold 0 -flags +cgop `
    -force_key_frames "expr:gte(t,n_forced*2)" -preset ultrafast -pix_fmt nv12 -b:v 16K -minrate 16K -maxrate 16K -bufsize 16k `
    -c:a aac -ar 44100 -b:a 384k -ac 2 -af "aresample=async=250" -vsync 1 -ss 00:00:01.071 `
    -max_muxing_queue_size 9999 -f segment -segment_time 600 -segment_wrap 9 -reset_timestamps 1 `
    -segment_format_options max_delay=0 C:\Users\djcim\Videos\Main\Soundboard\Soundboard%02d.ts `
    -map 2:0,2:1 -map 2:1 -c:v h264_nvenc -r 60 -rc-lookahead 120 -forced-idr 1 -strict_gop 1 -sc_threshold 0 -flags +cgop `
    -force_key_frames "expr:gte(t,n_forced*2)" -preset: llhp -pix_fmt nv12 -b:v 250M -minrate 250M -maxrate 250M -bufsize 250M `
    -c:a aac -ar 44100 -b:a 384k -ac 2 -af "atrim=0.086, asetpts=PTS-STARTPTS, aresample=async=250" -vsync 1 -ss 00:00:00.102 `
    -max_muxing_queue_size 9999 -f segment -segment_time 600 -segment_wrap 9 -reset_timestamps 1 `
    -segment_format_options max_delay=0 C:\Users\djcim\Videos\Main\Magewell\Magewell%02d.ts `
    -map 3:0,3:1 -map 3:1 -c:v h264_nvenc -r 60 -rc-lookahead 120 -forced-idr 1 -strict_gop 1 -sc_threshold 0 -flags +cgop `
    -force_key_frames "expr:gte(t,n_forced*2)" -preset: llhp -pix_fmt nv12 -b:v 250M -minrate 250M -maxrate 250M -bufsize 250M `
    -c:a aac -ar 44100 -b:a 384k -ac 2 -af "pan=mono|c0=c0, aresample=async=250" -vsync 1 `
    -max_muxing_queue_size 9999 -f segment -segment_time 600 -segment_wrap 9 -reset_timestamps 1 `
    -segment_format_options max_delay=0 C:\Users\djcim\Videos\Main\Camera\Camera%02d.ts

    EDIT : Also worth mentioning that I only drop frames when starting and ending recording, everything is fine in the middle. Wonder if I could like "ease" the recording in or something ?

    (09/13/2018) : I was able to stop frames from dropping while starting a recording by re-arranging inputs and outputs, however no matter how I list things I still drop frames ending recordings.

  • Four Trends Shaping the Future of Analytics in Banking

    27 novembre 2024, par Daniel Crough — Banking and Financial Services

    While retail banking revenues have been growing in recent years, trends like rising financial crimes and capital required for generative AI and ML tech pose significant risks and increase operating costs across the financial industry, according to McKinsey’s State of Retail Banking report.

     

    Today’s financial institutions are focused on harnessing AI and advanced analytics to make their data work for them. To be up to the task, analytics solutions must allow banks to give consumers the convenient, personalised experiences they want while respecting their privacy.

     

    In this article, we’ll explore some of the big trends shaping the future of analytics in banking and finance. We’ll also look at how banks use data and technology to cut costs and personalise customer experiences.

    So, let’s get into it.

    Graph showing average age of IT applications in insurance (18 years)

    This doesn’t just represent a security risk, it also impacts the usability for both customers and employees. Does any of the following sound familiar ?

    • Only specific senior employees know how to navigate the software to generate custom reports or use its more advanced features.
    • Customer complaints about your site’s usability or online banking experience are routine.
    • Onboarding employees takes much longer than necessary because of convoluted systems.
    • Teams and departments experience ‘data siloing,’ meaning that not everyone can access the data they need.

    These are warning signs that IT systems are ready for a review. Anyone thinking, “If it’s not broken, why fix it ?” should consider that legacy systems can also present data security risks. As more countries introduce regulations to protect customer privacy, staying ahead of the curve is increasingly important to avoid penalties and litigation.

    And regulations aren’t the only trends impacting the future of financial institutions’ IT and analytics.

    4 trends shaping the future of analytics in banking

    New regulations and new technology have changed the landscape of analytics in banking.

    New privacy regulations impact banks globally

    The first major international example was the advent of GDPR, which went into effect in the EU in 2018. But a lot has happened since. New privacy regulations and restrictions around AI continue to roll out.

    • The European Artificial Intelligence Act (EU AI Act), which was held up as the world’s first comprehensive legislation on AI, took effect on 31 July 2024.
    • In Europe’s federated data initiative, Gaia-X’s planned cloud infrastructure will provide for more secure, transparent, and trustworthy data storage and processing.
    • The revised Payment Services Directive (PSD2) makes payments more secure and strengthens protections for European businesses and consumers, aiming to create a more integrated and efficient payments market.

    But even businesses that don’t have customers in Europe aren’t safe. Consumer privacy is a hot-button issue globally.

    For example, the California Consumer Privacy Act (CCPA), which took effect in January, impacts the financial services industry more than any other. Case in point, 34% of CCPA-related cases filed in 2022 were related to the financial sector.

    California’s privacy regulations were the first in the US, but other states are following closely behind. On 1 July 2024, new privacy laws went into effect in Florida, Oregon, and Texas, giving people more control over their data.

    Share of CCPA cases in the financial industry in 2022 (34%)

    One typical issue for companies in the banking industry is that their privacy measures regarding user data collected from their website are much less lax than those in their online banking system.

    It’s better to proactively invest in a privacy-centric analytics platform before you get tangled up in a lawsuit and have to pay a fine (and are forced to change your system anyway). 

    And regulatory compliance isn’t the only bonus of an ethical analytics solution. The right alternative can unlock key customer insights that can help you improve the user experience.

    The demand for personalised banking services

    At the same time, consumers are expecting a more and more streamlined personal experience from financial institutions. 86% of bank employees say personalisation is a clear priority for the company. But 63% described resources as limited or only available after demonstrating clear business cases.

    McKinsey’s The data and analytics edge in corporate and commercial banking points out how advanced analytics are empowering frontline bank employees to give customers more personalised experiences at every stage :

    • Pre-meeting/meeting prep : Using advanced analytics to assess customer potential, recommend products, and identify prospects who are most likely to convert
    • Meetings/negotiation : Applying advanced models to support price negotiations, what-if scenarios and price multiple products simultaneously
    • Post-meeting/tracking : Using advanced models to identify behaviours that lead to high performance and improve forecast accuracy and sales execution

    Today’s banks must deliver the personalisation that drives customer satisfaction and engagement to outperform their competitors.

    The rise of AI and its role in banking

    With AI and machine learning technologies becoming more powerful and accessible, financial institutions around the world are already reaping the rewards.

    McKinsey estimates that AI in banking could add $200 to 340 billion annually across the global banking sector through productivity gains.

    • Credit card fraud prevention : Algorithms analyse usage to flag and block fraudulent transactions.
    • More accurate forecasting : AI-based tools can analyse a broader spectrum of data points and forecast more accurately.
    • Better risk assessment and modelling : More advanced analytics and predictive models help avoid extending credit to high-risk customers.
    • Predictive analytics : Help spot clients most likely to churn 
    • Gen-AI assistants : Instantly analyse customer profiles and apply predictive models to suggest the next best actions.

    Considering these market trends, let’s discuss how you can move your bank into the future.

    Using analytics to minimise risk and establish a competitive edge 

    With the right approach, you can leverage analytics and AI to help future-proof your bank against changing customer expectations, increased fraud, and new regulations.

    Use machine learning to prevent fraud

    Every year, more consumers are victims of credit and debit card fraud. Debit card skimming cases nearly doubled in the US in 2023. The last thing you want as a bank is to put your customer in a situation where a criminal has spent their money.

    This not only leads to a horrible customer experience but also creates a lot of internal work and additional costs.Thankfully, machine learning can help identify suspicious activity and stop transactions before they go through. For example, Mastercard’s fraud prevention model has improved fraud detection rates by 20–300%.

    A credit card fraud detection robot

    Implementing a solution like this (or partnering with credit card companies who use it) may be a way to reduce risk and improve customer trust.

    Foresee and avoid future issues with AI-powered risk management

    Regardless of what type of financial products organisations offer, AI can be an enormous tool. Here are just a few ways in which it can mitigate financial risk in the future :

    • Predictive analytics can evaluate risk exposure and allow for more informed decisions about whether to approve commercial loan applications.
    • With better credit risk modelling, banks can avoid extending personal loans to customers most likely to default.
    • Investment banks (or individual traders or financial analysts) can use AI- and ML-based systems to monitor market and trading activity more effectively.

    Those are just a few examples that barely scratch the surface. Many other AI-based applications and analytics use cases exist across all industries and market segments.

    Protect customer privacy while still getting detailed analytics

    New regulations and increasing consumer privacy concerns don’t mean banks and financial institutions should forego website analytics altogether. Its insights into performance and customer behaviour are simply too valuable. And without customer interaction data, you’ll only know something’s wrong if someone complains.

    Fortunately, it doesn’t have to be one or the other. The right financial analytics solution can give you the data and insights needed without compromising privacy while complying with regulations like GDPR and CCPA.

    That way, you can track usage patterns and improve site performance and content quality based on accurate data — without compromising privacy. Reliable, precise analytics are crucial for any bank that’s serious about user experience.

    Use A/B testing and other tools to improve digital customer experiences

    Personalised digital experiences can be key differentiators in banking and finance when done well. But there’s stiff competition. In 2023, 40% of bank customers rated their bank’s online and mobile experience as excellent. 

    Improving digital experiences for users while respecting their privacy means going above and beyond a basic web analytics tool like Google Analytics. Invest in a platform with features like A/B tests and user session analysis for deeper insights into user behaviour.

    Diagram of an A/B test with 4 visitors divided into two groups shown different options

    Behavioural analytics are crucial to understanding customer interactions. By identifying points of friction and drop-off points, you can make digital experiences smoother and more engaging.

    Matomo offers all this and is a great GDPR-compliant alternative to Google Analytics for banks and financial institutions

    Of course, this can be challenging. This is why taking an ethical and privacy-centric approach to analytics can be a key competitive edge for banks. Prioritising data security and privacy will attract other like-minded, ethically conscious consumers and boost customer loyalty.

    Get privacy-friendly web analytics suitable for banking & finance with Matomo

    Improving digital experiences for today’s customers requires a solid web analytics platform that prioritises data privacy and accurate analytics. And choosing the wrong one could even mean ending up in legal trouble or scrambling to reconstruct your entire analytics setup.

    Matomo provides privacy-friendly analytics with 100% data accuracy (no sampling), advanced privacy controls and the ability to run A/B tests and user session analysis within the same platform (limiting risk and minimising costs). 

    It’s easy to get started with Matomo. Users can access clear, easy-to-understand metrics and plenty of pre-made reports that deliver valuable insights from day one. Form usage reports can help banks and fintechs identify potential issues with broken links or technical glitches and reveal clues on improving UX in the short term.

    Over one million websites, including some of the world’s top banks and financial institutions, use Matomo for their analytics.

    Start your 21-day free trial to see why, or book a demo with one of our analytics experts.

  • WebRTC predictions for 2016

    17 février 2016, par silvia

    I wrote these predictions in the first week of January and meant to publish them as encouragement to think about where WebRTC still needs some work. I’d like to be able to compare the state of WebRTC in the browser a year from now. Therefore, without further ado, here are my thoughts.

    WebRTC Browser support

    I’m quite optimistic when it comes to browser support for WebRTC. We have seen Edge bring in initial support last year and Apple looking to hire engineers to implement WebRTC. My prediction is that we will see the following developments in 2016 :

    • Edge will become interoperable with Chrome and Firefox, i.e. it will publish VP8/VP9 and H.264/H.265 support
    • Firefox of course continues to support both VP8/VP9 and H.264/H.265
    • Chrome will follow the spec and implement H.264/H.265 support (to add to their already existing VP8/VP9 support)
    • Safari will enter the WebRTC space but only with H.264/H.265 support

    Codec Observations

    With Edge and Safari entering the WebRTC space, there will be a larger focus on H.264/H.265. It will help with creating interoperability between the browsers.

    However, since there are so many flavours of H.264/H.265, I expect that when different browsers are used at different endpoints, we will get poor quality video calls because of having to negotiate a common denominator. Certainly, baseline will work interoperably, but better encoding quality and lower bandwidth will only be achieved if all endpoints use the same browser.

    Thus, we will get to the funny situation where we buy ourselves interoperability at the cost of video quality and bandwidth. I’d call that a “degree of interoperability” and not the best possible outcome.

    I’m going to go out on a limb and say that at this stage, Google is going to consider strongly to improve the case of VP8/VP9 by improving its bandwidth adaptability : I think they will buy themselves some SVC capability and make VP9 the best quality codec for live video conferencing. Thus, when Safari eventually follows the standard and also implements VP8/VP9 support, the interoperability win of H.264/H.265 will become only temporary overshadowed by a vastly better video quality when using VP9.

    The Enterprise Boundary

    Like all video conferencing technology, WebRTC is having a hard time dealing with the corporate boundary : firewalls and proxies get in the way of setting up video connections from within an enterprise to people outside.

    The telco world has come up with the concept of SBCs (session border controller). SBCs come packed with functionality to deal with security, signalling protocol translation, Quality of Service policing, regulatory requirements, statistics, billing, and even media service like transcoding.

    SBCs are a total overkill for a world where a large number of Web applications simply want to add a WebRTC feature – probably mostly to provide a video or audio customer support service, but it could be a live training session with call-in, or an interest group conference all.

    We cannot install a custom SBC solution for every WebRTC service provider in every enterprise. That’s like saying we need a custom Web proxy for every Web server. It doesn’t scale.

    Cloud services thrive on their ability to sell directly to an individual in an organisation on their credit card without that individual having to ask their IT department to put special rules in place. WebRTC will not make progress in the corporate environment unless this is fixed.

    We need a solution that allows all WebRTC services to get through an enterprise firewall and enterprise proxy. I think the WebRTC standards have done pretty well with firewalls and connecting to a TURN server on port 443 will do the trick most of the time. But enterprise proxies are the next frontier.

    What it takes is some kind of media packet forwarding service that sits on the firewall or in a proxy and allows WebRTC media packets through – maybe with some configuration that is necessary in the browsers or the Web app to add this service as another type of TURN server.

    I don’t have a full understanding of the problems involved, but I think such a solution is vital before WebRTC can go mainstream. I expect that this year we will see some clever people coming up with a solution for this and a new type of product will be born and rolled out to enterprises around the world.

    Summary

    So these are my predictions. In summary, they address the key areas where I think WebRTC still has to make progress : interoperability between browsers, video quality at low bitrates, and the enterprise boundary. I’m really curious to see where we stand with these a year from now.

    It’s worth mentioning Philipp Hancke’s tweet reply to my post :

    — we saw some clever people come up with a solution already. Now it needs to be implemented