Recherche avancée

Médias (0)

Mot : - Tags -/xmlrpc

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (18)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Les formats acceptés

    28 janvier 2010, par

    Les commandes suivantes permettent d’avoir des informations sur les formats et codecs gérés par l’installation local de ffmpeg :
    ffmpeg -codecs ffmpeg -formats
    Les format videos acceptés en entrée
    Cette liste est non exhaustive, elle met en exergue les principaux formats utilisés : h264 : H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 m4v : raw MPEG-4 video format flv : Flash Video (FLV) / Sorenson Spark / Sorenson H.263 Theora wmv :
    Les formats vidéos de sortie possibles
    Dans un premier temps on (...)

  • Ajouter notes et légendes aux images

    7 février 2011, par

    Pour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
    Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
    Modification lors de l’ajout d’un média
    Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)

Sur d’autres sites (5329)

  • Google Analytics 4 (GA4) vs Matomo

    7 avril 2022, par Erin

    Google announced that Universal Analytics’ days are numbered. Universal Analytics will be replaced by Google Analytics 4 (or GA4) on the 1st of July 2023. 

    If Google Analytics users want to compare year-on-year data, they have until July 2022 to get set up and start collecting data before the sun sets on Universal Analytics (or UA).

    But is upgrading to Google Analytics 4 the right move ? There’s a lot to consider, and many organisations are looking for an alternative to Google Analytics. So in this blog, we’ll compare GA4 to Matomo – the leading Google Analytics alternative. 

    In this blog, we’ll look at :

    What is Matomo ?

    Matomo is a powerful privacy-first web analytics platform that gives you 100% data ownership. First launched in 2007, Matomo is now the world’s leading open-source web analytics platform and is used by more than 1 million websites. 

    Matomo’s core values are based on ethical data collection and processing. Consistently more businesses and organisations from around the globe are adopting data-privacy-compliant web analytics solutions like Matomo. 

    Matomo offers both Cloud and On-Premise solutions (and a five-star rated WordPress plugin), making for an adaptable and flexible solution. 

    What is Google Analytics 4 ?

    Google Analytics 4 is the latest version of Google Analytics and represents a completely new approach to data-modelling than its predecessor, Universal Analytics. For an in-depth look at how GA4 and UA compare, check out this Google Analytics 4 vs Universal Analytics comparison

    Google Analytics 4 will soon be the only available version of analytics software from Google. So what’s the issue ? Surely, in 2022, Google makes it easy to migrate to their newest (and only) analytics platform ? Not quite.

    Google Analytics 4 vs Matomo

    Whilst the core purpose of GA4 and Matomo is similar (providing web analytics that help to optimise your website and grow your business), there are several key differences that organisations should consider before making the switch.

    Importing Historical Data from Universal Analytics

    Google Analytics 4

    Users assuming that historical data from Universal Analytics could be imported into Google Analytics 4 were faced with swift disappointment. Unfortunately, Google Analytics 4 does not have an option to import data from its predecessor, Universal Analytics. This means that businesses won’t be able to import and compare data from previous years.

    Matomo

    If you don’t want to start from scratch with your web analytics data, then Matomo is an ideal solution for data continuity. Matomo offers users the ability to import their historical Universal Analytics data. So you can keep all that valuable historical data you’ve collected over the years.

    Google Analytics 4 Migration
    Tino Didriksen via Twitter

    User Interface

    Google Analytics 4

    GA4’s new user interface has been met with mixed reviews. Many claim that it’s overly complex and difficult to navigate. Some have even suggested that the tool has been designed specifically for enterprises with specialised analytics teams. 

    Kevin Levesquea via Twitter

    Matomo

    Matomo, on the other hand, is recognised for an easy to use interface, with a rating of 4.5 out of 5 stars for ease of use on Capterra. Matomo perfectly balances powerful features with a user-friendly interface so valuable insights are only a click away. There’s a reason why over 1 million websites are using Matomo. 

    Matomo Features

    Advanced Behavioural Analytics Features 

    Google Analytics 4

    While Google Analytics is undoubtedly robust in some areas (machine learning, for instance), what it really lacks is advanced behavioural analytics. Heatmaps, session recordings and other advanced tools can give you valuable insights into how users are engaging with your site. Well beyond pageviews and other metrics.

    Unfortunately, with this new generation of GA, Google still hasn’t introduced these features. So users have to manage subscriptions and tracking in third-party behavioural analytics tools like Hotjar or Lucky Orange, for example. This is inefficient, costly and time-consuming to manage. 

    Matomo Heatmaps Feature

    Matomo 

    Meanwhile, Matomo is a one-stop shop for all of your web analytics needs. Not only do you get access to the metrics you’ve grown accustomed to with Universal Analytics, but you also get built-in behavioural analytics features like Heatmaps, Scroll Depth, Session Recordings and more. 

    Want to know if visitors are reaching your call to action at the bottom of the page ? Scroll Depth will answer that.

    Want to know why visitors aren’t clicking through to the next page ? Heatmaps will give you the insights you need.

    You get the picture – the full picture, that is. 

    All-in-one web analytics

    Data Accuracy

    Google Analytics 4

    GA4 aims to make web and app analytics more privacy-centric by reducing the reliance on cookies to record certain events across platforms and devices. 

    However, when site and application visitors opt-out of cookie tracking, GA4 instead relies on machine learning to fill in the gaps. Data sampling could mean that your business is making business decisions based on inaccurate reports. 

    Matomo

    Data is the backbone of web analytics, so why make critical business decisions on sampled data ? With Matomo, you’re guaranteed 100% unsampled accurate data. So you can rest assured that any decisions you make are based on actual facts. 

    Compliance with Privacy Laws (GDPR, CCPA, etc.) 

    Google Analytics 4

    Google is making changes in an attempt to become compliant with privacy laws. However, even with GA4, users are still transferring data to the US. For this reason, both Austrian and French governments have ruled Google Analytics illegal under GDPR.

    The only possible workaround is “Privacy Shield 2.0”, but GDPR experts are still sceptical of this one. 

    Matomo

    If compliance with global privacy laws is a concern (and it should be), then Matomo is the clear winner here. 

    As an EU hosted web analytics tool, your data is stored in Europe, and no data is transferred to the US. On the other hand, if you choose to self-host, the data is stored in your country of choice.

    In addition, with cookieless tracking enabled, you can say goodbye to those pesky cookie consent screens. 

    Also, remember that under GDPR, and many other data privacy laws like CCPA and LGPD, end users have a legal right to access, amend and/or erase the personal data collected about them. 

    With Matomo you get 100% ownership of your web analytics data. This means that we don’t on-sell to third parties ; can’t claim ownership of the data ; and you can export your data at any time.

    Matomo vs GA4
    @tersmantoll via Twitter

    Wrap up

    At the end of the day, the worst thing an organisation can do is nothing. Waiting until July 2023 to migrate to GA4 or another web analytics platform would be very disruptive and costly. Organisations need to consider their options now and start migrating in the next few months. 

    With all that said, moving to Google Analytics 4 could prove to be a costly and time-consuming operation. The global trend towards increased data privacy is a threat to platforms like Google Analytics which uses data for advertising and transfers data across borders.

    With Matomo, you get an easy to use all-in-one web analytics platform and keep your historical Universal Analytics data. Plus, you can future-proof your business by being compliant with global privacy laws and get access to advanced behavioural analytics features. 

    There’s a lot to weigh up here but fortunately, getting started with Matomo is easy. Try it free for 21-days (no credit card required) and see for yourself why over 1 million websites choose Matomo. 

    While this is the end of the road for Universal Analytics, it’s also an opportune time for organisations to find a better fit web analytics tool. 

  • SEO for Financial Services : The Ultimate Guide

    26 juin 2024, par Erin

    You know that having a digital marketing strategy is crucial for helping your financial services business capture the attention and trust of potential customers and thrive in an increasingly competitive digital landscape.

    The question is — what’s the best way to go about improving your ranking in SERPs and driving organic traffic to your website ? 

    That’s where SEO strategies for financial services come into play. 

    This article will cover everything your company needs to know about SEO for financial services — from the unique challenges you’ll face to the proven tips and strategies you can implement to boost your ranking in SERPs. 

    What is SEO for financial services ? 

    SEO — short for search engine optimisation — refers to optimising your content and website for search engines, particularly Google. 

    The main goal of an SEO strategy is to make your site search-engine-friendly, show that you’re a trusted source and increase the likelihood of appearing in SERPs when potential customers look up relevant keywords — ultimately driving organic visibility and traffic. 

    Now, when it comes to evaluating the success of your financial services SEO strategy, there are certain key performance indicators (KPIs) you should keep track of — including : 

    • SEO ranking, or the position your web pages show up in SERPs for specific search terms (the terms and phrases identified during keyword research) 
    • SEO Score, which shows a website’s overall SEO health and indicates how well it will rank in SERPs
    • Impressions, or the number of times users saw your pages when they looked up relevant search terms 
    • Organic traffic, or the number of people that visit your website via search engines
    • Engagement metrics, such as time on page, pages per session, and bounce rate 
    • Conversion rates from website traffic, including both “hard” conversions (lead generation and purchases) and “soft” conversions (such as newsletter subscriptions) 

    It’s important to note that the financial services industry is incredibly competitive — especially given the large-scale digital transformations in the financial sector and the rise of fintech companies. 

    According to a 2022 report, the global market for financial services was valued at $25.51 trillion. Moreover, it’s expected to grow at a compound annual growth rate of 9.7%, reaching $58.69 trillion by 2031.

    Importance and challenges of financial services SEO 

    The financial services industry is changing rapidly, mainly driven by globalisation, innovation, shifting economies, and compliance risks. It’s crucial for financial service companies to develop effective SEO strategies that align with the opportunities and challenges unique to this sector. 

    Certain benefits of a well-executed SEO strategy, namely, better search engine rankings, driving more search traffic, delivering a better user experience, and maximising ROI and promoting business growth, are “universal.” 

    Illustration of top position in SERPs

    Financial services SEO efforts can provide a number of benefits. It can help you : 

    • Improve lead generation and customer acquisition ; the more search traffic you get, the higher the chances of converting visitors into potential clients 
    • Build a strong online presence and brand awareness, which comes as a result of increased visibility in organic search results and reaching a wider audience 
    • Increase your credibility and authority within the industry, primarily through high-quality content that shows your expertise and backlinks from authoritative websites 
    • Gain a competitive edge by analysing and outranking your main competitors 

    That said, financial services companies face some unique challenges :

    High competition : The digital arena for financial services is highly competitive, with numerous companies vying for the same business.

    YMYL (Your Money or Your Life) content : Google’s YMYL framework places higher scrutiny on financial content, demanding higher standards for experience, expertise, authoritativeness, and trustworthiness. We’ll cover this topic in greater detail shortly.

    Regulatory changes and compliance : The financial services sector is characterised by constant regulatory changes and new compliance requirements that businesses must navigate. Sometimes this makes it difficult to gather insights and market to your audience. 

    As a privacy-fist, compliant web analytics solution Matomo can provide valuable insights to support your SEO efforts. Matomo ensures compliance with privacy laws — including GDPR, CCPA and more — and provides 20-40% more comprehensive data than Google Analytics.

    Try Matomo for Free

    Get the web insights you need, without compromising data accuracy.

    No credit card required

    8 proven strategies for implementing SEO for financial services 

    SEO for financial services involves a wide range of strategies — including keyword optimisation, technical SEO, content marketing, link building and other off-page SEO activities — that can help your website rank higher in SERPs. 

    Of course, it’s not just about better search rankings. It’s about attracting the right search traffic to your website — potential clients interested in your financial services.

    Here are some proven financial services SEO strategies you should implement : 

    1. Build trust and topical authority 

    Financial services content typically covers more complex topics that could impact the reader’s financial stability and well-being — or, as Google calls them, “Your Money or Your Life” topics (YMYL). As such, it’s subject to much stricter quality standards. 

    To improve your YMYL content, you’ll need to apply the E-E-A-T framework — short for “Experience, Expertise, Authority, and Trust”. 

    This is a key part of Google’s search rater guidelines for evaluating a website’s quality and credibility. 

    The E-E-A-T standards become even more relevant to financial topics such as investment strategies, financial advice, taxes, and retirement planning. 

    In that sense, the overarching goal of your content strategy should be to build customer trust by demonstrating real expertise and topical authority through in-depth educational content. 

    2. Earn reputable external links through link-building 

    You also need to monitor your off-page SEO—factors outside your website that can’t be directly controlled but can still build trust and contribute to better ranking in SERPs. 

    These include everything from social media engagement and unlinked brand mentions in blog posts, news articles, user reviews and social media discussions — to inbound links from other reputable websites in the finance industry.

    That brings us to high-quality backlinks as a significant factor for YMYL content that can improve your financial services website’s SEO performance : 

    Earning external links can improve your domain authority and reinforce your brand’s position as a reliable source in the financial services niche — which, in turn, can contribute to better search engine rankings and drive more website traffic

    Here are a few link-building strategies you can try : 

    • Use tools like Ahrefs and Semrush to look for reputable websites and then request for them to link to your site
    • Demonstrate your expertise and get backlinks from reputable media outlets through Help a Reporter Out (HARO) 
    • Reach out to authoritative websites that mention your company without linking to you directly and ask them to include a link to your websit

    3. Conduct an SEO audit 

    An SEO audit is a key step in developing and implementing a successful financial SEO strategy. It sets the foundation for all your future efforts — and allows you to measure progress further down the line. 

    You’ll need to perform a comprehensive SEO audit, covering both the existing content and technical aspects of your website — including : 

    • Indexing issues
    • Internal linking and site architecture 
    • Duplicate content 
    • Backlink profile 
    • Broken links 
    • Page titles and metadata 

    It’s possible to do this manually, third-party tools will allow you to dig deeper and speed up the process. Ahrefs and Screaming Frog — to name a few — can help you evaluate your website’s overall health and structure. And, with a web analytics platform like Matomo you can easily measure the success of your SEO efforts.

    But this shouldn’t be a one-time thing ; be sure to perform audits regularly — ideally every six months. 

    4. Understand your target audience

    You can’t create helpful content without learning about your customers’ needs, pain points and preferences. 

    For example, a financial service provider focusing on individuals nearing retirement would prioritise content that educates on retirement planning strategies, investment options for seniors, and tax-efficient withdrawal strategies, aiming to guide clients through the transition from saving to managing retirement funds effectively.

    In contrast, a provider targeting small business owners would emphasise content related to small business loans, funding options, and financial management advice tailored to entrepreneurs seeking to expand their businesses and navigate financial challenges effectively.

    So, before you dive into keyword research and content creation, ensure you have a deep understanding of your target audience. 

    Identifying different audience categories and developing detailed customer personas for each segment is crucial for creating content that resonates with them and aligns with their search intent. 

    Matomo’s Segmentation tool can be of huge help here. It allows you to divide your audience into smaller groups based on factors like demographics and website interactions : 

    : Screenshot of Matomo's Segmentation tool demo

    In addition to that, you can : 

    • Engage with your frontline teams that interact directly with clients to gain deeper insights into prospects’ needs and concerns
    • Track social media channels and other online discussions related to the financial world and your audience
    • Gather qualitative insights from your site visitors through the Matomo Surveys plugin (questions like “What financial services are you most interested in ?” or “Are there any specific financial topics you would like us to cover in more detail ?” will help you understand your visitors better)
    • Watch out for financial trends and developments that could directly impact your audience’s needs and preferences 

    5. Identify new opportunities through keyword research 

    Comprehensive keyword research can help you identify key search terms — specific phrases that potential customers may use when looking up things related to their finances. 

    It’s best to start with a brainstorming session and assemble a list of relevant topics and core keywords. Once you have an initial list, use tools like Ahrefs and Semrush to get more keyword ideas based on your seed keywords, including : 

    • More specific long-tail keywords — and often less competitive — indicate a clearer intent to convert. For example :
      • “low-risk investment options for retirees”
      • “financial planning for freelancers”
      • “small business loan requirements”
    • Keywords that your competitors already rank for. For instance :
      • If a competing investment firm ranks for “best investment strategies for beginners,” targeting similar keywords can attract novice investors.
      • A competitor’s high ranking for “life insurance quotes online” suggests potential to optimise your own content around similar terms.
    • Location-specific keywords (if you have physical store locations)

    Google Search Console can provide information about the search terms you’re already ranking for — including underperforming content that may benefit from further optimisation. If you want deeper SEO insights, you can import your search keywords into Matomo. 

    While you’re at it, try Matomo’s Site Search feature, too. It will show you the exact terms and phrases visitors enter when using your website’s search bar — and you can use that information to find more content opportunities.

    Try Matomo for Free

    Get the web insights you need, without compromising data accuracy.

    No credit card required

    Of course, not all keywords are equal — and it would be impossible to target them all. Instead, prioritise keywords based on two factors : 

    • Search volume, which indicates the “popularity” of a particular query
    • Keyword difficulty, which indicates how hard it’ll be to rank for a specific term, depending on domain authority, search volume and competition 
    Illustration of search engine optimisation concept

    6. Find your main organic competitors 

    Besides performing an SEO audit, finding your core keywords, and researching your target market, competitor analysis is another crucial aspect of SEO for finance companies. 

    Before you start, it’s important to differentiate between your main organic search competitors and your direct industry competitors : 

    You’ll always have direct competitors — other financial services brands offering similar products and services and targeting the same audience as you.

    However, regarding search results, your financial services business won’t be in a “bubble” specifically reserved for the financial industry. Depending on the specific search queries — and the search intent behind them — SERPs could feature a wider range of online content, from niche finance blogs to news websites, and huge financial publications.

    Even if another company doesn’t offer the same services, they’re an organic competitor if you’re both ranking for the same keywords. 

    Once you determine who your main organic competitors are, you can analyse their websites to : 

    • Check how they’re getting search traffic 
    • See which types of content they’re publishing 
    • Find and fill in any potential content gaps 
    • Assess the quality of their backlink profile 
    • See if they currently have any featured snippets

    7. Consider local SEO

    According to a 2023 survey, 21% of US-based consumers report using the internet to look up local businesses daily, while another 32% do so multiple times a week. 

    Local SEO is worth investing in as a financial service provider, especially with physical locations. Prospective clients will typically look up nearby financial services when they need additional information or are ready to engage in financial planning, investment, or other financial activities.

    Here are a few suggestions on how to optimise your site for local searches : 

    • Create listings on online business directories, like Google Business Profile (previously known as Google My Business)
    • If your financial service company operates in more than one physical location, be sure to create a separate Google Business Profile for each one 
    • Identify location-specific keywords that will help you rank in local SERPs
    • Make sure that your name, address, and phone number (NAP) citations are correct and consistent 
    • Leverage positive customer reviews and testimonials as social proof

    8. Optimise technical aspects of your website 

    Technical SEO — which primarily deals with the website’s underlying structure — is another crucial factor that financial services brands must monitor. 

    It’s an umbrella term that covers a wide range of elements, including : 

    • Site speed 
    • Indexing issues 
    • Broken links, orphaned pages, improper redirects 
    • On-page optimisation 
    • Mobile responsiveness

    In 2020, Google introduced Core Web Vitals, a set of metrics that measure web page performance in three key areas — loading speed, responsiveness and visual stability. 

    Given that they’re now a part of Google’s core ranking systems, you should consider using Matomo’s SEO Web Vitals feature to monitor these crucial metrics. Here’s why :

    When technical aspects of your website — namely, site speed and mobile responsiveness — are properly optimised, you can deliver a better user experience. That’s what Google seeks to reward. 

    Plus, it can be a critical brand differentiator for your business. 

    Conclusion 

    Investing in SEO for financial services is crucial for boosting online visibility and driving organic traffic and business growth. However, one thing to keep in mind is that SEO efforts shouldn’t be a one-time thing : 

    SEO is an ongoing process, and it will take time to establish your company as a trustworthy source and see real results. 

    You can start building that trust by using a web analytics platform that offers crucial insights for improving your website’s ranking in SERPs and maintains full compliance with GDPR and other privacy regulations. 

    That’s why Matomo is trusted by more than 1 million websites around the globe. As an ethical alternative to Google Analytics that doesn’t rely on data sampling, Matomo is not only easy to use but more accurate, too — providing 20-40% more data compared to GA4. 

    Sign up for a 21-day free trial and see how Matomo can support your financial services SEO strategy. No credit card required.

  • Ogg objections

    3 mars 2010, par Mans — Multimedia

    The Ogg container format is being promoted by the Xiph Foundation for use with its Vorbis and Theora codecs. Unfortunately, a number of technical shortcomings in the format render it ill-suited to most, if not all, use cases. This article examines the most severe of these flaws.

    Overview of Ogg

    The basic unit in an Ogg stream is the page consisting of a header followed by one or more packets from a single elementary stream. A page can contain up to 255 packets, and a packet can span any number of pages. The following table describes the page header.

    Field Size (bits) Description
    capture_pattern 32 magic number “OggS”
    version 8 always zero
    flags 8
    granule_position 64 abstract timestamp
    bitstream_serial_number 32 elementary stream number
    page_sequence_number 32 incremented by 1 each page
    checksum 32 CRC of entire page
    page_segments 8 length of segment_table
    segment_table variable list of packet sizes

    Elementary stream types are identified by looking at the payload of the first few pages, which contain any setup data required by the decoders. For full details, see the official format specification.

    Generality

    Ogg, legend tells, was designed to be a general-purpose container format. To most multimedia developers, a general-purpose format is one in which encoded data of any type can be encapsulated with a minimum of effort.

    The Ogg format defined by the specification does not fit this description. For every format one wishes to use with Ogg, a complex mapping must first be defined. This mapping defines how to identify a codec, how to extract setup data, and even how timestamps are to be interpreted. All this is done differently for every codec. To correctly parse an Ogg stream, every such mapping ever defined must be known.

    Under this premise, a centralised repository of codec mappings would seem like a sensible idea, but alas, no such thing exists. It is simply impossible to obtain a exhaustive list of defined mappings, which makes the task of creating a complete implementation somewhat daunting.

    One brave soul, Tobias Waldvogel, created a mapping, OGM, capable of storing any Microsoft AVI compatible codec data in Ogg files. This format saw some use in the wild, but was frowned upon by Xiph, and it was eventually displaced by other formats.

    True generality is evidently not to be found with the Ogg format.

    A good example of a general-purpose format is Matroska. This container can trivially accommodate any codec, all it requires is a unique string to identify the codec. For codecs requiring setup data, a standard location for this is provided in the container. Furthermore, an official list of codec identifiers is maintained, meaning all information required to fully support Matroska files is available from one place.

    Matroska also has probably the greatest advantage of all : it is in active, wide-spread use. Historically, standards derived from existing practice have proven more successful than those created by a design committee.

    Overhead

    When designing a container format, one important consideration is that of overhead, i.e. the extra space required in addition to the elementary stream data being combined. For any given container, the overhead can be divided into a fixed part, independent of the total file size, and a variable part growing with increasing file size. The fixed overhead is not of much concern, its relative contribution being negligible for typical file sizes.

    The variable overhead in the Ogg format comes from the page headers, mostly from the segment_table field. This field uses a most peculiar encoding, somewhat reminiscent of Roman numerals. In Roman times, numbers were written as a sequence of symbols, each representing a value, the combined value being the sum of the constituent values.

    The segment_table field lists the sizes of all packets in the page. Each value in the list is coded as a number of bytes equal to 255 followed by a final byte with a smaller value. The packet size is simply the sum of all these bytes. Any strictly additive encoding, such as this, has the distinct drawback of coded length being linearly proportional to the encoded value. A value of 5000, a reasonable packet size for video of moderate bitrate, requires no less than 20 bytes to encode.

    On top of this we have the 27-byte page header which, although paling in comparison to the packet size encoding, is still much larger than necessary. Starting at the top of the list :

    • The version field could be disposed of, a single-bit marker being adequate to separate this first version from hypothetical future versions. One of the unused positions in the flags field could be used for this purpose
    • A 64-bit granule_position is completely overkill. 32 bits would be more than enough for the vast majority of use cases. In extreme cases, a one-bit flag could be used to signal an extended timestamp field.
    • 32-bit elementary stream number ? Are they anticipating files with four billion elementary streams ? An eight-bit field, if not smaller, would seem more appropriate here.
    • The 32-bit page_sequence_number is inexplicable. The intent is to allow detection of page loss due to transmission errors. ISO MPEG-TS uses a 4-bit counter per 188-byte packet for this purpose, and that format is used where packet loss actually happens, unlike any use of Ogg to date.
    • A mandatory 32-bit checksum is nothing but a waste of space when using a reliable storage/transmission medium. Again, a flag could be used to signal the presence of an optional checksum field.

    With the changes suggested above, the page header would shrink from 27 bytes to 12 bytes in size.

    We thus see that in an Ogg file, the packet size fields alone contribute an overhead of 1/255 or approximately 0.4%. This is a hard lower bound on the overhead, not attainable even in theory. In reality the overhead tends to be closer to 1%.

    Contrast this with the ISO MP4 file format, which can easily achieve an overhead of less than 0.05% with a 1 Mbps elementary stream.

    Latency

    In many applications end-to-end latency is an important factor. Examples include video conferencing, telephony, live sports events, interactive gaming, etc. With the codec layer contributing as little as 10 milliseconds of latency, the amount imposed by the container becomes an important factor.

    Latency in an Ogg-based system is introduced at both the sender and the receiver. Since the page header depends on the entire contents of the page (packet sizes and checksum), a full page of packets must be buffered by the sender before a single bit can be transmitted. This sets a lower bound for the sending latency at the duration of a page.

    On the receiving side, playback cannot commence until packets from all elementary streams are available. Hence, with two streams (audio and video) interleaved at the page level, playback is delayed by at least one page duration (two if checksums are verified).

    Taking both send and receive latencies into account, the minimum end-to-end latency for Ogg is thus twice the duration of a page, triple if strict checksum verification is required. If page durations are variable, the maximum value must be used in order to avoid buffer underflows.

    Minimum latency is clearly achieved by minimising the page duration, which in turn implies sending only one packet per page. This is where the size of the page header becomes important. The header for a single-packet page is 27 + packet_size/255 bytes in size. For a 1 Mbps video stream at 25 fps this gives an overhead of approximately 1%. With a typical audio packet size of 400 bytes, the overhead becomes a staggering 7%. The average overhead for a multiplex of these two streams is 1.4%.

    As it stands, the Ogg format is clearly not a good choice for a low-latency application. The key to low latency is small packets and fine-grained interleaving of streams, and although Ogg can provide both of these, by sending a single packet per page, the price in overhead is simply too high.

    ISO MPEG-PS has an overhead of 9 bytes on most packets (a 5-byte timestamp is added a few times per second), and Microsoft’s ASF has a 12-byte packet header. My suggestions for compacting the Ogg page header would bring it in line with these formats.

    Random access

    Any general-purpose container format needs to allow random access for direct seeking to any given position in the file. Despite this goal being explicitly mentioned in the Ogg specification, the format only allows the most crude of random access methods.

    While many container formats include an index allowing a time to be directly translated into an offset into the file, Ogg has nothing of this kind, the stated rationale for the omission being that this would require a two-pass multiplexing, the second pass creating the index. This is obviously not true ; the index could simply be written at the end of the file. Those objecting that this index would be unavailable in a streaming scenario are forgetting that seeking is impossible there regardless.

    The method for seeking suggested by the Ogg documentation is to perform a binary search on the file, after each file-level seek operation scanning for a page header, extracting the timestamp, and comparing it to the desired position. When the elementary stream encoding allows only certain packets as random access points (video key frames), a second search will have to be performed to locate the entry point closest to the desired time. In a large file (sizes upwards of 10 GB are common), 50 seeks might be required to find the correct position.

    A typical hard drive has an average seek time of roughly 10 ms, giving a total time for the seek operation of around 500 ms, an annoyingly long time. On a slow medium, such as an optical disc or files served over a network, the times are orders of magnitude longer.

    A factor further complicating the seeking process is the possibility of header emulation within the elementary stream data. To safeguard against this, one has to read the entire page and verify the checksum. If the storage medium cannot provide data much faster than during normal playback, this provides yet another substantial delay towards finishing the seeking operation. This too applies to both network delivery and optical discs.

    Although optical disc usage is perhaps in decline today, one should bear in mind that the Ogg format was designed at a time when CDs and DVDs were rapidly gaining ground, and network-based storage is most certainly on the rise.

    The final nail in the coffin of seeking is the codec-dependent timestamp format. At each step in the seeking process, the timestamp parsing specified by the codec mapping corresponding the current page must be invoked. If the mapping is not known, the best one can do is skip pages until one with a known mapping is found. This delays the seeking and complicates the implementation, both bad things.

    Timestamps

    A problem old as multimedia itself is that of synchronising multiple elementary streams (e.g. audio and video) during playback ; badly synchronised A/V is highly unpleasant to view. By the time Ogg was invented, solutions to this problem were long since explored and well-understood. The key to proper synchronisation lies in tagging elementary stream packets with timestamps, packets carrying the same timestamp intended for simultaneous presentation. The concept is as simple as it seems, so it is astonishing to see the amount of complexity with which the Ogg designers managed to imbue it. So bizarre is it, that I have devoted an entire article to the topic, and will not cover it further here.

    Complexity

    Video and audio decoding are time-consuming tasks, so containers should be designed to minimise extra processing required. With the data volumes involved, even an act as simple as copying a packet of compressed data can have a significant impact. Once again, however, Ogg lets us down. Despite the brevity of the specification, the format is remarkably complicated to parse properly.

    The unusual and inefficient encoding of the packet sizes limits the page size to somewhat less than 64 kB. To still allow individual packets larger than this limit, it was decided to allow packets spanning multiple pages, a decision with unfortunate implications. A page-spanning packet as it arrives in the Ogg stream will be discontiguous in memory, a situation most decoders are unable to handle, and reassembly, i.e. copying, is required.

    The knowledgeable reader may at this point remark that the MPEG-TS format also splits packets into pieces requiring reassembly before decoding. There is, however, a significant difference there. MPEG-TS was designed for hardware demultiplexing feeding directly into hardware decoders. In such an implementation the fragmentation is not a problem. Rather, the fine-grained interleaving is a feature allowing smaller on-chip buffers.

    Buffering is also an area in which Ogg suffers. To keep the overhead down, pages must be made as large as practically possible, and page size translates directly into demultiplexer buffer size. Playback of a file with two elementary streams thus requires 128 kB of buffer space. On a modern PC this is perhaps nothing to be concerned about, but in a small embedded system, e.g. a portable media player, it can be relevant.

    In addition to the above, a number of other issues, some of them minor, others more severe, make Ogg processing a painful experience. A selection follows :

    • 32-bit random elementary stream identifiers mean a simple table-lookup cannot be used. Instead the list of streams must be searched for a match. While trivial to do in software, it is still annoying, and a hardware demultiplexer would be significantly more complicated than with a smaller identifier.
    • Semantically ambiguous streams are possible. For example, the continuation flag (bit 1) may conflict with continuation (or lack thereof) implied by the segment table on the preceding page. Such invalid files have been spotted in the wild.
    • Concatenating independent Ogg streams forms a valid stream. While finding a use case for this strange feature is difficult, an implementation must of course be prepared to encounter such streams. Detecting and dealing with these adds pointless complexity.
    • Unusual terminology : inventing new terms for well-known concepts is confusing for the developer trying to understand the format in relation to others. A few examples :
      Ogg name Usual name
      logical bitstream elementary stream
      grouping multiplexing
      lacing value packet size (approximately)
      segment imaginary element serving no real purpose
      granule position timestamp

    Final words

    We have found the Ogg format to be a dubious choice in just about every situation. Why then do certain organisations and individuals persist in promoting it with such ferocity ?

    When challenged, three types of reaction are characteristic of the Ogg campaigners.

    On occasion, these people will assume an apologetic tone, explaining how Ogg was only ever designed for simple audio-only streams (ignoring it is as bad for these as for anything), and this is no doubt true. Why then, I ask again, do they continue to tout Ogg as the one-size-fits-all solution they already admitted it is not ?

    More commonly, the Ogg proponents will respond with hand-waving arguments best summarised as Ogg isn’t bad, it’s just different. My reply to this assertion is twofold :

    • Being too different is bad. We live in a world where multimedia files come in many varieties, and a decent media player will need to handle the majority of them. Fortunately, most multimedia file formats share some basic traits, and they can easily be processed in the same general framework, the specifics being taken care of at the input stage. A format deviating too far from the standard model becomes problematic.
    • Ogg is bad. When every angle of examination reveals serious flaws, bad is the only fitting description.

    The third reaction bypasses all technical analysis : Ogg is patent-free, a claim I am not qualified to directly discuss. Assuming it is true, it still does not alter the fact that Ogg is a bad format. Being free from patents does not magically make Ogg a good choice as file format. If all the standard formats are indeed covered by patents, the only proper solution is to design a new, good format which is not, this time hopefully avoiding the old mistakes.