Recherche avancée

Médias (0)

Mot : - Tags -/protocoles

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (102)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • Emballe Médias : Mettre en ligne simplement des documents

    29 octobre 2010, par

    Le plugin emballe médias a été développé principalement pour la distribution mediaSPIP mais est également utilisé dans d’autres projets proches comme géodiversité par exemple. Plugins nécessaires et compatibles
    Pour fonctionner ce plugin nécessite que d’autres plugins soient installés : CFG Saisies SPIP Bonux Diogène swfupload jqueryui
    D’autres plugins peuvent être utilisés en complément afin d’améliorer ses capacités : Ancres douces Légendes photo_infos spipmotion (...)

Sur d’autres sites (13774)

  • Adventures In NAS

    1er janvier, par Multimedia Mike — General

    In my post last year about my out-of-control single-board computer (SBC) collection which included my meager network attached storage (NAS) solution, I noted that :

    I find that a lot of my fellow nerds massively overengineer their homelab NAS setups. I’ll explore this in a future post. For my part, people tend to find my homelab NAS solution slightly underengineered.

    So here I am, exploring this is a future post. I’ve been in the home NAS game a long time, but have never had very elaborate solutions for such. For my part, I tend to take an obsessively reductionist view of what constitutes a NAS : Any small computer with a pool of storage and a network connection, running the Linux operating system and the Samba file sharing service.


    Simple hard drive and ethernet cable

    Many home users prefer to buy turnkey boxes, usually that allow you to install hard drives yourself, and then configure the box and its services with a friendly UI. My fellow weird computer nerds often buy cast-off enterprise hardware and set up more resilient, over-engineered solutions, as long as they have strategies to mitigate the noise and dissipate the heat, and don’t mind the electricity bills.

    If it works, awesome ! As an old hand at this, I am rather stuck in my ways, however, preferring to do my own stunts, both with the hardware and software solutions.

    My History With Home NAS Setups
    In 1998, I bought myself a new computer — beige box tower PC, as was the style as the time. This was when normal people only had one computer at most. It ran Windows, but I was curious about this new thing called “Linux” and learned to dual boot that. Later that year, it dawned on me that nothing prevented me from buying a second ugly beige box PC and running Linux exclusively on it. Further, it could be a headless Linux box, connected by ethernet, and I could consolidate files into a single place using this file sharing software named Samba.

    I remember it being fairly onerous to get Samba working in those days. And the internet was not quite so helpful in those days. I recall that the thing that blocked me for awhile was needing to know that I had to specify an entry for the Samba server machine in the LMHOSTS (Lanman hosts) file on the Windows 95 machine.

    However, after I cracked that code, I have pretty much always had some kind of ad-hoc home NAS setup, often combined with a headless Linux development box.

    In the early 2000s, I built a new beige box PC for a file server, with a new hard disk, and a coworker tutored me on setting up a (P)ATA UDMA 133 (or was it 150 ? anyway, it was (P)ATA’s last hurrah before SATA conquered all) expansion card and I remember profiling that the attached hard drive worked at a full 21 MBytes/s reading. It was pretty slick. Except I hadn’t really thought things through. You see, I had a hand-me-down ethernet hub cast-off from my job at the time which I wanted to use. It was a 100 Mbps repeater hub, not a switch, so the catch was that all connected machines had to be capable of 100 Mbps. So, after getting all of my machines (3 at the time) upgraded to support 10/100 ethernet (the old off-brand PowerPC running Linux was the biggest challenge), I profiled transfers and realized that the best this repeater hub could achieve was about 3.6 MBytes/s. For a long time after that, I just assumed that was the upper limit of what a 100 Mbps network could achieve. Obviously, I now know that the upper limit ought to be around 11.2 MBytes/s and if I had gamed out that fact in advance, I would have realized it didn’t make sense to care about super-fast (for the time) disk performance.

    At this time, I was doing a lot for development for MPlayer/xine/FFmpeg. I stored all of my multimedia material on this NAS. I remember being confused when I was working with Y4M data, which is raw frames, which is lots of data. xine, which employed a pre-buffering strategy, would play fine for a few seconds and then stutter. Eventually, I reasoned out that the files I was working with had a data rate about twice what my awful repeater hub supported, which is probably the first time I came to really understand and respect streaming speeds and their implications for multimedia playback.

    Smaller Solutions
    For a period, I didn’t have a NAS. Then I got an Apple AirPort Extreme, which I noticed had a USB port. So I bought a dual drive brick to plug into it and used that for a time. Later (2009), I had this thing called the MSI Wind Nettop which is the only PC I’ve ever seen that can use a CompactFlash (CF) card for a boot drive. So I did just that, and installed a large drive so it could function as a NAS, as well as a headless dev box. I’m still amazed at what a low-power I/O beast this thing is, at least when compared to all the ARM SoCs I have tried in the intervening 1.5 decades. I’ve had spinning hard drives in this thing that could read at 160 MBytes/s (‘dd’ method) and have no trouble saturating the gigabit link at 112 MBytes/s, all with its early Intel Atom CPU.

    Around 2015, I wanted a more capable headless dev box and discovered Intel’s line of NUCs. I got one of the fat models that can hold a conventional 2.5″ spinning drive in addition to the M.2 SATA SSD and I was off and running. That served me fine for a few years, until I got into the ARM SBC scene. One major limitation here is that 2.5″ drives aren’t available in nearly the capacities that make a NAS solution attractive.

    Current Solution
    My current NAS solution, chronicled in my last SBC post– the ODroid-HC2, which is a highly compact ARM SoC with an integrated USB3-SATA bridge so that a SATA drive can be connected directly to it :


    ODROID-HC2 NAS

    ODROID-HC2 NAS


    I tend to be weirdly proficient at recalling dates, so I’m surprised that I can’t recall when I ordered this and put it into service. But I’m pretty sure it was circa 2018. It’s only equipped with an 8 TB drive now, but I seem to recall that it started out with only a 4 TB drive. I think I upgraded to the 8 TB drive early in the pandemic in 2020, when ISPs were implementing temporary data cap amnesty and I was doing what a r/DataHoarder does.

    The HC2 has served me well, even though it has a number of shortcomings for a hardware set chartered for NAS :

    1. While it has a gigabit ethernet port, it’s documented that it never really exceeds about 70 MBytes/s, due to the SoC’s limitations
    2. The specific ARM chip (Samsung Exynos 5422 ; more than a decade old as of this writing) lacks cryptography instructions, slowing down encryption if that’s your thing (e.g., LUKS)
    3. While the SoC supports USB3, that block is tied up for the SATA interface ; the remaining USB port is only capable of USB2 speeds
    4. 32-bit ARM, which prevented me from running certain bits of software I wanted to try (like Minio)
    5. Only 1 drive, so no possibility for RAID (again, if that’s your thing)

    I also love to brag on the HC2’s power usage : I once profiled the unit for a month using a Kill-A-Watt and under normal usage (with the drive spinning only when in active use). The unit consumed 4.5 kWh… in an entire month.

    New Solution
    Enter the ODroid-HC4 (I purchased mine from Ameridroid but Hardkernel works with numerous distributors) :


    ODroid-HC4 with 2 drives

    ODroid-HC4 with an SSD and a conventional drive


    I ordered this earlier in the year and after many months of procrastinating and obsessing over the best approach to take with its general usage, I finally have it in service as my new NAS. Comparing point by point with the HC2 :

    1. The gigabit ethernet runs at full speed (though a few things on my network run at 2.5 GbE now, so I guess I’ll always be behind)
    2. The ARM chip (Amlogic S905X3) has AES cryptography acceleration and handles all the LUKS stuff without breaking a sweat ; “cryptsetup benchmark” reports between 500-600 MBytes/s on all the AES variants
    3. The USB port is still only USB2, so no improvement there
    4. 64-bit ARM, which means I can run Minio to simulate block storage in a local dev environment for some larger projects I would like to undertake
    5. Supports 2 drives, if RAID is your thing

    How I Set It Up
    How to set up the drive configuration ? As should be apparent from the photo above, I elected for an SSD (500 GB) for speed, paired with a conventional spinning HDD (18 TB) for sheer capacity. I’m not particularly trusting of RAID. I’ve watched it fail too many times, on systems that I don’t even manage, not to mention that aforementioned RAID brick that I had attached to the Apple AirPort Extreme.

    I had long been planning to use bcache, the block caching interface for Linux, which can use the SSD as a speedy cache in front of the more capacious disk. There is also LVM cache, which is supposed to achieve something similar. And then I had to evaluate the trade-offs in whether I wanted write-back, write-through, or write-around configurations.

    This was all predicated on the assumption that the spinning drive would not be able to saturate the gigabit connection. When I got around to setting up the hardware and trying some basic tests, I found that the conventional HDD had no trouble keeping up with the gigabit data rate, both reading and writing, somewhat obviating the need for SSD acceleration using any elaborate caching mechanisms.

    Maybe that’s because I sprung for the WD Red Pro series this time, rather than the Red Plus ? I’m guessing that conventional drives do deteriorate over the years. I’ll find out.

    For the operating system, I stuck with my newest favorite Linux distro : DietPi. While HardKernel (parent of ODroid) makes images for the HC units, I had also used DietPi for the HC2 for the past few years, as it tends to stay more up to date.

    Then I rsync’d my data from HC2 -> HC4. It was only about 6.5 TB of total data but it took days as this WD Red Plus drive is only capable of reading at around 10 MBytes/s these days. Painful.

    For file sharing, I’m pretty sure most normal folks have nice web UIs in their NAS boxes which allow them to easily configure and monitor the shares. I know there are such applications I could set up. But I’ve been doing this so long, I just do a bare bones setup through the terminal. I installed regular Samba and then brought over my smb.conf file from the HC2. 1 by 1, I tested that each of the old shares were activated on the new NAS and deactivated on the old NAS. I also set up a new share for the SSD. I guess that will just serve as a fast I/O scratch space on the NAS.

    The conventional drive spins up and down. That’s annoying when I’m actively working on something but manage not to hit the drive for like 5 minutes and then an application blocks while the drive wakes up. I suppose I could set it up so that it is always running. However, I micro-manage this with a custom bash script I wrote a long time ago which logs into the NAS and runs the “date” command every 2 minutes, appending the output to a file. As a bonus, it also prints data rate up/down stats every 5 seconds. The spinning file (“nas-main/zz-keep-spinning/keep-spinning.txt”) has never been cleared and has nearly a quarter million lines. I suppose that implies that it has kept the drive spinning for 1/2 million minutes which works out to around 347 total days. I should compare that against the drive’s SMART stats, if I can remember how. The earliest timestamp in the file is from March 2018, so I know the HC2 NAS has been in service at least that long.

    For tasks, vintage cron still does everything I could need. In this case, that means reaching out to websites (like this one) and automatically backing up static files.

    I also have to have a special script for starting up. Fortunately, I was able to bring this over from the HC2 and tweak it. The data disks (though not boot disk) are encrypted. Those need to be unlocked and only then is it safe for the Samba and Minio services to start up. So one script does all that heavy lifting in the rare case of a reboot (this is the type of system that’s well worth having on a reliable UPS).

    Further Work
    I need to figure out how to use the OLED display on the NAS, and how to make it show something more useful than the current time and date, which is what it does in its default configuration with HardKernel’s own Linux distro. With DietPi, it does nothing by default. I’m thinking it should be able to show the percent usage of each of the 2 drives, at a minimum.

    I also need to establish a more responsible backup regimen. I’m way too lazy about this. Fortunately, I reason that I can keep the original HC2 in service, repurposed to accept backups from the main NAS. Again, I’m sort of micro-managing this since a huge amount of data isn’t worth backing up (remember the whole DataHoarder bit), but the most important stuff will be shipped off.

    The post Adventures In NAS first appeared on Breaking Eggs And Making Omelettes.

  • CCPA vs GDPR : Understanding Their Impact on Data Analytics

    19 mars, par Alex Carmona

    With over 400 million internet users in Europe and 331 million in the US (11% of which reside in California alone), understanding the nuances of privacy laws like the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) is crucial for compliant and ethical consumer data collection.

    Navigating this compliance landscape can be challenging for businesses serving European and Californian markets.

    This guide explores the key differences between CCPA and GDPR, their impact on data analytics, and how to ensure your business meets these essential privacy requirements.

    What is the California Consumer Privacy Act (CCPA) ?

    The California Consumer Privacy Act (CCPA) is a data privacy law that gives California consumers control over their personal information. It applies to for-profit businesses operating in California that meet specific criteria related to revenue, data collection and sales.

    Origins and purpose

    The CCPA addresses growing concerns about data privacy and how businesses use personal information in California. The act passed in 2018 and went into effect on 1 January 2020.

    Key features

    • Grants consumers the right to know what personal information is collected
    • Provides the right to delete personal information
    • Allows consumers to opt out of the sale of their personal information
    • Prohibits discrimination against consumers who exercise their CCPA rights

    Key definitions under the CCPA framework

    • Business : A for-profit entity doing business in California and meeting one or more of these conditions :
      • Has annual gross revenues over $25 million ;
      • Buys, receives, sells or shares 50,000 or more consumers’ personal information ; or
      • Derives 50% or more of its annual revenues from selling consumers’ personal information
    • Consumer : A natural person who is a California resident
    • Personal Information : Information that could be linked to, related to or used to identify a consumer or household, such as online identifiers, IP addresses, email addresses, social security numbers, cookie identifiers and more

    What is the General Data Protection Regulation (GDPR) ?

    The General Data Protection Regulation (GDPR) is a data privacy and protection law passed by the European Union (EU). It’s one of the strongest and most influential data privacy laws worldwide and applies to all organisations that process the personal data of individuals in the EU.

    Origins and purpose

    The GDPR was passed in 2016 and went into effect on 25 May 2018. It aims to harmonise data privacy laws in Europe and give people in the European Economic Area (EEA) privacy rights and control over their data.

    Key features

    • Applies to all organisations that process the personal data of individuals in the EEA
    • Grants individuals a wide range of privacy rights over their data
    • Requires organisations to obtain explicit and informed consent for most data processing
    • Mandates appropriate security measures to protect personal data
    • Imposes significant fines and penalties for non-compliance

    Key definitions under the GDPR framework

    • Data Subject : An identified or identifiable person
    • Personal Data : Any information relating to a data subject
    • Data Controller : The entity or organisation that determines how personal data is processed and what for
    • Data Processor : The entity or organisation that processes the data on behalf of the controller

    CCPA vs. GDPR : Key similarities

    The CCPA and GDPR enhance consumer privacy rights and give individuals greater control over their data.

    DimensionCCPAGDPR
    PurposeProtect consumer privacyProtect individual data rights
    Key RightsRight to access, delete and opt out of saleRight to access, rectify, erase and restrict processing
    TransparencyRequires transparency around data collection and useRequires transparency about data collection, processing and use

    CCPA vs. GDPR : Key differences

    While they have similar purposes, the CCPA and GDPR differ significantly in their scope, approach and specific requirements.

    DimensionCCPAGDPR
    ScopeFor-profit businesses onlyAll organisations processing EU consumer data
    Territorial ReachCalifornia-based natural personsAll data subjects within the EEA
    ConsentOpt-out systemOpt-in system
    PenaltiesPer violation based on its intentional or negligent natureCase-by-case based on comprehensive assessment
    Individual RightsNarrower (relative to GDPR)Broader (relative to CCPA)

    CCPA vs. GDPR : A multi-dimensional comparison

    The previous sections gave a broad overview of the similarities and differences between CCPA and GDPR. Let’s now examine nine key dimensions where these regulations converge or diverge and discuss their impact on data analytics.

    Regulatory overlap between GDPR and CCPA.

    #1. Scope and territorial reach

    The GDPR has a much broader scope than the CCPA. It applies to all organisations that process the personal data of individuals in the EEA, regardless of their business model, purpose or physical location.

    The CCPA applies to medium and large for-profit businesses that derive a substantial portion of their earnings from selling Californian consumers’ personal information. It doesn’t apply to non-profits, government agencies or smaller for-profit companies.

    Impact on data analytics

    The difference in scope significantly impacts data analytics practices. Smaller businesses may not need to comply with either regulation, some may only need to follow the CCPA, while most global businesses must comply with both. This often requires different methods for collecting and processing data in California, Europe, and elsewhere.

    #2. Penalties and fines for non-compliance

    Both the CCPA and GDPR impose penalties for non-compliance, but the severity of fines differs significantly :

    CCPAMaximum penalty
    $2,500 per unintentional violation
    $7,500 per intentional violation

    “Per violation” means per violation per impacted consumer. For example, three intentional CCPA violations affecting 1,000 consumers would result in 3,000 total violations and a $22.5 million maximum penalty (3,000 × $7,500).

    The largest CCPA fine to date was Zoom’s $85 million settlement in 2021.

    In contrast, the GDPR has resulted in 2,248 fines totalling almost €6.6 billion since 2018 — €2.4 billion of which were for non-compliance.

    GDPRMaximum penalty
    €20 million or
    4% of all revenue earned the previous year

    So far, the biggest fine imposed under the GDPR was Meta’s €1.2 billion fine in May 2023 — 15 times more than Zoom had to pay California.

    Impact on data analytics

    The significant difference in potential fines demonstrates the importance of regulatory compliance for data analytics professionals. Non-compliance can have severe financial consequences, directly affecting budget allocation and business operations.

    Businesses must ensure their data collection, storage and processing practices comply with regulations in both Europe and California.

    Choosing privacy-first, compliance-ready analytics platforms like Matomo is instrumental for mitigating non-compliance risks.

    #3. Data subject rights and consumer rights

    The CCPA and GDPR give people similar rights over their data, but their limitations and details differ.

    Rights common to the CCPA and GDPR

    • Right to Access/Know : People can access their personal information and learn what data is collected, its source, its purpose and how it’s shared
    • Right to Delete/Erasure : People can request the deletion of their personal information, with some exceptions
    • Right to Non-Discrimination : Businesses can’t discriminate against people who exercise their privacy rights

    Consumer rights unique to the CCPA

    • Right to Opt Out of Sale : Consumers can prohibit the sale of their personal information
    • Right to Notice : Businesses must inform consumers about data collection practices
    • Right to Disclosure : Consumers can request specific information collected about them

    Data subject rights unique to the GDPR

    • Right to be Informed : Broader transparency requirements encompass data retention, automated decision-making and international transfers
    • Right to Rectification : Data subjects may request the correction of inaccurate data
    • Right to Restrict Processing : Consumers may limit data use in certain situations
    • Right to Data Portability : Businesses must provide individual consumer data in a secure, portable format when requested
    • Right to Withdraw Consent : Consumers may withdraw previously granted consent to data processing
    CCPAGDPR
    Right to Access or Know
    Right to Delete or Erase
    Right to Non-Discrimination
    Right to Opt-Out
    Right to Notice
    Right to Disclosure
    Right to be Informed
    Right to Rectification
    Right to Restrict Processing
    Right to Data Portability
    Right to Withdraw Consent

    Impact on data analytics

    Data analysts must understand these rights and ensure compliance with both regulations, which could potentially require separate data handling processes for EU and California consumers.

    #4. Opt-out vs. opt-in

    The CCPA generally follows an opt-out model, while the GDPR requires explicit consent from individuals before processing their data.

    Impact on data analytics

    For CCPA compliance, businesses can collect data by default if they provide opt-out mechanisms. Failing to process opt-out requests can result in severe penalties, like Sephora’s $1.2 million fine.

    Under GDPR, organisations must obtain explicit consent before collecting any data, which can limit the amount of data available for analysis.

    #5. Parental consent

    The CCPA and GDPR have provisions regarding parental consent for processing children’s data. The CCPA requires parental consent for children under 13, while the GDPR sets the age at 16, though member states can lower it to 13.

    Impact on data analytics

    This requirement significantly impacts businesses targeting younger audiences. In Europe and the US, companies must implement different methods to verify users’ ages and obtain parental consent when necessary.

    The California Attorney General’s Office recently fined Tilting Point Media LLC $500,000 for sharing children’s data without parental consent.

    #6. Data security requirements

    Both regulations require businesses to implement adequate security measures to protect personal data. However, the GDPR has more prescriptive requirements, outlining specific security measures and emphasising a risk-based approach.

    Impact on data analytics

    Data analytics professionals must ensure that data is processed and stored securely to avoid breaches and potential fines.

    #7. International data transfers

    Both the CCPA and GDPR address international data transfers. Under the CCPA, businesses must only inform consumers about international transfers. The GDPR has stricter requirements, including ensuring adequate data protection safeguards for transfers outside the EEA.

    A world map illustration.

    Other rules, like the Payment Services Directive 2 (PSD2), also affect international data transfers, especially in the financial industry.

    PSD2 requires strong customer authentication and secure communication channels for payment services. This adds complexity to cross-border data flows.

    Impact on data analytics

    The primary impact is on businesses serving European residents from outside Europe. Processing data within the European Union is typically advisable. Meta’s record-breaking €1.2 billion fine was specifically for transferring data from the EEA to the US without sufficient safeguards.

    Choosing the right analytics platform helps avoid these issues.

    For example, Matomo offers a free, open-source, self-hosted analytics platform you can deploy anywhere. You can also choose a managed, GDPR-compliant cloud analytics solution with all data storage and processing servers within the EU (in Germany), ensuring your data never leaves the EEA.

    #8. Enforcement mechanisms

    The California Attorney General is responsible for enforcing CCPA requirements, while in Europe, the Data Protection Authority (DPA) in each EU member state enforces GDPR requirements.

    Impact on data analytics

    Data analytics professionals should be familiar with their respective enforcement bodies and their powers to support compliance efforts and minimise the risk of fines and penalties.

    #9. Legal basis for personal data processing

    The GDPR outlines six legal grounds for processing personal data :

    • Consent
    • Contract
    • Legal obligation
    • Vital interests
    • Public task
    • Legitimate interests

    The CCPA doesn’t explicitly define lawful bases but focuses on consumer rights and transparency in general.

    Impact on data analytics

    Businesses subject to the GDPR must identify and document a valid lawful basis for each processing activity.

    Compliance rules under CCPA and GDPR

    Complying with the CCPA and GDPR requires a comprehensive approach to data privacy. Here’s a summary of the essential compliance rules for each framework :

    Key compliance points under CCPA and GDPR.

    CCPA compliance rules

    • Create clear and concise privacy policies outlining data collection and use practices
    • Give consumers the right to opt-out
    • Respond to consumer requests to access, delete and correct their personal information
    • Implement reasonable security measures for consumers’ personal data protection
    • Never discriminate against consumers who exercise their CCPA rights

    GDPR compliance rules

    • Obtain explicit and informed consent for data processing activities
    • Implement technical and organisational controls to safeguard personal data
    • Designate a Data Protection Officer (DPO) if necessary
    • Perform data protection impact assessments (DPIAs) for high-risk processing activities
    • Maintain records of processing activities
    • Promptly report data breaches to supervisory authorities

    Navigating the CCPA and GDPR with confidence

    Understanding the nuances of the CCPA and GDPR is crucial for businesses operating in the US and Europe. These regulations significantly impact data collection and analytics practices.

    Implementing robust data security practices and prioritising privacy and compliance are essential to avoid severe penalties and build trust with today’s privacy-conscious consumers.

    Privacy-centric analytics platforms like Matomo enable businesses to collect, analyse and use data responsibly and transparently, extracting valuable insights while maintaining compliance with both CCPA and GDPR requirements.

    no credit card required

  • Multilingual SEO : A Marketer’s Guide to Measuring and Optimising Multilingual Websites

    26 juin, par Joe

    The web—and search engines in particular—make it easier than ever for businesses of any size to reach an international audience. 

     
    A multilingual website makes sense, especially when the majority of websites are in English. After all, you want to stand out to customers by speaking their local language. But it’s no good having a multilingual site if people can’t find it. 

    That’s where multilingual SEO comes in. 

    In this article, we’ll show you how to build a multilingual website that ranks in Google and other local search engines. You’ll learn why multilingual SEO is about more than translating your content and specific tasks you need to tick off to make your multilingual site as visible as possible. 

    ¡Vamos !

    What is multilingual SEO ? 

    Multilingual SEO is the process of optimising your website to improve search visibility in more than one language. It involves creating high-quality translations (including SEO metadata), targeting language-specific keywords and building links in the target language. 

    A definition of multilingual SEO

    The goal is to make your site as discoverable and accessible as possible for users searching Google and other search engines in their local language. 

    It’s worth pointing out that multilingual SEO differs slightly from international SEO, even if the terms are used interchangeably. With multilingual SEO, you are optimising for a language (so Spanish targets every Spanish-speaking country, not just Spain). In international SEO, you target specific countries, so you might have a different strategy for targeting Argentinian customers vs. Mexican customers. 

    Why adopt a multilingual SEO strategy ?

    There are two major reasons to adopt a multilingual SEO strategy : to reach more customers and to deliver the best experience possible. 

    Why adopt a multilingual SEO strategy

    Reach a wider audience

    Not everyone searches the web in English. Even if non-native speakers eventually resort to English, many will try Googling in their own language first. That means if you target customers in multiple non-English-speaking countries, then creating a multilingual SEO is a must to reach as many of them as possible. 

    A multilingual SEO strategy also boosts your website’s chances of appearing in country-specific search engines like Baidu and Yandex — and in localised versions of Google like Google.fr and Google.de.

    Deliver a better user experience

    Multilingual SEO gives your customers what they want : the ability to search, browse and shop in their native language. This is a big deal, with 89% of consumers saying it’s important to deal with a brand in their own language.

    Improving the user experience also increases the likelihood of non-English-speaking customers converting. As many as 82% of people won’t make a purchase in major consumer categories without local language support. 

    How to prepare for multilingual SEO success

    Before you start creating multilingual SEO content, you need to take care of a couple of things. 

    Identify target markets

    The first step is to identify the languages you want to target. You know your customers better than anyone, so it’s likely you have one or two languages in mind already. 

    But if you don’t, why not analyse your existing website traffic to discover which languages to target first ? The Locations report in Matomo (found in the Visitors section of Matomo’s navigation) shows you which countries your visitors hail from. 

    A screenshot of Matomo's Location Report

    In the example above, targeting German and Indonesian searchers would be a sensible strategy. 

    Target local keywords

    Once you’ve decided on your target markets, it’s time to find localised keywords. Keywords are the backbone of any SEO campaign, so take your time to find ones that are specific to your local markets.

    Yes, that means you shouldn’t just translate your English keywords into French or Spanish ! French or Spanish searchers may use completely different terms to find your products or services. 

    That’s why it’s vital to use a tool like Ahrefs or Semrush to do multilingual keyword research. 

    A french keyword

    This may be a bit tricky if you aren’t a native speaker of your target language, but you can translate your English keywords using Google Translate to get started. 

    Remember, search volumes won’t be as high as English keywords since fewer people are searching for them. So don’t be scared off by small keyword volumes. Besides, even in the U.S. around 95% of keywords get 10 searches per month or fewer. 

    Choose your URL structure

    The final step in preparing your multilingual SEO strategy is deciding on your URL structure, whether that’s using separate domains, subdomains or subfolders. 

    This is important for SEO as it will avoid duplicate content issues. Using language indicators within these URLs will also help both users and search engines differentiate versions of your site. 

    The first option is to have a separate domain for each target language. 

    • yoursite.com
    • yoursite.fr
    • yoursite.es

    Using subdomains would mean you keep one domain but have completely separate sites :

    • fr.yoursite.com
    • es.yoursite.com
    • de.yoursite.com

    Using subfolders keeps everything clean but can result in long URLs :

    • yoursite.com/en
    • yoursite.com/de
    • yoursite.com/es

    As you can see in the image below, we use subdomains to separate multilingual versions of you site :

    A browser showing a language-specific URL structure

    While separate domains provide more precise targeting, it’s a lot of work to manage them. So, unless you have a keyword-rich, unbranded domain name that needs translating, we’d recommend using either subdomains or subdirectories. It’s slightly easier to manage subfolders, but subdomains offer users a clearer divide between different versions of your site. 

    If you want to make your site even easier to navigate, then you can incorporate language indicators into your page’s design to make it easy for consumers to switch languages. These are the little dropdown menus you see containing various flags that let users browse in different languages.

    5 multilingual SEO strategies to use in 2024

    Now you’ve got the basics in order, use the following SEO strategies to improve your multilingual rankings. 

    Use hreflang tags

    There’s another way that Google and other search engines use to determine the language and region your website is targeting : hreflang..

    Hreflang is an HTML attribute that Google and other search engines use to ensure they serve users the right version of the page.

    You can insert it into the header section of the page like this example for a German subdomain :

    <link rel=”alternate” href=”https://yourwebsite.com/de” hreflang=”de” />

    Or you can add the relevant markup to your website’s sitemap. Here’s what the same German markup would look like :

    <xhtml:link rel=”alternate” hreflang=”de” href=”https://yourwebsite.com/de/” /> 

    Whichever method you include one language code in ISO 639-1 format. You can also include a region code in ISO 3166-1 Alpha 2 format. Note that you can include multiple region codes. A web page in German, for example, could target German and Austrian consumers. 

    Hreflang tags also avoid duplicate content issues. 

    With a multilingual site, you could have a dozen different versions of the same page, showing the same content but in a different language. Without an hreflang tag specifying that these are different versions of the same page, Google may penalise your site.

    Invest in high-quality translations

    Google rewards good content. And, while you’d hope Google Translate would be good enough, it usually isn’t.

    Instead, make sure you are using professional linguists to translate your content. They won’t only be able to produce accurate and contextually relevant translations — the kind that Google may reward with higher rankings — but they’ll also be able to account for cultural differences between languages. 

    Imagine you are translating a web page from U.S. English into Italian, for example. You’ve not only got to translate the words themselves but also the measurements (from inches to cm), dates (from mm/dd/yy to dd/mm/yy), currencies, idioms and more. 

    Translate your metadata, too

    You need to translate more than just the content of your website. You should translate its metadata — the descriptive information search engines use to understand your page — to help you rank better in Google and localised search engines. 

    As you can see in the image below, we’ve translated the French version of our homepage’s title and meta description :

    Matomo's meta data translated into French

    Page titles and meta descriptions aren’t the only pieces of metadata you need to pay attention to. Make sure you translate the following :

    • URLs
    • Image alt tags
    • Canonical tags
    • Structured data markup

    While you’re at it, make sure you have translated all of your website’s content, too. It’s easy to miss error messages, contact forms and checkout pages that would otherwise ruin the user experience. 

    Build multilingual backlinks

    Building backlinks is an important step in any SEO strategy. But it’s doubly important in multilingual SEO, where your links in your target language also help Google to understand that you have a translated website. 

    While you want to prioritise links from websites in your target language, make sure that websites are relevant to your niche. It’s no good having a link from a Spanish recipe blog if you have a marketing SaaS tool. 

    A great place to start is by mining the links of competitors in your target market. Your competitors have already done the hard work acquiring these links, and there’s every chance these websites will link to your translated content, too.

    Search competitor backlinks for multilingual link opportunities

    Don’t forget about internal linking pages in the same language, either. This will obviously help users stay in the same language while navigating your site, but it will also show Google the depth of your multilingual content.

    Monitor the SEO health of your multilingual site

    The technical performance of your multilingual pages has a significant impact on your ability to rank and convert. 

    We know for a fact that Google uses page performance metrics in the form of Core Web Vitals as a search ranking factor. What’s more, research by WP Rocker finds that a side loading in one second has a three times better conversion rate than a site loading in five seconds. 

    With that in mind, make sure your site is performing at optimal levels using Matomo’s SEO Web Vitals report. Our SEO Web Vitals feature tracks all of Google’s Core Web Vitals, including :

    • Page Speed Score
    • First Contentful Paint (FCP)
    • Final Input Delay (FID)
    • Last Contentful Paint (LCP)
    • Cumulative Layout Shift (CLS)

    The report displays each metric in a different colour depending on your site’s performance, with green meaning good, orange meaning average, and red meaning poor.

    Matomo's SEO Web Vitals Report

    Check in on these metrics regularly or set up custom alerts to automatically notify you when a specific metric drops below or exceeds a certain threshold — like if your Page Speed score falls below 50, for example. 

    How to track your multilingual SEO efforts with Matomo

    Matomo isn’t just a great tool to track your site’s SEO health ; you can also use our privacy-focused analytics platform to track your multilingual SEO success.

    For example, you could use the report to focus your multilingual SEO efforts on a single language if searches are starting to rival English. Or you decide to translate your most trafficked English keywords into your target languages, regardless if a tool like Ahrefs or Semrush tells you whether these keywords get searches or not.

    If you want to analyse the performance of your new language, for example, you can segment traffic by URL. In our case, we use the segment “Page URL contains fr.matomo.org” to measure the impact of our French website. 

    We can also track the performance of every language except French by using the segment “Page URL does not contain fr.matomo.org”.

    You can use Matomo to track your Keyword performance, too. Unlike search engine-owned platforms like Google Analytics and Google Search Console that no longer share keyword data, Matomo lets users see exactly which keywords users search to find your site in the Combined keywords report :

    Matomo's Combined Keywords Report

    This is valuable information you can use to identify new keyword opportunities and improve your multilingual content strategy. 

    For example, you could use the report to focus your multilingual SEO efforts on a single language if searches are starting to rival English. Or you decide to translate your most trafficked English keywords into your target languages, regardless if a tool like Ahrefs or Semrush tells you whether these keywords get searches or not.

    For international brands that have separate websites and apps for each target language or region, Matomo’s Roll-Up Reporting lets you keep track of aggregate data in one place. 

    A diagram that shows how Roll-up reporting works

    Roll-Up Reporting lets you view data from multiple websites and apps as if they were a single site. This lets you quickly answer questions like :

    • How many visits happened across all of my multilingual websites ?
    • Which languages contributed the most conversions ?
    • How does the performance of my Spanish app compare to my Spanish website ?

    Is it any wonder, then, that Matomo is used by over one million sites in 190 countries to track their web and SEO performance in a privacy-friendly way ?

    Join them today by trying Matomo free for 21 days, no credit card required. Alternatively, request a demo to see how Matomo can help you track your multilingual SEO efforts.