Recherche avancée

Médias (0)

Mot : - Tags -/objet éditorial

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (6)

  • Les vidéos

    21 avril 2011, par

    Comme les documents de type "audio", Mediaspip affiche dans la mesure du possible les vidéos grâce à la balise html5 .
    Un des inconvénients de cette balise est qu’elle n’est pas reconnue correctement par certains navigateurs (Internet Explorer pour ne pas le nommer) et que chaque navigateur ne gère en natif que certains formats de vidéos.
    Son avantage principal quant à lui est de bénéficier de la prise en charge native de vidéos dans les navigateur et donc de se passer de l’utilisation de Flash et (...)

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Menus personnalisés

    14 novembre 2010, par

    MediaSPIP utilise le plugin Menus pour gérer plusieurs menus configurables pour la navigation.
    Cela permet de laisser aux administrateurs de canaux la possibilité de configurer finement ces menus.
    Menus créés à l’initialisation du site
    Par défaut trois menus sont créés automatiquement à l’initialisation du site : Le menu principal ; Identifiant : barrenav ; Ce menu s’insère en général en haut de la page après le bloc d’entête, son identifiant le rend compatible avec les squelettes basés sur Zpip ; (...)

Sur d’autres sites (3465)

  • ffmpeg Command in Docker with Rust Tokio Closes Warp Server Connection (curl 52 Error)

    3 juin 2024, par user762345

    I’m encountering an issue where executing an ffmpeg concatenation command through Rust’s Tokio process in a Docker container causes subsequent HTTP requests to fail. The error occurs exclusively after running the ffmpeg command and making immediate requests, resulting in a “curl 52 empty response from server” error with the connection being closed. Notably, this issue does not occur when running the same setup outside of Docker. Additionally, if no HTTP requests are made after the ffmpeg command, the curl 52 error does not occur.

    


    Here is the verbose curl output of my minimum reproducible example (see below).

    


    curl -v "http://localhost:3030"
*   Trying 127.0.0.1:3030...
* Connected to localhost (127.0.0.1) port 3030 (#0)
> GET / HTTP/1.1
> Host: localhost:3030
> User-Agent: curl/8.1.2
> Accept: */*
> 
* Empty reply from server
* Closing connection 0
curl: (52) Empty reply from server


    


    Here are Docker logs from my minimum reproducible example (see below). The wav files are concatenated successfully, then the container appears to rebuild.

    


    [2024-06-03T05:26:58Z INFO  minimal_docker_webserver_post_error] Starting server on 0.0.0.0:3030
[2024-06-03T05:26:58Z INFO  warp::server] Server::run; addr=0.0.0.0:3030
[2024-06-03T05:26:58Z INFO  warp::server] listening on http://0.0.0.0:3030
[2024-06-03T05:27:07Z INFO  minimal_docker_webserver_post_error] WAV files concatenated successfully
[Running 'cargo run']
    Finished `dev` profile [unoptimized + debuginfo] target(s) in 0.06s
     Running `target/debug/minimal_docker_webserver_post_error`
[2024-06-03T05:27:08Z INFO  minimal_docker_webserver_post_error] Starting server on 0.0.0.0:3030
[2024-06-03T05:27:08Z INFO  warp::server] Server::run; addr=0.0.0.0:3030
[2024-06-03T05:27:08Z INFO  warp::server] listening on http://0.0.0.0:3030


    


    What I have tried :
I tried using different web frameworks (Warp, Actix-web) and request crates (reqwest, ureq). I also tried running the setup outside of Docker, which worked as expected without any issues. Additionally, I tried running the setup in Docker without making any HTTP requests after the ffmpeg command, and the connection closed successfully without errors. I also tried posting to httpbin with a minimal request, but the issue persisted.

    


    Minimum reproducible example :

    


    main.rs

    


    use warp::Filter;&#xA;use reqwest::Client;&#xA;use std::convert::Infallible;&#xA;use log::{info, error};&#xA;use env_logger;&#xA;use tokio::process::Command;&#xA;&#xA;#[tokio::main]&#xA;async fn main() {&#xA;    std::env::set_var("RUST_LOG", "debug");&#xA;    env_logger::init();&#xA;&#xA;    let route = warp::path::end()&#xA;        .and_then(handle_request);&#xA;&#xA;    info!("Starting server on 0.0.0.0:3030");&#xA;    warp::serve(route)&#xA;        .run(([0, 0, 0, 0], 3030))&#xA;        .await;&#xA;}&#xA;&#xA;async fn handle_request() -> Result<impl infallible="infallible"> {&#xA;    let client = Client::new();&#xA;&#xA;    let output = Command::new("ffmpeg")&#xA;        .args(&amp;[&#xA;            "y",&#xA;            "-i", "concat:/usr/src/minimal_docker_webserver_post_error/file1.wav|/usr/src/minimal_docker_webserver_post_error/file2.wav",&#xA;            "-c", "copy",&#xA;            "/usr/src/minimal_docker_webserver_post_error/combined.wav"&#xA;        ])&#xA;        .output()&#xA;        .await;&#xA;&#xA;    match output {&#xA;        Ok(output) => {&#xA;            if output.status.success() {&#xA;                info!("WAV files concatenated successfully");&#xA;            } else {&#xA;                error!("Failed to concatenate WAV files: {:?}", output);&#xA;                return Ok(warp::reply::with_status("Failed to concatenate WAV files", warp::http::StatusCode::INTERNAL_SERVER_ERROR));&#xA;            }&#xA;        },&#xA;        Err(e) => {&#xA;            error!("Failed to execute ffmpeg: {:?}", e);&#xA;            return Ok(warp::reply::with_status("Failed to execute ffmpeg", warp::http::StatusCode::INTERNAL_SERVER_ERROR));&#xA;        }&#xA;    }&#xA;&#xA;    // ISSUE: Connection closes with curl: (52) Empty reply from server&#xA;    match client.get("https://httpbin.org/get").send().await {&#xA;        Ok(response) => info!("GET request successful: {:?}", response),&#xA;        Err(e) => error!("GET request failed: {:?}", e),&#xA;    }&#xA;&#xA;    match client.post("https://httpbin.org/post")&#xA;        .body("field1=value1&amp;field2=value2")&#xA;        .send().await {&#xA;        Ok(response) => info!("POST request successful: {:?}", response),&#xA;        Err(e) => error!("POST request failed: {:?}", e),&#xA;    }&#xA;&#xA;    Ok(warp::reply::with_status("Request handled", warp::http::StatusCode::OK))&#xA;}&#xA;</impl>

    &#xA;

    FFMPEG command to generate the two wav files for concatenation

    &#xA;

    ffmpeg -f lavfi -i "sine=frequency=1000:duration=5" file1.wav &amp;&amp; ffmpeg -f lavfi -i "sine=frequency=500:duration=5" file2.wav&#xA;

    &#xA;

    Dockerfile

    &#xA;

    # Use the official Rust image as the base image&#xA;FROM rust:latest&#xA;&#xA;# Install cargo-watch&#xA;RUN cargo install cargo-watch&#xA;&#xA;# Install ffmpeg&#xA;RUN apt-get update &amp;&amp; apt-get install -y ffmpeg&#xA;&#xA;# Set the working directory inside the container&#xA;WORKDIR /usr/src/minimal_docker_webserver_post_error&#xA;&#xA;# Copy the Cargo.toml and Cargo.lock files&#xA;COPY Cargo.toml Cargo.lock ./&#xA;&#xA;# Copy the source code&#xA;COPY src ./src&#xA;&#xA;# Copy wav files&#xA;COPY file1.wav /usr/src/minimal_docker_webserver_post_error/file1.wav&#xA;COPY file2.wav /usr/src/minimal_docker_webserver_post_error/file2.wav&#xA;&#xA;# Install dependencies&#xA;RUN cargo build --release&#xA;&#xA;# Expose the port that the application will run on&#xA;EXPOSE 3030&#xA;&#xA;# Set the entry point to use cargo-watch&#xA;CMD ["cargo", "watch", "-x", "run"]&#xA;

    &#xA;

    Cargo.toml

    &#xA;

    [package]&#xA;name = "minimal_docker_webserver_post_error"&#xA;version = "0.1.0"&#xA;edition = "2021"&#xA;&#xA;# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html&#xA;&#xA;[dependencies]&#xA;warp = "0.3"&#xA;reqwest = { version = "0.12.4", features = ["json"] }&#xA;tokio = { version = "1", features = ["full"] }&#xA;log = "0.4"&#xA;env_logger = "0.11.3"&#xA;

    &#xA;

    Making the request to the warp server

    &#xA;

    curl -v "http://localhost:3030"&#xA;

    &#xA;

  • How to Use Web Analytics to Improve SEO

    5 janvier 2022, par erin — Analytics Tips

    Everyone wants their website to rank highly in Google — and that’s exactly why the world of SEO is so competitive.

    In order to succeed in such a crowded space, it’s essential to equip yourself with the right tools and processes to ensure your website is maximally optimised for search engines.

    If you’d like to improve your website’s SEO rankings, leveraging web analytics is one of the best places to start. Web analytics provides valuable insights to help you assess performance, user behaviour and optimisation opportunities.

    In this blog, we’ll cover :

    The basics of SEO and web analytics

    Before we discuss how to use web analytics for SEO, let’s start with a quick explanation of both.

    SEO (Search Engine Optimisation) encompasses a broad set of activities aimed at increasing a website’s position in search engine results pages (SERPs). When a user enters a query (e.g. ‘marketing agencies in Dallas’) in a search engine, the websites that appear near the top of the page are optimised for search engines and therefore ranking for that particular term. 

    Web analytics refers to the monitoring/assessment of metrics that track traffic sources and user behaviour on a website. This involves the use of a web analytics tool to collect, aggregate, organise and visualise website data so that meaningful conclusions can be drawn.

    The importance of website analytics for SEO

    SEO revolves around search engine algorithms – a set of rules that dictates a website’s ranking for a given search query (i.e. keyword). The algorithm takes numerous factors into account to determine a particular site’s SERP ranking. So, to achieve strong SEO, your website needs to exhibit qualities that the algorithm deems important. That’s where web analytics comes into play.

    Web analytics allows you to track key metrics and data points that affect how the algorithm ranks your website. For example, how much time do users spend on your site ? Which external links are referring traffic to your site ? How do your site’s Core Web Vitals stack up ? 

    Understanding this data will supply you with the insights needed to make positive adjustments, ultimately improving your website’s SEO. 

    How do you analyse a website for SEO ?

    The SEO analysis of a website needs to be focused on relevant data that’s applicable to search engine rankings. When conducting your website SEO analysis, here are some notable metrics and data fields to pay attention to :

    1. Bounce rate and dwell time

    These metrics denote how much time users spend on your website. If users frequently exit your site after only a few seconds, Google may view this as a negative indicator. To reduce bounce rate and increase dwell time, you should work towards making your site’s content more captivating and ensuring that there aren’t any technical issues with your site (e.g. pages taking too long to load or not optimised for mobile).

    Bounce rate on Matomo's Page report
    Bounce rate and average time on page via Pages report

    2. Broken/dead links

    Perform a technical analysis to scan your website for faulty links. If your site contains broken links that lead to 404 pages, this can detract from your website’s SEO rankings. Redirect those links to a related page or remove them.

    Crawl Errors report in Matomo
    404 errors via the Crawling Errors report

    Matomo’s Crawling Errors report can give you instant access to this technical information so you can resolve it before it begins to impact your ranking.

    3. Scroll depth

    Measuring scroll depth (how far users scroll down the page) can help you gauge the quality of your content — and this goes hand-in-hand with bounce rate and dwell time. To assess scroll depth, you can use a Tag Manager to track the specific scroll percentage on your site’s pages.

    4. Transitions

    Studying how users transition from page to page within your site can help you understand their behaviour more holistically. Which pages do they tend to gravitate towards ? Are there CTAs on your blog that aren’t driving many click-throughs ? Optimising user journeys will, in turn, elevate the overall user experience on your site.

    Matomo's Transition report
    Previous and following actions of visitors for a website’s cart page via the Transitions report

    5. Internal site search

    You can use site search tracking and reporting to learn what your audience is looking for. If you notice a trend (e.g. the majority of searches are for pricing because your pricing page isn’t in the navigation menu), this can inform both site architecture and content planning.

    Matomo's Site Search Keywords report
    List of keywords via Site Search Keywords report

    Ecommerce sites in particular should be monitoring branded queries, especially in regards to brand misspellings that could be causing users to bounce off the site.

    6. Segments

    Separating your visitors into distinct segments can produce granular insights that paint a more accurate picture.

    For example, perhaps you notice that your bounce rate is far higher on mobile, or with users from the UK. In both cases, this knowledge will provide clarity on where to focus your optimisation efforts (e.g. mobile responsiveness, UK-specific content/landing pages, etc.).

    Website visitor segment via Matomo's Site Search Keywords report
    Matomo’s Site Search report combined with the Returning Users Segment

    7. Acquisition channels

    It’s crucial to analyse where your website traffic is coming from. Among other things, reviewing your acquisition metrics will reveal which external websites are referring the most traffic to your website. 

    Links from external sites (also known as backlinks) are one of the most important ranking factors because this tells Google that your site is reputable and credible. So, you may choose to cultivate a relationship with these sites (or similar sites) by offering guest blogging and other link building initiatives.

    Referral Website report in Matomo
    Referral websites via Matomo’s Websites report

    In addition to the above, you should also be monitoring your Core Web Vitals — which leads us to our next section.

    What are Core Web Vitals and why are they important ?

    Core Web Vitals are a set of 3 primary metrics that reflect the general user experience of a website. These metrics are load time, interactivity and stability. 

    1. Load time (LCP) refers to the amount of time it takes for your website’s text and images to load.
    2. Interactivity (FID) refers to the amount of time it takes for user input areas (buttons, form fields, etc.) to become functional.
    3. Stability (CLS) refers to the visual/spatial integrity of your website. If text, images, and other elements tend to suddenly shift position when a user is viewing the site, this will hurt your CLS score.
    Matomo's SEO Web VItals report
    Core Web Vitals metrics via Matomo’s SEO Web Vitals report

    So, why are these Core Web Vitals metrics important for SEO ? Generally speaking, Google prioritises user experience — and Core Web Vitals affect users’ satisfaction with a website. Furthermore, Google has confirmed that Core Web Vitals are, indeed, a ranking factor.

    Matomo enables you to track metrics for Core Web Vitals which we refer to as SEO Web Vitals.

    How to measure and track keyword performance

    We can’t talk about SEO and analytics without touching on keywords. Keywords (the words/phrases that users type in a search engine) are arguably the most cardinal component of SEO. So, outside of website performance, it’s also necessary to track the keywords your website is ranking for. 

    Recall from above that SEO is all about ranking highly on SERPs for certain search queries (i.e. keywords). To assess your Search Engine Keyword Performance, you can use an analytics tool to view Keyword reports for your website. These reports will show you which keywords your site ranks for, the average SERP position your site achieves for each keyword, the amount of traffic you receive from each keyword, and more.

    Top keywords generating traffic via Matomo's Search Engines & Keywords Performance report
    Top keywords generating traffic via Search Engines & Keywords report in Matomo

    Digging into your keyword performance can help you identify valuable keyword opportunities and improvement goals.

    For example, upon reviewing your highest-traffic keywords, you may choose to create more blog content around those keywords to bolster your success. Or, perhaps you notice that your average position for a high-intent keyword is quite low. In that case, you could implement a targeted link building campaign to help boost your ranking for that keyword. 

    Final thoughts

    In this article, we’ve discussed the benefits of web analytics — particularly in regards to SEO. When it comes to selecting a web analytics tool, Google Analytics is by far the most popular choice. But that doesn’t make it the best.

    At Matomo, we’re committed to providing a superior alternative to Google Analytics. Matomo is a powerful, open-source web analytics platform that gives you 100% data ownership — protecting both your data and your customers’ privacy.

    Try our live demo or start a free 21-day trial now – no credit card required.

  • Meta Receives a Record GDPR Fine from The Irish Data Protection Commission

    29 mai 2023, par Erin — GDPR

    The Irish Data Protection Commission (the DPC) issued a €1.2 billion fine to Meta on May, 22nd 2023 for violating the General Data Protection Regulation (GDPR). 

    The regulator ruled that Meta was unlawfully transferring European users’ data to its US-based servers and taking no sufficient measures for ensuring users’ privacy. 

    Meta must now suspend data transfer within five months and delete EU/EEA users’ personal data that was illegally transferred across the border. Or they risk facing another round of repercussions. 

    Meta continued to transfer personal user data to the USA following an earlier ruling of The Court of Justice of the European Union (CJEU), which already address problematic EU-U.S. data flows. Meta continued those transfers on the basis of the updated Standard Contractual Clauses (“SCCs”), adopted by the European Commission in 2021. 

    The Irish regulator successfully proved that these arrangements had not sufficiently addressed the “fundamental rights and freedoms” of the European data subjects, outlined in the CJEU ruling. Meta was not doing enough to protect EU users’ data against possible surveillance and unconsented usage by US authorities or other authorised entities.

    Why European Regulators Are After The US Big Tech Firms ? 

    GDPR regulations have been a sore area of compliance for US-based big tech companies. 

    Effectively, they had to adopt a host of new measures for collecting user consent, ensuring compliant data storage and the right to request data removal for a substantial part of their user bases. 

    The wrinkle, however, is that companies like Google and Meta among others, don’t have separate data processing infrastructure for different markets. Instead, all the user data gets commingled on the companies’ servers, which are located in the US. 

    Data storage facilities’ location is an issue. In 2020, the CJEU made a historical ruling, called the invalidation of the Privacy Shield. Originally, international companies were allowed to transfer data between the EU and the US if they adhered to seven data protection principles. This arrangement was called the Privacy Shield. 

    However, the continuous investigation found that the Privacy Shield scheme was not GDPR compliant and therefore companies could no longer use it to justify cross-border data transfers.

    The invalidation of the Privacy Shield gave ground for further investigations of the big tech companies’ compliance statuses. 

    In March 2022, the Irish DPC issued the first €17 million fine to Meta for “insufficient technical and organisational measures to ensure information security of European users”. In September 2022, Meta was again hit with a €405 million fine for Instagram breaching GDPR principles. 

    2023 began with another series of rulings, with the DPC concluding that Meta had breaches of the GDPR relating to its Facebook service (€210 million fine) and breaches related to Instagram (€180 million fine). 

    Clearly, Meta already knew they weren’t doing enough for GDPR compliance and yet they refused to take privacy-focused action

    Is Google GDPR Compliant ?

    Google has a similar “track record” as Meta when it comes to ensuring full compliance with the GDPR. Although Google has said to provide users with more controls for managing their data privacy, the proposed solutions are just scratching the surface. 

    In the background, Google continues to leverage its ample reserves of user browsing, behavioural and device data in product development and advertising. 

    In 2022, the Irish Council for Civil Liberties (ICCL) found that Google used web users’ information in its real-time bidding ad system without their knowledge or consent. The French data regulator (CNIL), in turn, fined Google for €150 million because of poor cookie consent banners the same year. 

    Google Analytics GDPR compliance status is, however, the bigger concern.

    Neither Google Univeral Analytics (UA) nor Google Analytics 4 are GDPR compliant, following the Privacy Shield framework invalidation in 2020. 

    Fines from individual regulators in Sweden, France, Austria, Italy, Denmark, Finland and Norway ruled that Google Analytics is non-GDPR compliant and is therefore illegal to use. 

    The regulatory rulings not just affect Google, but also GA users. Because the product is in breach of European privacy laws, people using it are complacent. Privacy groups like noyb, for example, are exercising their right to sue individual websites, using Google Analytics.

    How to Stay GDPR Compliant With Website Analytics 

    To avoid any potential risk exposure, selectively investigate each website analytics provider’s data storage and management practices. 

    Inquire about the company’s data storage locations among the first things. For example, Matomo Cloud keeps all the data in the EU, while Matomo On-Premise edition gives you the option to store data in any country of your choice. 

    Secondly, ask about their process for consent tracking and subsequent data analysis. Our website analytics product is fully GDPR compliant as we have first-party cookies enabled by default, offer a convenient option of tracking out-outs, provide a data removal mechanism and practice safe data storage. In fact, Matomo was approved by the French Data Protection Authority (CNIL) as one of the few web analytics apps that can be used to collect data without tracking consent

    Using an in-built GDPR Manager, Matomo users can implement the right set of controls for their market and their industry. For example, you can implement extra data or IP anonymization ; disable visitor logs and profiles. 

    Thanks to our privacy-by-design architecture and native controls, users can make their Matomo analytics compliant even with the strictest privacy laws like HIPAA, CCPA, LGPD and PECR. 

    Learn more about GDPR-friendly website analytics.

    Final Thoughts

    Since the GDPR came into effect in 2018, over 1,400 fines have been given to various companies in breach of the regulations. Meta and Google have been initially lax in response to European regulatory demands. But as new fines follow and the consumer pressure mounts, Big Tech companies are forced to take more proactive measures : add opt-outs for personalised ads and introduce an alternative mechanism to third-party cookies

    Companies, using non-GDPR-compliant tools risk finding themselves in the crossfire of consumer angst and regulatory criticism. To operate an ethical, compliant business consider privacy-focused alternatives to Google products, especially in the area of website analytics.