Recherche avancée

Médias (0)

Mot : - Tags -/objet éditorial

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (35)

  • Les tâches Cron régulières de la ferme

    1er décembre 2010, par

    La gestion de la ferme passe par l’exécution à intervalle régulier de plusieurs tâches répétitives dites Cron.
    Le super Cron (gestion_mutu_super_cron)
    Cette tâche, planifiée chaque minute, a pour simple effet d’appeler le Cron de l’ensemble des instances de la mutualisation régulièrement. Couplée avec un Cron système sur le site central de la mutualisation, cela permet de simplement générer des visites régulières sur les différents sites et éviter que les tâches des sites peu visités soient trop (...)

  • XMP PHP

    13 mai 2011, par

    Dixit Wikipedia, XMP signifie :
    Extensible Metadata Platform ou XMP est un format de métadonnées basé sur XML utilisé dans les applications PDF, de photographie et de graphisme. Il a été lancé par Adobe Systems en avril 2001 en étant intégré à la version 5.0 d’Adobe Acrobat.
    Étant basé sur XML, il gère un ensemble de tags dynamiques pour l’utilisation dans le cadre du Web sémantique.
    XMP permet d’enregistrer sous forme d’un document XML des informations relatives à un fichier : titre, auteur, historique (...)

  • Websites made ​​with MediaSPIP

    2 mai 2011, par

    This page lists some websites based on MediaSPIP.

Sur d’autres sites (4596)

  • Increasing Website Traffic : 11 Tips To Attract Visitors

    25 août 2023, par Erin — Analytics Tips, Marketing

    For your website and business to succeed, you need to focus on building traffic.

    However, you aren’t the only one with that goal in mind.

    There are millions of other websites trying to increase their traffic as well. With that much competition, it’s important to make sure your website stands out. Accomplishing that can require a great deal of strategy.

    We’ve compiled a list of tips to help you develop a solid plan for increasing website traffic, to expand your reach, grow your audience and boost customer engagement levels — creating more opportunities for your business.Using these tips, more visitors will find their way to your website — meaning more customers for your business.

    Why is website traffic important ?

    Website traffic is essentially the number of people visiting your website. When someone lands on your site, they’re considered a visitor and increase your website traffic. 

    When your website traffic is high, you’ll get more clicks, customer interactions and brand engagement. As a result, search engines will have a positive impression of your website and send more people there, meaning even more people will see your content and have the opportunity to buy your product.

    When using a website for your business or any other venture, tracking your website traffic using a web analytics solution like Matomo is critical.

    A screenshot of Matomo's Visits Dashboard

    With over 200 million actively maintained and visited websites in 2023, it’s important to make sure yours stands out if you want to increase your website traffic and grow your online presence. 

    11 tips for increasing website traffic

    Here are 11 tips to increase your organic traffic and elevate your business.

    1. Perfect your SEO

    Optimising your website to show up in search engine results shouldn’t be overlooked, as 63% of consumers start researching a product by using a search engine. Search engine optimisation, or SEO, increases the visibility and discoverability of your website on search engine results pages (SERPs). SEO targets organic searches, which means it doesn’t add to social media traffic, direct traffic or referrals, and it isn’t paid traffic.

    SEO is number one on this list for a reason — most of these tips will directly, or indirectly, improve your SEO efforts. 

    Steps to improve your search engine optimisation can include :

    • Using relevant keywords that are incorporated naturally throughout your content
    • Using a web analytics tool like Matomo, with its search keyword feature, to gain insights and identify opportunities for improvement
    • Using descriptive meta titles and meta descriptions
    • Link to your own content internally with descriptive anchor tags, and make sure unused pages are removed 
    • Keeping your target audience in mind and marketing your content toward them
    • Making sure your website’s structure is optimised to be mobile-friendly, fast and responsive — such as with Matomo’s SEO Web Vitals feature, which monitors key metrics like your website’s page speed and loading performance, pivotal for optimising search engine results

    2. Research the competition

    It’s important to remember that while your business might be unique, it’s likely not the only one in its field. Thousands of other websites from other companies are also looking to improve their website traffic and increase sales, and you have to outcompete them.

    Looking at what your competitors are doing is vital from a strategic perspective. You can see what their content looks like, how they’re framing their specific use cases and what target audience they’re marketing toward.

    Knowing what your competitors are doing can help you find ways to improve your content and make it unique. Are your competitors missing a specific use case or neglecting a particular audience ? Fill in their content gaps on your website, and pick up the traffic they’re missing.

    3. Create high-quality, evergreen content

    If your content is high-quality, visitors will read more of it and stay longer on your site. This obviously increases the likelihood they will purchase your product or service, and it tells search engines that your website is a good answer for a search query.

    High-quality content will also be shared more often, leading to even more website traffic. You should aim to develop content that doesn’t lose relevance over time (aka “evergreen content”). If you include time-sensitive data, statistics or content in your website, blog posts or articles, it’ll be relevant only around that time frame. 

    While this month’s viral content is highly popular, it likely won’t be relevant in a few months. Instead, if you ensure your content is evergreen, it will continue to get engagement long after it’s published.

    4. Implement creative visuals

    It’s important to have engaging, fun and interactive media on your website to keep visitors on your site longer. Like good content, interesting visuals (and the resulting longer visits) can translate to more purchases (and favourable assessments by search engines).

    A screenshot of Matomo's Media Dashboard

    Media can take the form of videos, infographics, images or web graphics. 

    With Matomo’s Media Analytics feature, you can automatically gain even deeper insights into how your visitors engage with your media content, enhancing your understanding of their preferences and behaviours.

    If you have interesting, captivating visuals, visitors will be more likely to stay on your website longer and see what you have to offer. Without captivating visuals to break up walls of text, you’ll likely find visitors will tend to leave your site in favour of something more engaging.

    Just make sure you design your visuals with your target audience in mind. Flashy, fun graphics might not be a good fit for a professional audience, but they’re great for younger audiences. If you get your audience correct, they may also share the images with others. Depending on your business, that might be a useful infographic shared across LinkedIn, or a picture of a clever use case shared on Pinterest. 

    As a bonus, if other companies use your graphics on their websites, that earns you some backlinks — more on those in a bit.

    5. Create a comprehensive knowledge base

    Having a knowledge base is critical to making sure your service or product is well understood and well documented, especially in the tech industry. If a visitor or potential customer is interested in your product or service, they need to know exactly what it will do for them and that they have a good foundation of support in case they need help. A knowledge base is also a good place for internal links (more on those in a bit).

    Visitors can also use your knowledge base as a source of information, and if they cite you as a source, that’ll lead right back to more website traffic for you (see our backlinks section for more about this). If your website is a good source of information, visitors will come back to it again and again.

    6. Use social media often and consistently

    Digital marketing nowadays heavily relies on social media platforms. Having an online presence no longer means just having a website — if you’re not using social media sites, you’re missing out on a huge portion of potential visitors and customers.

    A strong social media presence with profiles on platforms like Facebook, X (formerly Twitter), Instagram or LinkedIn can be invaluable for increasing your website traffic. Visitors to your social media profiles will click on regularly shared content, read your blog posts and possibly become customers.

    Participating in relevant communities and networking with other companies in groups in your industry can also be invaluable. If you participate in online communities and forums for your niche, you can offer insight, answer questions and plug your website. All of this will increase your clicks, which will increase your website traffic.

    If you’ve managed to build your own community on social media, make sure to keep them engaged ! Implementing your own forum, hosting live chats and Q&As, offering helpful and engaging content will make sure visitors keep coming back and spreading the word. 

    7. Use email marketing or newsletters

    Having an email list and sending marketing emails or newsletters is a great way to increase website traffic. You can offer exclusive content, and promise discounts or resources to your subscribers for when they return to your website. This will help keep your loyal audience engaged, entice new customers to subscribe to your newsletter, give you a chance to upsell to people who have already expressed an interest in your product and potentially convert curious subscribers into customers.

    8. Make sure your content can earn backlinks

    A backlink is when a website links to a different website — ideally using relevant anchor text — and it’s an effective strategy for increasing referral traffic, that is, visitors who get to your website via a link on another website. The more backlinks you have, the more your referral traffic will increase. Social share buttons make it easy for people to cite you on social platforms, too. 

    We’ve already talked about making expert content that’s link-worthy, but also make sure that you’re creating linkable assets (like those interesting visuals mentioned earlier), building relationships with other sites that will link to you (like by inviting an expert or influencer to write on your page and promote it from their platform, or by writing your own guest content for their sites) and sharing your own content. All of this can help increase your referral traffic, particularly when you’re linked from websites with a higher domain authority than you have.

    You can also make sure your website is listed in online directories. Some sites will do interviews and roundups, as well — these are great opportunities to increase your backlinks.

    9. Optimise your CTR

    Click-through rate, or CTR, is the percentage of users who click on specific links to your website. A high CTR means your visitors are following a link — whether in an advertisement, a search result or a social media post — and a low CTR means they’re passing it by. Optimising your CTR can greatly improve your website traffic.

    To improve CTR, identify successful elements such as copy, imagery, and offers in your ads, enabling you to amplify effective elements and minimise less impactful ones.

    10. Ensure your website is responsive and mobile-friendly

    If a visitor is frustrated by your site being slow, laggy, clunky or not mobile-friendly, they won’t stay long. That doesn’t look good to search engines if that’s how your visitors got there. Your website needs to be clean, responsive, user-friendly and accessible.

    If your website is slow, try increasing your website’s performance by :

    • Optimising images : Reduce the size of images and compress them for faster load times. Opt for JPEG format for photos and PNG format for graphics. 
    • Limit the use of plugins : If you are using a CMS like WordPress, consider removing plugins that are unnecessary or not essential.
    • Embrace lazy loading : To further enhance site speed and reduce initial load times, set up your site to load images and content only as visitors scroll down. Prioritising the content and images at the top of the page makes the site feel faster. Some CMS platforms will offer this option, but others may require a bit of coding to set this up. 

    Many people rely on their phones to research services or products, especially if they’re doing a quick search. Make sure your website is friendly to mobile users. It should scale vertically and scroll smoothly so users aren’t frustrated when using your site. They should be able to find the info they need immediately without any technical issues.

    11. Track your website’s metrics

    As you test out each of these strategies to increase your web traffic, don’t forget to closely analyse the performance of your site. To truly understand the impact of your efforts, you’ll need a reliable web analytics solution. Think of a dependable web analytics solution as your website’s GPS. Without it, you’d be lost, unsure of your direction and missing out on valuable insights to steer your growth.

    Matomo is a powerful web analytics tool that can help you do just that by providing information on your site visitors and campaign performance, complemented by an array of behavioural analytics features that delve into user interactions. Among these, our heatmap feature stands out, enabling greater insights into user interactions and optimisation of your site’s effectiveness.

    Screenshot of Matomo heatmap feature

    Google Analytics is another powerful analytics option, though it has challenges with data accuracy ; there are multiple other web analytics solutions as well.

    Regardless of what web analytics solution you choose, the process of analysing your website metrics is incredibly important for identifying areas of improvement to increase website traffic.

    Increasing your web traffic is a process

    Increasing website traffic isn’t something you accomplish overnight. It’s a comprehensive, ongoing endeavour that requires constant analysis and fine-tuning. 

    By applying these tips to create consistent, high-quality content that gets spotlighted on search engines, shared on social media and returned to again and again, you’ll see a steady stream of increased traffic. 

    With Matomo, you can understand your visitor behaviour to see what works and what doesn’t as you work to increase your website traffic. Get your free 21-day trial now. No credit card required.

  • Open Media Developers Track at OVC 2011

    11 octobre 2011, par silvia

    The Open Video Conference that took place on 10-12 September was so overwhelming, I’ve still not been able to catch my breath ! It was a dense three days for me, even though I only focused on the technology sessions of the conference and utterly missed out on all the policy and content discussions.

    Roughly 60 people participated in the Open Media Software (OMS) developers track. This was an amazing group of people capable and willing to shape the future of video technology on the Web :

    • HTML5 video developers from Apple, Google, Opera, and Mozilla (though we missed the NZ folks),
    • codec developers from WebM, Xiph, and MPEG,
    • Web video developers from YouTube, JWPlayer, Kaltura, VideoJS, PopcornJS, etc.,
    • content publishers from Wikipedia, Internet Archive, YouTube, Netflix, etc.,
    • open source tool developers from FFmpeg, gstreamer, flumotion, VideoLAN, PiTiVi, etc,
    • and many more.

    To provide a summary of all the discussions would be impossible, so I just want to share the key take-aways that I had from the main sessions.

    WebRTC : Realtime Communications and HTML5

    Tim Terriberry (Mozilla), Serge Lachapelle (Google) and Ethan Hugg (CISCO) moderated this session together (slides). There are activities both at the W3C and at IETF – the ones at IETF are supposed to focus on protocols, while the W3C ones on HTML5 extensions.

    The current proposal of a PeerConnection API has been implemented in WebKit/Chrome as open source. It is expected that Firefox will have an add-on by Q1 next year. It enables video conferencing, including media capture, media encoding, signal processing (echo cancellation etc), secure transmission, and a data stream exchange.

    Current discussions are around the signalling protocol and whether SIP needs to be required by the standard. Further, the codec question is under discussion with a question whether to mandate VP8 and Opus, since transcoding gateways are not desirable. Another question is how to measure the quality of the connection and how to report errors so as to allow adaptation.

    What always amazes me around RTC is the sheer number of specialised protocols that seem to be required to implement this. WebRTC does not disappoint : in fact, the question was asked whether there could be a lighter alternative than to re-use dozens of years of protocol development – is it over-engineered ? Can desktop players connect to a WebRTC session ?

    We are already in a second or third revision of this part of the HTML5 specification and yet it seems the requirements are still being collected. I’m quietly confident that everything is done to make the lives of the Web developer easier, but it sure looks like a huge task.

    The Missing Link : Flash to HTML5

    Zohar Babin (Kaltura) and myself moderated this session and I must admit that this session was the biggest eye-opener for me amongst all the sessions. There was a large number of Flash developers present in the room and that was great, because sometimes we just don’t listen enough to lessons learnt in the past.

    This session gave me one of those aha-moments : it the form of the Flash appendBytes() API function.

    The appendBytes() function allows a Flash developer to take a byteArray out of a connected video resource and do something with it – such as feed it to a video for display. When I heard that Web developers want that functionality for JavaScript and the video element, too, I instinctively rejected the idea wondering why on earth would a Web developer want to touch encoded video bytes – why not leave that to the browser.

    But as it turns out, this is actually a really powerful enabler of functionality. For example, you can use it to :

    • display mid-roll video ads as part of the same video element,
    • sequence playlists of videos into the same video element,
    • implement DVR functionality (high-speed seeking),
    • do mash-ups,
    • do video editing,
    • adaptive streaming.

    This totally blew my mind and I am now completely supportive of having such a function in HTML5. Together with media fragment URIs you could even leave all the header download management for resources to the Web browser and just request time ranges from a video through an appendBytes() function. This would be easier on the Web developer than having to deal with byte ranges and making sure that appropriate decoding pipelines are set up.

    Standards for Video Accessibility

    Philip Jagenstedt (Opera) and myself moderated this session. We focused on the HTML5 track element and the WebVTT file format. Many issues were identified that will still require work.

    One particular topic was to find a standard means of rendering the UI for caption, subtitle, und description selection. For example, what icons should be used to indicate that subtitles or captions are available. While this is not part of the HTML5 specification, it’s still important to get this right across browsers since otherwise users will get confused with diverging interfaces.

    Chaptering was discussed and a particular need to allow URLs to directly point at chapters was expressed. I suggested the use of named Media Fragment URLs.

    The use of WebVTT for descriptions for the blind was also discussed. A suggestion was made to use the voice tag <v> to allow for “styling” (i.e. selection) of the screen reader voice.

    Finally, multitrack audio or video resources were also discussed and the @mediagroup attribute was explained. A question about how to identify the language used in different alternative dubs was asked. This is an issue because @srclang is not on audio or video, only on text, so it’s a missing feature for the multitrack API.

    Beyond this session, there was also a breakout session on WebVTT and the track element. As a consequence, a number of bugs were registered in the W3C bug tracker.

    WebM : Testing, Metrics and New features

    This session was moderated by John Luther and John Koleszar, both of the WebM Project. They started off with a presentation on current work on WebM, which includes quality testing and improvements, and encoder speed improvement. Then they moved on to questions about how to involve the community more.

    The community criticised that communication of what is happening around WebM is very scarce. More sharing of information was requested, including a move to using open Google+ hangouts instead of Google internal video conferences. More use of the public bug tracker can also help include the community better.

    Another pain point of the community was that code is introduced and removed without much feedback. It was requested to introduce a peer review process. Also it was requested that example code snippets are published when new features are announced so others can replicate the claims.

    This all indicates to me that the WebM project is increasingly more open, but that there is still a lot to learn.

    Standards for HTTP Adaptive Streaming

    This session was moderated by Frank Galligan and Aaron Colwell (Google), and Mark Watson (Netflix).

    Mark started off by giving us an introduction to MPEG DASH, the MPEG file format for HTTP adaptive streaming. MPEG has just finalized the format and he was able to show us some examples. DASH is XML-based and thus rather verbose. It is covering all eventualities of what parameters could be switched during transmissions, which makes it very broad. These include trick modes e.g. for fast forwarding, 3D, multi-view and multitrack content.

    MPEG have defined profiles – one for live streaming which requires chunking of the files on the server, and one for on-demand which requires keyframe alignment of the files. There are clear specifications for how to do these with MPEG. Such profiles would need to be created for WebM and Ogg Theora, too, to make DASH universally applicable.

    Further, the Web case needs a more restrictive adaptation approach, since the video element’s API is already accounting for some of the features that DASH provides for desktop applications. So, a Web-specific profile of DASH would be required.

    Then Aaron introduced us to the MediaSource API and in particular the webkitSourceAppend() extension that he has been experimenting with. It is essentially an implementation of the appendBytes() function of Flash, which the Web developers had been asking for just a few sessions earlier. This was likely the biggest announcement of OVC, alas a quiet and technically-focused one.

    Aaron explained that he had been trying to find a way to implement HTTP adaptive streaming into WebKit in a way in which it could be standardised. While doing so, he also came across other requirements around such chunked video handling, in particular around dynamic ad insertion, live streaming, DVR functionality (fast forward), constraint video editing, and mashups. While trying to sort out all these requirements, it became clear that it would be very difficult to implement strategies for stream switching, buffering and delivery of video chunks into the browser when so many different and likely contradictory requirements exist. Also, once an approach is implemented and specified for the browser, it becomes very difficult to innovate on it.

    Instead, the easiest way to solve it right now and learn about what would be necessary to implement into the browser would be to actually allow Web developers to queue up a chunk of encoded video into a video element for decoding and display. Thus, the webkitSourceAppend() function was born (specification).

    The proposed extension to the HTMLMediaElement is as follows :

    partial interface HTMLMediaElement 
      // URL passed to src attribute to enable the media source logic.
      readonly attribute [URL] DOMString webkitMediaSourceURL ;
    

    bool webkitSourceAppend(in Uint8Array data) ;

    // end of stream status codes.
    const unsigned short EOS_NO_ERROR = 0 ;
    const unsigned short EOS_NETWORK_ERR = 1 ;
    const unsigned short EOS_DECODE_ERR = 2 ;

    void webkitSourceEndOfStream(in unsigned short status) ;

    // states
    const unsigned short SOURCE_CLOSED = 0 ;
    const unsigned short SOURCE_OPEN = 1 ;
    const unsigned short SOURCE_ENDED = 2 ;

    readonly attribute unsigned short webkitSourceState ;
     ;

    The code is already checked into WebKit, but commented out behind a command-line compiler flag.

    Frank then stepped forward to show how webkitSourceAppend() can be used to implement HTTP adaptive streaming. His example uses WebM – there are no examples with MPEG or Ogg yet.

    The chunks that Frank’s demo used were 150 video frames long (6.25s) and 5s long audio. Stream switching only switched video, since audio data is much lower bandwidth and more important to retain at high quality. Switching was done on multiplexed files.

    Every chunk requires an XHR range request – this could be optimised if the connections were kept open per adaptation. Seeking works, too, but since decoding requires download of a whole chunk, seeking latency is determined by the time it takes to download and decode that chunk.

    Similar to DASH, when using this approach for live streaming, the server has to produce one file per chunk, since byte range requests are not possible on a continuously growing file.

    Frank did not use DASH as the manifest format for his HTTP adaptive streaming demo, but instead used a hacked-up custom XML format. It would be possible to use JSON or any other format, too.

    After this session, I was actually completely blown away by the possibilities that such a simple API extension allows. If I wasn’t sold on the idea of a appendBytes() function in the earlier session, this one completely changed my mind. While I still believe we need to standardise a HTTP adaptive streaming file format that all browsers will support for all codecs, and I still believe that a native implementation for support of such a file format is necessary, I also believe that this approach of webkitSourceAppend() is what HTML needs – and maybe it needs it faster than native HTTP adaptive streaming support.

    Standards for Browser Video Playback Metrics

    This session was moderated by Zachary Ozer and Pablo Schklowsky (JWPlayer). Their motivation for the topic was, in fact, also HTTP adaptive streaming. Once you leave the decisions about when to do stream switching to JavaScript (through a function such a wekitSourceAppend()), you have to expose stream metrics to the JS developer so they can make informed decisions. The other use cases is, of course, monitoring of the quality of video delivery for reporting to the provider, who may then decide to change their delivery environment.

    The discussion found that we really care about metrics on three different levels :

    • measuring the network performance (bandwidth)
    • measuring the decoding pipeline performance
    • measuring the display quality

    In the end, it seemed that work previously done by Steve Lacey on a proposal for video metrics was generally acceptable, except for the playbackJitter metric, which may be too aggregate to mean much.

    Device Inputs / A/V in the Browser

    I didn’t actually attend this session held by Anant Narayanan (Mozilla), but from what I heard, the discussion focused on how to manage permission of access to video camera, microphone and screen, e.g. when multiple applications (tabs) want access or when the same site wants access in a different session. This may apply to real-time communication with screen sharing, but also to photo sharing, video upload, or canvas access to devices e.g. for time lapse photography.

    Open Video Editors

    This was another session that I wasn’t able to attend, but I believe the creation of good open source video editing software and similar video creation software is really crucial to giving video a broader user appeal.

    Jeff Fortin (PiTiVi) moderated this session and I was fascinated to later see his analysis of the lifecycle of open source video editors. It is shocking to see how many people/projects have tried to create an open source video editor and how many have stopped their project. It is likely that the creation of a video editor is such a complex challenge that it requires a larger and more committed open source project – single people will just run out of steam too quickly. This may be comparable to the creation of a Web browser (see the size of the Mozilla project) or a text processing system (see the size of the OpenOffice project).

    Jeff also mentioned the need to create open video editor standards around playlist file formats etc. Possibly the Open Video Alliance could help. In any case, something has to be done in this space – maybe this would be a good topic to focus next year’s OVC on ?

    Monday’s Breakout Groups

    The conference ended officially on Sunday night, but we had a third day of discussions / hackday at the wonderful New York Lawschool venue. We had collected issues of interest during the two previous days and organised the breakout groups on the morning (Schedule).

    In the Content Protection/DRM session, Mark Watson from Netflix explained how their API works and that they believe that all we need in browsers is a secure way to exchange keys and an indicator of protection scheme is used – the actual protection scheme would not be implemented by the browser, but be provided by the underlying system (media framework/operating system). I think that until somebody actually implements something in a browser fork and shows how this can be done, we won’t have much progress. In my understanding, we may also need to disable part of the video API for encrypted content, because otherwise you can always e.g. grab frames from the video element into canvas and save them from there.

    In the Playlists and Gapless Playback session, there was massive brainstorming about what new cool things can be done with the video element in browsers if playback between snippets can be made seamless. Further discussions were about a standard playlist file formats (such as XSPF, MRSS or M3U), media fragment URIs in playlists for mashups, and the need to expose track metadata for HTML5 media elements.

    What more can I say ? It was an amazing three days and the complexity of problems that we’re dealing with is a tribute to how far HTML5 and open video has already come and exciting news for the kind of applications that will be possible (both professional and community) once we’ve solved the problems of today. It will be exciting to see what progress we will have made by next year’s conference.

    Thanks go to Google for sponsoring my trip to OVC.

    UPDATE : We actually have a mailing list for open media developers who are interested in these and similar topics – do join at http://lists.annodex.net/cgi-bin/mailman/listinfo/foms.

  • Revision 32594 : plugins en minuscules, et alias pour les noms de sites

    1er novembre 2009, par fil@… — Log

    plugins en minuscules, et alias pour les noms de sites