Recherche avancée

Médias (1)

Mot : - Tags -/bug

Autres articles (96)

  • MediaSPIP 0.1 Beta version

    25 avril 2011, par

    MediaSPIP 0.1 beta is the first version of MediaSPIP proclaimed as "usable".
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

  • HTML5 audio and video support

    13 avril 2011, par

    MediaSPIP uses HTML5 video and audio tags to play multimedia files, taking advantage of the latest W3C innovations supported by modern browsers.
    The MediaSPIP player used has been created specifically for MediaSPIP and can be easily adapted to fit in with a specific theme.
    For older browsers the Flowplayer flash fallback is used.
    MediaSPIP allows for media playback on major mobile platforms with the above (...)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

Sur d’autres sites (6593)

  • CJEU rules US cloud servers don’t comply with GDPR and what this means for web analytics

    17 juillet 2020, par Jake Thornton

    Breaking news : On July 16, 2020, the Court of Justice of the European Union (CJEU) has ruled that any cloud services hosted in the US are incapable of complying with the GDPR and EU privacy laws.

    In August 2016, the EU-US Privacy Shield framework came into effect, which “protects the fundamental rights of anyone in the EU whose personal data is transferred to the United States for commercial purposes. It allows the free transfer of data to companies that are certified in the US under the Privacy Shield.” – European Commission website

    However after today’s CJEU ruling, this Privacy Shield framework became invalidated due to significant differences between EU and US privacy laws.

    European privacy law activist Max Schrems summarises with “The Court clarified for a second time now that there is a clash between EU privacy law and US surveillance law. As the EU will not change its fundamental rights to please the NSA, the only way to overcome this clash is for the US to introduce solid privacy rights for all people – including foreigners. Surveillance reform thereby becomes crucial for the business interests of Silicon Valley.” – noyb website

    Today’s ruling also continues to spark concern into the legitimacy of US privacy laws which doesn’t fully protect people’s personal data when hosted on cloud servers based in the US.

    Web analytics hosted on US cloud servers don’t comply with GDPR

    How will this affect you ?

    For any business operating a website in the EU or if you have traffic coming to your website from EU visitors, you need to know what data you’re capturing and where this data is being stored.

    Here’s what Maja Smoltczyk (Berlin’s Commissioner for Data Protection and Freedom of Information) says :

    Controllers who transfer personal data to the USA, especially when using cloud-based services, are now required to switch immediately to service providers based in the European Union or a country that can
    ensure an adequate level of data protection. 
    The CJEU has made it refreshingly clear that data exports are not just financial decisions, as people’s fundamental rights must also be considered as a matter of priority. This ruling will put
    an end to the transfer of personal data to the USA
    for the sake of convenience or to cut costs.

    The controller is you (not Google) and by transferring data to the US you are at risk of being fined up to €20 million or 4% of your annual worldwide turnover for not being GDPR compliant. 

    It’s you who has to take action, not Google or other US companies. The court’s decision has immediate effect. While we assume there will be a grace period, companies should act now as finding and implementing alternatives solution can take a while. 

    Can no data be exported outside the EU anymore ?

    Data can still be exported outside the EU if an adequate level of data protection is guaranteed. This is the case for some trading partners of the EU such as New Zealand, Japan, Switzerland, and Canada. They have been certified by the EU as having a comparable level of privacy protection and therefore demonstrate adequacy at a country level.

    Necessary data can still flow to countries like the US too. This is for example the case when someone books a hotel in the US or when sending an email to someone in the US. Backups for disaster recovery and most other reasons don’t qualify as necessary.

    In all other cases you can still send data to countries like the US if you get explicit and informed consent from a user. Meaning the user has been informed about all possible risks of sending the data to the US and who can access the data (for example the US government).

    How this affects Google Analytics and Google Tag Manager users

    If your website is using Google Analytics, the safest bet is to deactivate it immediately. Otherwise, you must ask for consent from everyone who visits your website and inform them that the data will be processed in the United States under less strict privacy laws and all associated risks. If you don’t, you could be liable to privacy law infringements and face being fined for not complying with the GDPR. This also applies to Google Tag Manager as it transfers the IP address to the US which is considered personal data under the GDPR.

    Consent needs to be :

    • Freely given (the user must have a choice to not give consent and be able to opt out at any time) 
    • Informed (you need to disclose who is processing the data, what data is processed, where the data will be stored and how to opt out) 
    • Specific (consent is only valid for the specific informed purpose) 
    • Unambiguous (for example pre-ticked boxes or similar aren’t allowed)
    Web analytics that complies with GDPR

    If users don’t give you consent, you are not allowed to track them using Google Analytics or any other US based cloud solution.

    Update August 19, 2020

    A month after this ruling, over 100 complaints have been filed against websites for continuing to send data to the US via Google Analytics or Facebook, by the European privacy campaign group noyb. It’s clear Google and Facebook fall under US surveillance laws such as FISA 702 and the court clearly ruled these companies cannot rely on SCCs to transfer data to the US. Anyone still using Google Analytics is now at risk of facing fines and compensation damages

    How this affects Matomo users

    Our cloud servers are based in Germany.

    Matomo On-Premise users choose the location of their data themselves. If the servers are located in the EU nothing changes. If the servers are located outside the EU and the website targets EU users and tracks personal data, then you need to assess whether you are required to ask for tracking consent.

    If the data is stored inside the EU you can use Matomo without asking for any consent and you can continue tracking users even if they reject a consent screen which greatly increases the quality of your data.

    Want to avoid informing users about transferring their data to the US and all associated risks ?

    Try Matomo now for free ! No credit card required.

  • Progress with rtc.io

    12 août 2014, par silvia

    At the end of July, I gave a presentation about WebRTC and rtc.io at the WDCNZ Web Dev Conference in beautiful Wellington, NZ.

    webrtc_talk

    Putting that talk together reminded me about how far we have come in the last year both with the progress of WebRTC, its standards and browser implementations, as well as with our own small team at NICTA and our rtc.io WebRTC toolbox.

    WDCNZ presentation page5

    One of the most exciting opportunities is still under-exploited : the data channel. When I talked about the above slide and pointed out Bananabread, PeerCDN, Copay, PubNub and also later WebTorrent, that’s where I really started to get Web Developers excited about WebRTC. They can totally see the shift in paradigm to peer-to-peer applications away from the Server-based architecture of the current Web.

    Many were also excited to learn more about rtc.io, our own npm nodules based approach to a JavaScript API for WebRTC.

    rtcio_modules

    We believe that the World of JavaScript has reached a critical stage where we can no longer code by copy-and-paste of JavaScript snippets from all over the Web universe. We need a more structured module reuse approach to JavaScript. Node with JavaScript on the back end really only motivated this development. However, we’ve needed it for a long time on the front end, too. One big library (jquery anyone ?) that does everything that anyone could ever need on the front-end isn’t going to work any longer with the amount of functionality that we now expect Web applications to support. Just look at the insane growth of npm compared to other module collections :

    Packages per day across popular platforms (Shamelessly copied from : http://blog.nodejitsu.com/npm-innovation-through-modularity/)

    For those that – like myself – found it difficult to understand how to tap into the sheer power of npm modules as a font end developer, simply use browserify. npm modules are prepared following the CommonJS module definition spec. Browserify works natively with that and “compiles” all the dependencies of a npm modules into a single bundle.js file that you can use on the front end through a script tag as you would in plain HTML. You can learn more about browserify and module definitions and how to use browserify.

    For those of you not quite ready to dive in with browserify we have prepared prepared the rtc module, which exposes the most commonly used packages of rtc.io through an “RTC” object from a browserified JavaScript file. You can also directly download the JavaScript file from GitHub.

    Using rtc.io rtc JS library
    Using rtc.io rtc JS library

    So, I hope you enjoy rtc.io and I hope you enjoy my slides and large collection of interesting links inside the deck, and of course : enjoy WebRTC ! Thanks to Damon, JEeff, Cathy, Pete and Nathan – you’re an awesome team !

    On a side note, I was really excited to meet the author of browserify, James Halliday (@substack) at WDCNZ, whose talk on “building your own tools” seemed to take me back to the times where everything was done on the command-line. I think James is using Node and the Web in a way that would appeal to a Linux Kernel developer. Fascinating !!

  • Progress with rtc.io

    12 août 2014, par silvia

    At the end of July, I gave a presentation about WebRTC and rtc.io at the WDCNZ Web Dev Conference in beautiful Wellington, NZ.

    webrtc_talk

    Putting that talk together reminded me about how far we have come in the last year both with the progress of WebRTC, its standards and browser implementations, as well as with our own small team at NICTA and our rtc.io WebRTC toolbox.

    WDCNZ presentation page5

    One of the most exciting opportunities is still under-exploited : the data channel. When I talked about the above slide and pointed out Bananabread, PeerCDN, Copay, PubNub and also later WebTorrent, that’s where I really started to get Web Developers excited about WebRTC. They can totally see the shift in paradigm to peer-to-peer applications away from the Server-based architecture of the current Web.

    Many were also excited to learn more about rtc.io, our own npm nodules based approach to a JavaScript API for WebRTC.

    rtcio_modules

    We believe that the World of JavaScript has reached a critical stage where we can no longer code by copy-and-paste of JavaScript snippets from all over the Web universe. We need a more structured module reuse approach to JavaScript. Node with JavaScript on the back end really only motivated this development. However, we’ve needed it for a long time on the front end, too. One big library (jquery anyone ?) that does everything that anyone could ever need on the front-end isn’t going to work any longer with the amount of functionality that we now expect Web applications to support. Just look at the insane growth of npm compared to other module collections :

    Packages per day across popular platforms (Shamelessly copied from : http://blog.nodejitsu.com/npm-innovation-through-modularity/)

    For those that – like myself – found it difficult to understand how to tap into the sheer power of npm modules as a font end developer, simply use browserify. npm modules are prepared following the CommonJS module definition spec. Browserify works natively with that and “compiles” all the dependencies of a npm modules into a single bundle.js file that you can use on the front end through a script tag as you would in plain HTML. You can learn more about browserify and module definitions and how to use browserify.

    For those of you not quite ready to dive in with browserify we have prepared prepared the rtc module, which exposes the most commonly used packages of rtc.io through an “RTC” object from a browserified JavaScript file. You can also directly download the JavaScript file from GitHub.

    Using rtc.io rtc JS library
    Using rtc.io rtc JS library

    So, I hope you enjoy rtc.io and I hope you enjoy my slides and large collection of interesting links inside the deck, and of course : enjoy WebRTC ! Thanks to Damon, JEeff, Cathy, Pete and Nathan – you’re an awesome team !

    On a side note, I was really excited to meet the author of browserify, James Halliday (@substack) at WDCNZ, whose talk on “building your own tools” seemed to take me back to the times where everything was done on the command-line. I think James is using Node and the Web in a way that would appeal to a Linux Kernel developer. Fascinating !!

    The post Progress with rtc.io first appeared on ginger’s thoughts.