Recherche avancée

Médias (1)

Mot : - Tags -/censure

Autres articles (46)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 est la première version de MediaSPIP stable.
    Sa date de sortie officielle est le 21 juin 2013 et est annoncée ici.
    Le fichier zip ici présent contient uniquement les sources de MediaSPIP en version standalone.
    Comme pour la version précédente, il est nécessaire d’installer manuellement l’ensemble des dépendances logicielles sur le serveur.
    Si vous souhaitez utiliser cette archive pour une installation en mode ferme, il vous faudra également procéder à d’autres modifications (...)

  • Mise à disposition des fichiers

    14 avril 2011, par

    Par défaut, lors de son initialisation, MediaSPIP ne permet pas aux visiteurs de télécharger les fichiers qu’ils soient originaux ou le résultat de leur transformation ou encodage. Il permet uniquement de les visualiser.
    Cependant, il est possible et facile d’autoriser les visiteurs à avoir accès à ces documents et ce sous différentes formes.
    Tout cela se passe dans la page de configuration du squelette. Il vous faut aller dans l’espace d’administration du canal, et choisir dans la navigation (...)

  • Installation en mode ferme

    4 février 2011, par

    Le mode ferme permet d’héberger plusieurs sites de type MediaSPIP en n’installant qu’une seule fois son noyau fonctionnel.
    C’est la méthode que nous utilisons sur cette même plateforme.
    L’utilisation en mode ferme nécessite de connaïtre un peu le mécanisme de SPIP contrairement à la version standalone qui ne nécessite pas réellement de connaissances spécifique puisque l’espace privé habituel de SPIP n’est plus utilisé.
    Dans un premier temps, vous devez avoir installé les mêmes fichiers que l’installation (...)

Sur d’autres sites (6251)

  • CJEU rules US cloud servers don’t comply with GDPR and what this means for web analytics

    17 juillet 2020, par Jake Thornton

    Breaking news : On July 16, 2020, the Court of Justice of the European Union (CJEU) has ruled that any cloud services hosted in the US are incapable of complying with the GDPR and EU privacy laws.

    In August 2016, the EU-US Privacy Shield framework came into effect, which “protects the fundamental rights of anyone in the EU whose personal data is transferred to the United States for commercial purposes. It allows the free transfer of data to companies that are certified in the US under the Privacy Shield.” – European Commission website

    However after today’s CJEU ruling, this Privacy Shield framework became invalidated due to significant differences between EU and US privacy laws.

    European privacy law activist Max Schrems summarises with “The Court clarified for a second time now that there is a clash between EU privacy law and US surveillance law. As the EU will not change its fundamental rights to please the NSA, the only way to overcome this clash is for the US to introduce solid privacy rights for all people – including foreigners. Surveillance reform thereby becomes crucial for the business interests of Silicon Valley.” – noyb website

    Today’s ruling also continues to spark concern into the legitimacy of US privacy laws which doesn’t fully protect people’s personal data when hosted on cloud servers based in the US.

    Web analytics hosted on US cloud servers don’t comply with GDPR

    How will this affect you ?

    For any business operating a website in the EU or if you have traffic coming to your website from EU visitors, you need to know what data you’re capturing and where this data is being stored.

    Here’s what Maja Smoltczyk (Berlin’s Commissioner for Data Protection and Freedom of Information) says :

    Controllers who transfer personal data to the USA, especially when using cloud-based services, are now required to switch immediately to service providers based in the European Union or a country that can
    ensure an adequate level of data protection. 
    The CJEU has made it refreshingly clear that data exports are not just financial decisions, as people’s fundamental rights must also be considered as a matter of priority. This ruling will put
    an end to the transfer of personal data to the USA
    for the sake of convenience or to cut costs.

    The controller is you (not Google) and by transferring data to the US you are at risk of being fined up to €20 million or 4% of your annual worldwide turnover for not being GDPR compliant. 

    It’s you who has to take action, not Google or other US companies. The court’s decision has immediate effect. While we assume there will be a grace period, companies should act now as finding and implementing alternatives solution can take a while. 

    Can no data be exported outside the EU anymore ?

    Data can still be exported outside the EU if an adequate level of data protection is guaranteed. This is the case for some trading partners of the EU such as New Zealand, Japan, Switzerland, and Canada. They have been certified by the EU as having a comparable level of privacy protection and therefore demonstrate adequacy at a country level.

    Necessary data can still flow to countries like the US too. This is for example the case when someone books a hotel in the US or when sending an email to someone in the US. Backups for disaster recovery and most other reasons don’t qualify as necessary.

    In all other cases you can still send data to countries like the US if you get explicit and informed consent from a user. Meaning the user has been informed about all possible risks of sending the data to the US and who can access the data (for example the US government).

    How this affects Google Analytics and Google Tag Manager users

    If your website is using Google Analytics, the safest bet is to deactivate it immediately. Otherwise, you must ask for consent from everyone who visits your website and inform them that the data will be processed in the United States under less strict privacy laws and all associated risks. If you don’t, you could be liable to privacy law infringements and face being fined for not complying with the GDPR. This also applies to Google Tag Manager as it transfers the IP address to the US which is considered personal data under the GDPR.

    Consent needs to be :

    • Freely given (the user must have a choice to not give consent and be able to opt out at any time) 
    • Informed (you need to disclose who is processing the data, what data is processed, where the data will be stored and how to opt out) 
    • Specific (consent is only valid for the specific informed purpose) 
    • Unambiguous (for example pre-ticked boxes or similar aren’t allowed)
    Web analytics that complies with GDPR

    If users don’t give you consent, you are not allowed to track them using Google Analytics or any other US based cloud solution.

    Update August 19, 2020

    A month after this ruling, over 100 complaints have been filed against websites for continuing to send data to the US via Google Analytics or Facebook, by the European privacy campaign group noyb. It’s clear Google and Facebook fall under US surveillance laws such as FISA 702 and the court clearly ruled these companies cannot rely on SCCs to transfer data to the US. Anyone still using Google Analytics is now at risk of facing fines and compensation damages

    How this affects Matomo users

    Our cloud servers are based in Germany.

    Matomo On-Premise users choose the location of their data themselves. If the servers are located in the EU nothing changes. If the servers are located outside the EU and the website targets EU users and tracks personal data, then you need to assess whether you are required to ask for tracking consent.

    If the data is stored inside the EU you can use Matomo without asking for any consent and you can continue tracking users even if they reject a consent screen which greatly increases the quality of your data.

    Want to avoid informing users about transferring their data to the US and all associated risks ?

    Try Matomo now for free ! No credit card required.

  • Spotlight : Alwaysdata.com the company behind Piwik.org web hosting [Interview]

    11 avril 2013, par matt — About, Community

    Piwik is the result of the work of many talented individuals and companies. Today we’d like to showcase Alwaysdata.com, the awesome web hosting company providing managed hosting for all Piwik.org websites and services.

    I recently met and asked a few questions to Cyril, co-founder of Alwaysdata.com and Piwik core developer. Learn more in the interview below !

    What is Alwaysdata ?

    We are a French web hosting company created in 2006. If you need to host a website — a Piwik installation, for example — or even your domains/emails, we provide infrastructure and maintenance services.

    Who are your customers and what kind of work do you do ?

    We have several types of clients :

    • Individuals who need hosting for their personal site and who benefit from storage space with direct SSH access.
    • Web agencies who need hosting for their clients’ sites.
    • The largest customers, often on dedicated servers, for hosting their site/infrastructure.

    Our work falls into three categories :

    1. Support (via administration, telephone, Twitter, IRC, etc.)
    2. Development (in Python), primarily to add new features
    3. System administration, either for maintenance (e.g. adding servers), or for preparing new features

    What sets Alwaysdata apart from the large web hosting competition ?

    Two things :

    • Availability. We are a small team and often know our customers quite well. We are all on IRC, so you can contact us directly if you need any assistance.
    • Features. We are halfway between traditional web hosting and the cloud, combining the advantages of both.

    Are you using Piwik internally or with customers ? If so, how are you using Piwik ?

    All of our customers can view statistics for their sites via our global Piwik installation, without having to configure anything.

    To provide these analytics reports to our customers, we implemented import of the raw access logs in Piwik. The Log import toolkit is now a feature included in Piwik.

    What is the next big thing for Alwaysdata ?

    We are going to upgrade our pricing : instead of fixed costs, each of our clients will now pay exactly what they consume. This allows our clients the benefit of a very high quality service for the lowest possible price.

    We are also going to add native support for more technologies : Java, Node.js, ZeroMQ, etc.

    Thank you for your time and all the best to Alwaysdata for the future !

    Note from Matt, Piwik founder : Cyril and the team at Alwaysdata.com have been consistently great in their system administration work for Piwik.org services, providing a fast and reliable web hosting experience with top notch support and security practises. They also handled the migration of all services from our old servers with total piece of mind.

    Alwaysdata contributed to Piwik the popular Log Analytics toolkit. They are great software developers and system administrators with a passion for their work. Since 2006, they have been maintaining optimized hosting services for the entire web infrastructure (websites, domains, emails, databases, etc.), from the simplest to the most exotic. We do recommend their managed hosting services.

    Learn more

  • Why does my ffmpeg audio sound slower and deeper - sample rate mismatch

    4 septembre 2020, par yogesh zinzu

    ok so this is a discord bot to record voice chat
https://hatebin.com/hgjlazacri
Now the bot works perfectly fine but the issue is that the audio sounds a bit deeper and slower than normal.. Why does it happen ? how can I make the audio sound 1:1..

    


    

    

    const Discord = require('discord.js');
const client = new Discord.Client();
const ffmpegInstaller = require('@ffmpeg-installer/ffmpeg');
const ffmpeg = require('fluent-ffmpeg');
ffmpeg.setFfmpegPath(ffmpegInstaller.path);
const fs = require('fs-extra')
const mergeStream = require('merge-stream');
const config = require('./config.json');
const { getAudioDurationInSeconds } = require('get-audio-duration');
const cp = require('child_process');
const path1 = require('path');
const Enmap = require('enmap');
const UserRecords = require("./models/userrecords.js")
const ServerRecords = require("./models/serverrecords.js")
let prefix = `$`
class Readable extends require('stream').Readable { _read() {} }
let recording = false;
let currently_recording = {};
let mp3Paths = [];
const silence_buffer = new Uint8Array(3840);
const express = require('express')
const app = express()
const port = 3000
const publicIP = require('public-ip')
const { program } = require('commander');
const { path } = require('@ffmpeg-installer/ffmpeg');
const version = '0.0.1'
program.version(version);
let debug = false
let runProd = false
let fqdn = "";
const mongoose = require("mongoose");
const MongoClient = require('mongodb').MongoClient;
mongoose.connect('SECRRET',{
  useNewUrlParser: true
}, function(err){
  if(err){
    console.log(err);
  }else{
    console.log("Database connection initiated");
  }
});
require("dotenv").config()
function bufferToStream(buffer) {
    let stream = new Readable();
    stream.push(buffer);
    return stream;
}





client.commands = new Enmap();

client.on('ready', async () => {
    console.log(`Logged in as ${client.user.tag}`);

    let host = "localhost"

    

    let ip = await publicIP.v4();

    let protocol = "http";
    if (!runProd) {
        host = "localhost"
    } else {
        host = `35.226.244.186`;
    }
    fqdn = `${protocol}://${host}:${port}`
    app.listen(port, `0.0.0.0`, () => {
        console.log(`Listening on port ${port} for ${host} at fqdn ${fqdn}`)
    })
});
let randomArr = []
let finalArrWithIds = []
let variable = 0
client.on('message', async message => {
    console.log(`fuck`);
    if(message.content === `$record`){
        mp3Paths = []
        finalArrWithIds = []
        let membersToScrape = Array.from(message.member.voice.channel.members.values());
        membersToScrape.forEach((member) => {
            if(member.id === `749250882830598235`) {
                console.log(`botid`);
            }
            else {
                finalArrWithIds.push(member.id)
            }
            
        })
        const randomNumber = Math.floor(Math.random() * 100)
        randomArr = []
        randomArr.push(randomNumber)
    }
   
    
    const generateSilentData = async (silentStream, memberID) => {
        console.log(`recordingnow`)
        while(recording) {
            if (!currently_recording[memberID]) {
                silentStream.push(silence_buffer);
            }
            await new Promise(r => setTimeout(r, 20));
        }
        return "done";
    }
    console.log(generateSilentData, `status`)
    function generateOutputFile(channelID, memberID) {
        const dir = `./recordings/${channelID}/${memberID}`;
        fs.ensureDirSync(dir);
        const fileName = `${dir}/${randomArr[0]}.aac`;
        console.log(`${fileName} ---------------------------`);
        return fs.createWriteStream(fileName);
    }
    
    if (!fs.existsSync("public")) {
        fs.mkdirSync("public");
    }
    app.use("/public", express.static("./public"));
  if (!message.guild) return;

  if (message.content === config.prefix + config.record_command) {
    if (recording) {
        message.reply("bot is already recording");
        return
    }
    if (message.member.voice.channel) {
        recording = true;
        const connection = await message.member.voice.channel.join();
        const dispatcher = connection.play('./audio.mp3');

        connection.on('speaking', (user, speaking) => {
            if (speaking.has('SPEAKING')) {
                currently_recording[user.id] = true;
            } else {
                currently_recording[user.id] = false;
            }
        })


        let members = Array.from(message.member.voice.channel.members.values());
        members.forEach((member) => {

            if (member.id != client.user.id) {
                let memberStream = connection.receiver.createStream(member, {mode : 'pcm', end : 'manual'})

                let outputFile = generateOutputFile(message.member.voice.channel.id, member.id);
                console.log(outputFile, `outputfile here`);
                mp3Paths.push(outputFile.path);
                    

                silence_stream = bufferToStream(new Uint8Array(0));
                generateSilentData(silence_stream, member.id).then(data => console.log(data));
                let combinedStream = mergeStream(silence_stream, memberStream);

                ffmpeg(combinedStream)
                    .inputFormat('s32le')
                    .audioFrequency(44100)
                    .audioChannels(2)
                    .on('error', (error) => {console.log(error)})
                    .audioCodec('aac')
                    .format('adts') 
                    .pipe(outputFile)
                    
            }
        })
    } else {
      message.reply('You need to join a voice channel first!');
    }
  }

  if (message.content === config.prefix + config.stop_command) {

    let date = new Date();
    let dd = String(date.getDate()).padStart(2, '0');
    let mm = String(date.getMonth() + 1).padStart(2, '0'); 
    let yyyy = date.getFullYear();
    date = mm + '/' + dd + '/' + yyyy;





    let currentVoiceChannel = message.member.voice.channel;
    if (currentVoiceChannel) {
        recording = false;
        await currentVoiceChannel.leave();

        let mergedOutputFolder = './recordings/' + message.member.voice.channel.id + `/${randomArr[0]}/`;
        fs.ensureDirSync(mergedOutputFolder);
        let file_name = `${randomArr[0]}` + '.aac';
        let mergedOutputFile = mergedOutputFolder + file_name;
    
        
    let download_path = message.member.voice.channel.id + `/${randomArr[0]}/` + file_name;

        let mixedOutput = new ffmpeg();
        console.log(mp3Paths, `mp3pathshere`);
        mp3Paths.forEach((mp3Path) => {
             mixedOutput.addInput(mp3Path);
            
        })
        console.log(mp3Paths);
        //mixedOutput.complexFilter('amix=inputs=2:duration=longest');
        mixedOutput.complexFilter('amix=inputs=' + mp3Paths.length + ':duration=longest');
        
        let processEmbed = new Discord.MessageEmbed().setTitle(`Audio Processing.`)
        processEmbed.addField(`Audio processing starting now..`, `Processing Audio`)
        processEmbed.setThumbnail(`https://media.discordapp.net/attachments/730811581046325348/748610998985818202/speaker.png`)
        processEmbed.setColor(` #00FFFF`)
        const processEmbedMsg = await message.channel.send(processEmbed)
        async function saveMp3(mixedData, outputMixed) {
            console.log(`${mixedData} MIXED `)
            
            
            
            return new Promise((resolve, reject) => {
                mixedData.on('error', reject).on('progress',
                async (progress) => {
                    
                    let processEmbedEdit = new Discord.MessageEmbed().setTitle(`Audio Processing.`)
                    processEmbedEdit.addField(`Processing: ${progress.targetSize} KB converted`, `Processing Audio`)
                    processEmbedEdit.setThumbnail(`https://media.discordapp.net/attachments/730811581046325348/748610998985818202/speaker.png`)
                    processEmbedEdit.setColor(` #00FFFF`)
                    processEmbedMsg.edit(processEmbedEdit)
                    console.log('Processing: ' + progress.targetSize + ' KB converted');
                }).on('end', () => {
                    console.log('Processing finished !');
                    resolve()
                }).saveToFile(outputMixed);
                console.log(`${outputMixed} IT IS HERE`);
            })
        }
        // mixedOutput.saveToFile(mergedOutputFile);
        await saveMp3(mixedOutput, mergedOutputFile);
        console.log(`${mixedOutput} IN HEREEEEEEE`);
        // We saved the recording, now copy the recording
        if (!fs.existsSync(`./public`)) {
            fs.mkdirSync(`./public`);
        }
        let sourceFile = `${__dirname}/recordings/${download_path}`
        console.log(`DOWNLOAD PATH HERE ${download_path}`)
        const guildName = message.guild.id;
        const serveExist = `/public/${guildName}`
        if (!fs.existsSync(`.${serveExist}`)) {
            fs.mkdirSync(`.${serveExist}`)
        }
        let destionationFile = `${__dirname}${serveExist}/${file_name}`

        let errorThrown = false
        try {
            fs.copySync(sourceFile, destionationFile);
        } catch (err) {
            errorThrown = true
            await message.channel.send(`Error: ${err.message}`)
        }
        const usersWithTag = finalArrWithIds.map(user => `\n <@${user}>`);
        let timeSpent = await getAudioDurationInSeconds(`public/${guildName}/${file_name}`)
        let timesSpentRound = Math.floor(timeSpent)
        let finalTimeSpent = timesSpentRound / 60
        let finalTimeForReal = Math.floor(finalTimeSpent)
        if(!errorThrown){
            //--------------------- server recording save START
            class GeneralRecords {
                constructor(generalLink, date, voice, time) {
                  this.generalLink = generalLink;
                  this.date = date;
                  this.note = `no note`;
                  this.voice = voice;
                  this.time = time
                }
              }
              let newGeneralRecordClassObject = new GeneralRecords(`${fqdn}/public/${guildName}/${file_name}`, date, usersWithTag, finalTimeForReal)
              let checkingServerRecord = await ServerRecords.exists({userid: `server`})
              if(checkingServerRecord === true){
                  existingServerRecord = await ServerRecords.findOne({userid: `server`})
                  existingServerRecord.content.push(newGeneralRecordClassObject)
                  await existingServerRecord.save()
              }
              if(checkingServerRecord === false){
                let serverRecord = new ServerRecords()
                serverRecord.userid = `server`
                serverRecord.content.push(newGeneralRecordClassObject)
                await serverRecord.save()
              }
              //--------------------- server recording save STOP
        }
        
        //--------------------- personal recording section START
        for( member of finalArrWithIds) {

        let personal_download_path = message.member.voice.channel.id + `/${member}/` + file_name;
        let sourceFilePersonal = `${__dirname}/recordings/${personal_download_path}`
        let destionationFilePersonal = `${__dirname}${serveExist}/${member}/${file_name}`
        await fs.copySync(sourceFilePersonal, destionationFilePersonal);
        const user = client.users.cache.get(member);
        console.log(user, `user here`);
        try {
            ffmpeg.setFfmpegPath(ffmpegInstaller.path);
          
            ffmpeg(`public/${guildName}/${member}/${file_name}`)
             .audioFilters('silenceremove=stop_periods=-1:stop_duration=1:stop_threshold=-90dB')
             .output(`public/${guildName}/${member}/personal-${file_name}`)
             .on(`end`, function () {
               console.log(`DONE`);
             })
             .on(`error`, function (error) {
               console.log(`An error occured` + error.message)
             })
             .run();
             
          }
          catch (error) {
          console.log(error)
          }
        

        // ----------------- SAVING PERSONAL RECORDING TO DATABASE START
        class PersonalRecords {
            constructor(generalLink, personalLink, date, time) {
              this.generalLink = generalLink;
              this.personalLink = personalLink;
              this.date = date;
              this.note = `no note`;
              this.time = time;
            }
          }
          let timeSpentPersonal = await getAudioDurationInSeconds(`public/${guildName}/${file_name}`)
          let timesSpentRoundPersonal = Math.floor(timeSpentPersonal)
          let finalTimeSpentPersonal = timesSpentRoundPersonal / 60
          let finalTimeForRealPersonal = Math.floor(finalTimeSpentPersonal)
          let newPersonalRecordClassObject = new PersonalRecords(`${fqdn}/public/${guildName}/${file_name}`, `${fqdn}/public/${guildName}/${member}/personal-${file_name}`, date, finalTimeForRealPersonal)

           let checkingUserRecord = await UserRecords.exists({userid: member})
              if(checkingUserRecord === true){
                  existingUserRecord = await UserRecords.findOne({userid: member})
                  existingUserRecord.content.push(newPersonalRecordClassObject)
                  await existingUserRecord.save()
              }
              if(checkingUserRecord === false){
                let newRecord = new UserRecords()
                newRecord.userid = member
                newRecord.content.push(newPersonalRecordClassObject)
                await newRecord.save()
              }


       
        // ----------------- SAVING PERSONAL RECORDING TO DATABASE END
       

        const endPersonalEmbed = new Discord.MessageEmbed().setTitle(`Your performance was amazing ! Review it here :D`)
        endPersonalEmbed.setColor('#9400D3')
        endPersonalEmbed.setThumbnail(`https://media.discordapp.net/attachments/730811581046325348/745381641324724294/vinyl.png`)
        endPersonalEmbed.addField(`1