Recherche avancée

Médias (1)

Mot : - Tags -/pirate bay

Autres articles (53)

  • Mise à jour de la version 0.1 vers 0.2

    24 juin 2013, par

    Explications des différents changements notables lors du passage de la version 0.1 de MediaSPIP à la version 0.3. Quelles sont les nouveautés
    Au niveau des dépendances logicielles Utilisation des dernières versions de FFMpeg (>= v1.2.1) ; Installation des dépendances pour Smush ; Installation de MediaInfo et FFprobe pour la récupération des métadonnées ; On n’utilise plus ffmpeg2theora ; On n’installe plus flvtool2 au profit de flvtool++ ; On n’installe plus ffmpeg-php qui n’est plus maintenu au (...)

  • Personnaliser en ajoutant son logo, sa bannière ou son image de fond

    5 septembre 2013, par

    Certains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;

  • Ecrire une actualité

    21 juin 2013, par

    Présentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
    Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
    Vous pouvez personnaliser le formulaire de création d’une actualité.
    Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...)

Sur d’autres sites (4649)

  • Can build & make video call with pjsip and ffmpeg

    10 mai 2023, par QViet

    I try to build PJSIP with ffmpeg with this config :

    


    i Follow those step :

    


      

    • build need lib and place in thirdparty folder name ffmpeg
    • 


    • setup link lib & header already.
    • 


    • run build with "$configure —with-ffmpeg="
    • 


    • config_site.h add :
    • 


    


    

    #define PJMEDIA_HAS_OPENH264_CODEC 1 
#define PJMEDIA_HAS_VIDEO 1 
#define PJMEDIA_VIDEO_DEV_HAS_FFMPEG 1 
#define PJMEDIA_HAS_FFMPEG_VID_CODEC 1 
#define PJMEDIA_HAS_FFMPEG 1 
#define PJMEDIA_HAS_FFMPEG_CODEC_H264 1 
#define PJMEDIA_HAS_LIBAVDEVICE 1 
#define PJMEDIA_HAS_OPENH264_CODEC 1


    


    


    I see have to enable PJMEDIA_HAS_OPENH264_CODEC , if not, can build success but when import will receive this error :

    


    Undefined symbol: _WelsCreateDecoder


    


    cause miss wels package exits in openh264 lib.

    


    The build with success after all with above config but in this :

    


       pj_status_t status = pjsua_vid_enum_codecs(videoCodecInfo, &videoCodecCount);


    


    the codec info just show 1 codec is "H264/97" -> is OpenH264 codec, i can't see ffmpeg here.
When im start call like normal, see log openh264 init call/ open camera .

    


    What i need step to using ffmpeg, i can see any docs about it

    


    can you help me ?

    


    ** this i log call stack :**

    


    2023-04-24 10:17:21.522976+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.523 [SIPSample -[SIPSample startEndpointWithEndpointConfiguration:error:]:272] Creating new PSUASIP Endpoint instance.
10:17:21.525         os_core_unix.c !pjlib 2.13-dev for POSIX initialized
10:17:21.526         sip_endpoint.c  .Creating endpoint instance...
10:17:21.527                  pjlib  .select() I/O Queue created (0x1050a32c8)
10:17:21.527         sip_endpoint.c  .Module "mod-msg-print" registered
10:17:21.527        sip_transport.c  .Transport manager created.
10:17:21.527           pjsua_core.c  .PJSUA state changed: NULL --> CREATED
2023-04-24 10:17:21.528077+0700 PSUAKitSample[83000:15642817] 10:17:21.528         sip_endpoint.c  .Module "mod-pjsua-log" registered

2023-04-24 10:17:21.528204+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.528 [SIPSample void logCallBack(int, const char *, int):1034]         sip_endpoint.c  .Module "mod-PSUA-log" registered
2023-04-24 10:17:21.529375+0700 PSUAKitSample[83000:15642817] 10:17:21.529         sip_endpoint.c  .Module "mod-tsx-layer" registered

2023-04-24 10:17:21.529477+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.529 [SIPSample void logCallBack(int, const char *, int):1034]         sip_endpoint.c  .Module "mod-tsx-layer" registered
2023-04-24 10:17:21.529491+0700 PSUAKitSample[83000:15642817] 10:17:21.529         sip_endpoint.c  .Module "mod-stateful-util" registered

2023-04-24 10:17:21.529592+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.530 [SIPSample void logCallBack(int, const char *, int):1034]         sip_endpoint.c  .Module "mod-stateful-util" registered
2023-04-24 10:17:21.529895+0700 PSUAKitSample[83000:15642817] 10:17:21.529         sip_endpoint.c  .Module "mod-ua" registered

2023-04-24 10:17:21.530024+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.530 [SIPSample void logCallBack(int, const char *, int):1034]         sip_endpoint.c  .Module "mod-ua" registered
2023-04-24 10:17:21.530068+0700 PSUAKitSample[83000:15642817] 10:17:21.530         sip_endpoint.c  .Module "mod-100rel" registered

2023-04-24 10:17:21.530181+0700 PSUAKitSample[83000:15642817] 10:17:21.530         sip_endpoint.c  .Module "mod-pjsua" registered

2023-04-24 10:17:21.530217+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.530 [SIPSample void logCallBack(int, const char *, int):1034]         sip_endpoint.c  .Module "mod-100rel" registered
2023-04-24 10:17:21.530283+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.530 [SIPSample void logCallBack(int, const char *, int):1034]         sip_endpoint.c  .Module "mod-PSUA" registered
2023-04-24 10:17:21.530865+0700 PSUAKitSample[83000:15642817] 10:17:21.530         sip_endpoint.c  .Module "mod-invite" registered

2023-04-24 10:17:21.530970+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.531 [SIPSample void logCallBack(int, const char *, int):1034]         sip_endpoint.c  .Module "mod-invite" registered
2023-04-24 10:17:21.677206+0700 PSUAKitSample[83000:15642817] 10:17:21.677        coreaudio_dev.c  .. dev_id 0: iPhone IO device  (in=1, out=1) 8000Hz

2023-04-24 10:17:21.677497+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.677 [SIPSample void logCallBack(int, const char *, int):1034]        coreaudio_dev.c  .. dev_id 0: iPhone IO device  (in=1, out=1) 8000Hz
2023-04-24 10:17:21.677588+0700 PSUAKitSample[83000:15642817] 10:17:21.677        coreaudio_dev.c  ..core audio initialized

2023-04-24 10:17:21.677804+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.678 [SIPSample void logCallBack(int, const char *, int):1034]        coreaudio_dev.c  ..core audio initialized
2023-04-24 10:17:21.678538+0700 PSUAKitSample[83000:15642817] 10:17:21.678                  pjlib  ..select() I/O Queue created (0x1060684a8)

2023-04-24 10:17:21.678743+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.679 [SIPSample void logCallBack(int, const char *, int):1034]                  pjlib  ..select() I/O Queue created (0x1060684a8)
2023-04-24 10:17:21.683380+0700 PSUAKitSample[83000:15642817] 10:17:21.683            pjsua_vid.c  ..Initializing video subsystem..

2023-04-24 10:17:21.683585+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.683 [SIPSample void logCallBack(int, const char *, int):1034]            PSUA_vid.c  ..Initializing video subsystem..
2023-04-24 10:17:21.684058+0700 PSUAKitSample[83000:15642817] 10:17:21.684             vid_conf.c  ...Created video conference bridge with 32 ports

2023-04-24 10:17:21.684260+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.684 [SIPSample void logCallBack(int, const char *, int):1034]             vid_conf.c  ...Created video conference bridge with 32 ports
2023-04-24 10:17:21.684983+0700 PSUAKitSample[83000:15642817] 10:17:21.684           openh264.cpp  ...OpenH264 codec initialized

2023-04-24 10:17:21.685168+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.685 [SIPSample void logCallBack(int, const char *, int):1034]           openh264.cpp  ...OpenH264 codec initialized
2023-04-24 10:17:21.685237+0700 PSUAKitSample[83000:15642817] 10:17:21.685           opengl_dev.c  ...OpenGL device initialized

2023-04-24 10:17:21.685370+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.685 [SIPSample void logCallBack(int, const char *, int):1034]           opengl_dev.c  ...OpenGL device initialized
2023-04-24 10:17:21.715616+0700 PSUAKitSample[83000:15642817] 10:17:21.715           darwin_dev.m  ...Darwin video initialized with 5 devices:

2023-04-24 10:17:21.715796+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.716 [SIPSample void logCallBack(int, const char *, int):1034]           darwin_dev.m  ...Darwin video initialized with 5 devices:
2023-04-24 10:17:21.715812+0700 PSUAKitSample[83000:15642817] 10:17:21.715           darwin_dev.m  ... 0: [Renderer] iOS - UIView

2023-04-24 10:17:21.715917+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.716 [SIPSample void logCallBack(int, const char *, int):1034]           darwin_dev.m  ... 0: [Renderer] iOS - UIView
2023-04-24 10:17:21.715921+0700 PSUAKitSample[83000:15642817] 10:17:21.715           darwin_dev.m  ... 1: [Capturer] AVF - Front Camera

2023-04-24 10:17:21.716006+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.716 [SIPSample void logCallBack(int, const char *, int):1034]           darwin_dev.m  ... 1: [Capturer] AVF - Front Camera
2023-04-24 10:17:21.716033+0700 PSUAKitSample[83000:15642817] 10:17:21.716           darwin_dev.m  ... 2: [Capturer] AVF - Back Camera

2023-04-24 10:17:21.716137+0700 PSUAKitSample[83000:15642817] 10:17:21.716           darwin_dev.m  ... 3: [Capturer] AVF - Back Dual Camera

2023-04-24 10:17:21.716152+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.716 [SIPSample void logCallBack(int, const char *, int):1034]           darwin_dev.m  ... 2: [Capturer] AVF - Back Camera
2023-04-24 10:17:21.716218+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.716 [SIPSample void logCallBack(int, const char *, int):1034]           darwin_dev.m  ... 3: [Capturer] AVF - Back Dual Camera
2023-04-24 10:17:21.716247+0700 PSUAKitSample[83000:15642817] 10:17:21.716           darwin_dev.m  ... 4: [Capturer] AVF - Back Telephoto Camera

2023-04-24 10:17:21.716375+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.716 [SIPSample void logCallBack(int, const char *, int):1034]           darwin_dev.m  ... 4: [Capturer] AVF - Back Telephoto Camera
2023-04-24 10:17:21.716409+0700 PSUAKitSample[83000:15642817] 10:17:21.716         colorbar_dev.c  ...Colorbar video src initialized with 2 device(s):

2023-04-24 10:17:21.716673+0700 PSUAKitSample[83000:15642817] 10:17:21.716         colorbar_dev.c  ... 0: Colorbar generator

2023-04-24 10:17:21.716764+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.717 [SIPSample void logCallBack(int, const char *, int):1034]         colorbar_dev.c  ...Colorbar video src initialized with 2 device(s):
2023-04-24 10:17:21.716918+0700 PSUAKitSample[83000:15642817] 10:17:21.716         colorbar_dev.c  ... 1: Colorbar-active

2023-04-24 10:17:21.716938+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.717 [SIPSample void logCallBack(int, const char *, int):1034]         colorbar_dev.c  ... 0: Colorbar generator
2023-04-24 10:17:21.717192+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.717 [SIPSample void logCallBack(int, const char *, int):1034]         colorbar_dev.c  ... 1: Colorbar-active
2023-04-24 10:17:21.717528+0700 PSUAKitSample[83000:15642817] 10:17:21.717         sip_endpoint.c  .Module "mod-evsub" registered

2023-04-24 10:17:21.717645+0700 PSUAKitSample[83000:15642975] 💚 DEBUG   10:17:21.718 [SIPSample void logCallBack(int, const char *, int):1034]         sip_endpoint.c  .Module "mod-evsub" registered
2023-04-24 10:17:21.717710+0700 PSUAKitSample[83000:15642817] 10:17:21.717         sip_endpoint.c  .Module "mod-presence" registered


    


  • Using PyAV to encode mono audio to file, params match docs, but still causes Errno 22

    20 février 2023, par andrew8088

    While trying to use PyAV to encode live mono audio from a microphone to a compressed audio stream (using mp2 or flac as encoder), the program kept raising an exception ValueError: [Errno 22] Invalid argument.

    


    To remove the live microphone source as a cause of the problem, and to make the problematic code easier for others to run/test, I have removed the mic source and now just generate a pure tone as a sequence of input buffers.

    


    All attempts to figure out the missing or mismatched or incorrect argument have just resulted in seeing documentation and examples that are the same as my code.

    


    I would like to know from someone who has used PyAV successfully for mono audio what the correct method and parameters are for encoding mono frames into the mono stream.

    


    The package used is av 10.0.0 installed with
pip3 install av --no-binary av
so it uses my package-manager provided ffmpeg library, which is version 4.2.7.

    


    The problematic python code is :

    


    #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Recreating an error 22 when encoding sound with PyAV.

Created on Sun Feb 19 08:10:29 2023
@author: andrewm
"""
import typing
import sys
import math
import fractions

import av
from av import AudioFrame

""" Ensure some PyAudio constants are still defined without changing 
    the PyAudio recording callback function and without depending 
    on PyAudio simply for reproducing the PyAV bug [Errno 22] thrown in 
    File "av/filter/context.pyx", line 89, in av.filter.context.FilterContext.push
"""
class PA_Stub():
    paContinue = True
    paComplete= False

pyaudio = PA_Stub()


"""Generate pure tone at given frequency with amplitude 0...1.0 at 
   sampling frewuency fs and beginning at phase offset 'phase'.
   Returns the new phase after the sinusoid has cycled over the 
   sampling window length.
"""
def generate_tone(
        freq:int, phase:float, amp:float, fs, samp_fmt, buffer:bytearray
) -> float:
    assert samp_fmt == "s16", "Only s16 supported atm"
    samp_size_bytes = 2
    n_samples = int(len(buffer)/samp_size_bytes)
    window = [int(0) for i in range(n_samples)]
    theta = phase
    phase_inc = 2*math.pi * freq / fs
    for i in range(n_samples):
        v = amp * math.sin(theta)
        theta += phase_inc
        s = int((2**15-1)*v)
        window[i] = s
    for sample_i in range(len(window)):
        byte_i = sample_i * samp_size_bytes
        enc = window[sample_i].to_bytes(
                2, byteorder=sys.byteorder, signed=True
        )
        buffer[byte_i] = enc[0]
        buffer[byte_i+1] = enc[1]
    return theta


channels = 1
fs = 44100  # Record at 44100 samples per second
fft_size_samps = 256
chunk_samps = fft_size_samps * 10  # Record in chunks that are multiples of fft windows.

# print(f"fft_size_samps={fft_size_samps}\nchunk_samps={chunk_samps}")

seconds = 3.0
out_filename = "testoutput.wav"

# Store data in chunks for 3 seconds
sample_limit = int(fs * seconds)
sample_len = 0
frames = []  # Initialize array to store frames

ffmpeg_codec_name = 'mp2'  # flac, mp3, or libvorbis make same error.

sample_size_bytes = 2
buffer = bytearray(int(chunk_samps*sample_size_bytes))
chunkperiod = chunk_samps / fs
total_chunks = int(math.ceil(seconds / chunkperiod))
phase = 0.0

### uncomment if you want to see the synthetic data being used as a mic input.
# with open("test.raw","wb") as raw_out:
#     for ci in range(total_chunks):
#         phase = generate_tone(2600, phase, 0.8, fs, "s16", buffer)
#         raw_out.write(buffer)
# print("finished gen test")
# sys.exit(0)
# #---- 

# Using mp2 or mkv as the container format gets the same error.
with av.open(out_filename+'.mp2', "w", format="mp2") as output_con:
    output_con.metadata["title"] = "My title"
    output_con.metadata["key"] = "value"
    channel_layout = "mono"
    sample_fmt = "s16p"

    ostream = output_con.add_stream(ffmpeg_codec_name, fs, layout=channel_layout)
    assert ostream is not None, "No stream!"
    cctx = ostream.codec_context
    cctx.sample_rate = fs
    cctx.time_base = fractions.Fraction(numerator=1,denominator=fs)
    cctx.format = sample_fmt
    cctx.channels = channels
    cctx.layout = channel_layout
    print(cctx, f"layout#{cctx.channel_layout}")
    
    # Define PyAudio-style callback for recording plus PyAV transcoding.
    def rec_callback(in_data, frame_count, time_info, status):
        global sample_len
        global ostream
        frames.append(in_data)
        nsamples = int(len(in_data) / (channels*sample_size_bytes))
        
        frame = AudioFrame(format=sample_fmt, layout=channel_layout, samples=nsamples)
        frame.sample_rate = fs
        frame.time_base = fractions.Fraction(numerator=1,denominator=fs)
        frame.pts = sample_len
        frame.planes[0].update(in_data)
        print(frame, len(in_data))
        
        for out_packet in ostream.encode(frame):
            output_con.mux(out_packet)
        for out_packet in ostream.encode(None):
            output_con.mux(out_packet)
        
        sample_len += nsamples
        retflag = pyaudio.paContinue if sample_lencode>

    


    If you uncomment the RAW output part you will find the generated data can be imported as PCM s16 Mono 44100Hz into Audacity and plays the expected tone, so the generated audio data does not seem to be the problem.

    


    The normal program console output up until the exception is :

    


    mp2 at 0x7f8e38202cf0> layout#4
Beginning
 5120
. 5120


    


    The stack trace is :

    


    Traceback (most recent call last):&#xA;&#xA;  File "Dev/multichan_recording/av_encode.py", line 147, in <module>&#xA;    ret_data, ret_flag = rec_callback(buffer, ci, {}, 1)&#xA;&#xA;  File "Dev/multichan_recording/av_encode.py", line 121, in rec_callback&#xA;    for out_packet in ostream.encode(frame):&#xA;&#xA;  File "av/stream.pyx", line 153, in av.stream.Stream.encode&#xA;&#xA;  File "av/codec/context.pyx", line 484, in av.codec.context.CodecContext.encode&#xA;&#xA;  File "av/audio/codeccontext.pyx", line 42, in av.audio.codeccontext.AudioCodecContext._prepare_frames_for_encode&#xA;&#xA;  File "av/audio/resampler.pyx", line 101, in av.audio.resampler.AudioResampler.resample&#xA;&#xA;  File "av/filter/graph.pyx", line 211, in av.filter.graph.Graph.push&#xA;&#xA;  File "av/filter/context.pyx", line 89, in av.filter.context.FilterContext.push&#xA;&#xA;  File "av/error.pyx", line 336, in av.error.err_check&#xA;&#xA;ValueError: [Errno 22] Invalid argument&#xA;&#xA;</module>

    &#xA;

    edit : It's interesting that the error happens on the 2nd AudioFrame, as apparently the first one was encoded okay, because they are given the same attribute values aside from the Presentation Time Stamp (pts), but leaving this out and letting PyAV/ffmpeg generate the PTS by itself does not fix the error, so an incorrect PTS does not seem the cause.

    &#xA;

    After a brief glance in av/filter/context.pyx the exception must come from a bad return value from res = lib.av_buffersrc_write_frame(self.ptr, frame.ptr)
    &#xA;Trying to dig into av_buffersrc_write_frame from the ffmpeg source it is not clear what could be causing this error. The only obvious one is a mismatch between channel layouts, but my code is setting the layout the same in the Stream and the Frame. That problem had been found by an old question pyav - cannot save stream as mono and their answer (that one parameter required is undocumented) is the only reason the code now has the layout='mono' argument when making the stream.

    &#xA;

    The program output shows layout #4 is being used, and from https://github.com/FFmpeg/FFmpeg/blob/release/4.2/libavutil/channel_layout.h you can see this is the value for symbol AV_CH_FRONT_CENTER which is the only channel in the MONO layout.

    &#xA;

    The mismatch is surely some other object property or an undocumented parameter requirement.

    &#xA;

    How do you encode mono audio to a compressed stream with PyAV ?

    &#xA;

  • Matomo’s privacy-friendly web analytics software named best of the year 2022

    25 janvier 2023, par Erin

    W3Tech names Matomo ‘Traffic Analysis Tool of the Year 2022’ in its Web Technologies of the Year list of technologies that gained the most sites

    Matomo, a world-leading open-source web analytics platform, is proud to announce that it has received W3Tech’s award for the best web analytics software in its Web Technologies of the Year 2022. Matomo is the first independent, open-source tool named Traffic Analysis Tool of the Year – with previous winners including Google Analytics and Facebook Pixel.


    W3Tech, a trusted source for web technology research, determines winners for its annual Web Technologies of the Year list by technologies that gained the most websites. W3Tech surveys usage across millions of websites globally – comparing the number of sites using a technology on January 1st of one year with the number of sites using it the following year.

    W3Tech commenting on the Traffic Analysis Tool winners, said : “Matomo, the privacy-focused open source analytics platform, is the traffic analysis tool of the year for the first time, while Google Analytics and the other previous winners all lost a bit of market share in 2022. The Chinese Baidu Analytics ranks second this year. Snowplow, another open source tool, is an unexpected third.”


    Matomo launched in 2007 as an open-source analytics alternative to Google Analytics, keeps businesses GDPR and CCPA-compliant. Matomo is trusted by over 1.4 million websites in 220 countries and is translated into over 50 languages.


    Matomo founder Matthieu Aubry says, “As the first independent, open-source traffic analysis tool to receive this recognition, Matomo is humbled and honoured to lead the charge for change. It’s a testament to the hard work of our community, and it’s a clear sign that consumers and organisations are looking for ethical alternatives.


    “This recognition is a major win for the entire privacy movement and proves that the tide is turning against the big tech players who I believe have long prioritised profits over privacy. We are committed to continuing our work towards a more private and secure digital landscape for all.”


    In W3Tech’s Web Technologies of the Year 2022, Matomo was also judged third Tag Manager, behind Google Tag Manager and Adobe DTM.


    Matomo helps businesses and organisations track and optimise their online presence allowing users to easily collect, analyse, and act on their website and marketing data to gain a deeper understanding of their visitors and drive conversions and revenue. With 100% data ownership, customers using the company’s tools get the power to protect their website user’s privacy – and where their data is stored and what’s happening to it, without external influence. Furthermore, as the data is not sampled, it maintains data accuracy. 


    Aubry says its recent award is a positive reminder of how well this solution is performing internationally and is a testament to the exceptional quality and performance of Matomo’s powerful web analytics tools that respect a user’s privacy.


    “In 2020, the CJEU ruled US cloud servers don’t comply with GDPR. Then in 2022, the Austrian Data Protection Authority and French Data Protection Authority (CNIL) ruled that the use of Google Analytics is illegal due to data transfers to the US. With Matomo Cloud, the customer’s data is stored in Europe, and no data is transferred to the US. On the other hand, with Matomo On-Premise, the data is stored in your country of choice.


    “Matomo has also become one of the most popular open-source alternatives to Google Analytics for website owners and marketing teams because it empowers web professionals to make business decisions. Website investment, collateral, and arrangement are enriched by having the full picture and control of the data.”

    Image of a laptop surrounded by multiple data screens from matomo

    About Matomo

    Matomo is a world-leading open-source web analytics platform, trusted by over 1.4 million websites in 220 countries and translated into over 50 languages. Matomo helps businesses and organisations track and optimise their online presence allowing users to easily collect, analyse, and act on their website and marketing data to gain a deeper understanding of their visitors and drive conversions and revenue. Matomo’s vision is to create, as a community, the leading open digital analytics platform that gives every user complete control of their data.

    For more information/ press enquiries Press – Matomo