Recherche avancée

Médias (33)

Mot : - Tags -/creative commons

Sur d’autres sites (104)

  • FFMPEG = I tried resizing a video, but got different resolution than I wanted [closed]

    10 janvier, par wakanasakai

    I downloaded a video that had some black bars (left & right), so I used the following command in FFmpeg to make various changes to it. I tested it on a 10 second clip to see what the result would look like.

    


    -ss 00:04:44 -to 00:04:54 -vf "crop=1870:20:20:0","scale=640x480:flags=lanczos","eq=gamma=1.5:saturation=1.3:contrast=1.2"

    


    The original video is an mp4, with a resolution of 1920 x 1080. Besides trying to crop it & adjust the gamma, saturation, & contrast, I also tried to resize it to 640 x 480. Instead, it's resulting resolution is 44880 x 480 ! I have a link to it for anybody who wants to examine it directly. (It's only 487 kb.)
text

    


    I've tried using FFmpeg before, & it never did anything so insane. (It cropped it, & adjusted the gamma a saturation (I didn't test the contrast until THIS time), but it did not resize it at all.)

    


    Here is FFmpeg's log file for it. Guesses as to the cause of the insane result, & advice on how to achieve the DESIRED result (in 1 pass, if possible) are requested.

    


    ffmpeg -hwaccel auto -y -i "/storage/emulated/0/bluetooth/Barbie & the Rockers=1080-Out of this world (1987).mp4" -ss 00:04:44 -to 00:04:54 -vf "crop=1870:20:20:0","scale=640x480:flags=lanczos","eq=gamma=1.5:saturation=1.3:contrast=1.2" "/storage/emulated/0/Movies/Barbie.mp4"

ffmpeg version 6.0 Copyright (c) 2000-2023 the FFmpeg developers
  built with gcc 4.9.x (GCC) 20150123 (prerelease)
  configuration: --enable-version3 --enable-gpl --enable-nonfree --disable-indev=v4l2 --enable-libmp3lame --enable-libx264 --enable-libx265 --enable-libvpx --enable-libvorbis --enable-libtheora --enable-libopus --enable-libfdk-aac --enable-libfreetype --enable-libass --enable-libfribidi --enable-fontconfig --enable-pthreads --enable-libxvid --enable-filters --enable-openssl --enable-librtmp --disable-protocol='udp,udplite' --enable-libopencore-amrwb --enable-libopencore-amrnb --enable-libvo-amrwbenc --enable-libspeex --enable-libsoxr --enable-libwebp --enable-libxml2 --enable-libopenh264 --enable-jni --prefix=/home/silentlexx/AndroidstudioProjects/ffmpeg/ffmpeg/build/arm-api18-r13b --sysroot=/home/silentlexx/Android/android-ndk-r13b/platforms/android-18/arch-arm --arch=arm --disable-shared --enable-static --enable-pic --enable-ffmpeg --disable-ffplay --disable-ffprobe --disable-ffnvcodec --disable-avdevice --disable-debug --disable-doc --disable-htmlpages --disable-manpages --disable-podpages --disable-txtpages --disable-symver --cross-prefix=/home/silentlexx/Android/android-ndk-r13b/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64/bin/arm-linux-androideabi- --target-os=android --enable-cross-compile --pkg-config-flags=--static --extra-libs='-lgnustl_static -lm -lpng -l:libz.so -lpthread' --enable-asm --enable-neon --enable-small
  libavutil      58.  2.100 / 58.  2.100
  libavcodec     60.  3.100 / 60.  3.100
  libavformat    60.  3.100 / 60.  3.100
  libavfilter     9.  3.100 /  9.  3.100
  libswscale      7.  1.100 /  7.  1.100
  libswresample   4. 10.100 /  4. 10.100
  libpostproc    57.  1.100 / 57.  1.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from '/storage/emulated/0/bluetooth/Barbie & the Rockers=1080-Out of this world (1987).mp4':
  Metadata:
    major_brand     : mp42
    minor_version   : 512
    compatible_brands: mp41isomiso2
    creation_time   : 2024-01-04T01:46:07.000000Z
  Duration: 00:45:33.10, start: 0.000000, bitrate: 3404 kb/s
  Stream #0:0[0x1](und): Video: h264 (avc1 / 0x31637661), yuv420p(tv, bt709, progressive), 1920x1080 [SAR 1:1 DAR 16:9], 3272 kb/s, 30 fps, 30 tbr, 15360 tbn (default)
    Metadata:
      creation_time   : 2023-06-25T13:25:03.000000Z
      vendor_id       : [0][0][0][0]
  Stream #0:1[0x2](eng): Audio: aac (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
    Metadata:
      creation_time   : 2023-06-25T13:25:03.000000Z
      vendor_id       : [0][0][0][0]
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
  Stream #0:1 -> #0:1 (aac (native) -> aac (native))
Press [q] to stop, [?] for help
[libx264 @ 0xf38cd180] using SAR=561/8
[libx264 @ 0xf38cd180] using cpu capabilities: ARMv6 NEON
[libx264 @ 0xf38cd180] profile High, level 3.0, 4:2:0, 8-bit
[libx264 @ 0xf38cd180] 264 - core 158 r2984 3759fcb - H.264/MPEG-4 AVC codec - Copyleft 2003-2019 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=12 lookahead_threads=2 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
Output #0, mp4, to '/storage/emulated/0/Movies/Barbie.mp4':
  Metadata:
    major_brand     : mp42
    minor_version   : 512
    compatible_brands: mp41isomiso2
    encoder         : Lavf60.3.100
  Stream #0:0(und): Video: h264 (avc1 / 0x31637661), yuv420p(tv, bt709, progressive), 640x480 [SAR 561:8 DAR 187:2], q=2-31, 30 fps, 15360 tbn (default)
    Metadata:
      creation_time   : 2023-06-25T13:25:03.000000Z
      vendor_id       : [0][0][0][0]
      encoder         : Lavc60.3.100 libx264
    Side data:
      cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: N/A
  Stream #0:1(eng): Audio: aac (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
    Metadata:
      creation_time   : 2023-06-25T13:25:03.000000Z
      vendor_id       : [0][0][0][0]
      encoder         : Lavc60.3.100 aac
frame=    0 fps=0.0 q=0.0 size=       0kB time=-577014:32:22.77 bitrate=  -0.0kbits/s speed=N/A    
frame=    0 fps=0.0 q=0.0 size=       0kB time=00:00:00.16 bitrate=   2.4kbits/s speed=0.00197x    
frame=    0 fps=0.0 q=0.0 size=       0kB time=00:00:00.71 bitrate=   0.5kbits/s speed=0.00867x    
frame=   13 fps=0.2 q=29.0 size=       0kB time=00:00:01.48 bitrate=   0.3kbits/s speed=0.0178x    
frame=   45 fps=0.5 q=29.0 size=       0kB time=00:00:02.55 bitrate=   0.2kbits/s speed=0.0304x    
frame=   78 fps=0.9 q=29.0 size=       0kB time=00:00:03.66 bitrate=   0.1kbits/s speed=0.0434x    
frame=  114 fps=1.3 q=29.0 size=       0kB time=00:00:04.85 bitrate=   0.1kbits/s speed=0.057x    
frame=  146 fps=1.7 q=29.0 size=       0kB time=00:00:05.92 bitrate=   0.1kbits/s speed=0.0692x    
frame=  178 fps=2.1 q=29.0 size=       0kB time=00:00:07.03 bitrate=   0.1kbits/s speed=0.0817x    
frame=  209 fps=2.4 q=29.0 size=     256kB time=00:00:08.03 bitrate= 261.1kbits/s speed=0.0928x    
frame=  240 fps=2.8 q=29.0 size=     256kB time=00:00:09.07 bitrate= 231.0kbits/s speed=0.104x    
frame=  300 fps=3.4 q=-1.0 Lsize=     445kB time=00:00:09.98 bitrate= 365.2kbits/s speed=0.114x    
video:275kB audio:159kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 2.692692%
[libx264 @ 0xf38cd180] frame I:10    Avg QP:20.34  size:  2434
[libx264 @ 0xf38cd180] frame P:129   Avg QP:21.89  size:  1292
[libx264 @ 0xf38cd180] frame B:161   Avg QP:21.69  size:   555
[libx264 @ 0xf38cd180] consecutive B-frames: 20.0% 18.7% 20.0% 41.3%
[libx264 @ 0xf38cd180] mb I  I16..4: 30.2% 66.5%  3.2%
[libx264 @ 0xf38cd180] mb P  I16..4: 14.3% 17.7%  0.2%  P16..4: 12.7%  2.7%  0.4%  0.0%  0.0%    skip:52.1%
[libx264 @ 0xf38cd180] mb B  I16..4:  2.1%  1.1%  0.0%  B16..8: 21.9%  1.7%  0.0%  direct: 1.5%  skip:71.6%  L0:46.0% L1:53.0% BI: 1.0%
[libx264 @ 0xf38cd180] 8x8 transform intra:54.9% inter:98.2%
[libx264 @ 0xf38cd180] coded y,uvDC,uvAC intra: 10.3% 14.9% 1.5% inter: 2.2% 5.4% 0.0%
[libx264 @ 0xf38cd180] i16 v,h,dc,p: 93%  2%  2%  4%
[libx264 @ 0xf38cd180] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 69%  1% 28%  0%  0%  1%  0%  0%  0%
[libx264 @ 0xf38cd180] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 76%  3% 17%  1%  1%  2%  0%  1%  0%
[libx264 @ 0xf38cd180] i8c dc,h,v,p: 45%  2% 53%  1%
[libx264 @ 0xf38cd180] Weighted P-Frames: Y:0.8% UV:0.8%
[libx264 @ 0xf38cd180] ref P L0: 57.0%  8.7% 24.0% 10.4%
[libx264 @ 0xf38cd180] ref B L0: 79.7% 17.3%  3.0%
[libx264 @ 0xf38cd180] ref B L1: 95.6%  4.4%
[libx264 @ 0xf38cd180] kb/s:224.32
[aac @ 0xf38cd880] Qavg: 457.489


    


  • Unveiling GA4 Issues : 8 Questions from a Marketer That GA4 Can’t Answer

    8 janvier, par Alex

    It’s hard to believe, but Universal Analytics had a lifespan of 11 years, from its announcement in March 2012. Despite occasional criticism, this service established standards for the entire web analytics industry. Many metrics and reports became benchmarks for a whole generation of marketers. It truly was an era.

    For instance, a lot of marketers got used to starting each workday by inspecting dashboards and standard traffic reports in the Universal Analytics web interface. There were so, so many of those days. They became so accustomed to Universal Analytics that they would enter reports, manipulate numbers, and play with metrics almost on autopilot, without much thought.

    However, six months have passed since the sunset of Universal Analytics – precisely on July 1, 2023, when Google stopped processing requests for resources using the previous version of Google Analytics. The time when data about visitors and their interactions with the website were more clearly structured within the UA paradigm is now in the past. GA4 has brought a plethora of opportunities to marketers, but along with those opportunities came a series of complexities.

    GA4 issues

    Since its initial announcement in 2020, GA4 has been plagued with errors and inconsistencies. It still has poor and sometimes illogical documentation, numerous restrictions, and peculiar interface solutions. But more importantly, the barrier to entry into web analytics has significantly increased.

    If you diligently follow GA4 updates, read the documentation, and possess skills in working with data (SQL and basic statistics), you probably won’t feel any problems – you know how to set up a convenient and efficient environment for your product and marketing data. But what if you’re not that proficient ? That’s when issues arise.

    In this article, we try to address a series of straightforward questions that less experienced users – marketers, project managers, SEO specialists, and others – want answers to. They have no time to delve into the intricacies of GA4 but seek access to the fundamentals crucial for their functionality.

    Previously, in Universal Analytics, they could quickly and conveniently address their issues. Now, the situation has become, to put it mildly, more complex. We’ve identified 8 such questions for which the current version of GA4 either fails to provide answers or implies that answers would require significant enhancements. So, let’s dive into them one by one.

    Question 1 : What are the most popular traffic sources on my website ?

    Seemingly a straightforward question. What does GA4 tell us ? It responds with a question : “Which traffic source parameter are you interested in ?”

    GA4 traffic source

    Wait, what ?

    People just want to know which resources bring them the most traffic. Is that really an issue ?

    Unfortunately, yes. In GA4, there are not one, not two, but three traffic source parameters :

    1. Session source.
    2. First User Source – the source of the first session for each user.
    3. Just the source – determined at the event or conversion level.

    If you wanted to open a report and draw conclusions quickly, we have bad news for you. Before you start ranking your traffic sources by popularity, you need to do some mental work on which parameter and in what context you will look. And even when you decide, you’ll need to make a choice in the selection of standard reports : work with the User Acquisition Report or Traffic Acquisition.

    Yes, there is a difference between them : the first uses the First User Source parameter, and the second uses the session source. And you need to figure that out too.

    Question 2 : What is my conversion rate ?

    This question concerns everyone, and it should be simple, implying a straightforward answer. But no.

    GA4 conversion rate

    In GA4, there are three conversion metrics (yes, three) :

    1. Session conversion – the percentage of sessions with a conversion.
    2. User conversion – the percentage of users who completed a conversion.
    3. First-time Purchaser Conversion – the share of active users who made their first purchase.

    If the last metric doesn’t interest us much, GA4 users can still choose something from the remaining two. But what’s next ? Which parameters to use for comparison ? Session source or user source ? What if you want to see the conversion rate for a specific event ? And how do you do this in analyses rather than in standard reports ?

    In the end, instead of an answer to a simple question, marketers get a bunch of new questions.

    Question 3. Can I trust user and session metrics ?

    Unfortunately, no. This may boggle the mind of those not well-versed in the mechanics of calculating user and session metrics, but it’s the plain truth : the numbers in GA4 and those in reality may and will differ.

    GA4 confidence levels

    The reason is that GA4 uses the HyperLogLog++ statistical algorithm to count unique values. Without delving into details, it’s a mechanism for approximate estimation of a metric with a certain level of error.

    This error level is quite well-documented. For instance, for the Total Users metric, the error level is 1.63% (for a 95% confidence interval). In simple terms, this means that 100,000 users in the GA4 interface equate to 100,000 1.63% in reality.

    Furthermore – but this is no surprise to anyone – GA4 samples data. This means that with too large a sample size or when using a large number of parameters, the application will assess your metrics based on a partial sample – let’s say 5, 10, or 30% of the entire population.

    It’s a reasonable assumption, but it can (and probably will) surprise marketers – the metrics will deviate from reality. All end-users can do (excluding delving into raw data methodologies) is to take this error level into account in their conclusions.

    Question 4. How do I calculate First Click attribution ?

    You can’t. Unfortunately, as of late, GA4 offers only three attribution models available in the Attribution tab : Last Click, Last Click For Google Ads, and Data Driven. First Click attribution is essential for understanding where and when demand is generated. In the previous version of Google Analytics (and until recently, in the current one), users could quickly apply First Click and other attribution models, compare them, and gain insights. Now, this capability is gone.

    GA4 attribution model

    Certainly, you can look at the conversion distribution considering the First User Source parameter – this will be some proxy for First Click attribution. However, comparing it with others in the Model Comparison tab won’t be possible. In the context of the GA4 interface, it makes sense to forget about non-standard attribution models.

    Question 5. How do I account for intra-session traffic ?

    Intra-session traffic essentially refers to a change in traffic sources within a session. Imagine a scenario where a user comes to your site organically from Google and, within a minute, comes from an email campaign. In the previous version of Google Analytics, a new session with the traffic source “e-mail” would be created in such a case. But now, the situation has changed.

    A session now only ends in the case of a timeout – say, 30 minutes without interaction. This means a session will always have a source from which it started. If a user changes the source within a session (clicks on an ad, from email campaigns, and so on), you won’t know anything about it until they convert. This is a significant blow to intra-session traffic since their contribution to traffic remains virtually unnoticed. 

    Question 6. How can I account for users who have not consented to the use of third-party cookies ?

    You can’t. Google Consent Mode settings imply several options when a user rejects the use of 3rd party cookies. In GA4 and BigQuery, depersonalized cookieless pings will be sent. These pings do not contain specific client_id, session_id, or other custom dimensions. As a result, you won’t be able to consider them as users or link the actions of such users together.

    Question 7. How can I compare data in explorations with the previous year ?

    The maximum data retention period for a free GA4 account is 14 months. This means that if the date range is wider, you can only use standard reports. You won’t be able to compare or view cohorts or funnels for periods more than 14 months ago. This makes the product functionality less rich because various report formats in explorations are very convenient for comparing specific metrics in easily digestible reports.

    GA4 data retention

    Of course, you always have the option to connect BigQuery and store raw data without limitations, but this process usually requires the involvement of an advanced analyst. And precisely this option is unavailable to most marketers in small teams.

    Question 8. Is the data for yesterday accurate ?

    Unknown. Google declares that data processing in GA4 takes up to 48 hours. And although this process is faster, most users still have room for frustration. And they can be understood.

    Data processing time in GA4

    What does “data processing takes 24-48 hours” mean ? When will the data in reports be complete ? For yesterday ? Or the day before yesterday ? Or for all days that were more than two days ago ? Unclear. What should marketers tell their managers when they were asked if all the data is in this report ? Well, probably all of it… or maybe not… Let’s wait for 48 hours…

    Undoubtedly, computational resources and time are needed for data preprocessing and aggregation. It’s okay that data for today will not be up-to-date. And probably not for yesterday either. But people just want to know when they can trust their data. Are they asking for too much : just a note that this report contains all the data sent and processed by Google Analytics ?

    What should you do ?

    Credit should be given to the Google team – they have done a lot to enable users to answer these questions in one form or another. For example, you can use data streaming in BigQuery and work with raw data. The entry threshold for this functionality has been significantly lowered. In fact, if you are dissatisfied with the GA4 interface, you can organize your export to BigQuery and create your own reports without (almost) any restrictions.

    Another strong option is the widespread launch of GTM Server Side. This allows you to quite freely modify the event model and essentially enrich each hit with various parameters, doing this in a first-party context. This, of course, reduces the harmful impact of most of the limitations described in this text.

    But this is not a solution.

    The users in question – marketers, managers, developers – they do not want or do not have the time for a deep dive into the issue. And they want simple answers to simple (it seemed) questions. And for now, unfortunately, GA4 is more of a professional tool for analysts than a convenient instrument for generating insights for not very advanced users.

    Why is this such a serious issue ?

    The thing is – and this is crucial – over the past 10 years, Google has managed to create a sort of GA-bubble for marketers. Many of them have become so accustomed to Google Analytics that when faced with another issue, they don’t venture to explore alternative solutions but attempt to solve it on their own. And almost always, this turns out to be expensive and inconvenient.

    However, with the latest updates to GA4, it is becoming increasingly evident that this application is struggling to address even the most basic questions from users. And these questions are not fantastically complex. Much of what was described in this article is not an unsolvable mystery and is successfully addressed by other analytics services.

    Let’s try to answer some of the questions described from the perspective of Matomo.

    Question 1 : What are the most popular traffic sources ? [Solved]

    In the Acquisition panel, you will find at least three easily identifiable reports – for traffic channels (All Channels), sources (Websites), and campaigns (Campaigns). 

    Channel Type Table

    With these, you can quickly and easily answer the question about the most popular traffic sources, and if needed, delve into more detailed information, such as landing pages.

    Question 2 : What is my conversion rate ? [Solved]

    Under Goals in Matomo, you’ll easily find the overall conversion rate for your site. Below that you’ll have access to the conversion rate of each goal you’ve set in your Matomo instance.

    Question 3 : Can I trust user and session metrics ? [Solved]

    Yes. With Matomo, you’re guaranteed 100% accurate data. Matomo does not apply sampling, does not employ specific statistical algorithms, or any analogs of threshold values. Yes, it is possible, and it’s perfectly normal. If you see a metric in the visits or users field, it accurately represents reality by 100%.

    Try Matomo for Free

    Get the web insights you need, without compromising data accuracy.

    No credit card required

    Question 4 : How do I calculate First Click attribution ? [Solved]

    You can do this in the same section where the other 5 attribution models, available in Matomo, are calculated – in the Multi Attribution section.

    Multi Attribution feature

    You can choose a specific conversion and, in a few clicks, calculate and compare up to 3 marketing attribution models. This means you don’t have to spend several days digging through documentation trying to understand how a particular model is calculated. Have a question – get an answer.

    Question 5 : How do I account for intra-session traffic ? [Solved]

    Matomo creates a new visit when a user changes a campaign. This means that you will accurately capture all relevant traffic if it is adequately tagged. No campaigns will be lost within a visit, as they will have a new utm_campaign parameter.

    This is a crucial point because when the Referrer changes, a new visit is not created, but the key lies in something else – accounting for all available traffic becomes your responsibility and depends on how you tag it.

    Try Matomo for Free

    Get the web insights you need, without compromising data accuracy.

    No credit card required

    Question 6 : How can I account for users who have not consented to the use of third-party cookies ? [Solved]

    Google Analytics requires users to accept a cookie consent banner with “analytics_storage=granted” to track them. If users reject cookie consent banners, however, then Google Analytics can’t track these visitors at all. They simply won’t show up in your traffic reports. 

    Matomo doesn’t require cookie consent banners (apart from in the United Kingdom and Germany) and can therefore continue to track visitors even after they have rejected a cookie consent screen. This is achieved through a config_id variable (the user identifier equivalent which is updating once a day). 

    Matomo doesn't need cookie consent, so you see a complete view of your traffic

    This means that virtually all of your website traffic will be tracked regardless of whether users accept a cookie consent banner or not.

    Question 7 : How can I compare data in explorations with the previous year ? [Solved]

    There is no limitation on data retention for your aggregated reports in Matomo. The essence of Matomo experience lies in the reporting data, and consequently, retaining reports indefinitely is a viable option. So you can compare data for any timeframe. 7

    Date Comparison Selector
  • What is a Cohort Report ? A Beginner’s Guide to Cohort Analysis

    3 janvier, par Erin

    Handling your user data as a single mass of numbers is rarely conducive to figuring out meaningful patterns you can use to improve your marketing campaigns.

    A cohort report (or cohort analysis) can help you quickly break down that larger audience into sequential segments and contrast and compare based on various metrics. As such, it is a great tool for unlocking more granular trends and insights — for example, identifying patterns in engagement and conversions based on the date users first interacted with your site.

    In this guide, we explain the basics of the cohort report and the best way to set one up to get the most out of it.

    What is a cohort report ?

    In a cohort report, you divide a data set into groups based on certain criteria — typically a time-based cohort metric like first purchase date — and then analyse the data across those segments, looking for patterns.

    Date-based cohort analysis is the most common approach, often creating cohorts based on the day a user completed a particular action — signed up, purchased something or visited your website. Depending on the metric you choose to measure (like return visits), the cohort report might look something like this :

    Example of a basic cohort report

    Note that this is not a universal benchmark or anything of the sort. The above is a theoretical cohort analysis based on app users who downloaded the app, tracking and comparing the retention rates as the days go by. 

    The benchmarks will be drastically different depending on the metric you’re measuring and the basis for your cohorts. For example, if you’re measuring returning visitor rates among first-time visitors to your website, expect single-digit percentages even on the second day.

    Your industry will also greatly affect what you consider positive in a cohort report. For example, if you’re a subscription SaaS, you’d expect high continued usage rates over the first week. If you sell office supplies to companies, much less so.

    What is an example of a cohort ?

    As we just mentioned, a typical cohort analysis separates users or customers by the date they first interacted with your business — in this case, they downloaded your app. Within that larger analysis, the users who downloaded it on May 3 represent a single cohort.

    Illustration of a specific cohort

    In this case, we’ve chosen behaviour and time — the app download day — to separate the user base into cohorts. That means every specific day denotes a specific cohort within the analysis.

    Diving deeper into an individual cohort may be a good idea for important holidays or promotional events like Black Friday.

    Of course, cohorts don’t have to be based on specific behaviour within certain periods. You can also create cohorts based on other dimensions :

    • Transactional data — revenue per user
    • Churn data — date of churn
    • Behavioural cohort — based on actions taken on your website, app or e-commerce store, like the number of sessions per user or specific product pages visited
    • Acquisition cohort — which channel referred the user or customer

    For more information on different cohort types, read our in-depth guide on cohort analysis.

    How to create a cohort report (and make sense of it)

    Matomo makes it easy to view and analyse different cohorts (without the privacy and legal implications of using Google Analytics).

    Here are a few different ways to set up a cohort report in Matomo, starting with our built-in cohorts report.

    Cohort reports

    With Matomo, cohort reports are automatically compiled based on the first visit date. The default metric is the percentage of returning visitors.

    Screenshot of the cohorts report in Matomo analytics

    Changing the settings allows you to create multiple variations of cohort analysis reports.

    Break down cohorts by different metrics

    The percentage of returning visits can be valuable if you’re trying to improve early engagement in a SaaS app onboarding process. But it’s far from your only option.

    You can also compare performance by conversion, revenue, bounce rate, actions per visit, average session duration or other metrics.

    Cohort metric options in Matomo analytics

    Change the time and scope of your cohort analysis

    Splitting up cohorts by single days may be useless if you don’t have a high volume of users or visitors. If the average cohort size is only a few users, you won’t be able to identify reliable patterns. 

    Matomo lets you set any time period to create your cohort analysis report. Instead of the most recent days, you can create cohorts by week, month, year or custom date ranges. 

    Date settings in the cohorts report in Matomo analytics

    Cohort sizes will depend on your customer base. Make sure each cohort is large enough to encapsulate all the customers in that cohort and not so small that you have insignificant cohorts of only a few customers. Choose a date range that gives you that without scaling it too far so you can’t identify any seasonal trends.

    Cohort analysis can be a great tool if you’ve recently changed your marketing, product offering or onboarding. Set the data range to weekly and look for any impact in conversions and revenue after the changes.

    Using the “compare to” feature, you can also do month-over-month, quarter-over-quarter or any custom date range comparisons. This approach can help you get a rough overview of your campaign’s long-term progress without doing any in-depth analysis.

    You can also use the same approach to compare different holiday seasons against each other.

    If you want to combine time cohorts with segmentation, you can run cohort reports for different subsets of visitors instead of all visitors. This can lead to actionable insights like adjusting weekend or specific seasonal promotions to improve conversion rates.

    Try Matomo for Free

    Get the web insights you need, without compromising data accuracy.

    No credit card required

    Easily create custom cohort reports beyond the time dimension

    If you want to split your audience into cohorts by focusing on something other than time, you will need to create a custom report and choose another dimension. In Matomo, you can choose from a wide range of cohort metrics, including referrers, e-commerce signals like viewed product or product category, form submissions and more.

    Custom report options in Matomo

    Then, you can create a simple table-based report with all the insights you need by choosing the metrics you want to see. For example, you could choose average visit duration, bounce rate and other usage metrics.

    Metrics selected in a Matomo custom report

    If you want more revenue-focused insights, add metrics like conversions, add-to-cart and other e-commerce events.

    Custom reports make it easy to create cohort reports for almost any dimension. You can use any metric within demographic and behavioural analytics to create a cohort. (You can explore the complete list of our possible segmentation metrics.)

    We cover different types of custom reports (and ideas for specific marketing campaigns) in our guide on custom segmentation.

    Create your first cohort report and gain better insights into your visitors

    Cohort reports can help you identify trends and the impact of short-term marketing efforts like events and promotions.

    With Matomo cohort reports you have the power to create complex custom reports for various cohorts and segments. 

    If you’re looking for a powerful, easy-to-use web analytics solution that gives you 100% accurate data without compromising your users’ privacy, Matomo is a great fit. Get started with a 21-day free trial today. No credit card required.