Recherche avancée

Médias (2)

Mot : - Tags -/rotation

Autres articles (95)

  • Creating farms of unique websites

    13 avril 2011, par

    MediaSPIP platforms can be installed as a farm, with a single "core" hosted on a dedicated server and used by multiple websites.
    This allows (among other things) : implementation costs to be shared between several different projects / individuals rapid deployment of multiple unique sites creation of groups of like-minded sites, making it possible to browse media in a more controlled and selective environment than the major "open" (...)

  • Le profil des utilisateurs

    12 avril 2011, par

    Chaque utilisateur dispose d’une page de profil lui permettant de modifier ses informations personnelle. Dans le menu de haut de page par défaut, un élément de menu est automatiquement créé à l’initialisation de MediaSPIP, visible uniquement si le visiteur est identifié sur le site.
    L’utilisateur a accès à la modification de profil depuis sa page auteur, un lien dans la navigation "Modifier votre profil" est (...)

  • Configurer la prise en compte des langues

    15 novembre 2010, par

    Accéder à la configuration et ajouter des langues prises en compte
    Afin de configurer la prise en compte de nouvelles langues, il est nécessaire de se rendre dans la partie "Administrer" du site.
    De là, dans le menu de navigation, vous pouvez accéder à une partie "Gestion des langues" permettant d’activer la prise en compte de nouvelles langues.
    Chaque nouvelle langue ajoutée reste désactivable tant qu’aucun objet n’est créé dans cette langue. Dans ce cas, elle devient grisée dans la configuration et (...)

Sur d’autres sites (6060)

  • Cohort Analysis 101 : How-To, Examples & Top Tools

    13 novembre 2023, par Erin — Analytics Tips

    Imagine that a farmer is trying to figure out why certain hens are laying large brown eggs and others are laying average-sized white eggs.

    The farmer decides to group the hens into cohorts based on what kind of eggs they lay to make it easier to detect patterns in their day-to-day lives. After careful observation and analysis, she discovered that the hens laying big brown eggs ate more than the roost’s other hens.

    With this cohort analysis, the farmer deduced that a hen’s body weight directly corresponds to egg size. She can now develop a strategy to increase the body weight of her hens to sell more large brown eggs, which are very popular at the weekly farmers’ market.

    Cohort analysis has a myriad of applications in the world of web analytics. Like our farmer, you can use it to better understand user behaviour and reap the benefits of your efforts. This article will discuss the best practices for conducting an effective cohort analysis and compare the top cohort analysis tools for 2024. 

    What is cohort analysis ?

    By definition, cohort analysis refers to a technique where users are grouped based on shared characteristics or behaviours and then examined over a specified period.

    Think of it as a marketing superpower, enabling you to comprehend user behaviours, craft personalised campaigns and allocate resources wisely, ultimately resulting in improved performance and better ROI.

    Why does cohort analysis matter ?

    In web analytics, a cohort is a group of users who share a certain behaviour or characteristic. The goal of cohort analysis is to uncover patterns and compare the performance and behaviour of different cohorts over time.

    An example of a cohort is a group of users who made their first purchase during the holidays. By analysing this cohort, you could learn more about their behaviour and buying patterns. You may discover that this cohort is more likely to buy specific product categories as holiday gifts — you can then tailor future holiday marketing campaigns to include these categories. 

    Types of cohort analysis

    There are a few different types of notable cohorts : 

    1. Time-based cohorts are groups of users categorised by a specific time. The example of the farmer we went over at the beginning of this section is a great example of a time-based cohort.
    2. Acquisition cohorts are users acquired during a specific time frame, event or marketing channel. Analysing these cohorts can help you determine the value of different acquisition methods. 
    3. Behavioural cohorts consist of users who show similar patterns of behaviour. Examples include frequent purchases with your mobile app or digital content engagement. 
    4. Demographic cohorts share common demographic characteristics like age, gender, education level and income. 
    5. Churn cohorts are buyers who have cancelled a subscription/stopped using your service within a specific time frame. Analysing churn cohorts can help you understand why customers leave.
    6. Geographic cohorts are pretty self-explanatory — you can use them to tailor your marketing efforts to specific regions. 
    7. Customer journey cohorts are based on the buyer lifecycle — from acquisition to adoption to retention. 
    8. Product usage cohorts are buyers who use your product/service specifically (think basic users, power users or occasional users). 

    Best practices for conducting a cohort analysis 

    So, you’ve decided you want to understand your user base better but don’t know how to go about it. Perhaps you want to reduce churn and create a more engaging user experience. In this section, we’ll walk you through the dos and don’ts of conducting an effective cohort analysis. Remember that you should tailor your cohort analysis strategy for organisation-specific goals.

    A line graph depicting product usage cohort data with a blue line for new users and a green line for power users.

    1. Preparing for cohort analysis : 

      • First, define specific goals you want your cohort analysis to achieve. Examples include improving conversion rates or reducing churn.
      • Choosing the right time frame will help you compare short-term vs. long-term data trends. 

    2. Creating effective cohorts : 

      • Define your segmentation criteria — anything from demographics to location, purchase history or user engagement level. Narrowing in on your specific segments will make your cohort analysis more precise. 
      • It’s important to find a balance between cohort size and similarity. If your cohort is too small and diverse, you won’t be able to find specific behavioural patterns.

    3. Performing cohort analysis :

        • Study retention rates across cohorts to identify patterns in user behaviour and engagement over time. Pay special attention to cohorts with high retention or churn rates. 
        • Analysing cohorts can reveal interesting behavioural insights — how do specific cohorts interact with your website ? Do they have certain preferences ? Why ? 

    4. Visualising and interpreting data :

      • Visualising your findings can be a great way to reveal patterns. Line charts can help you spot trends, while bar charts can help you compare cohorts.
      • Guide your analytics team on how to interpret patterns in cohort data. Watch for sudden drops or spikes and what they could mean. 

    5. Continue improving :

      • User behaviour is constantly evolving, so be adaptable. Continuous tracking of user behaviour will help keep your strategies up to date. 
      • Encourage iterative analysis optimisation based on your findings. 
    wrench trying to hammer in a nail, and a hammer trying to screw in a screw to a piece of wood

    The top cohort analysis tools for 2024

    In this section, we’ll go over the best cohort analysis tools for 2024, including their key features, cohort analysis dashboards, cost and pros and cons.

    1. Matomo

    A screenshot of a cohorts graph in Matomo

    Matomo is an open-source, GDPR-compliant web analytics solution that offers cohort analysis as a standard feature in Matomo Cloud and is available as a plugin for Matomo On-Premise. Pairing traditional web analytics with cohort analysis will help you gain even deeper insights into understanding user behaviour over time. 

    You can use the data you get from web analytics to identify patterns in user behaviour and target your marketing strategies to specific cohorts. 

    Key features

    • Matomo offers a cohorts table that lets you compare cohorts side-by-side, and it comes with a time series.
      • All core session and conversion metrics are also available in the Cohorts report.
    • Create custom segments based on demographics, geography, referral sources, acquisition date, device types or user behaviour. 
    • Matomo provides retention analysis so you can track how many users from a specific cohort return to your website and when. 
    • Flexibly analyse your cohorts with custom reports. Customise your reports by combining metrics and dimensions specific to different cohorts. 
    • Create cohorts based on events or interactions with your website. 
    • Intuitive, colour-coded data visualisation, so you can easily spot patterns.

    Pros

    • No setup is needed if you use the JavaScript tracker
    • You can fetch cohort without any limit
    • 100% accurate data, no AI or Machine Learning data filling, and without the use of data sampling

    Cons

    • Matomo On-Premise (self-hosted) is free, but advanced features come with additional charges
    • Servers and technical know-how are required for Matomo On-Premise. Alternatively, for those not ready for self-hosting, Matomo Cloud presents a more accessible option and starts at $19 per month.

    Price : 

    • Matomo Cloud : 21-day free trial, then starts at $19 per month (includes Cohorts).
    • Matomo On-Premise : Free to self-host ; Cohorts plugin : 30-day free trial, then $99 per year.

    2. Mixpanel

    Mixpanel is a product analytics tool designed to help teams better understand user behaviour. It is especially well-suited for analysing user behaviour on iOS and Android apps. It offers various cohort analytics features that can be used to identify patterns and engage your users. 

    Key features

    • Create cohorts based on criteria such as sign-up date, first purchase date, referral source, geographic location, device type or another custom event/property. 
    • Compare how different cohorts engage with your app with Mixpanel’s comparative analysis features.
    • Create interactive dashboards, charts and graphs to visualise data.
    • Mixpanel provides retention analysis tools to see how often users return to your product over time. 
    • Send targeted messages and notifications to specific cohorts to encourage user engagement, announce new features, etc. 
    • Track and analyse user behaviours within cohorts — understand how different types of users engage with your product.

    Pros

    • Easily export cohort analysis data for further analysis
    • Combined with Mixpanel reports, cohorts can be a powerful tool for improving your product

    Cons

    • With the free Mixpanel plan, you can’t save cohorts for future use
    • Enterprise-level pricing is expensive
    • Time-consuming cohort creation process

    Price : Free basic version. The growth version starts at £16/month.

    3. Amplitude

    A screenshot of a cohorts graph in Amplitude

    Amplitude is another product analytics solution that can help businesses track user interactions across digital platforms. Amplitude offers a standard toolkit for in-depth cohort analysis.

    Key features

    • Create cohorts based on criteria such as sign-up date, first purchase date, referral source, geographic location, device type or another custom event/property. 
    • Conduct behavioural, time-based and retention analyses.
    • Create custom reports with custom data.
    • Segment cohorts further based on additional criteria and compare multiple cohorts side-by-side.

    Pros

    • Highly customisable and flexible
    • Quick and simple setup

    Cons

    • Steep learning curve — requires significant training 
    • Slow loading speed
    • High price point compared to other tools

    Price : Free basic version. Plus version starts at £40/month (billed annually).

    4. Kissmetrics

    A screenshot of a cohorts graph in Kissmetrics

    Kissmetrics is a customer engagement automation platform that offers powerful analytics features. Kissmetrics provides behavioural analytics, segmentation and email campaign automation. 

    Key features

    • Create cohorts based on demographics, user behaviour, referral sources, events and specific time frames.
    • The user path tool provides path visualisation so you can identify common paths users take and spot abandonment points. 
    • Create and optimise conversion funnels.
    • Customise events, user properties, funnels, segments, cohorts and more.

    Pros

    • Powerful data visualisation options
    • Highly customisable

    Cons

    • Difficult to install
    • Not well-suited for small businesses
    • Limited integration with other tools

    Price : Starting at £21/month for 10k events (billed monthly).

    Improve your cohort analysis with Matomo

    When choosing a cohort analysis tool, consider factors such as the tool’s ease of integration with your existing systems, data accuracy, the flexibility it offers in defining cohorts, the comprehensiveness of reporting features, and its scalability to accommodate the growth of your data and analysis needs over time. Moreover, it’s essential to confirm GDPR compliance to uphold rigorous privacy standards. 

    If you’re ready to understand your user’s behaviour, take Matomo for a test drive. Paired with web analytics, this powerful combination can advance your marketing efforts. Start your 21-day free trial today — no credit card required.

  • how to record camera to file while encoding stream to v4l2-loopback device ?

    3 février 2021, par Jonatas

    So i have this logitech c920 camera 1920x1080 h264 capable and i would like to record the camera to file while at the same time copying/encoding the stream to a loopback device so the camera can be still used by other apps.
here is the code i got so far :

    


    ffmpeg -report -f alsa -i hw:CARD=C920,DEV=0 -r 1500 -s 1920x1080 -f v4l2 -vcodec h264 \
-i /dev/video1 -copyinkf -vcodec copy /home/jonatas/Videos/2021-02-01185658.mp4 \ 
-f v4l2 /dev/video0


    


    error :

    


    Unknown V4L2 pixel format equivalent for yuvj420p
Could not write header for output file #1 (incorrect codec parameters ?): Invalid argument
Error initializing output stream 1:0 -- 


    


    some info on /dev/video0(loopbackdevice) :

    


    v4l2-ctl -d /dev/video0

    Device Caps      : 0x05208003
        Video Capture
        Video Output
        Video Memory-to-Memory
        Read/Write
        Streaming
        Extended Pix Format
Priority: 0
Video input : 0 (loopback: ok)
Video output: 0 (loopback in)
Format Video Output:
    Width/Height      : 1280/720
    Pixel Format      : 'YUYV' (YUYV 4:2:2)
    Field             : None
    Bytes per Line    : 2560
    Size Image        : 1843200
    Colorspace        : sRGB
    Transfer Function : Default (maps to sRGB)
    YCbCr/HSV Encoding: Default (maps to ITU-R 601)
    Quantization      : Default (maps to Limited Range)
    Flags             : 
Streaming Parameters Video Capture:
    Frames per second: 30.000 (30/1)
    Read buffers     : 8
Streaming Parameters Video Output:
    Frames per second: 30.000 (30/1)
    Write buffers    : 8

User Controls

                    keep_format 0x0098f900 (bool)   : default=0 value=0
              sustain_framerate 0x0098f901 (bool)   : default=0 value=0
                        timeout 0x0098f902 (int)    : min=0 max=100000 step=1 default=0 value=0
               timeout_image_io 0x0098f903 (bool)   : default=0 value=0


    


    some info on my camera in /dev/video1

    


    v4l2-ctl -d /dev/video1 --all
Driver Info:
    Driver name      : uvcvideo
    Card type        : HD Pro Webcam C920
    Bus info         : usb-0000:00:14.0-7.2
    Driver version   : 5.4.78
    Capabilities     : 0x84a00001
        Video Capture
        Metadata Capture
        Streaming
        Extended Pix Format
        Device Capabilities
    Device Caps      : 0x04200001
        Video Capture
        Streaming
        Extended Pix Format
Media Driver Info:
    Driver name      : uvcvideo
    Model            : HD Pro Webcam C920
    Serial           : EC6C336F
    Bus info         : usb-0000:00:14.0-7.2
    Media version    : 5.4.78
    Hardware revision: 0x00000011 (17)
    Driver version   : 5.4.78
Interface Info:
    ID               : 0x03000002
    Type             : V4L Video
Entity Info:
    ID               : 0x00000001 (1)
    Name             : HD Pro Webcam C920
    Function         : V4L2 I/O
    Flags         : default
    Pad 0x01000007   : 0: Sink
      Link 0x0200001f: from remote pad 0x100000a of entity 'Processing 3': Data, Enabled, Immutable
Priority: 2
Video input : 0 (Camera 1: ok)
Format Video Capture:
    Width/Height      : 1280/720
    Pixel Format      : 'H264' (H.264)
    Field             : None
    Bytes per Line    : 2560
    Size Image        : 1843200
    Colorspace        : sRGB
    Transfer Function : Default (maps to sRGB)
    YCbCr/HSV Encoding: Default (maps to ITU-R 601)
    Quantization      : Default (maps to Full Range)
    Flags             : 
Crop Capability Video Capture:
    Bounds      : Left 0, Top 0, Width 1280, Height 720
    Default     : Left 0, Top 0, Width 1280, Height 720
    Pixel Aspect: 1/1
Selection Video Capture: crop_default, Left 0, Top 0, Width 1280, Height 720, Flags: 
Selection Video Capture: crop_bounds, Left 0, Top 0, Width 1280, Height 720, Flags: 
Streaming Parameters Video Capture:
    Capabilities     : timeperframe
    Frames per second: 30.000 (30/1)
    Read buffers     : 0
                     brightness 0x00980900 (int)    : min=0 max=255 step=1 default=128 value=128
                       contrast 0x00980901 (int)    : min=0 max=255 step=1 default=128 value=128
                     saturation 0x00980902 (int)    : min=0 max=255 step=1 default=128 value=128
 white_balance_temperature_auto 0x0098090c (bool)   : default=1 value=1
                           gain 0x00980913 (int)    : min=0 max=255 step=1 default=0 value=255
           power_line_frequency 0x00980918 (menu)   : min=0 max=2 default=2 value=2
                0: Disabled
                1: 50 Hz
                2: 60 Hz
      white_balance_temperature 0x0098091a (int)    : min=2000 max=6500 step=1 default=4000 value=3233 flags=inactive
                      sharpness 0x0098091b (int)    : min=0 max=255 step=1 default=128 value=128
         backlight_compensation 0x0098091c (int)    : min=0 max=1 step=1 default=0 value=0
                  exposure_auto 0x009a0901 (menu)   : min=0 max=3 default=3 value=3
                1: Manual Mode
                3: Aperture Priority Mode
              exposure_absolute 0x009a0902 (int)    : min=3 max=2047 step=1 default=250 value=333 flags=inactive
         exposure_auto_priority 0x009a0903 (bool)   : default=0 value=0
                   pan_absolute 0x009a0908 (int)    : min=-36000 max=36000 step=3600 default=0 value=0
                  tilt_absolute 0x009a0909 (int)    : min=-36000 max=36000 step=3600 default=0 value=0
                 focus_absolute 0x009a090a (int)    : min=0 max=250 step=5 default=0 value=0
                     focus_auto 0x009a090c (bool)   : default=1 value=0
                  zoom_absolute 0x009a090d (int)    : min=100 max=500 step=1 default=100 value=100


    


    if i remove the 3rd line of my script the camera records the stream of video and sound to the file flawless. i tried diferent things for the 3rd line as -vcodec and something to do with pix-format flag to YUYV without success.
Is it possible to achieve this with just one ffmpeg process ?
Will the sound be made available to the loopback device also ?
How to transcode it to the proper pixel format used by loopback device ?

    


  • Saying Goodbye To Old Machines

    1er décembre 2014, par Multimedia Mike — General, powerpc, via

    I recently sent a few old machines off for recycling. Both had relevance to the early days of the FATE testing effort. As is my custom, I photographed them (poorly, of course).

    First, there’s the PowerPC-based Mac Mini I procured thanks to a Craigslist ad in late 2006. I had plans to develop automated FFmpeg building and testing and was already looking ahead toward testing multiple CPU architectures. Again, this was 2006 and PowerPC wasn’t completely on the outs yet– although Apple’s MacTel transition was in full swing, the entire new generation of video game consoles was based on PowerPC.


    PPC Mac Mini pieces

    Click for larger image


    I remember trying to find a Mac Mini PPC on Craigslist. Many were to be found, but all asked more than the price of even a new Mac Mini Intel, always because the seller was leaving all of last year’s applications and perhaps including a monitor, neither of which I needed. Fortunately, I found this bare Mac Mini. Also fortunate was the fact that it was far easier to install Linux on it than the first PowerPC machine I owned.

    After FATE operation transitioned away from me, I still kept the machine in service as an edge server and automated backup machine. That is, until the hard drive failed on reboot one day. Thus, when it was finally time to recycle the computer, I felt it necessary to disassemble the machine and remove the hard drive for possible salvage and then for destruction.

    If you’ve ever attempted to upgrade or otherwise service this style of Mac Mini, you will no doubt recognize the pictured paint scraper tool as standard kit. I have had that tool since I first endeavored to upgrade the RAM to 1 GB from the standard 1/2 GB. Performing such activities on a Mac Mini is tedious, but only if you care about putting it back together afterwards.

    The next machine is a bit older. I put it together nearly a decade ago, early in 2005. This machine’s original duty was “download agent”– this would be more specifically called a BitTorrent machine in modern tech parlance. Back then, I placed it on someone else’s woefully underutilized home broadband connection (with their permission, of course) when I was too cheap to upgrade from dialup.


    VIA small form factor front

    Click for larger image


    This is a small form factor system from VIA that was clearly designed with home theater PC (HTPC) use cases in mind. It has a VIA C3 x86-compatible CPU (according to my notes, Centaur VIA Samuel 2 stepping 03, flags : fpu de tsc msr cx8 mtrr pge mmx 3dnow) and 128 MB of RAM (initially ; I upgraded it to 512 MB some years later, just for the sake of doing it). And then there was the 120 GB PATA HD for all that downloaded goodness.


    VIA machine small form factor inside

    Click for larger image


    I have specific memories of a time when my main computer at home wasn’t working correctly for one reason or another. Instead, I logged into this machine remotely via SSH to make several optimizations and fixes on FFmpeg’s VP3/Theora video decoder, all from the terminal, without being able to see the decoded images with my own eyes (which is why I insist that even blind people could work on video codecs).

    By the time I got my own broadband, I had become inspired to attempt the automated build and test system for FFmpeg. This was the machine I used for prototyping early brainstorms of FATE. By the time I put a basic build/test system into place in early 2008, I had much faster computers that could build and test the project– obvious limitation of this machine is that it could take at least 1/2 hour to build the entire codebase, and that was the project from 8 years ago.

    So the machine got stuffed in a closet somewhere along the line. The next time I pulled it out was in 2010 when I wanted to toy with Dreamcast programming once more (the machine appears in one of the photos in this post). This was the only machine I still owned which still had an RS-232 serial port (I didn’t know much about USB serial converters yet), plus it still had a bunch of pre-compiled DC homebrew binaries (I was having trouble getting the toolchain to work right).

    The next time I dusted off this machine was late last year when I was trying some experiments with the Microsoft Xbox’s IDE drive (a photo in that post also shows the machine ; this thing shows up a lot on this blog). The VIA machine was the only machine I still owned which had 40-pin IDE connectors which was crucial to my experiment.

    At this point, I was trying to make the machine more useful which meant replacing the ancient Gentoo Linux distribution as well as simply interacting with it via a keyboard and mouse. I have a long Evernote entry documenting a comedy of errors revolving around this little box. The interaction troubles were due to the fact that I didn’t have any PS/2 keyboards left and I couldn’t make a USB keyboard work with it. Diego was able to explain that I needed to flip a bit in the BIOS to address this which worked. As for upgrading the OS, I tried numerous Linux distributions large and small, mostly focusing on the small. None worked. I eventually learned that, while I was trying to use i686 distributions, this machine did not actually qualify as an i686 CPU ; installations usually booted but failed because the default kernel required the cmov instruction. I was advised to try i386 distros instead. My notes don’t indicate whether I had any luck on this front before I gave up and moved on.

    I just made the connection that this VIA machine has two 40-pin IDE connectors which means that the thing was technically capable of supporting up to 4 IDE devices. Obviously, the computer couldn’t really accommodate that in terms of space or power. When I wanted to try installing a new OS, I needed take off the top and connect a rather bulky IDE CD-ROM drive. This computer’s casing was supposed to be able to support a slimline optical drive (perhaps like the type found in laptops), but I could never quite visualize how that was supposed to work, space-wise. When I disassembled the PowerPC Mac Mini, I realized I might be able to repurpose that machines optical drive for this computer. Obviously, I thought better of trying since both machines are off to the recycle pile.

    I would still like to work on the Xbox project a bit more, but I procured a different, unused, much more powerful yet still old computer that has a motherboard with 1 PATA connector in addition to 6 SATA connectors. If I ever get around to toying with Linux kernel development, this should be a much more appropriate platform to use.

    I thought about turning this machine into an old Windows XP (and lower, down to Windows 3.1) gaming platform ; the capabilities of the machine would probably be perfect for a huge portion of my Windows game collection. But I think the lack of an optical drive renders this idea intractable. External USB drives are likely out of the question since there is very little chance that this motherboard featured USB 2.0 (the specs don’t mention 2.0, so the USB ports are probably 1.1).

    So it is with fond memories that I send off both machines, sans hard drives, to the recycle pile. I’m still deciding on an appropriate course of action for failed hard drives, though.