Recherche avancée

Médias (0)

Mot : - Tags -/flash

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (34)

  • ANNEXE : Les plugins utilisés spécifiquement pour la ferme

    5 mars 2010, par

    Le site central/maître de la ferme a besoin d’utiliser plusieurs plugins supplémentaires vis à vis des canaux pour son bon fonctionnement. le plugin Gestion de la mutualisation ; le plugin inscription3 pour gérer les inscriptions et les demandes de création d’instance de mutualisation dès l’inscription des utilisateurs ; le plugin verifier qui fournit une API de vérification des champs (utilisé par inscription3) ; le plugin champs extras v2 nécessité par inscription3 (...)

  • Encoding and processing into web-friendly formats

    13 avril 2011, par

    MediaSPIP automatically converts uploaded files to internet-compatible formats.
    Video files are encoded in MP4, Ogv and WebM (supported by HTML5) and MP4 (supported by Flash).
    Audio files are encoded in MP3 and Ogg (supported by HTML5) and MP3 (supported by Flash).
    Where possible, text is analyzed in order to retrieve the data needed for search engine detection, and then exported as a series of image files.
    All uploaded files are stored online in their original format, so you can (...)

  • MediaSPIP v0.2

    21 juin 2013, par

    MediaSPIP 0.2 is the first MediaSPIP stable release.
    Its official release date is June 21, 2013 and is announced here.
    The zip file provided here only contains the sources of MediaSPIP in its standalone version.
    To get a working installation, you must manually install all-software dependencies on the server.
    If you want to use this archive for an installation in "farm mode", you will also need to proceed to other manual (...)

Sur d’autres sites (5711)

  • FFMPEG : RTSP re-stream dies randomly

    14 mai 2018, par stevendesu

    I have a security camera streaming RTSP, and I wish to re-stream this to an RTMP ingest server. For now I’m using my laptop as an ffmpeg proxy, but eventually I’ll use a raspberry pi or something similar (cheap/small)

    Here’s the command I’m using (pretty simple) :

    ffmpeg -i rtsp://@10.0.0.16:554/1/h264major -c:v libx264 -c:a none -f flv rtmp://output/camera_stream

    This works but after a minute or two the stream dies. Here’s the output :

    ffmpeg version N-90057-g7c82e0f Copyright (c) 2000-2018 the FFmpeg developers
     built with gcc 5.4.0 (Ubuntu 5.4.0-6ubuntu1~16.04.6) 20160609
     configuration: --prefix=/home/sbarnett/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/sbarnett/ffmpeg_build/include --extra-ldflags=-L/home/sbarnett/ffmpeg_build/lib --extra-libs='-lpthread -lm' --bindir=/home/sbarnett/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libspeex --enable-nonfree
     libavutil      56.  7.101 / 56.  7.101
     libavcodec     58. 11.101 / 58. 11.101
     libavformat    58.  9.100 / 58.  9.100
     libavdevice    58.  1.100 / 58.  1.100
     libavfilter     7. 12.100 /  7. 12.100
     libswscale      5.  0.101 /  5.  0.101
     libswresample   3.  0.101 /  3.  0.101
     libpostproc    55.  0.100 / 55.  0.100
    Input #0, rtsp, from 'rtsp://@10.0.0.16:554/1/h264major':
     Metadata:
       title           : h264major
       comment         : h264major
     Duration: N/A, start: 0.360000, bitrate: N/A
       Stream #0:0: Video: h264 (Main), yuvj420p(pc, bt709, progressive), 720x480, 25 fps, 25 tbr, 90k tbn, 50 tbc
    Stream mapping:
     Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
    Press [q] to stop, [?] for help
    [libx264 @ 0x38843c0] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX FMA3 BMI2 AVX2
    [libx264 @ 0x38843c0] profile High, level 3.0
    [libx264 @ 0x38843c0] 264 - core 155 - H.264/MPEG-4 AVC codec - Copyleft 2003-2018 - http://www.videolan.org/x264.html - options: cabac=1 ref=3 deblock=1:0:0 analyse=0x3:0x113 me=hex subme=7 psy=1 psy_rd=1.00:0.00 mixed_ref=1 me_range=16 chroma_me=1 trellis=1 8x8dct=1 cqm=0 deadzone=21,11 fast_pskip=1 chroma_qp_offset=-2 threads=6 lookahead_threads=1 sliced_threads=0 nr=0 decimate=1 interlaced=0 bluray_compat=0 constrained_intra=0 bframes=3 b_pyramid=2 b_adapt=1 b_bias=0 direct=1 weightb=1 open_gop=0 weightp=2 keyint=250 keyint_min=25 scenecut=40 intra_refresh=0 rc_lookahead=40 rc=crf mbtree=1 crf=23.0 qcomp=0.60 qpmin=0 qpmax=69 qpstep=4 ip_ratio=1.40 aq=1:1.00
    Output #0, flv, to 'rtmp://output/camera_stream':
     Metadata:
       title           : h264major
       comment         : h264major
       encoder         : Lavf58.9.100
       Stream #0:0: Video: h264 (libx264) ([7][0][0][0] / 0x0007), yuvj420p(pc), 720x480, q=-1--1, 25 fps, 1k tbn, 25 tbc
       Metadata:
         encoder         : Lavc58.11.101 libx264
       Side data:
         cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
    Past duration 0.999992 too large
       Last message repeated 29 times
    [rtsp @ 0x3847600] max delay reached. need to consume packet
    [rtsp @ 0x3847600] RTP: missed 48 packets
    Past duration 0.999992 too large
       Last message repeated 4 times
    frame=   44 fps=0.0 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A dup=0 drop=5 speed=   0x    
    frame=   57 fps= 54 q=28.0 size=      43kB time=00:00:00.16 bitrate=2186.4kbits/s dup=0 drop=5 speed=0.153x    
    ... (lots of similar messages) ...  
    frame= 1163 fps= 26 q=28.0 size=    1341kB time=00:00:44.84 bitrate= 245.0kbits/s dup=0 drop=5 speed=0.99x    
    frame= 1177 fps= 26 q=28.0 size=    1353kB time=00:00:45.40 bitrate= 244.2kbits/s dup=0 drop=5 speed=0.99x    
    [rtsp @ 0x3847600] max delay reached. need to consume packet
    [rtsp @ 0x3847600] RTP: missed 2 packets
    frame= 1190 fps= 26 q=28.0 size=    1370kB time=00:00:45.92 bitrate= 244.4kbits/s dup=0 drop=5 speed=0.99x    
    [h264 @ 0x38c08c0] Increasing reorder buffer to 1
    frame= 1201 fps= 26 q=28.0 size=    1381kB time=00:00:46.36 bitrate= 244.0kbits/s dup=0 drop=5 speed=0.989x    
    frame= 1214 fps= 26 q=28.0 size=    1393kB time=00:00:46.88 bitrate= 243.4kbits/s dup=0 drop=5 speed=0.989x    
    ... (lots of similar messages) ...    
    frame= 1761 fps= 25 q=28.0 size=    2030kB time=00:01:08.80 bitrate= 241.7kbits/s dup=0 drop=5 speed=0.993x    
    frame= 1774 fps= 25 q=28.0 size=    2041kB time=00:01:09.32 bitrate= 241.2kbits/s dup=0 drop=5 speed=0.993x    
    [flv @ 0x3884900] Failed to update header with correct duration.
    [flv @ 0x3884900] Failed to update header with correct filesize.
    frame= 1782 fps= 25 q=-1.0 Lsize=    2127kB time=00:01:11.64 bitrate= 243.2kbits/s dup=0 drop=5 speed=1.02x    
    video:2092kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.679417%
    [libx264 @ 0x38843c0] frame I:8     Avg QP:16.89  size: 42446
    [libx264 @ 0x38843c0] frame P:1672  Avg QP:19.54  size:  1065
    [libx264 @ 0x38843c0] frame B:102   Avg QP:23.00  size:   205
    [libx264 @ 0x38843c0] consecutive B-frames: 92.4%  0.0%  0.0%  7.6%
    [libx264 @ 0x38843c0] mb I  I16..4: 12.9% 36.2% 50.9%
    [libx264 @ 0x38843c0] mb P  I16..4:  0.2%  0.2%  0.0%  P16..4: 16.7%  0.7%  1.0%  0.0%  0.0%    skip:81.1%
    [libx264 @ 0x38843c0] mb B  I16..4:  0.1%  0.1%  0.0%  B16..8: 11.7%  0.1%  0.0%  direct: 1.5%  skip:86.5%  L0:62.2% L1:35.3% BI: 2.5%
    [libx264 @ 0x38843c0] 8x8 transform intra:40.8% inter:47.4%
    [libx264 @ 0x38843c0] coded y,uvDC,uvAC intra: 46.5% 53.0% 17.2% inter: 3.9% 8.7% 0.0%
    [libx264 @ 0x38843c0] i16 v,h,dc,p: 21% 56%  8% 15%
    [libx264 @ 0x38843c0] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 23% 33% 31%  1%  2%  3%  2%  2%  3%
    [libx264 @ 0x38843c0] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 25% 39%  9%  3%  3%  4%  5%  3%  8%
    [libx264 @ 0x38843c0] i8c dc,h,v,p: 43% 33% 21%  3%
    [libx264 @ 0x38843c0] Weighted P-Frames: Y:0.0% UV:0.0%
    [libx264 @ 0x38843c0] ref P L0: 88.0%  1.4%  6.6%  4.0%
    [libx264 @ 0x38843c0] ref B L0: 99.4%  0.5%  0.1%
    [libx264 @ 0x38843c0] ref B L1: 99.4%  0.6%
    [libx264 @ 0x38843c0] kb/s:238.73

    The camera is pretty cheap (from China) so it’s likely I’m getting bad data from it or it’s cutting out for a few seconds at a time. Ideally I would need ffmpeg to handle this well (ignore bad data, wait as long as necessary for good data to resume encoding)

    Using ffplay to check out the RTSP stream, I get output like the following :

    $> ffplay -i rtsp://@10.0.0.16:554/1/h264major
    ffplay version N-90057-g7c82e0f Copyright (c) 2003-2018 the FFmpeg developers
     built with gcc 5.4.0 (Ubuntu 5.4.0-6ubuntu1~16.04.6) 20160609
     configuration: --prefix=/home/sbarnett/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/sbarnett/ffmpeg_build/include --extra-ldflags=-L/home/sbarnett/ffmpeg_build/lib --extra-libs='-lpthread -lm' --bindir=/home/sbarnett/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libspeex --enable-nonfree
     libavutil      56.  7.101 / 56.  7.101
     libavcodec     58. 11.101 / 58. 11.101
     libavformat    58.  9.100 / 58.  9.100
     libavdevice    58.  1.100 / 58.  1.100
     libavfilter     7. 12.100 /  7. 12.100
     libswscale      5.  0.101 /  5.  0.101
     libswresample   3.  0.101 /  3.  0.101
     libpostproc    55.  0.100 / 55.  0.100
    Input #0, rtsp, from 'rtsp://@10.0.0.16:554/1/h264major':0B f=0/0
     Metadata:
       title           : h264major
       comment         : h264major
     Duration: N/A, start: 0.320000, bitrate: N/A
       Stream #0:0: Video: h264 (Main), yuvj420p(pc, bt709, progressive), 720x480, 25 fps, 25 tbr, 90k tbn, 50 tbc
    [swscaler @ 0x7f6bbc093180] deprecated pixel format used, make sure you did set range correctly
    [rtsp @ 0x7f6bc0000940] max delay reached. need to consume packet
    [rtsp @ 0x7f6bc0000940] RTP: missed 2 packets
    [h264 @ 0x7f6bc0041080] error while decoding MB 44 28, bytestream -37
    [h264 @ 0x7f6bc0041080] concealing 95 DC, 95 AC, 95 MV errors in I frame
    [rtsp @ 0x7f6bc0000940] max delay reached. need to consume packet
    [rtsp @ 0x7f6bc0000940] RTP: missed 1 packets
    [h264 @ 0x7f6bc0041080] error while decoding MB 43 29, bytestream -49
    [h264 @ 0x7f6bc0041080] concealing 51 DC, 51 AC, 51 MV errors in I frame
    [rtsp @ 0x7f6bc0000940] max delay reached. need to consume packet
    [rtsp @ 0x7f6bc0000940] RTP: missed 2 packets
    [h264 @ 0x7f6bc0041080] Increasing reorder buffer to 1
    [rtsp @ 0x7f6bc0000940] max delay reached. need to consume packet
    [rtsp @ 0x7f6bc0000940] RTP: missed 3 packets
    [h264 @ 0x7f6bc02c3600] error while decoding MB 27 29, bytestream -24
    [h264 @ 0x7f6bc02c3600] concealing 67 DC, 67 AC, 67 MV errors in I frame
    [rtsp @ 0x7f6bc0000940] max delay reached. need to consume packet
    [rtsp @ 0x7f6bc0000940] RTP: missed 2 packets
    [rtsp @ 0x7f6bc0000940] max delay reached. need to consume packet
    [rtsp @ 0x7f6bc0000940] RTP: missed 42 packets
    [rtsp @ 0x7f6bc0000940] max delay reached. need to consume packet
    [rtsp @ 0x7f6bc0000940] RTP: missed 2 packets

    Then eventually the video just freezes. The first time it froze after around 5 minutes, but I wasn’t able to say definitively if it froze the instant 44 packets were dropped or if it froze randomly later. So the second time I stared intently.... for 21 minutes. Then I got bored of it not freezing, turned to pet my cat, and when I looked back 15 seconds later it was frozen. I think it only breaks when no one is watching it.

    What I can say definitively is :

    • While running normally, M-V hovers around 0 (anywhere between -0.01 and +0.01)
    • Once frozen, M-V begins to count down into negative numbers without stopping - although at a rate slower than -1 per second
    • While running normally, aq is 0KB and vq is a positive number (I think it was 30KB or so ?)
    • Once frozen, vq is also 0KB

    It’s a really cheap camera with a crummy power supply that goes out if you breathe on it, so it’s likely the camera is going temporarily offline during this time — but I’d like ffmpeg to wait out a timeout and resume streaming when it sees the camera again.

  • Cohort Analysis 101 : How-To, Examples & Top Tools

    13 novembre 2023, par Erin — Analytics Tips

    Imagine that a farmer is trying to figure out why certain hens are laying large brown eggs and others are laying average-sized white eggs.

    The farmer decides to group the hens into cohorts based on what kind of eggs they lay to make it easier to detect patterns in their day-to-day lives. After careful observation and analysis, she discovered that the hens laying big brown eggs ate more than the roost’s other hens.

    With this cohort analysis, the farmer deduced that a hen’s body weight directly corresponds to egg size. She can now develop a strategy to increase the body weight of her hens to sell more large brown eggs, which are very popular at the weekly farmers’ market.

    Cohort analysis has a myriad of applications in the world of web analytics. Like our farmer, you can use it to better understand user behaviour and reap the benefits of your efforts. This article will discuss the best practices for conducting an effective cohort analysis and compare the top cohort analysis tools for 2024. 

    What is cohort analysis ?

    By definition, cohort analysis refers to a technique where users are grouped based on shared characteristics or behaviours and then examined over a specified period.

    Think of it as a marketing superpower, enabling you to comprehend user behaviours, craft personalised campaigns and allocate resources wisely, ultimately resulting in improved performance and better ROI.

    Why does cohort analysis matter ?

    In web analytics, a cohort is a group of users who share a certain behaviour or characteristic. The goal of cohort analysis is to uncover patterns and compare the performance and behaviour of different cohorts over time.

    An example of a cohort is a group of users who made their first purchase during the holidays. By analysing this cohort, you could learn more about their behaviour and buying patterns. You may discover that this cohort is more likely to buy specific product categories as holiday gifts — you can then tailor future holiday marketing campaigns to include these categories. 

    Types of cohort analysis

    There are a few different types of notable cohorts : 

    1. Time-based cohorts are groups of users categorised by a specific time. The example of the farmer we went over at the beginning of this section is a great example of a time-based cohort.
    2. Acquisition cohorts are users acquired during a specific time frame, event or marketing channel. Analysing these cohorts can help you determine the value of different acquisition methods. 
    3. Behavioural cohorts consist of users who show similar patterns of behaviour. Examples include frequent purchases with your mobile app or digital content engagement. 
    4. Demographic cohorts share common demographic characteristics like age, gender, education level and income. 
    5. Churn cohorts are buyers who have cancelled a subscription/stopped using your service within a specific time frame. Analysing churn cohorts can help you understand why customers leave.
    6. Geographic cohorts are pretty self-explanatory — you can use them to tailor your marketing efforts to specific regions. 
    7. Customer journey cohorts are based on the buyer lifecycle — from acquisition to adoption to retention. 
    8. Product usage cohorts are buyers who use your product/service specifically (think basic users, power users or occasional users). 

    Best practices for conducting a cohort analysis 

    So, you’ve decided you want to understand your user base better but don’t know how to go about it. Perhaps you want to reduce churn and create a more engaging user experience. In this section, we’ll walk you through the dos and don’ts of conducting an effective cohort analysis. Remember that you should tailor your cohort analysis strategy for organisation-specific goals.

    A line graph depicting product usage cohort data with a blue line for new users and a green line for power users.

    1. Preparing for cohort analysis : 

      • First, define specific goals you want your cohort analysis to achieve. Examples include improving conversion rates or reducing churn.
      • Choosing the right time frame will help you compare short-term vs. long-term data trends. 

    2. Creating effective cohorts : 

      • Define your segmentation criteria — anything from demographics to location, purchase history or user engagement level. Narrowing in on your specific segments will make your cohort analysis more precise. 
      • It’s important to find a balance between cohort size and similarity. If your cohort is too small and diverse, you won’t be able to find specific behavioural patterns.

    3. Performing cohort analysis :

        • Study retention rates across cohorts to identify patterns in user behaviour and engagement over time. Pay special attention to cohorts with high retention or churn rates. 
        • Analysing cohorts can reveal interesting behavioural insights — how do specific cohorts interact with your website ? Do they have certain preferences ? Why ? 

    4. Visualising and interpreting data :

      • Visualising your findings can be a great way to reveal patterns. Line charts can help you spot trends, while bar charts can help you compare cohorts.
      • Guide your analytics team on how to interpret patterns in cohort data. Watch for sudden drops or spikes and what they could mean. 

    5. Continue improving :

      • User behaviour is constantly evolving, so be adaptable. Continuous tracking of user behaviour will help keep your strategies up to date. 
      • Encourage iterative analysis optimisation based on your findings. 
    wrench trying to hammer in a nail, and a hammer trying to screw in a screw to a piece of wood

    The top cohort analysis tools for 2024

    In this section, we’ll go over the best cohort analysis tools for 2024, including their key features, cohort analysis dashboards, cost and pros and cons.

    1. Matomo

    A screenshot of a cohorts graph in Matomo

    Matomo is an open-source, GDPR-compliant web analytics solution that offers cohort analysis as a standard feature in Matomo Cloud and is available as a plugin for Matomo On-Premise. Pairing traditional web analytics with cohort analysis will help you gain even deeper insights into understanding user behaviour over time. 

    You can use the data you get from web analytics to identify patterns in user behaviour and target your marketing strategies to specific cohorts. 

    Key features

    • Matomo offers a cohorts table that lets you compare cohorts side-by-side, and it comes with a time series.
      • All core session and conversion metrics are also available in the Cohorts report.
    • Create custom segments based on demographics, geography, referral sources, acquisition date, device types or user behaviour. 
    • Matomo provides retention analysis so you can track how many users from a specific cohort return to your website and when. 
    • Flexibly analyse your cohorts with custom reports. Customise your reports by combining metrics and dimensions specific to different cohorts. 
    • Create cohorts based on events or interactions with your website. 
    • Intuitive, colour-coded data visualisation, so you can easily spot patterns.

    Pros

    • No setup is needed if you use the JavaScript tracker
    • You can fetch cohort without any limit
    • 100% accurate data, no AI or Machine Learning data filling, and without the use of data sampling

    Cons

    • Matomo On-Premise (self-hosted) is free, but advanced features come with additional charges
    • Servers and technical know-how are required for Matomo On-Premise. Alternatively, for those not ready for self-hosting, Matomo Cloud presents a more accessible option and starts at $19 per month.

    Price : 

    • Matomo Cloud : 21-day free trial, then starts at $19 per month (includes Cohorts).
    • Matomo On-Premise : Free to self-host ; Cohorts plugin : 30-day free trial, then $99 per year.

    2. Mixpanel

    Mixpanel is a product analytics tool designed to help teams better understand user behaviour. It is especially well-suited for analysing user behaviour on iOS and Android apps. It offers various cohort analytics features that can be used to identify patterns and engage your users. 

    Key features

    • Create cohorts based on criteria such as sign-up date, first purchase date, referral source, geographic location, device type or another custom event/property. 
    • Compare how different cohorts engage with your app with Mixpanel’s comparative analysis features.
    • Create interactive dashboards, charts and graphs to visualise data.
    • Mixpanel provides retention analysis tools to see how often users return to your product over time. 
    • Send targeted messages and notifications to specific cohorts to encourage user engagement, announce new features, etc. 
    • Track and analyse user behaviours within cohorts — understand how different types of users engage with your product.

    Pros

    • Easily export cohort analysis data for further analysis
    • Combined with Mixpanel reports, cohorts can be a powerful tool for improving your product

    Cons

    • With the free Mixpanel plan, you can’t save cohorts for future use
    • Enterprise-level pricing is expensive
    • Time-consuming cohort creation process

    Price : Free basic version. The growth version starts at £16/month.

    3. Amplitude

    A screenshot of a cohorts graph in Amplitude

    Amplitude is another product analytics solution that can help businesses track user interactions across digital platforms. Amplitude offers a standard toolkit for in-depth cohort analysis.

    Key features

    • Create cohorts based on criteria such as sign-up date, first purchase date, referral source, geographic location, device type or another custom event/property. 
    • Conduct behavioural, time-based and retention analyses.
    • Create custom reports with custom data.
    • Segment cohorts further based on additional criteria and compare multiple cohorts side-by-side.

    Pros

    • Highly customisable and flexible
    • Quick and simple setup

    Cons

    • Steep learning curve — requires significant training 
    • Slow loading speed
    • High price point compared to other tools

    Price : Free basic version. Plus version starts at £40/month (billed annually).

    4. Kissmetrics

    A screenshot of a cohorts graph in Kissmetrics

    Kissmetrics is a customer engagement automation platform that offers powerful analytics features. Kissmetrics provides behavioural analytics, segmentation and email campaign automation. 

    Key features

    • Create cohorts based on demographics, user behaviour, referral sources, events and specific time frames.
    • The user path tool provides path visualisation so you can identify common paths users take and spot abandonment points. 
    • Create and optimise conversion funnels.
    • Customise events, user properties, funnels, segments, cohorts and more.

    Pros

    • Powerful data visualisation options
    • Highly customisable

    Cons

    • Difficult to install
    • Not well-suited for small businesses
    • Limited integration with other tools

    Price : Starting at £21/month for 10k events (billed monthly).

    Improve your cohort analysis with Matomo

    When choosing a cohort analysis tool, consider factors such as the tool’s ease of integration with your existing systems, data accuracy, the flexibility it offers in defining cohorts, the comprehensiveness of reporting features, and its scalability to accommodate the growth of your data and analysis needs over time. Moreover, it’s essential to confirm GDPR compliance to uphold rigorous privacy standards. 

    If you’re ready to understand your user’s behaviour, take Matomo for a test drive. Paired with web analytics, this powerful combination can advance your marketing efforts. Start your 21-day free trial today — no credit card required.

  • Saying Goodbye To Old Machines

    1er décembre 2014, par Multimedia Mike — General, powerpc, via

    I recently sent a few old machines off for recycling. Both had relevance to the early days of the FATE testing effort. As is my custom, I photographed them (poorly, of course).

    First, there’s the PowerPC-based Mac Mini I procured thanks to a Craigslist ad in late 2006. I had plans to develop automated FFmpeg building and testing and was already looking ahead toward testing multiple CPU architectures. Again, this was 2006 and PowerPC wasn’t completely on the outs yet– although Apple’s MacTel transition was in full swing, the entire new generation of video game consoles was based on PowerPC.


    PPC Mac Mini pieces

    Click for larger image


    I remember trying to find a Mac Mini PPC on Craigslist. Many were to be found, but all asked more than the price of even a new Mac Mini Intel, always because the seller was leaving all of last year’s applications and perhaps including a monitor, neither of which I needed. Fortunately, I found this bare Mac Mini. Also fortunate was the fact that it was far easier to install Linux on it than the first PowerPC machine I owned.

    After FATE operation transitioned away from me, I still kept the machine in service as an edge server and automated backup machine. That is, until the hard drive failed on reboot one day. Thus, when it was finally time to recycle the computer, I felt it necessary to disassemble the machine and remove the hard drive for possible salvage and then for destruction.

    If you’ve ever attempted to upgrade or otherwise service this style of Mac Mini, you will no doubt recognize the pictured paint scraper tool as standard kit. I have had that tool since I first endeavored to upgrade the RAM to 1 GB from the standard 1/2 GB. Performing such activities on a Mac Mini is tedious, but only if you care about putting it back together afterwards.

    The next machine is a bit older. I put it together nearly a decade ago, early in 2005. This machine’s original duty was “download agent”– this would be more specifically called a BitTorrent machine in modern tech parlance. Back then, I placed it on someone else’s woefully underutilized home broadband connection (with their permission, of course) when I was too cheap to upgrade from dialup.


    VIA small form factor front

    Click for larger image


    This is a small form factor system from VIA that was clearly designed with home theater PC (HTPC) use cases in mind. It has a VIA C3 x86-compatible CPU (according to my notes, Centaur VIA Samuel 2 stepping 03, flags : fpu de tsc msr cx8 mtrr pge mmx 3dnow) and 128 MB of RAM (initially ; I upgraded it to 512 MB some years later, just for the sake of doing it). And then there was the 120 GB PATA HD for all that downloaded goodness.


    VIA machine small form factor inside

    Click for larger image


    I have specific memories of a time when my main computer at home wasn’t working correctly for one reason or another. Instead, I logged into this machine remotely via SSH to make several optimizations and fixes on FFmpeg’s VP3/Theora video decoder, all from the terminal, without being able to see the decoded images with my own eyes (which is why I insist that even blind people could work on video codecs).

    By the time I got my own broadband, I had become inspired to attempt the automated build and test system for FFmpeg. This was the machine I used for prototyping early brainstorms of FATE. By the time I put a basic build/test system into place in early 2008, I had much faster computers that could build and test the project– obvious limitation of this machine is that it could take at least 1/2 hour to build the entire codebase, and that was the project from 8 years ago.

    So the machine got stuffed in a closet somewhere along the line. The next time I pulled it out was in 2010 when I wanted to toy with Dreamcast programming once more (the machine appears in one of the photos in this post). This was the only machine I still owned which still had an RS-232 serial port (I didn’t know much about USB serial converters yet), plus it still had a bunch of pre-compiled DC homebrew binaries (I was having trouble getting the toolchain to work right).

    The next time I dusted off this machine was late last year when I was trying some experiments with the Microsoft Xbox’s IDE drive (a photo in that post also shows the machine ; this thing shows up a lot on this blog). The VIA machine was the only machine I still owned which had 40-pin IDE connectors which was crucial to my experiment.

    At this point, I was trying to make the machine more useful which meant replacing the ancient Gentoo Linux distribution as well as simply interacting with it via a keyboard and mouse. I have a long Evernote entry documenting a comedy of errors revolving around this little box. The interaction troubles were due to the fact that I didn’t have any PS/2 keyboards left and I couldn’t make a USB keyboard work with it. Diego was able to explain that I needed to flip a bit in the BIOS to address this which worked. As for upgrading the OS, I tried numerous Linux distributions large and small, mostly focusing on the small. None worked. I eventually learned that, while I was trying to use i686 distributions, this machine did not actually qualify as an i686 CPU ; installations usually booted but failed because the default kernel required the cmov instruction. I was advised to try i386 distros instead. My notes don’t indicate whether I had any luck on this front before I gave up and moved on.

    I just made the connection that this VIA machine has two 40-pin IDE connectors which means that the thing was technically capable of supporting up to 4 IDE devices. Obviously, the computer couldn’t really accommodate that in terms of space or power. When I wanted to try installing a new OS, I needed take off the top and connect a rather bulky IDE CD-ROM drive. This computer’s casing was supposed to be able to support a slimline optical drive (perhaps like the type found in laptops), but I could never quite visualize how that was supposed to work, space-wise. When I disassembled the PowerPC Mac Mini, I realized I might be able to repurpose that machines optical drive for this computer. Obviously, I thought better of trying since both machines are off to the recycle pile.

    I would still like to work on the Xbox project a bit more, but I procured a different, unused, much more powerful yet still old computer that has a motherboard with 1 PATA connector in addition to 6 SATA connectors. If I ever get around to toying with Linux kernel development, this should be a much more appropriate platform to use.

    I thought about turning this machine into an old Windows XP (and lower, down to Windows 3.1) gaming platform ; the capabilities of the machine would probably be perfect for a huge portion of my Windows game collection. But I think the lack of an optical drive renders this idea intractable. External USB drives are likely out of the question since there is very little chance that this motherboard featured USB 2.0 (the specs don’t mention 2.0, so the USB ports are probably 1.1).

    So it is with fond memories that I send off both machines, sans hard drives, to the recycle pile. I’m still deciding on an appropriate course of action for failed hard drives, though.