
Recherche avancée
Médias (1)
-
Rennes Emotion Map 2010-11
19 octobre 2011, par
Mis à jour : Juillet 2013
Langue : français
Type : Texte
Autres articles (31)
-
Personnaliser en ajoutant son logo, sa bannière ou son image de fond
5 septembre 2013, parCertains thèmes prennent en compte trois éléments de personnalisation : l’ajout d’un logo ; l’ajout d’une bannière l’ajout d’une image de fond ;
-
Ecrire une actualité
21 juin 2013, parPrésentez les changements dans votre MédiaSPIP ou les actualités de vos projets sur votre MédiaSPIP grâce à la rubrique actualités.
Dans le thème par défaut spipeo de MédiaSPIP, les actualités sont affichées en bas de la page principale sous les éditoriaux.
Vous pouvez personnaliser le formulaire de création d’une actualité.
Formulaire de création d’une actualité Dans le cas d’un document de type actualité, les champs proposés par défaut sont : Date de publication ( personnaliser la date de publication ) (...) -
Publier sur MédiaSpip
13 juin 2013Puis-je poster des contenus à partir d’une tablette Ipad ?
Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir
Sur d’autres sites (5376)
-
Ode to the Gravis Ultrasound
1er août 2011, par Multimedia Mike — GeneralWARNING : This post is a bunch of nostalgia. Feel free to follow along if you recall the DOS days of the early-mid 1990s.
I finally let go of my Gravis Ultrasound MAX sound card a little while ago. It felt like the end of an era for me, even though I had scarcely used the card in recent memory.
The Beginning
What is the Gravis Ultrasound ? Only the finest PC sound card from the classic DOS days. Back in the day (very early 1990s), most consumer PC sound cards were Yamaha OPL FM synthesizers paired with a basic digital to analog converter (DAC). Gravis, a company known for game controllers, dared to break with the dominant paradigm of Sound Blaster clones and create a sound card that had 32 digital channels.
I heard about the GUS sometime in 1992 through one of the dominant online services at the time, Prodigy. Through the message boards, I learned of a promotion with Electronic Arts in which customers could pre-order a GUS at a certain discount along with 2 EA games from a selected catalog (with progressive discounts when ordering more games from the list). I know I got the DOS version of PowerMonger ; I think the other was Night Shift, though that doesn’t seem to be an EA title.Anyway, 1992 saw many maddening delays of the GUS hardware. Finally, reports of GUS shipments began to trickle into the Prodigy message forums. Then one day in November, 1992, mine arrived. Into the 286 machine it went and a valiant attempt at software installation was made. A friend and I fought with the software late into the evening, trying to make this thing work reasonably. I remember grabbing a pair of old headphones sitting near the computer that were used for an ancient (even for the time) portable radio. That was the only means of sound reproduction we had available at that moment. And it still sounded incredible.
After graduating to progressively superior headphones, I would later return to that original pair only to feel my ears were being physically assaulted. Strange, they sounded fine that first night I was trying to make the GUS work. I guess this was my first understanding that the degree to which one is a snobby audiophile is all a matter of hard-earned experience.
Technology
The GUS was powered by something called a GF1 which was supposed to use a technology called wavetable synthesis. In the early days, I thought (and I wasn’t alone in this) that this meant that the GF1 chip had a bunch of digitized instrument samples stored in the ASIC. That wasn’t it.However, it did feature 32 digital channels at a time when most PC audio cards had 2 (plus that Yamaha FM synthesizer). There was some hemming and hawing about how the original GUS couldn’t drive all 32 channels at a full 44.1 kHz ("CD quality") playback rate. It’s true— if 14 channels were enabled, all could be played at 44.1 kHz. Enabling more channels started progressive degradation and with all 32 channels, each was only playing at around 19 kHz. Still, from my emerging game programmer perspective, that allowed for 8-channel tracker music and 6 channels of sound effects, all at the vaunted CD level of quality.
Games and Compatibility
The primary reason to have a discrete sound card was for entertainment applications — ahem, games. GUS support was pretty sketchy out of the gate (ostensibly a major reason for the card’s delay). While many sound cards offered Sound Blaster emulation by basically having the same hardware as Sound Blaster cards, the GUS took a software route towards emulating the SB. To do this required a program called the Sound Blaster Operating System, or SBOS.Oh, how awesome it was to hear the program exclaim "SBOS installed !" And how harshly it grated on your nerves after the 200th time hearing it due to so many reboots and fiddling with options to make your games work. Also, I’ve always wondered if there’s something special about sampling an ’s’ sound — does it strain the sampling frequency range ? Perhaps the phrase was sampled at too low a bitrate because the ’s’ sounds didn’t come through very clearly, which is something you notice after hundreds of iterations when there are 3 ’s’ sounds in the phrase.
Fortunately, SBOS became less relevant with the advent of Mega-Em, a separate emulator which intercepted calls to Roland MIDI systems and routed them to the very capable GUS. Roland-supporting games sounded beautiful.
Eventually, more and more DOS games were released with native Gravis support, sometimes with the help of The Miles Sound System (from our friends at Rad Game Tools — you know, the people behind Smacker and Bink). The library changelog is quite the trip down PC memory lane.
An important area where the GUS shined brightly was that of demos and music trackers. The emerging PC demo scene embraced the powerful GUS (aided, no doubt, by Gravis’ sponsorship of the community) and the coolest computer art and music of the time natively supported the card.
Programming
At this point in my life, I was a budding programmer in high school and was fairly intent on programming video games. So far, I had figured out how to make a few blips using a borrowed Sound Blaster card. I went to great lengths to learn how to program the Gravis Ultrasound.Oh you kids today, with your easy access to information at the tips of your fingers thanks to Google and the broader internet. I had to track down whatever information I could find through a combination of Prodigy message boards and local dialup BBSes and FidoNet message bases. Gravis was initially tight-lipped about programming information for its powerful card, as was de rigueur of hardware companies (something that largely persists to this day). But Gravis eventually saw an opportunity to one-up encumbent Creative Labs and released a full SDK for the Ultrasound. I wanted the SDK badly.
So it was early-mid 1993. Gravis released an SDK. I heard that it was available on their support BBS. Their BBS with a long distance phone number. If memory serves, the SDK was only in the neighborhood of 1.5 Mbytes. That takes a long time to transfer via a 2400 baud modem at a time when long distance phone charges were still a thing and not insubstantial.
Luckily, they also put the SDK on something called an ’FTP site’. Fortunately, about this time, I had the opportunity to get some internet access via the local university.
Indeed, my entire motivation for initially wanting to get on the internet was to obtain special programming information. Is that nerdy enough for you ?
I see that the GUS SDK is still available via the Gravis FTP site. The file GUSDK222.ZIP is dated 1998 and is less than a megabyte.
Next Generation : CD Support
So I had my original GUS by the end of 1992. That was just the first iteration of the Gravis Ultrasound. The next generation was the GUS MAX. When I was ready to get into the CD-ROM era, this was what I wanted in my computer. This is because the GUS MAX had CD-ROM support. This is odd to think about now when all optical drives have SATA interfaces and (P)ATA interfaces before that— what did CD-ROM compatibility mean back then ? I wasn’t quite sure. But in early 1995, I headed over to Computer City (R.I.P.) and bought a new GUS MAX and Sony double-speed CD-ROM drive to install in the family’s PC.
About the "CD-ROM compatibility" : It seems that there were numerous competing interfaces in the early days of CD-ROM technology. The GUS MAX simply integrated 3 different CD-ROM controllers onto the audio card. This was superfluous to me since the Sony drive came with an appropriate controller card anyway, though I didn’t figure out that the extra controller card was unnecessary until after I installed it. No matter ; computers of the day were rife with expansion ports.
The 3 different CD-ROM controllers on the GUS MAX
Explaining The Difference
It was difficult to explain the difference in quality to those who didn’t really care. Sometime during 1995, I picked up a quasi-promotional CD-ROM called "The Gravis Ultrasound Experience" from Babbage’s computer store (remember when that was a thing ?). As most PC software had been distributed on floppy discs up until this point, this CD-ROM was an embarrassment of riches. Tons of game demos, scene demos, tracker music, and all the latest GUS drivers and support software.Further, the CD-ROM had a number of red book CD audio tracks that illustrated the difference between Sound Blaster cards and the GUS. I remember loaning this to a tech-savvy coworker who disbelieved how awesome the GUS was. The coworker took it home, listened to it, and wholly agreed that the GUS audio sounded better than the SB audio in the comparison — and was thoroughly confused because she was hearing this audio emanating from her Sound Blaster. It was the difference between real-time and pre-rendered audio, I suppose, but I failed to convey that message. I imagine the same issue comes up even today regarding real-time video rendering vs., e.g., a pre-rendered HD cinematic posted on YouTube.
Regrettably, I can’t find that CD-ROM anymore which leads me to believe that the coworker never gave it back. Too bad, because it was quite the treasure trove.
Aftermath
According to folklore I’ve heard, Gravis couldn’t keep up as the world changed to Windows and failed to deliver decent drivers. Indeed, I remember trying to keep my GUS in service under Windows 95 well into 1998 but eventually relented and installed some kind of more appropriate sound card that was better supported under Windows.Of course, audio output capability has been standard issue for any PC for at least 10 years and many people aren’t even aware that discrete sound cards still exist. Real-time audio rendering has become less essential as full musical tracks can be composed and compressed into PCM format and delivered with the near limitless space afforded by optical storage.
A few years ago, it was easy to pick up old GUS cards on eBay for cheap. As of this writing, there are only a few and they’re pricy (but perhaps not selling). Maybe I was just viewing during the trough of no value a few years ago.
Nowadays, of course, anyone interested in studying the old GUS or getting a nostalgia fix need only boot up the always-excellent DOSBox emulator which provides remarkable GUS emulation support.
-
CD-R Read Speed Experiments
21 mai 2011, par Multimedia Mike — Science Projects, Sega DreamcastI want to know how fast I can really read data from a CD-R. Pursuant to my previous musings on this subject, I was informed that it is inadequate to profile reading just any file from a CD-R since data might be read faster or slower depending on whether the data is closer to the inside or the outside of the disc.
Conclusion / Executive Summary
It is 100% true that reading data from the outside of a CD-R is faster than reading data from the inside. Read on if you care to know the details of how I arrived at this conclusion, and to find out just how much speed advantage there is to reading from the outside rather than the inside.Science Project Outline
- Create some sample CD-Rs with various properties
- Get a variety of optical drives
- Write a custom program that profiles the read speed
Creating The Test Media
It’s my understanding that not all CD-Rs are created equal. Fortunately, I have 3 spindles of media handy : Some plain-looking Memorex discs, some rather flamboyant Maxell discs, and those 80mm TDK discs :
My approach for burning is to create a single file to be burned into a standard ISO-9660 filesystem. The size of the file will be the advertised length of the CD-R minus 1 megabyte for overhead— so, 699 MB for the 120mm discs, 209 MB for the 80mm disc. The file will contain a repeating sequence of 0..0xFF bytes.
Profiling
I don’t want to leave this to the vagaries of any filesystem handling layer so I will conduct this experiment at the sector level. Profiling program outline :- Read the CD-ROM TOC and get the number of sectors that comprise the data track
- Profile reading the first 20 MB of sectors
- Profile reading 20 MB of sectors in the middle of the track
- Profile reading the last 20 MB of sectors
Unfortunately, I couldn’t figure out the raw sector reading on modern Linux incarnations (which is annoying since I remember it being pretty straightforward years ago). So I left it to the filesystem after all. New algorithm :
- Open the single, large file on the CD-R and query the file length
- Profile reading the first 20 MB of data, 512 kbytes at a time
- Profile reading 20 MB of sectors in the middle of the track (starting from filesize / 2 - 10 MB), 512 kbytes at a time
- Profile reading the last 20 MB of sectors (starting from filesize - 20MB), 512 kbytes at a time
Empirical Data
I tested the program in Linux using an LG Slim external multi-drive (seen at the top of the pile in this post) and one of my Sega Dreamcast units. I gathered the median value of 3 runs for each area (inner, middle, and outer). I also conducted a buffer flush in between Linux runs (as root :'sync; echo 3 > /proc/sys/vm/drop_caches'
).LG Slim external multi-drive (reading from inner, middle, and outer areas in kbytes/sec) :
- TDK-80mm : 721, 897, 1048
- Memorex-120mm : 1601, 2805, 3623
- Maxell-120mm : 1660, 2806, 3624
So the 120mm discs can range from about 10.5X all the way up to a full 24X on this drive. For whatever reason, the 80mm disc fares a bit worse — even at the inner track — with a range of 4.8X - 7X.
Sega Dreamcast (reading from inner, middle, and outer areas in kbytes/sec) :
- TDK-80mm : 502, 632, 749
- Memorex-120mm : 499, 889, 1143
- Maxell-120mm : 500, 890, 1156
It’s interesting that the 80mm disc performed comparably to the 120mm discs in the Dreamcast, in contrast to the LG Slim drive. Also, the results are consistent with my previous profiling experiments, which largely only touched the inner area. The read speeds range from 3.3X - 7.7X. The middle of a 120mm disc reads at about 6X.
Implications
A few thoughts regarding these results :- Since the very definition of 1X is the minimum speed necessary to stream data from an audio CD, then presumably, original 1X CD-ROM drives would have needed to be capable of reading 1X from the inner area. I wonder what the max read speed at the outer edges was ? It’s unlikely I would be able to get a 1X drive working easily in this day and age since the earliest CD-ROM drives required custom controllers.
- I think 24X is the max rated read speed for CD-Rs, at least for this drive. This implies that the marketing literature only cites the best possible numbers. I guess this is no surprise, similar to how monitors and TVs have always been measured by their diagonal dimension.
- Given this data, how do you engineer an ISO-9660 filesystem image so that the timing-sensitive multimedia files live on the outermost track ? In the Dreamcast case, if you can guarantee your FMV files will live somewhere between the middle and the end of the disc, you should be able to count on a bitrate of at least 900 kbytes/sec.
Source Code
Here is the program I wrote for profiling. Note that the filename is hardcoded (#define FILENAME
). Compiling for Linux is a simple'gcc -Wall profile-cdr.c -o profile-cdr'
. Compiling for Dreamcast is performed in the standard KallistiOS manner (people skilled in the art already know what they need to know) ; the only variation is to compile with the'-D_arch_dreamcast'
flag, which the default KOS environment adds anyway.C :-
#ifdef _arch_dreamcast
-
#include <kos .h>
-
-
/* map I/O functions to their KOS equivalents */
-
#define open fs_open
-
#define lseek fs_seek
-
#define read fs_read
-
#define close fs_close
-
-
#define FILENAME "/cd/bigfile"
-
#else
-
#include <stdio .h>
-
#include <sys /types.h>
-
#include </sys><sys /stat.h>
-
#include </sys><sys /time.h>
-
#include <fcntl .h>
-
#include <unistd .h>
-
-
#define FILENAME "/media/Full disc/bigfile"
-
#endif
-
-
/* Get a current absolute millisecond count ; it doesn’t have to be in
-
* reference to anything special. */
-
unsigned int get_current_milliseconds()
-
{
-
#ifdef _arch_dreamcast
-
return timer_ms_gettime64() ;
-
#else
-
struct timeval tv ;
-
gettimeofday(&tv, NULL) ;
-
return tv.tv_sec * 1000 + tv.tv_usec / 1000 ;
-
#endif
-
}
-
-
#define READ_SIZE (20 * 1024 * 1024)
-
#define READ_BUFFER_SIZE (512 * 1024)
-
-
int main()
-
{
-
int i, j ;
-
int fd ;
-
char read_buffer[READ_BUFFER_SIZE] ;
-
off_t filesize ;
-
unsigned int start_time, end_time ;
-
-
fd = open(FILENAME, O_RDONLY) ;
-
if (fd == -1)
-
{
-
return 1 ;
-
}
-
filesize = lseek(fd, 0, SEEK_END) ;
-
-
for (i = 0 ; i <3 ; i++)
-
{
-
if (i == 0)
-
{
-
lseek(fd, 0, SEEK_SET) ;
-
}
-
else if (i == 1)
-
{
-
lseek(fd, (filesize / 2) - (READ_SIZE / 2), SEEK_SET) ;
-
}
-
else
-
{
-
lseek(fd, filesize - READ_SIZE, SEEK_SET) ;
-
}
-
/* read 20 MB ; 40 chunks of 1/2 MB */
-
start_time = get_current_milliseconds() ;
-
for (j = 0 ; j <(READ_SIZE / READ_BUFFER_SIZE) ; j++)
-
if (read(fd, read_buffer, READ_BUFFER_SIZE) != READ_BUFFER_SIZE)
-
{
-
break ;
-
}
-
end_time = get_current_milliseconds() ;
-
end_time, start_time, end_time - start_time,
-
READ_SIZE / (end_time - start_time)) ;
-
}
-
-
close(fd) ;
-
-
return 0 ;
-
}
-
The 11th Hour RoQ Variation
12 avril 2012, par Multimedia Mike — Game Hacking, dreamroq, Reverse Engineering, roq, Vector QuantizationI have been looking at the RoQ file format almost as long as I have been doing practical multimedia hacking. However, I have never figured out how the RoQ format works on The 11th Hour, which was the game for which the RoQ format was initially developed. When I procured the game years ago, I remember finding what appeared to be RoQ files and shoving them through the open source decoders but not getting the right images out.
I decided to dust off that old copy of The 11th Hour and have another go at it.
Baseline
The game consists of 4 CD-ROMs. Each disc has a media/ directory that has a series of files bearing the extension .gjd, likely the initials of one Graeme J. Devine. These are resource files which are merely headerless concatenations of other files. Thus, at first glance, one file might appear to be a single RoQ file. So that’s the source of some of the difficulty : Sending an apparent RoQ .gjd file through a RoQ player will often cause the program to complain when it encounters the header of another RoQ file.I have uploaded some samples to the usual place.
However, even the frames that a player can decode (before encountering a file boundary within the resource file) look wrong.
Investigating Codebooks Using dreamroq
I wrote dreamroq last year– an independent RoQ playback library targeted towards embedded systems. I aimed it at a gjd file and quickly hit a codebook error.RoQ is a vector quantizer video codec that maintains a codebook of 256 2×2 pixel vectors. In the Quake III and later RoQ files, these are transported using a YUV 4:2:0 colorspace– 4 Y samples, a U sample, and a V sample to represent 4 pixels. This totals 6 bytes per vector. A RoQ codebook chunk contains a field that indicates the number of 2×2 vectors as well as the number of 4×4 vectors. The latter vectors are each comprised of 4 2×2 vectors.
Thus, the total size of a codebook chunk ought to be (# of 2×2 vectors) * 6 + (# of 4×4 vectors) * 4.
However, this is not the case with The 11th Hour RoQ files.
Longer Codebooks And Mystery Colorspace
Juggling the numbers for a few of the codebook chunks, I empirically determined that the 2×2 vectors are represented by 10 bytes instead of 6. Now I need to determine what exactly these 10 bytes represent.I should note that I suspect that everything else about these files lines up with successive generations of the format. For example if a file has 640×320 resolution, that amounts to 40×20 macroblocks. dreamroq iterates through 40×20 8×8 blocks and precisely exhausts the VQ bitstream. So that all looks valid. I’m just puzzled on the codebook format.
Here is an example codebook dump :
ID 0x1002, len = 0x0000014C, args = 0x1C0D 0 : 00 00 00 00 00 00 00 00 80 80 1 : 08 07 00 00 1F 5B 00 00 7E 81 2 : 00 00 15 0F 00 00 40 3B 7F 84 3 : 00 00 00 00 3A 5F 18 13 7E 84 4 : 00 00 00 00 3B 63 1B 17 7E 85 5 : 18 13 00 00 3C 63 00 00 7E 88 6 : 00 00 00 00 00 00 59 3B 7F 81 7 : 00 00 56 23 00 00 61 2B 80 80 8 : 00 00 2F 13 00 00 79 63 81 83 9 : 00 00 00 00 5E 3F AC 9B 7E 81 10 : 1B 17 00 00 B6 EF 77 AB 7E 85 11 : 2E 43 00 00 C1 F7 75 AF 7D 88 12 : 6A AB 28 5F B6 B3 8C B3 80 8A 13 : 86 BF 0A 03 D5 FF 3A 5F 7C 8C 14 : 00 00 9E 6B AB 97 F5 EF 7F 80 15 : 86 73 C8 CB B6 B7 B7 B7 85 8B 16 : 31 17 84 6B E7 EF FF FF 7E 81 17 : 79 AF 3B 5F FC FF E2 FF 7D 87 18 : DC FF AE EF B3 B3 B8 B3 85 8B 19 : EF FF F5 FF BA B7 B6 B7 88 8B 20 : F8 FF F7 FF B3 B7 B7 B7 88 8B 21 : FB FF FB FF B8 B3 B4 B3 85 88 22 : F7 FF F7 FF B7 B7 B9 B7 87 8B 23 : FD FF FE FF B9 B7 BB B7 85 8A 24 : E4 FF B7 EF FF FF FF FF 7F 83 25 : FF FF AC EB FF FF FC FF 7F 83 26 : CC C7 F7 FF FF FF FF FF 7F 81 27 : FF FF FE FF FF FF FF FF 80 80
Note that 0x14C (the chunk size) = 332, 0x1C and 0x0D (the chunk arguments — count of 2×2 and 4×4 vectors, respectively) are 28 and 13. 28 * 10 + 13 * 4 = 332, so the numbers check out.
Do you see any patterns in the codebook ? Here are some things I tried :
- Treating the last 2 bytes as U & V and treating the first 4 as the 4 Y samples :
- Treating the last 2 bytes as U & V and treating the first 8 as 4 16-bit little-endian Y samples :
- Disregarding the final 2 bytes and treating the first 8 bytes as 4 RGB565 pixels (both little- and big-endian, respectively, shown here) :
- Based on the type of data I’m seeing in these movies (which appears to be intended as overlays), I figured that some of these bits might indicate transparency ; here is 15-bit big-endian RGB which disregards the top bit of each pixel :
These images are taken from the uploaded sample bdpuz.gjd, apparently a component of the puzzle represented in this screenshot.
Unseen Types
It has long been rumored that early RoQ files could contain JPEG images. I finally found one such specimen. One of the files bundled early in the uploaded fhpuz.gjd sample contains a JPEG frame. It’s a standard JFIF file and can easily be decoded after separating the bytes from the resource using ‘dd’. JPEGs serve as intraframes in the coding scheme, with successive RoQ frames moving objects on top.However, a new chunk type showed up as well, one identified by 0×1030. I have never encountered this type. Where could I possibly find data about this ? Fortunately, iD Games recently posted all of their open sourced games at Github. Reading through the code for their official RoQ decoder, I see that this is called a RoQ_PACKET. The name and the code behind it are both supremely unhelpful. The code is basically a no-op. The payloads of the various RoQ_PACKETs from one sample are observed to be either 8784, 14752, or 14760 bytes in length. It’s very likely that this serves the same purpose as the JPEG intraframes.
Other Tidbits
I read through the readme.txt on the first game disc and found this nugget :g) Animations displayed normally or in SPOOKY MODE
SPOOKY MODE is blue-tinted grayscale with color cursors, puzzle
and game pieces. It is the preferred display setting of the
developers at Trilobyte. Just for fun, try out the SPOOKY
MODE.The MobyGames screenshot page has a number of screenshots labeled as being captured in spooky mode. Color tricks ?
Meanwhile, another twist arose as I kept tweaking dreamroq to deal with more RoQ weirdness : After modifying my dreamroq code to handle these 10-byte vectors, it eventually chokes on another codebook. These codebooks happen to have 6-byte vectors again ! Fortunately, I was already working on a scheme to automatically detect which codebook is in play (plugging the numbers into a formula and seeing which vector size checks out).
- Treating the last 2 bytes as U & V and treating the first 4 as the 4 Y samples :