
Recherche avancée
Autres articles (24)
-
La file d’attente de SPIPmotion
28 novembre 2010, parUne file d’attente stockée dans la base de donnée
Lors de son installation, SPIPmotion crée une nouvelle table dans la base de donnée intitulée spip_spipmotion_attentes.
Cette nouvelle table est constituée des champs suivants : id_spipmotion_attente, l’identifiant numérique unique de la tâche à traiter ; id_document, l’identifiant numérique du document original à encoder ; id_objet l’identifiant unique de l’objet auquel le document encodé devra être attaché automatiquement ; objet, le type d’objet auquel (...) -
Contribute to documentation
13 avril 2011Documentation is vital to the development of improved technical capabilities.
MediaSPIP welcomes documentation by users as well as developers - including : critique of existing features and functions articles contributed by developers, administrators, content producers and editors screenshots to illustrate the above translations of existing documentation into other languages
To contribute, register to the project users’ mailing (...) -
Selection of projects using MediaSPIP
2 mai 2011, parThe examples below are representative elements of MediaSPIP specific uses for specific projects.
MediaSPIP farm @ Infini
The non profit organizationInfini develops hospitality activities, internet access point, training, realizing innovative projects in the field of information and communication technologies and Communication, and hosting of websites. It plays a unique and prominent role in the Brest (France) area, at the national level, among the half-dozen such association. Its members (...)
Sur d’autres sites (3041)
-
Dreamcast Anniversary Programming
10 septembre 2010, par Multimedia Mike — Game HackingThis day last year saw a lot of nostalgia posts on the internet regarding the Sega Dreamcast, launched 10 years prior to that day (on 9/9/99). Regrettably, none of the retrospectives that I read really seemed to mention the homebrew potential, which is the aspect that interested me. On the occasion of the DC’s 11th anniversary, I wanted to remind myself how to build something for the unit and do so using modern equipment and build tools.
Background
Like many other programmers, I initially gained interest in programming because I desired to program video games. Not content to just plunk out games on a PC, I always had a deep, abiding ambition to program actual video game hardware. That is, I wanted to program a purpose-built video game console. The Sega Dreamcast might be the most ideal candidate to ever emerge for that task. All that was required to run your own software on the unit was the console, a PC, some free software tools, and a special connectivity measure.The Equipment
Here is the hardware required (ideally) to build software for the DC :- The console itself (I happen to have 3 of them laying around, as pictured above)
- Some peripherals : Such as the basic DC controller, the DC keyboard (flagship title : Typing of the Dead), and the visual memory unit (VMU)
- VGA box : The DC supported 480p gaming via a device that allowed you to connect the console straight to a VGA monitor via 15-pin D-sub. Not required for development, but very useful. I happen to have 3 of them from different third parties :
- Finally, the connectivity measure for hooking the DC to the PC.
There are 2 options here. The first is rare, expensive and relatively fast : A DC broadband adapter. The second is slower but much less expensive and relatively easy to come by– the DC coder’s cable. This was a DB-9 adapter on one end and a DC serial adapter on the other, and a circuit in the middle to monkey with voltage levels or some such ; I’m no electrical engineer. I procured this model from the notorious Lik Sang, well before that outfit was sued out of business.
Dealing With Legacy
Take a look at that coder’s cable again. DB-9 ? When was the last time you owned a computer with one of those ? And then think farther back to the last time to had occasion to plug something into one of those ports (likely a serial mouse).
A few years ago, someone was about to toss out this Belkin USB to DB-9 serial converter when I intervened. I foresaw the day when I would dust off the coder’s cable. So now I can connect a USB serial cable to my Eee PC, which then connects via converter to a different serial cable, one which has its own conversion circuit that alters the connection to yet another type of serial cable.
Bits is bits is bits as far as I’m concerned.
Putting It All Together
Now to assemble all the pieces (plus a monitor) into one development desktop :
The monitor says “dcload 1.0.3, idle…”. That’s a custom boot CD-ROM that is patiently waiting to receive commands, code and data via the serial port.
Getting The Software
Back in the day, homebrew software development on the DC revolved around these components :- GNU binutils : for building base toolchains for the Hitachi SH-4 main CPU as well as the ARM7-based audio coprocessor
- GNU gcc/g++ : for building compilers on top of binutils for the 2 CPUs
- Newlib : a C library intended for embedded systems
- KallistiOS : an open source, real-time OS developed for the DC
The DC was my first exposure to building cross compilers. I developed some software for the DC in the earlier part of the decade. Now, I am trying to figure out how I did it, especially since I think I came up with a few interesting ideas at the time.
Struggling With the Software Legacy
The source for KallistiOS has gone untouched since about 2004 but is still around thanks to Sourceforge. The instructions for properly building the toolchain have been lost to time, or would be were it not for the Internet Archive’s copy of a site called Hangar Eleven. Also, KallistiOS makes reference to a program called ‘dc-tool’ which is needed on the client side for communicating with dcload. I was able to find this binary at the Boob ! site (well-known in DC circles).I was able to build the toolchain using binutils 2.20.1, gcc 4.5.1 and newlib 1.18.0. Building the toolchain is an odd process as it requires building the binutils, then building the C compiler, then newlib, and then building the C compiler again along with the C++ compiler because the C++ compiler depends on newlib.
With some effort, I got the toolchain to build KallistiOS and most of its example programs. I documented most of the tweaks I had to make, several of them exactly the same as this one that I recently discovered while resurrecting a 10-year-old C program (common construct in C programming of old ?).
Moment of Truth
So I had some example programs built as ELF files. I told dc-tool to upload and run them on the waiting console. Unfortunately, the tool would just sort of stall, though some communication had evidently taken place. It has been many years since I have seen this in action but I recall that something more ought to be happening.Plan B (Hardware)
This is the point that I remember that I have been holding onto one rather old little machine that still has a DB-9 serial port. It’s not especially ergonomic to set up. I have to run it on my floor because, to connect it to my network, I need to run a 25′ ethernet cable that just barely reaches from the other room. The machine doesn’t seem to like USB keyboards, which is a shame since I have long since ditched any PS/2 keyboards. Fortunately, the box still has an old Gentoo distro and is running sshd, a holdover from its former life as a headless box.
Now when I run dc-tool, both the PC and DC report the upload progress while pretty overscan bars oscillate on the DC’s monitor. Now I’m back in business, until…
Plan C (Software)
None of these KallistiOS example programs are working. Some are even reporting catastrophic failures (register dumps) via the serial console. That’s when I remember that gcc can be a bit fickle on CPU architectures that are not, shall we say, first-class citizens. Back in the day, gcc 2.95 was a certified no-go for SH-4 development. 3.0.3 or 3.0.4 was called upon at the time. As I’m hosting this toolchain on x86_64 right now, gcc 3.0.4 can’t even be built (predates the architecture).One last option : As I searched through my old DC project directories, I found that I still have a lot of the resulting binaries, the ones I built 7-8 years ago. I upload a few of those and I finally see homebrew programming at work again, including this old program (described in detail here).
Next Steps
If I ever feel like revisiting this again, I suppose I can try some of the older 4.x series to see if they build valid programs. Alternatively, try building an x86_32-hosted 3.0.4 toolchain which ought to be a known good. And if that fails, search a little bit more to find that there are still active Dreamcast communities out there on the internet which probably have development toolchain binaries ready for download. -
Brute Force Dimensional Analysis
15 juillet 2010, par Multimedia Mike — Game Hacking, PythonI was poking at the data files of a really bad (is there any other kind ?) interactive movie video game known simply by one letter : D. The Sega Saturn version of the game is comprised primarily of Sega FILM/CPK files, about which I wrote the book. The second most prolific file type bears the extension ’.dg2’. Cursory examination of sample files revealed an apparently headerless format. Many of the video files are 288x144 in resolution. Multiplying that width by that height and then doubling it (as in, 2 bytes/pixel) yields 82944, which happens to be the size of a number of these DG2 files. Now, if only I had a tool that could take a suspected raw RGB file and convert it to a more standard image format.
Here’s the FFmpeg conversion recipe I used :
ffmpeg -f rawvideo -pix_fmt rgb555 -s 288x144 -i raw_file -y output.png
So that covers the files that are suspected to be 288x144 in dimension. But what about other file sizes ? My brute force approach was to try all possible dimensions that would yield a particular file size. The Python code for performing this operation is listed at the end of this post.
It’s interesting to view the progression as the script compresses to different sizes :
That ’D’ is supposed to be red. So right away, we see that rgb555(le) is not the correct input format. Annoyingly, FFmpeg cannot handle rgb555be as a raw input format. But this little project worked well enough as a proof of concept.
If you want to toy around with these files (and I know you do), I have uploaded a selection at : http://multimedia.cx/dg2/.
Here is my quick Python script for converting one of these files to every acceptable resolution.
work-out-resolution.py :
PYTHON :-
# !/usr/bin/python
-
-
import commands
-
import math
-
import os
-
import sys
-
-
FFMPEG = "/path/to/ffmpeg"
-
-
def convert_file(width, height, filename) :
-
outfile = "%s-%dx%d.png" % (filename, width, height)
-
command = "%s -f rawvideo -pix_fmt rgb555 -s %dx%d -i %s -y %s" % (FFMPEG, width, height, filename, outfile)
-
commands.getstatusoutput(command)
-
-
if len(sys.argv) <2 :
-
print "USAGE : work-out-resolution.py <file>"
-
sys.exit(1)
-
-
filename = sys.argv[1]
-
if not os.path.exists(filename) :
-
print filename + " does not exist"
-
sys.exit(1)
-
-
filesize = os.path.getsize(filename) / 2
-
-
limit = int(math.sqrt(filesize)) + 1
-
for i in xrange(1, limit) :
-
if filesize % i == 0 and filesize & 1 == 0 :
-
convert_file(i, filesize / i, filename)
-
convert_file(filesize / i, i, filename)
-
-
IJG swings again, and misses
1er février 2010, par Mans — MultimediaEarlier this month the IJG unleashed version 8 of its ubiquitous libjpeg library on the world. Eager to try out the “major breakthrough in image coding technology” promised in the README file accompanying v7, I downloaded the release. A glance at the README file suggests something major indeed is afoot :
Version 8.0 is the first release of a new generation JPEG standard to overcome the limitations of the original JPEG specification.
The text also hints at the existence of a document detailing these marvellous new features, and a Google search later a copy has found its way onto my monitor. As I read, however, my state of mind shifts from an initial excited curiosity, through bewilderment and disbelief, finally arriving at pure merriment.
Already on the first page it becomes clear no new JPEG standard in fact exists. All we have is an unsolicited proposal sent to the ITU-T by members of the IJG. Realising that even the most brilliant of inventions must start off as mere proposals, I carry on reading. The summary informs me that I am about to witness the introduction of three extensions to the T.81 JPEG format :
- An alternative coefficient scan sequence for DCT coefficient serialization
- A SmartScale extension in the Start-Of-Scan (SOS) marker segment
- A Frame Offset definition in or in addition to the Start-Of-Frame (SOF) marker segment
Together these three extensions will, it is promised, “bring DCT based JPEG back to the forefront of state-of-the-art image coding technologies.”
Alternative scan
The first of the proposed extensions introduces an alternative DCT coefficient scan sequence to be used in place of the zigzag scan employed in most block transform based codecs.
Alternative scan sequence
The advantage of this scan would be that combined with the existing progressive mode, it simplifies decoding of an initial low-resolution image which is enhanced through subsequent passes. The author of the document calls this scheme “image-pyramid/hierarchical multi-resolution coding.” It is not immediately obvious to me how this constitutes even a small advance in image coding technology.
At this point I am beginning to suspect that our friend from the IJG has been trapped in a half-world between interlaced GIF images transmitted down noisy phone lines and today’s inferno of SVC, MVC, and other buzzwords.
(Not so) SmartScale
Disguised behind this camel-cased moniker we encounter a method which, we are told, will provide better image quality at high compression ratios. The author has combined two well-known (to us) properties in a (to him) clever way.
The first property concerns the perceived impact of different types of distortion in an image. When encoding with JPEG, as the quantiser is increased, the decoded image becomes ever more blocky. At a certain point, a better subjective visual quality can be achieved by down-sampling the image before encoding it, thus allowing a lower quantiser to be used. If the decoded image is scaled back up to the original size, the unpleasant, blocky appearance is replaced with a smooth blur.
The second property belongs to the DCT where, as we all know, the top-left (DC) coefficient is the average of the entire block, its neighbours represent the lowest frequency components etc. A top-left-aligned subset of the coefficient block thus represents a low-resolution version of the full block in the spatial domain.
In his flash of genius, our hero came up with the idea of using the DCT for down-scaling the image. Unfortunately, he appears to possess precious little knowledge of sampling theory and human visual perception. Any block-based resampling will inevitably produce sharp artefacts along the block edges. The human visual system is particularly sensitive to sharp edges, so this is one of the most unwanted types of distortion in an encoded image.
Despite the obvious flaws in this approach, I decided to give it a try. After all, the software is already written, allowing downscaling by factors of 8/8..16.
Using a 1280×720 test image, I encoded it with each of the nine scaling options, from unity to half size, each time adjusting the quality parameter for a final encoded file size of no more than 200000 bytes. The following table presents the encoded file size, the libjpeg quality parameter used, and the SSIM metric for each of the images.
Scale Size Quality SSIM 8/8 198462 59 0.940 8/9 196337 70 0.936 8/10 196133 79 0.934 8/11 197179 84 0.927 8/12 193872 89 0.915 8/13 197153 92 0.914 8/14 188334 94 0.899 8/15 198911 96 0.886 8/16 197190 97 0.869 Although the smaller images allowed a higher quality setting to be used, the SSIM value drops significantly. Numbers may of course be misleading, but the images below speak for themselves. These are cut-outs from the full image, the original on the left, unscaled JPEG-compressed in the middle, and JPEG with 8/16 scaling to the right.
Looking at these images, I do not need to hesitate before picking the JPEG variant I prefer.
Frame offset
The third and final extension proposed is quite simple and also quite pointless : a top-left cropping to be applied to the decoded image. The alleged utility of this feature would be to enable lossless cropping of a JPEG image. In a typical image workflow, however, JPEG is only used for the final published version, so the need for this feature appears quite far-fetched.
The grand finale
Throughout the text, the author makes references to “the fundamental DCT property for image representation.” In his own words :
This property was found by the author during implementation of the new DCT scaling features and is after his belief one of the most important discoveries in digital image coding after releasing the JPEG standard in 1992.
The secret is to be revealed in an annex to the main text. This annex quotes in full a post by the author to the comp.dsp Usenet group in a thread with the subject why DCT. Reading the entire thread proves quite amusing. A few excerpts follow.
The actual reason is much simpler, and therefore apparently very difficult to recognize by complicated-thinking people.
Here is the explanation :
What are people doing when they have a bunch of images and want a quick preview ? They use thumbnails ! What are thumbnails ? Thumbnails are small downscaled versions of the original image ! If you want more details of the image, you can zoom in stepwise by enlarging (upscaling) the image.
So with proper understanding of the fundamental DCT property, the MPEG folks could make their videos more scalable, but, as in the case of JPEG, they are unable to recognize this simple but basic property, unfortunately, and pursue rather inferior approaches in actual developments.
These are just phrases, and they don’t explain anything. But this is typical for the current state in this field : The relevant people ignore and deny the true reasons, and thus they turn in a circle and no progress is being made.
However, there are dark forces in action today which ignore and deny any fruitful advances in this field. That is the reason that we didn’t see any progress in JPEG for more than a decade, and as long as those forces dominate, we will see more confusion and less enlightenment. The truth is always simple, and the DCT *is* simple, but this fact is suppressed by established people who don’t want to lose their dubious position.
I believe a trip to the Total Perspective Vortex may be in order. Perhaps his tin-foil hat will save him.