Recherche avancée

Médias (0)

Mot : - Tags -/albums

Aucun média correspondant à vos critères n’est disponible sur le site.

Autres articles (42)

  • Les autorisations surchargées par les plugins

    27 avril 2010, par

    Mediaspip core
    autoriser_auteur_modifier() afin que les visiteurs soient capables de modifier leurs informations sur la page d’auteurs

  • Publier sur MédiaSpip

    13 juin 2013

    Puis-je poster des contenus à partir d’une tablette Ipad ?
    Oui, si votre Médiaspip installé est à la version 0.2 ou supérieure. Contacter au besoin l’administrateur de votre MédiaSpip pour le savoir

  • Ajouter notes et légendes aux images

    7 février 2011, par

    Pour pouvoir ajouter notes et légendes aux images, la première étape est d’installer le plugin "Légendes".
    Une fois le plugin activé, vous pouvez le configurer dans l’espace de configuration afin de modifier les droits de création / modification et de suppression des notes. Par défaut seuls les administrateurs du site peuvent ajouter des notes aux images.
    Modification lors de l’ajout d’un média
    Lors de l’ajout d’un média de type "image" un nouveau bouton apparait au dessus de la prévisualisation (...)

Sur d’autres sites (5197)

  • IJG swings again, and misses

    1er février 2010, par Mans — Multimedia

    Earlier this month the IJG unleashed version 8 of its ubiquitous libjpeg library on the world. Eager to try out the “major breakthrough in image coding technology” promised in the README file accompanying v7, I downloaded the release. A glance at the README file suggests something major indeed is afoot :

    Version 8.0 is the first release of a new generation JPEG standard to overcome the limitations of the original JPEG specification.

    The text also hints at the existence of a document detailing these marvellous new features, and a Google search later a copy has found its way onto my monitor. As I read, however, my state of mind shifts from an initial excited curiosity, through bewilderment and disbelief, finally arriving at pure merriment.

    Already on the first page it becomes clear no new JPEG standard in fact exists. All we have is an unsolicited proposal sent to the ITU-T by members of the IJG. Realising that even the most brilliant of inventions must start off as mere proposals, I carry on reading. The summary informs me that I am about to witness the introduction of three extensions to the T.81 JPEG format :

    1. An alternative coefficient scan sequence for DCT coefficient serialization
    2. A SmartScale extension in the Start-Of-Scan (SOS) marker segment
    3. A Frame Offset definition in or in addition to the Start-Of-Frame (SOF) marker segment

    Together these three extensions will, it is promised, “bring DCT based JPEG back to the forefront of state-of-the-art image coding technologies.”

    Alternative scan

    The first of the proposed extensions introduces an alternative DCT coefficient scan sequence to be used in place of the zigzag scan employed in most block transform based codecs.

    Alternative scan sequence

    Alternative scan sequence

    The advantage of this scan would be that combined with the existing progressive mode, it simplifies decoding of an initial low-resolution image which is enhanced through subsequent passes. The author of the document calls this scheme “image-pyramid/hierarchical multi-resolution coding.” It is not immediately obvious to me how this constitutes even a small advance in image coding technology.

    At this point I am beginning to suspect that our friend from the IJG has been trapped in a half-world between interlaced GIF images transmitted down noisy phone lines and today’s inferno of SVC, MVC, and other buzzwords.

    (Not so) SmartScale

    Disguised behind this camel-cased moniker we encounter a method which, we are told, will provide better image quality at high compression ratios. The author has combined two well-known (to us) properties in a (to him) clever way.

    The first property concerns the perceived impact of different types of distortion in an image. When encoding with JPEG, as the quantiser is increased, the decoded image becomes ever more blocky. At a certain point, a better subjective visual quality can be achieved by down-sampling the image before encoding it, thus allowing a lower quantiser to be used. If the decoded image is scaled back up to the original size, the unpleasant, blocky appearance is replaced with a smooth blur.

    The second property belongs to the DCT where, as we all know, the top-left (DC) coefficient is the average of the entire block, its neighbours represent the lowest frequency components etc. A top-left-aligned subset of the coefficient block thus represents a low-resolution version of the full block in the spatial domain.

    In his flash of genius, our hero came up with the idea of using the DCT for down-scaling the image. Unfortunately, he appears to possess precious little knowledge of sampling theory and human visual perception. Any block-based resampling will inevitably produce sharp artefacts along the block edges. The human visual system is particularly sensitive to sharp edges, so this is one of the most unwanted types of distortion in an encoded image.

    Despite the obvious flaws in this approach, I decided to give it a try. After all, the software is already written, allowing downscaling by factors of 8/8..16.

    Using a 1280×720 test image, I encoded it with each of the nine scaling options, from unity to half size, each time adjusting the quality parameter for a final encoded file size of no more than 200000 bytes. The following table presents the encoded file size, the libjpeg quality parameter used, and the SSIM metric for each of the images.

    Scale Size Quality SSIM
    8/8 198462 59 0.940
    8/9 196337 70 0.936
    8/10 196133 79 0.934
    8/11 197179 84 0.927
    8/12 193872 89 0.915
    8/13 197153 92 0.914
    8/14 188334 94 0.899
    8/15 198911 96 0.886
    8/16 197190 97 0.869

    Although the smaller images allowed a higher quality setting to be used, the SSIM value drops significantly. Numbers may of course be misleading, but the images below speak for themselves. These are cut-outs from the full image, the original on the left, unscaled JPEG-compressed in the middle, and JPEG with 8/16 scaling to the right.

    Looking at these images, I do not need to hesitate before picking the JPEG variant I prefer.

    Frame offset

    The third and final extension proposed is quite simple and also quite pointless : a top-left cropping to be applied to the decoded image. The alleged utility of this feature would be to enable lossless cropping of a JPEG image. In a typical image workflow, however, JPEG is only used for the final published version, so the need for this feature appears quite far-fetched.

    The grand finale

    Throughout the text, the author makes references to “the fundamental DCT property for image representation.” In his own words :

    This property was found by the author during implementation of the new DCT scaling features and is after his belief one of the most important discoveries in digital image coding after releasing the JPEG standard in 1992.

    The secret is to be revealed in an annex to the main text. This annex quotes in full a post by the author to the comp.dsp Usenet group in a thread with the subject why DCT. Reading the entire thread proves quite amusing. A few excerpts follow.

    The actual reason is much simpler, and therefore apparently very difficult to recognize by complicated-thinking people.

    Here is the explanation :

    What are people doing when they have a bunch of images and want a quick preview ? They use thumbnails ! What are thumbnails ? Thumbnails are small downscaled versions of the original image ! If you want more details of the image, you can zoom in stepwise by enlarging (upscaling) the image.

    So with proper understanding of the fundamental DCT property, the MPEG folks could make their videos more scalable, but, as in the case of JPEG, they are unable to recognize this simple but basic property, unfortunately, and pursue rather inferior approaches in actual developments.

    These are just phrases, and they don’t explain anything. But this is typical for the current state in this field : The relevant people ignore and deny the true reasons, and thus they turn in a circle and no progress is being made.

    However, there are dark forces in action today which ignore and deny any fruitful advances in this field. That is the reason that we didn’t see any progress in JPEG for more than a decade, and as long as those forces dominate, we will see more confusion and less enlightenment. The truth is always simple, and the DCT *is* simple, but this fact is suppressed by established people who don’t want to lose their dubious position.

    I believe a trip to the Total Perspective Vortex may be in order. Perhaps his tin-foil hat will save him.

  • Ode to the Gravis Ultrasound

    1er août 2011, par Multimedia Mike — General

    WARNING : This post is a bunch of nostalgia. Feel free to follow along if you recall the DOS days of the early-mid 1990s.

    I finally let go of my Gravis Ultrasound MAX sound card a little while ago. It felt like the end of an era for me, even though I had scarcely used the card in recent memory.



    The Beginning
    What is the Gravis Ultrasound ? Only the finest PC sound card from the classic DOS days. Back in the day (very early 1990s), most consumer PC sound cards were Yamaha OPL FM synthesizers paired with a basic digital to analog converter (DAC). Gravis, a company known for game controllers, dared to break with the dominant paradigm of Sound Blaster clones and create a sound card that had 32 digital channels.

    I heard about the GUS sometime in 1992 through one of the dominant online services at the time, Prodigy. Through the message boards, I learned of a promotion with Electronic Arts in which customers could pre-order a GUS at a certain discount along with 2 EA games from a selected catalog (with progressive discounts when ordering more games from the list). I know I got the DOS version of PowerMonger ; I think the other was Night Shift, though that doesn’t seem to be an EA title.

    Anyway, 1992 saw many maddening delays of the GUS hardware. Finally, reports of GUS shipments began to trickle into the Prodigy message forums. Then one day in November, 1992, mine arrived. Into the 286 machine it went and a valiant attempt at software installation was made. A friend and I fought with the software late into the evening, trying to make this thing work reasonably. I remember grabbing a pair of old headphones sitting near the computer that were used for an ancient (even for the time) portable radio. That was the only means of sound reproduction we had available at that moment. And it still sounded incredible.

    After graduating to progressively superior headphones, I would later return to that original pair only to feel my ears were being physically assaulted. Strange, they sounded fine that first night I was trying to make the GUS work. I guess this was my first understanding that the degree to which one is a snobby audiophile is all a matter of hard-earned experience.

    Technology
    The GUS was powered by something called a GF1 which was supposed to use a technology called wavetable synthesis. In the early days, I thought (and I wasn’t alone in this) that this meant that the GF1 chip had a bunch of digitized instrument samples stored in the ASIC. That wasn’t it.

    However, it did feature 32 digital channels at a time when most PC audio cards had 2 (plus that Yamaha FM synthesizer). There was some hemming and hawing about how the original GUS couldn’t drive all 32 channels at a full 44.1 kHz ("CD quality") playback rate. It’s true— if 14 channels were enabled, all could be played at 44.1 kHz. Enabling more channels started progressive degradation and with all 32 channels, each was only playing at around 19 kHz. Still, from my emerging game programmer perspective, that allowed for 8-channel tracker music and 6 channels of sound effects, all at the vaunted CD level of quality.

    Games and Compatibility
    The primary reason to have a discrete sound card was for entertainment applications — ahem, games. GUS support was pretty sketchy out of the gate (ostensibly a major reason for the card’s delay). While many sound cards offered Sound Blaster emulation by basically having the same hardware as Sound Blaster cards, the GUS took a software route towards emulating the SB. To do this required a program called the Sound Blaster Operating System, or SBOS.

    Oh, how awesome it was to hear the program exclaim "SBOS installed !" And how harshly it grated on your nerves after the 200th time hearing it due to so many reboots and fiddling with options to make your games work. Also, I’ve always wondered if there’s something special about sampling an ’s’ sound — does it strain the sampling frequency range ? Perhaps the phrase was sampled at too low a bitrate because the ’s’ sounds didn’t come through very clearly, which is something you notice after hundreds of iterations when there are 3 ’s’ sounds in the phrase.

    Fortunately, SBOS became less relevant with the advent of Mega-Em, a separate emulator which intercepted calls to Roland MIDI systems and routed them to the very capable GUS. Roland-supporting games sounded beautiful.

    Eventually, more and more DOS games were released with native Gravis support, sometimes with the help of The Miles Sound System (from our friends at Rad Game Tools — you know, the people behind Smacker and Bink). The library changelog is quite the trip down PC memory lane.

    An important area where the GUS shined brightly was that of demos and music trackers. The emerging PC demo scene embraced the powerful GUS (aided, no doubt, by Gravis’ sponsorship of the community) and the coolest computer art and music of the time natively supported the card.

    Programming
    At this point in my life, I was a budding programmer in high school and was fairly intent on programming video games. So far, I had figured out how to make a few blips using a borrowed Sound Blaster card. I went to great lengths to learn how to program the Gravis Ultrasound.

    Oh you kids today, with your easy access to information at the tips of your fingers thanks to Google and the broader internet. I had to track down whatever information I could find through a combination of Prodigy message boards and local dialup BBSes and FidoNet message bases. Gravis was initially tight-lipped about programming information for its powerful card, as was de rigueur of hardware companies (something that largely persists to this day). But Gravis eventually saw an opportunity to one-up encumbent Creative Labs and released a full SDK for the Ultrasound. I wanted the SDK badly.

    So it was early-mid 1993. Gravis released an SDK. I heard that it was available on their support BBS. Their BBS with a long distance phone number. If memory serves, the SDK was only in the neighborhood of 1.5 Mbytes. That takes a long time to transfer via a 2400 baud modem at a time when long distance phone charges were still a thing and not insubstantial.

    Luckily, they also put the SDK on something called an ’FTP site’. Fortunately, about this time, I had the opportunity to get some internet access via the local university.

    Indeed, my entire motivation for initially wanting to get on the internet was to obtain special programming information. Is that nerdy enough for you ?

    I see that the GUS SDK is still available via the Gravis FTP site. The file GUSDK222.ZIP is dated 1998 and is less than a megabyte.

    Next Generation : CD Support
    So I had my original GUS by the end of 1992. That was just the first iteration of the Gravis Ultrasound. The next generation was the GUS MAX. When I was ready to get into the CD-ROM era, this was what I wanted in my computer. This is because the GUS MAX had CD-ROM support. This is odd to think about now when all optical drives have SATA interfaces and (P)ATA interfaces before that— what did CD-ROM compatibility mean back then ? I wasn’t quite sure. But in early 1995, I headed over to Computer City (R.I.P.) and bought a new GUS MAX and Sony double-speed CD-ROM drive to install in the family’s PC.



    About the "CD-ROM compatibility" : It seems that there were numerous competing interfaces in the early days of CD-ROM technology. The GUS MAX simply integrated 3 different CD-ROM controllers onto the audio card. This was superfluous to me since the Sony drive came with an appropriate controller card anyway, though I didn’t figure out that the extra controller card was unnecessary until after I installed it. No matter ; computers of the day were rife with expansion ports.



    The 3 different CD-ROM controllers on the GUS MAX

    Explaining The Difference
    It was difficult to explain the difference in quality to those who didn’t really care. Sometime during 1995, I picked up a quasi-promotional CD-ROM called "The Gravis Ultrasound Experience" from Babbage’s computer store (remember when that was a thing ?). As most PC software had been distributed on floppy discs up until this point, this CD-ROM was an embarrassment of riches. Tons of game demos, scene demos, tracker music, and all the latest GUS drivers and support software.

    Further, the CD-ROM had a number of red book CD audio tracks that illustrated the difference between Sound Blaster cards and the GUS. I remember loaning this to a tech-savvy coworker who disbelieved how awesome the GUS was. The coworker took it home, listened to it, and wholly agreed that the GUS audio sounded better than the SB audio in the comparison — and was thoroughly confused because she was hearing this audio emanating from her Sound Blaster. It was the difference between real-time and pre-rendered audio, I suppose, but I failed to convey that message. I imagine the same issue comes up even today regarding real-time video rendering vs., e.g., a pre-rendered HD cinematic posted on YouTube.

    Regrettably, I can’t find that CD-ROM anymore which leads me to believe that the coworker never gave it back. Too bad, because it was quite the treasure trove.

    Aftermath
    According to folklore I’ve heard, Gravis couldn’t keep up as the world changed to Windows and failed to deliver decent drivers. Indeed, I remember trying to keep my GUS in service under Windows 95 well into 1998 but eventually relented and installed some kind of more appropriate sound card that was better supported under Windows.

    Of course, audio output capability has been standard issue for any PC for at least 10 years and many people aren’t even aware that discrete sound cards still exist. Real-time audio rendering has become less essential as full musical tracks can be composed and compressed into PCM format and delivered with the near limitless space afforded by optical storage.

    A few years ago, it was easy to pick up old GUS cards on eBay for cheap. As of this writing, there are only a few and they’re pricy (but perhaps not selling). Maybe I was just viewing during the trough of no value a few years ago.

    Nowadays, of course, anyone interested in studying the old GUS or getting a nostalgia fix need only boot up the always-excellent DOSBox emulator which provides remarkable GUS emulation support.

  • Introducing WebM, an open web media project

    20 mai 2010, par noreply@blogger.com (christosap)

    A key factor in the web’s success is that its core technologies such as HTML, HTTP, TCP/IP, etc. are open and freely implementable. Though video is also now core to the web experience, there is unfortunately no open and free video format that is on par with the leading commercial choices. To that end, we are excited to introduce WebM, a broadly-backed community effort to develop a world-class media format for the open web.

    WebM includes :

    • VP8, a high-quality video codec we are releasing today under a BSD-style, royalty-free license
    • Vorbis, an already open source and broadly implemented audio codec
    • a container format based on a subset of the Matroska media container

    The team that created VP8 have been pioneers in video codec development for over a decade. VP8 delivers high quality video while efficiently adapting to the varying processing and bandwidth conditions found on today’s broad range of web-connected devices. VP8’s efficient bandwidth usage will mean lower serving costs for content publishers and high quality video for end-users. The codec’s relative simplicity makes it easy to integrate into existing environments and requires less manual tuning to produce high quality results. These existing attributes and the rapid innovation we expect through the open-development process make VP8 well suited for the unique requirements of video on the web.

    A developer preview of WebM and VP8, including source code, specs, and encoding tools is available today at www.webmproject.org.

    We want to thank the many industry leaders and web community members who are collaborating on the development of WebM and integrating it into their products. Check out what Mozilla, Opera, Google Chrome, Adobe, and many others below have to say about the importance of WebM to the future of web video.


    Telestream