Summary: While ATI and NVIDIA rule the 3D graphics market today, it wasn't always this way. In this article we've examined four promising graphics chips you might not know about, detailing what made them unique, and ultimately why they failed. Learn more about the history of the 3D graphics market by reading this article!
Sometimes, the best stories are the ones from the past that have been forgotten. If Nintendo was less cocky, they would have had the Sony Playstation under their own brand. The Soviet MiG-15s that ruled the skies over Korea (initially) would not have existed if the Rolls Royce company did not sell the plans to the jet engine (after the head was hustled in a game of pool.)
Recently, while cleaning out my hardware closet, I was reminded that the 3D graphics industry is also rich with many stories that have not yet been told.
Some stories you know, such as our world exclusive NVIDIA NV2 article, our feature behind the fall of 3dfx, a look at what happened to the original Hercules, and even the true story of the Bitboys. Some stories will remain untold for a few more years… some stories need storytellers, but today, we'll go back and take a look at the graphics products that deserve a spot in the history books.
We'll be looking at products from PowerVR, S3, Matrox, ATI, and NVIDIA -- we've already written everything we know about 3dfx.
Today, ATI and NVIDIA dominate the graphics industry. It wasn't always like this -- in fact, it was the complete opposite. In this article, we're going to go back... going back to the heyday of 3D graphics and looking at the forgotten stories of the era and see how they ultimately shaped the gaming industry today. Think of it as a window back in time or a techie vesion of E! True Hollywood Story.
SIDEBAR: Yamaha made a texture-less 3D graphics accelerator.
The PowerVR PCX and PCX2
Everyone remembers the 3D/fx Voodoo Graphics as being the first "real" high-end 3D graphics accelerator with a blistering fill rate of 45MTexels/sec. The PowerVR PCX graphics chip, developed by VideoLogic and NEC, could have been even faster. With a Pentium II/300, the PowerVR was offering fillrates of 66MTexels/sec, which would effectively be even higher due to the deferred rendering technology.
Why it could have been a contender
The PowerVR PCX was a dedicated 3D graphics PCI card with 4MB of SDRAM. This memory was used exclusively for textures, because the PowerVR directly transferred its images to the main 2D graphics framebuffer. While cards such as the Voodoo Graphics could only deal with 640x480 full-screen 3D, the PowerVR supported windowed-3D and resolutions as high as 1024x768. The PowerVR PCX actually had the fillrate to support these high resolutions, as it was a tile based scene renderer. Unlike other graphics chips at the time, which drew all of triangles as they came in, the PowerVR only drew visible triangles. With a baseline fillrate of 66MPixels/sec, it wasn't uncommon to see real-world fillrates approaching the magic 100MPixel/sec mark.
So what happened?
Well one disadvantage to infinite planes is that it required CPU power to translate OpenGL/Direct3D triangles info infinite planes. However, the PowerVR's first chip, the PCX, was doomed from the very beginning -- it lacked bilinear filtering. Without bilinear filtering, the PCX1 offered hardware perspective correction and true color textures, but things looked blocky.
Here's where the story gets more interesting. The PCX was supposed to have bilinear filtering. The engineers knew it was an important quality feature but during chip development, the transistor block required for bilinear filtering cycle was inadvertently left out by an engineer. By the time they realized the mistake, the decision was made that it would have been too costly and too time consuming to fix it. They had to go with the flawed chip.
UPDATE 1/22/04: Now the interesting story is why the PCX went with per-pixel linear MIP mapping as opposed to including bilinear filtering. (Note that per-pixel as opposed to per-polygon MIP mapping was also a key point in PowerVR's premium design). We've heard stories of how the PCX could have had bilinear filtering and that the engineers knew that it was possible to add the feature but overlooked it.
In our first run of this article, we erroneously went with the "bilinear filtering left out", but we were corrected by one of the architects of the PowerVR, a trusted resource. It is fact that the PCX1 never had anything other than per-pixel linear MIP mapping as the design choice. Our PowerVR stories come from current and ex-VideoLogic engineers as well as board manufacturers courted by PowerVR. In our opinion, the most likely story is an overzealous individual somewhere along the chain between engineering and a 3rd-party board manufacturer with an exaggerated story. That said, the fact that the story even exists reflects the desire that many had for PowerVR's success.
SIDEBAR: What does SGL in PowerSGL stand for? No one can agree. The VideoLogic team in the UK claimed that during development, it was called the PowerRGL for rendering graphics library. When it came out of beta, they just went to the next letter, S instead of R. The other half of the PowerVR team, NEC insists that it stood for Super Graphics Library. It's worth noting that NEC used to have a 16-bit console called the SuperGrafx.
After 3Dfx showed the world the importance of bilinear filtering, there was no way the PCX could sell. The PCX2 was essentially the bug-fixed PCX, offering minor performance tweaks and added bilinear filtering. True to VideoLogic's word, bilinear filtering on the PCX2 had no performance hit and the chip even found a design win with the Matrox m3D, a rare time when Matrox went outside their own engineering teams. It's too bad because the PCX2 could have been the original product.
The other issue that PowerVR owners will never forget is src*dst. This is a texture blending technique supported by the Voodoo, which allowed for "colored lighting" and cool explosion effects. Without this hardware, games like Quake2 did not have the same image quality that was possible from other cards. This wasn't a mistake so much as not making the correct prediction for the future of 3D graphics.
Why is this historically significant?
The PowerVR Series 2 was supposed to be a major leap in performance and quality, offering 2D/3D/MPEG acceleration in a single chip. It was to combine all of the deferred rendering benefits of the original PowerVR, but now include all of the texture blend effects necessary for 100% game compatibility. The PowerVR Series 2 found its way into the Sega Dreamcast producing graphics competitive to many PS2 and Gamecube games. Of course, the PowerVR Series 2 (Neon250) was also known as perhaps the most delayed PC hardware that eventually shipped. Most of the bugs were due to the 2D/VGA component of the graphics chip. After the mistake of rushing out the PCX1, VideoLogic may have gone a bit too far in trying to perfect the Neon 250, respinning the silicon many many times to fix little mistakes. In Spring '99 the third tape-out reportedly only had a severe bug with the hardware mouse cursor. In retrospect, things may have turned out better if they shipped the card with a software mouse cursor and appropriately discounted the chips. When Win2K came out, lots of cards couldn't do the hardware mouse shadow and no one really cared...
NEC would ultimately drop out of the PowerVR partnership, leaving ST Graphics to work with VideoLogic to co-develop the PowerVR Series 3 (Kyro). VideoLogic (now Imagination Technologies) will probably never play a big role in high-performance desktop 3D graphics again (although they’re rumored to be working on a next-generation high-end part as we speak), but that's not to say they're unsuccessful. Their lightweight 3D, video, and display technology is now part of the ARM architecture and powers commercial displays.
The next item I'm featuring is the S3 Savage3D, a card that very few people remember or consider important. S3 used to be the leader of the 2D graphics industry, producing very fast graphics cards with world-class drivers. In the very earliest stages of 3D graphics, the S3 Virge was a strong contender. After 3Dfx showed the world what gaming 3D was all about, S3 found itself struggling to produce a competitive chip. They missed a product cycle, but eventually produced the Savage3D, which was designed to compete against the Voodoo2, at a lower cost.
Very few people actually owned a Savage3D primarily because it was late to market. This was further compounded with the fact that the Savage3D did not have a multitexturing engine; many felt that the Savage3D was a card that would offer excellent performance for games available at the time, but would go obsolete quickly. The Savage3D was superior to the Voodoo Banshee in nearly all respects (except for Glide support), and was very competitive in 3D graphics when compared against NVIDIA's Riva TNT in Direct3D.
When it came to OpenGL, however, NVIDIA's drivers were vastly superior and for the Quake II playing gamer, S3 wasn't a contender. If you had to point out when NVIDIA started having its reputation for quality drivers, it's clearly the Riva TNT.
That said, even though the Savage3D never made a visible impact in the 3D industry, it is a graphics chip that truly deserves a spot in the history books.
A Technological Wonder
When I think about the Savage3D, I think of three technological advantages: S3TC, Void-cluster dithering, and world-class DVD quality.
S3 Texture Compression
The Savage3D was the first graphics card to support S3TC and optimized AGP 2X implementation. With S3TC, it was possible to have games with large 1024x1024 or even 2048x2048 textures running as fast as a Voodoo2 with only 256x256 sized textures. The 64x increase in detail was amazing and despite having only 8MB of SGRAM, the AGP implementation on the Savage3D worked superbly with large amounts of compressed textures.
Although all current graphics cards support S3TC (DXTC) texture compression, we have many games featuring large 2Kx2K textures. Part of the problem stems from differing implementations of S3TC from vendor to vendor in the early stages (no longer a problem). Today, the biggest problem is that high-resolution textures requires significant financial investments from the game development perspective.
SIDEBAR: The original Savage3D supported up to 8MB of memory
Back then, most 3D games had to be run in 16-bit color due to performance issues. The Savage3D featured the most advanced 16-bit dithering implementation. Most manufacturers were using a conventional ordered dither. This dithering approach produces a cross-hatched appearance often called a "cheesecloth effect." Depending on the graphics card manufacturer, the amount of cheesecloth was reduced.
In a single textured 16-bit mode, void-cluster dithering was excellent. When it came to multitextured games, however, even the best 16-bit algorithm couldn't compete with the internal multitexturing found in the Riva TNT.
DVD Subpicture Blending
Everyone thinks of ATI when it comes to DVD quality, but the Savage3D was in fact superior to the Rage Pro and Rage 128. The Savage3D was the first graphics accelerator to support hardware subpicture blending, important for reproducing DVD menu highlights and subtitles with the same look as a stand-alone DVD player. In addition, the scalar featured in the Savage3D performed well in both upsample and downsample modes. This was important for 640x480 TV output of the DVD movies and particularly for the anamorphic downsample.
Despite the superb feature set in the Savage3D, the lack of multitexturing and tepid OpenGL performance made the Riva TNT a much stronger product. S3 would redesign the Savage3D into a cost-reduced version, the Savage4 for the next product cycle. The Savage4 dropped the expensive void-cluster dithering, but preserved the exceptional DVD quality from the original Savage3D. By then, the Savage4's performance was just mainstream (by this time the TNT2 had been launched) but its "dirt-cheap" price made it a very popular card. Even NVIDIA reps were happy for S3 – back then they weren't targeting the low-cost market.
The other Canadian graphics company, Matrox, made their mark in 1999 with the gaming community with their Matrox Millenium G400 and G400 Max. With this card, Matrox briefly held the 3D performance and quality crown. It had environment mapped bump mapping, fast memory, great image quality, and dual head capability. In the end, it would be the G400's TV-out support that keeps the board in our labs today.
Even today, most TV-out implementations are compared to the G400. Matrox was able to output a clean 10-bit 1024x768 signal through the S-Video output, when others at the time could only do 800x600. The software engineers, recognizing that people usually ran their monitors at higher resolutions, added a downsampling feature to the TV-out.
This meant that if you ran your desktop at 1600x1200, the drivers would automatically downsample the image to 1024x768 before sending it out to the TV chip. You'd be able to maintain full quality on the desktop and see the "best possible" output on the TV. One thing to know is that the G400 TV-out worked perfectly in DOS and BIOS – as a cost cutting measure, the G450 TV-out wasn't as adept outside of Windows. Don't forget, with the G400, the TV-out chip was so complex that it needed its own heatsink…
The G400 also had DVDMax, which automatically ran DVDs and any video played through the overlay at full screen on televisions. As a result, it was possible to have a DVD playing in a tiny window at the corner of your screen so that you could use your PC normally while also having a full-quality picture available on the TV. You never had to worry about primary versus secondary displays, and the direct connection between the overlay and TV output chip produced superb quality. This feature took years to become incorporated into video cards from other manufacturers.
So what happened after the G400?
Matrox basically stagnated after the G400. The G450 was no faster than the G400, and by the time the Parhelia was released, it was overpriced and underperforming. Today Matrox has fallen back to the corporate and video environment and throughout the years since the G400, both engineers and support staff (marketing, software, etc.) found greener pastures. Cards such as the NVIDIA Quadro NVS are rapidly encroaching on their multi-display, and in the end, Matrox is falling into the shadow of the graphics industry. Although the G400 was a FiringSquad Editor's Choice product, I doubt we'll ever see another such product from Matrox.
My representative card from ATI is the 3D Rage Pro. This was the card that started ATI's unwanted reputation of great hardware with poor drivers. As one of the first AGP accelerators, the Rage Pro’s technical specifications read like a dream. It offered 45Mpixels/sec fill rate, equivalent to the Voodoo1, VQ texture compression, single-pass trilinear filtering, and all the image quality features you'd expect from a Voodoo Graphics. In addition, it featured ATI's robust Mach64 graphics engine with excellent video performance. Unfortunately gamers frequently experienced graphical glitches, some caused by a lack of good palletized texture support, and some simply caused by drivers.
A long time ago, drivers were only updated to correct bugs. More often than not, you'd use the drivers that shipped with the product and never thought twice. Today, everyone recognizes drivers as being a key component to the gaming experience. When most new gamers think about drivers, they think about the NVIDIA Detonator drivers which have been described by journalists across the world as the gold-standard. ATI owners have recently looked to the Catalyst drivers as their own set of high performance software.
New and Improved?
It wasn't NVIDIA who started the high-performance driver set but ATI. Approximately one year after the release of the 3D Rage Pro, ATI announced a new graphics chip, the 3D Rage Pro Turbo. The excitement quickly turned to skepticism when it was realized that the Rage Pro Turbo and Rage Pro graphics chips were one and the same. Instead, ATI was releasing a new set of drivers dubbed "Turbo" drivers, which promised a 40% speed boost. The new level of performance was supposed to be so significant that it was as if ATI had a new product. Note that there was no change in core clockspeed.
SIDEBAR: The Rage128 also had its own approach to 16-bit dithering. It ended up looking like noise and resulted in an unpleasant picture
Since then we've seen NVIDIA take the DirectX8 crown with the GeForce 3 and GeForce 4 lineup, and ATI catch-up in the DirectX9 era with the Radeon 9700. The stories of today's graphics industry are likely as interesting as what you've read in the previous pages.
It’s also worth noting that besides S3, ATI, Matrox, NVIDIA, and Videologic, there was another company that played a large role in the development of 3D graphics, Rendition. Their Vérité series of cards were popular due to their unique combination of price, performance, and features (unlike 3dfx’s Voodoo Graphics, Rendition’s v1000 combined 2D and 3D functionality in one package). Unfortunately, Rendition couldn’t keep up with the brutal product cycles in the 3D graphics market; they were acquired by Micron Technology but never released a follow-up product to their Vérité 2200. The history of Rendition would make for an interesting story, and perhaps with a little inside information it can one day be told.
In any case, ATI clearly would not be where they are today if not for the acquisition of ArtX, the developers of the Radeon 9700 architecture, which serves as the foundation for all of ATI's current products. The story of the Radeon 9700 is one we’ll be revealing in the near future.
On the flip side, there are two stories worth telling about NVIDIA. First is the development of the GeForce 3, which would extend NVIDIA's flagship position for two product generations and shape the way the industry looked at GPUs. Perhaps more interesting are the secret stories behind the NV30, the seemingly ill-fated project that combined the creativity and talent of NVIDIA's most elite engineers from the GeForce3 and the best of 3dfx's team.
There's no question that the 0.13 micron gamble was a short-term failure, as it allowed ATI to take the lead with the Radeon 9700. However, the game is still in. Only after the true next generation products from ATI and NVIDIA are released will we know who chose poorly.
Long for the good old days of 3D graphics when there were more than two graphics manufacturers? Reminisce with others in the FiringSquad forums!
SIDEBAR: Did you own any of these video cards? What other technologies could also be added to the list? Discuss this topic in the news comments!
|© Copyright 2003 FS Media, Inc.|