Summary: There's been a gap in NVIDIA product lines for a while, no answer to that magically delicious GeForce Ti4200 which had the right price and the right performance for most performance-oriented gamers without an unlimited budget. NVIDIA hopes to remedy the lack of such a solution this cycle with the introduction of the GeForce 6600 line of cards. Brandon gets a sneak preview of the card NVIDIA hopes will be the ultimate price/performance solution.
Properly replacing the GeForce4 Ti 4200 has been a tough task for NVIDIA. With its 4x2 DX8 core running at 250MHz, and 512MHz memory (in later AGP 8X models), the Ti 4200 offered more than enough horsepower to run all of the games of its generation, and it turns out that with a little bit of work, subsequent games as well. NVIDIA’s first direct replacement, the GeForce 5600 Ultra, received a poor reception at its initial launch early last year. NVIDIA later went back to the drawing board and created a faster, second generation 5600 Ultra, but these cards didn’t hit retail until months after the first round of preview articles went out. By then anyone interested in upgrading had lost interest.
To stir things up in the mainstream segment, last Fall NVIDIA released their GeForce FX 5700 line. By adapting key technologies found in the GeForce FX 5900 such as UltraShadow, Intellisample HCT, CineFX 2.0, and high-speed DDR2 memory operating at 900MHz, the 5700 and the 5700 Ultra were both really strong parts that were competitive with ATI’s latest offerings, but gamers and enthusiasts still didn’t quite bite, particularly the Ti 4200 owners. NVIDIA decided to up the ante one more time.
Their answer was the GeForce FX 5900 XT. Rather than produce a cost-reduced version of the GeForce FX 5900 like they’d done with the 5700 line, the GeForce FX 5900 XT featured the same NV35 graphics core as the GeForce FX 5900/5900 Ultra, and just as importantly, it shared the same 256-bit wide memory interface found in those cards. These cards were an instant hit, as they provided all the features of NVDIA’s highest-end offerings, but at a much more affordable $200 (or less) price point. It was finally just like the GeForce4 Ti 4200 all over again!
Today NVIDIA is unveiling the GeForce 6600 family, or “The DOOM 3 GPU” as they like to call it. NVIDIA has also announced a new GeForce 6800 SKU, the GeForce 6800 LE.
The 6800 LE is NVIDIA’s eight pixel pipeline derivative of the GeForce 6800, but fortunately it retains the 6800 line’s 256-bit memory interface. The GeForce 6800 LE is the new bottom end of the GeForce 6800 series and should be priced somewhere between $200-$300 although we weren’t given an exact figure by NVIDIA. Final clocks are still up in the air, although we expect most boards will likely ship with 128MB of DDR1 memory, just like the vanilla GeForce 6800.
We’ve been told that the GeForce 6800 LE will be targeted at OEMs, or at least initially. Apparently these guys want a lower-priced GeForce 6800 card so they can claim to have 6800 graphics inside, although we have a feeling many enthusiasts will pass on this card, just as they did with similar offerings from ATI, such as the RADEON 9800 SE.
One of the features that made ATI’s initial mainstream offering, the RADEON 9500 PRO, such a strong card was its eight pixel pipeline architecture. This gave the RADEON 9500 PRO a fill rate that was quite competitive with the high-end card of the time, the RADEON 9700/9700 PRO. Unfortunately, since the card was essentially a RADEON 9700 in disguise, it was expensive for ATI to produce, and was eventually replaced by the RADEON 9600, which featured half as many pipes.
With GeForce 6600, the eight pipe architecture configuration finally makes a return to the mainstream segment, as NVIDIA has chosen an 8x1 configuration for these cards.
Core clock frequency has been increased from 475MHz in GeForce FX 5700 Ultra to 500MHz in the GeForce 6600 GT. This equates to a peak fill rate of 4.0 Gigatexels/second, double that of the RADEON 9600 XT and GeForce FX 5700 Ultra. Meanwhile, the GeForce 6600’s core runs at 300MHz, a figure which is also higher than previous offerings.
Of course, having a fast graphics core like the GeForce 6600 GT means nothing if it’s starving for memory bandwidth. To keep the core fed with data, NVIDIA has paired it with 128MB of GDDR3 memory running at 500MHz (1.0GHz effective).
In order to keep manufacturing costs down, the GeForce 6600’s memory interface is 128-bits wide, just like the GeForce FX 5700 Ultra, but thanks to its high clock speed we’re looking at a peak bandwidth figure of 16GB/sec, an improvement of 1.6GB/sec over GeForce FX 5700 Ultra, but falls well short of the 5900 XT’s 22.4GB/sec. This could give the GeForce FX 5900 XT the edge in some titles under high resolutions, especially with AA/AF enabled. NVIDIA hasn’t determined the final memory clock for the GeForce 6600, but it has been confirmed that it will be outfitted with DDR1 memory.
From a features perspective, the GeForce 6600 GT gives up nothing to its bigger brothers, the GeForce 6800 line. The 6600 is shader model 3.0 compatible, ensuring longevity, and we’ve been told that unlike the GeForce FX 5600 series, NVIDIA hasn’t compromised the AA implementation on the GeForce 6600. The GeForce 6600 borrows the AA engine and algorithms found in GeForce 6800. NVIDIA has also integrated UltraShadow II into GeForce 6600.
So how is NVIDIA able to integrate all this into a card that’s been designed for the mainstream segment? Simple, they shrunk it. The GeForce 6600 is NVIDIA’s first GPU to move to 0.11-micron, manufactured by TSMC. The core itself contains 146 million transistors, 64 million more than GeForce FX 5700, so the move to 0.11-micron was very important for NVIDIA.
If all this sounds promising to you, good. But we haven’t told you the most exciting part yet. PCI Express-based GeForce 6600 GT cards will also support NVIDIA’s scaleable link interface technology, more commonly known as SLI. This means that you can purchase one GeForce 6600 GT card now, and buy a second card a few months down the road for roughly double the performance! NVIDIA’s SLI announcement earlier this summer was received positively by the press and public alike, so we have a feeling that many of you will like this feature. NVIDIA will also be integrating their video processor into the GeForce 6600.
Unlike ATI’s X600 line, NVIDIA will also be providing AGP versions of the GeForce 6600 and 6600 GT with the same feature set, excluding SLI of course. For these cards, NVIDIA’s HSI interconnect technology is used to convert PCI Express signals to AGP, much like it’s used today to go in the opposite direction for GeForce PCX.
PCI Express and AGP-based GeForce 6600 GT’s should start at around $199 when they begin shipping in mid to late September. GeForce 6600 prices should hover right around $150, if not a little lower.
Unfortunately, we can’t answer the performance question on GeForce 6600/6600 GT today, as a last minute board design change prevented boards from shipping to press in time for launch at Quakecon. Early 6600 GT cards required an external power connection, much like the GeForce FX 5600 Ultra/5700 Ultra, but as you’ve seen in our board shots, external power is no longer necessary. Fortunately, NVIDIA did provide one interesting performance figure however: 42 frames per second in DOOM 3’s demo1 demo at 1024x768x32 with 4xAA and 8xAF in high quality mode, or 1600x1200x32 without AA/AF.
As it stands right now, the GeForce 6600 appears well poised to follow-up the GeForce FX 5700/5700 Ultra, but it remains to be seen how well it will stack up to the competition from ATI. GeForce FX 5900 XT may give it a tough time in many situations as well, especially with DX7 and DX8 games. Hopefully we’ll have the answer to this question soon, but as exciting as the GeForce 6600 is today, we’re eager to see what NVIDIA has planned next. NVIDIA CEO Jen Hsun Huang has gone on record to say that by the end of the year NVIDIA will have a top to bottom solution of shader model 3.0 products. That means that the remaining months in 2004 should be quite an interesting time for gamers of in all budgets!
|© Copyright 2003 FS Media, Inc.|