Summary: Packing two 55-nm GT200b GPUs and nearly 1.8GB of memory, NVIDIA’s GeForce GTX 295 is designed to unseat the Radeon 4870 X2 as the world’s fastest graphics card. But does the card succeed in its mission? Join us as we take an early look at a preproduction card!
In any case, shifting GPU production from 65-nm to 55-nm is a win-win for NVIDIA and the consumer.
Nowhere is this more imperative than at the high-end segment of the graphics market. NVIDIA’s GT200 GPU is a massive chip with 1.4 billion transistors and a 576mm^2 die size. In fact, some reports have suggested the manufacturing cost for NVIDIA is over $100 per GT200 die alone. With GeForce GTX 260 prices falling from $400 at launch to nearly $200 today, getting these costs down is obviously important.
Besides production costs, the other problem NVIDIA currently faces with GT200 is its performance. While the chip is considered to be the fastest overall GPU on the market today, it isn’t as significant a performance leap as previous next-generation offerings. The performance jump from the 7800 GTX to the 8800 GTX for instance was greater. In our GeForce GTX 280 Performance Preview article, we found that the GTX 280 was generally outperformed by a pair of GeForce 8800 GTX cards running in SLI.
ATI exploited this reality with the Radeon 4870 X2. Boasting two RV770 GPUs clocked at 750MHz and 2GB of GDDR5 memory, the 4870 X2 is universally considered by all to be the world’s fastest graphics card. This is a title NVIDIA had held for nearly two full years. To add insult to injury, the Radeon 4850 X2 2GB recently took the second spot behind the 4870 X2, relegating the GeForce GTX 280 to third place in performance.
Shrinking GT200 to 55-nm helps solve this second problem for NVIDIA.
To tackle this issue, two strategies were available to NVIDIA: they could use the smaller process to add more shaders to GT200 and hope for higher clocks as well, or they could use the smaller process to resurrect the GX2 formula of fusing two GPUs onto one graphics card. Integrating two 65-nm GT200 GPUs onto one board isn’t feasible, as the TDP of such a board would be too high: one GTX 280 card has a max board power of 236 watts.
While rumors suggested NVIDIA planned to offer enhanced GT206 and GT212 GPUs to combat ATI, ultimately NVIDIA opted for the second strategy of integrating two 55-nm GT200b GPUs onto one card to recapture the #1 position in graphics performance.
The GeForce GTX 295 is the designation NVIDIA has chosen for this beast. Armed with two 55-nm GT200b GPUs and 1.792GB of GDDR3 memory, the GeForce GTX 295 shares traits with the GTX 280 and GTX 260. It’s a hybrid of sorts that’s been designed to dethrone the Radeon 4870 X2. But does it accomplish its mission? Let’s find out!
When the first GeForce GTX 295 rumors surfaced on the internet, it was widely expected that NVIDIA would base the card entirely off the 216-shader GeForce GTX 260. Using this GPU as the foundation for the card made the most logical sense in order to keep costs as well as power consumption and heat down.
However, NVIDIA ultimately ended up borrowing aspects of both the GTX 260 and the GTX 280 for the GeForce GTX 295.
Like the GeForce GTX 280, the GPUs inside GTX 295 contain 240 stream processors apiece, with a grand total of 480 stream processors for the entire card. Unlike the GTX 280 however, the GeForce GTX 295 shares the same clock speeds as the GeForce GTX 260 – 576MHz for the graphics core while the stream processors operate at 1242MHz.
The memory subsystem is also carried over unchanged from the GeForce GTX 260, with the GTX 295 featuring a 448-bit memory interface with 1.792GB of GDDR3 memory (896MB per GPU) clocked at 1.0GHz (2.0GHz effective).
As a result, its paper specs lie somewhere between the GeForce GTX 280 and the GTX 260. The following chart summarizes the differences among the various GeForce GTX 200 GPUs:
As you can see, despite the graphics horsepower inside the GeForce GTX 295, the board has a respectable power consumption figure: 286W. For those of you keeping score, this is the exact same power rating as the Radeon 4870 X2. NVIDIA’s specifications recommend a 680W PSU for the GeForce GTX 295.
Priced at $499, the GeForce GTX 295 is also priced right in the middle of GTX 280 SLI and GTX 260 SLI. In comparison Radeon 4870 X2 cards start right around $460 on Newegg, which is where all of our pricing data comes from. The GeForce GTX 295 will hit retailers on January 8th of next year, but we were granted access to a preproduction board to run some preliminary benchmarks to whet your appetite ahead of the launch.
This early access did come with a few conditions however. Namely game titles tested were limited to five holiday launch games: Call of Duty: World at War, Dead Space, Fallout 3, Far Cry 2, and Left 4 Dead, plus one title of our own choosing. Naturally since it’s still the most demanding game on the market we chose Crysis.
The final condition was that we couldn’t run power consumption and acoustic tests. Don’t worry though, our preproduction card scored well here with the card running quietly and with respectable power draw. NVIDIA said they wanted to save something for us to talk about for the official launch day. How nice of them. NVIDIA didn’t tell us we couldn’t OC the board, which is one of the aspects we were most interested in anyway. First let’s take a peek under the board’s black shroud though…
While the GeForce GTX 295 is NVIDIA’s third generation dual PCB GX2 board (fourth gen if you count the original card that started it all, the mile-long GeForce 7900 GX2), it’s board design is pretty similar to the card that preceded it, the GeForce 9800 GX2. Like the 9800 GX2 the GeForce GTX 295 relies on a dual PCB design with the PCB housing the primary GPU inverted so it faces toward the second PCB on the bottom of the card. Sandwiched between the two PCBs is a dual-slot cooler with copper heatpipes and an aluminum heatsink, as well as a single blower-style cooling fan.
The top and bottom of this cooler are outfitted with separate heatpipes reserved for each GPU. With copper cooling on both sides of the cooling unit, it’s quite heavy; heavier in fact than the 9800 GX2. You’re definitely going to want to secure this card pretty tightly to your case to prevent it from coming loose accidentally.
Unlike the GeForce 9800 GX2, the GeForce GTX 295 isn’t completely enclosed in metal casing. In fact, the bottom PCB is completely exposed to air. Starting with the GeForce 9800 GX2 and continuing through the GTX 260/280, NVIDIA had adopted the practice of enclosing their cards to protect them from accidental damage during shipping as well as preventing damage from electrostatic discharge.
We have a feeling the GTX 295 is no longer protected this way because it needs the extra ventilation in order to remain as cool as possible. As you can see, the entire top and side of the card have a mesh grille, while the other side is partially exposed to air (the majority of the air from the card’s fan exhausts here rather than the card’s backplate). NVIDIA likely made this change in order to provide additional ventilation to the card.
Like the GeForce 9800 GX2, the GTX 295 is also equipped with two LEDs. One LED is used as a power LED. If the LED shines green, you know you’ve connected the board’s power connectors properly, while the LED will shine red if it isn’t getting enough power. The second LED is blue and denotes the primary graphics card. This is useful in SLI situations where some motherboards locate the primary PCI Express graphics slot in an odd location. In these situations an inexperienced user may hook up their monitor to the slave GTX 295 graphics card and not realize it when they don’t get a response from their display. This user would then unknowingly return both cards when they’re actually completely functional, but aren’t hooked up properly. This process repeats itself until the end user finally tries to hook the card up to the other graphics card!
With the display LED in place, this situation is avoided, as the LED on the primary graphics card will shine blue, while the LED will remain off on the secondary card.
The following are a collection of shots of our exposed GeForce GTX 295 board:
Out back, the GeForce GTX 295 has two DVIs tied to the master GPU, and one HDMI output running off the slave GPU. Thanks to the latest ForceWare drivers, the two DVIs can be run simultaneously while in SLI mode, while you'll need to disable SLI in order to run all three display outputs.
PhysX is another hot topic recently. Fortunately, the GTX 295 can be configured to run PhysX on one of its GPUs, or it can run PhysX in SLI mode, with one GPU handling graphics exclusively, while the second GPU tackles a mixture of graphics and PhysX workloads.
Intel Core i7-965 Extreme Edition
EVGA X58 SLI
3GB Qimonda DDR3-1066
NVIDIA GeForce 9800 GTX+
NVIDIA GeForce 8800 GTX
NVIDIA GeForce GTX 260
EVGA e-GeForce GTX 260 Core 216 (stock GTX 260 clocks)
NVIDIA GeForce GTX 280
NVIDIA GeForce GTX 295
ATI Radeon HD 4870 X2 2GB
ATI Radeon HD 4870 1GB
300GB Western Digital Caviar SE
Windows Vista Ultimate 64-bit w/Service Pack 1
Call of Duty: World at War
Crysis High – Direct3D
NotesThe only snafu we ran into testing our GTX 295 board occurred here, with Crysis crashing to the desktop under 4xAA at 2560x1600. NVIDIA is aware of the issue and hopes to resolve it soon in an upcoming driver.
Crysis High – Direct3D
The neat part about the GTX 295 and the NVIDIA utility is that you can set the clock speeds of each GPU independent of each other, just like you would on a conventional SLI card setup. This way if one GPU on your board is capable of OC’ing significantly better than the other, you can use this to your advantage and OC the max out of the good chip, while playing it conservative with the second GPU.
To keep things simple for the graphs, we decided to run both GPUs at the same clocks, but you certainly don’t have to do this.
So how far were we able to push both our GPUs? 665MHz core/1130MHz memory/1466MHz shaders. These speeds aren’t too far off from the clocks we’ve hit in the past with individual 65-nm GeForce GTX 260 cards, so we’re pretty confident that 55-nm discrete GTX 260 and GTX 280 cards should be capable OC’ers. We could actually go further on the core and shader speeds in some games, but we couldn’t complete our Far Cry 2 looped testing session where we loop the game back-to-back 10 times to test stability.
Check out the benchmarks at these speeds:
In general the extra shaders allowed the GTX 295 to outrun the GTX 260 SLI by around 4-5% overall, although in shader-heavy cases such as Crysis with very high graphics settings the margin separating the two cards increased to 8%. The least graphically demanding game we tested with, Dead Space, also gave the edge to the GTX 295, but by a narrower 3% margin of victory.
In comparison to the competition from ATI, the GeForce GTX 295 put up a very strong showing, completely outgunning the 4870 X2 in the majority of our benchmarks. Far Cry 2 and Dead Space seem to favor the GeForce architecture the most (with Far Cry 2 seeing a nice boost as a result of ForceWare 180), while our Left 4 Dead testing revealed a clear advantage for the 4870 X2. The GeForce GTX 295 then finished 5-9% faster than the 4870 X2 in Fallout 3 while the GTX 295 ran up to 13% faster in CoD: World at War.
Crysis, a title which has traditionally favored ATI’s architecture in our benchmarks, is really close between the GTX 295 and the 4870 X2 under the game’s very high graphics settings, but the GTX 295 opens up a comfortable double-digit margin under the less demanding high setting. This is another case where NVIDIA’s driver work in ForceWare 180 has reaped dividends.
While there are still more games we’d like to test, including a UE3-engine title, for right now the GeForce GTX 295 looks like it’s poised to take the 3D performance crown away from ATI.
Priced at $500 though, the card does come with a price premium over the 4870 X2 and the 216-shader GeForce GTX 260. Is the added performance the card offers worth it? Who knows? We’re going to reserve final judgment on that topic until the other rumored 55-nm GT200b parts arrive next year.
|© Copyright 2003 FS Media, Inc.|