Summary: After taking a look at Call of Duty performance with ATI hardware, today we're here to evaluate performance with this game for NVIDIA users. This time we gathered a dozen GeForce cards starting with the GeForce2 Ti to NVIDIA's latest -- GeForce FX 5950 Ultra. Check it all out inside!
Once again we’re out to explore the performance of Infinity Ward’s new shooter, Call of Duty. This week, we’re focusing on performance with NVIDIA hardware. The GeForce line is a well established brand with millions of users, so it was important that we not only include NVIDIA’s latest and greatest cards, but also their older products like the GeForce3. To represent NVIDIA’s DX7 hardware, we also included the GeForce2 Ti. It’s important to note that for this card in particular, anti-aliasing didn’t seem to work, which may not be a bad thing considering the capabilities of the hardware.
Like the ATI performance article, all visual quality settings were cranked up to their maximums, while all sound settings were dialed down to their minimums. Infinity Ward adds a unique “NVIDIA Distance Fog” setting for GeForce owners only. As its name implies, when enabled this setting adds a layer of fog, obscuring objects in the distance (much like you’d see on a console game).
This reduces the rendering workload on the graphics processor and thus enhances performance; the exact boost can vary depending on the complexity of the scene as well as screen resolution. The default setting is “enabled” but we disabled this setting for our testing to ensure an apples-to-apples performance comparison with our ATI results. Also, to help speed things along we ran the GeForce3 and GeForce2 Ti with bilinear filtering rather than trilinear. Here’s the rest of the system setup:
Call of Duty
Call of Duty
The GeForce FX 5700 Ultra results at low resolutions really surprised us, at first we thought we hadn’t cranked up the IQ settings. However, after a quick confirmation we saw that everything was setup properly. Even after a complete re-install the numbers remained consistent. Bizarre.
Call of Duty 2xAA
Under the greater demands of 2xAA, the GeForce4 Ti 4600 isn’t able to keep up with the newer GeForce FX 5600 Ultra at higher resolutions. The GeForce FX 5600 Ultra’s 12.8GB/sec of peak bandwidth outweighs the GeForce4 Ti 4600’s superior fill rate.
Call of Duty 2xAA 8xAF
The same patterns we saw with 2xAA continue here, although the transitions occur sooner because of the addition of anisotropic filtering. For example, the GeForce FX 5600 Ultra/GeForce4 Ti 4600 transition occurs at 1024x768, although the gap between both cards isn’t tremendous.
Call of Duty 4xAA
With 4xAA enabled, we see the gap between the cards with 128-bit memory interfaces and 256-bit interfaces increases fairly substantially, although the 5950 Ultra and 5900 Ultra continue to offer similar levels of performance. This shouldn’t be too surprising as the GeForce FX 5950 Ultra is only a minor core clock improvement.
Call of Duty 4xAA 8xAF
The biggest surprise here is the GeForce FX 5900 versus GeForce FX 5700 Ultra battle. After seeing the wide margin between these cards with 4xAA, we expected that to continue with 8x aniso added on. However, we see the margin between both cards closes substantially at 1600x1200. Our best guess is that the GeForce FX 5900’s 128MB of memory is holding the core back. If you recall our ATI numbers with Call of Duty with these settings, the RADEON 9500 PRO and RADEON 9800/RADEON 9700 PRO were separated by less than 6 frames per second, while the 256MB RADEON 9800 PRO outperformed the 128MB card by 1.5 times!
NVIDIA played it rather conservative with GeForce FX 5950 Ultra, and in Call of Duty, it shows. We really didn’t see any tangible performance differences between the GeForce FX 5950 Ultra and GeForce FX 5900 Ultra, and this is including settings as high as 4xAA and 8xAF with a screen resolution of 1600x1200.
Therefore, with online prices starting in the $350-$360 range for the GeForce FX 5900 Ultra, enthusiasts may want to save the $50 or $60 and just go with the GeForce FX 5900 Ultra. Getting 475MHz out of the core should be a breeze, while the memory is generally good for 900MHz+. You can then use that money you saved and put it towards another game. If you do have that extra money to spend, you will have the peace of mind of knowing that your card will operate at 475/950MHz just fine, and you’ve always got room for more with a little bit of overclocking (something the 5950 Ultra is quite good at).
At the $200 price point many GeForce FX 5900s are going for, they’ve become quite the steal among hardware purchases. This would definitely rank as our choice if you’re looking to maximize your bang-for-the-buck. You’ve got the same NV35 core as the GeForce FX 5900 Ultra, so making up the 50MHz clock speed difference isn’t a problem. And as far as the memory subsystem is concerned, the clock speeds are the same and you’ve got a 256-bit memory interface.
You do miss out on the extra 128MB of RAM if you go with your typical low-end GeForce FX 5900, and it’s beginning to look like next generation games like DOOM 3 and Half-Life 2 will take advantage of the added memory. But of course, we’re also still wondering how those games will perform (and look) with NVIDIA hardware. Call of Duty also favors 256MB cards. Albatron, Gainward, MSI and Leadtek all make a special line of 256MB GeForce FX 5900 cards, but the added memory does come at a price premium. This space may be one to watch as more vendors possibly come onboard.
The GeForce FX 5700 Ultra clearly represents the best performer among the cards with 128-bit memory interfaces, but like its predecessor it’s a late arrival. Board manufacturers we spoke to at Comdex are still having a hard time getting sufficient quantities of these chips, so we don’t see this changing anytime soon. Quite frankly though, with GeForce FX 5900s currently pushing the $200 barrier (and slightly breaking it if you’re willing to go with an XFX card according to Price Watch) is the GeForce FX 5700 Ultra really necessary? Keep in mind that this card retails for $220, with a $20 mail-in rebate. So it’s pretty much the same price as the GeForce FX 5900, which sports a wider 256-bit memory interface. The choice here is obvious – get the GeForce FX 5900! NVIDIA and its board partners have really priced the GeForce FX 5700 Ultra into obsolescence, even though it’s cutting-edge mainstream hardware. At least that’s the way things stand now, perhaps in a month or two GeForce FX 5700 Ultra cards will be in the $150 price range?
With GeForce FX 5600 Ultras currently in the $120 price range online, NVIDIA and its board partners are really pushing their DX9 hardware into very low price points. If you’re in the market for a new graphics card, say for example you’re coming from a GeForce2 or GeForce3 and don’t want to spend a lot of money, there’s really no excuse not to set the GeForce FX 5600 Ultra as your bare minimum solution (assuming you’re looking to upgrade to an NVIDIA-based card), with your choice going up from there based on your needs and budget.
Hopefully this series of articles with Call of Duty has helped you sort out where the cards stand from a performance perspective. But also keep in mind that just as important (if not more so) is driver stability and compatibility. This is where our driver reports come in. This is another feature that you’re going to want to watch regularly!
SIDEBAR: Are you surprised by any of the results? What kind of performance are you seeing with Call of Duty? Speak up in the news comments!
|© Copyright 2003 FS Media, Inc.|