Summary: Based on NVIDIA's GeForce FX 5900 Ultra GPU, the ASUS V9950 Ultra is targeted for high-end enthusiasts and gamers. Unlike other GeForce FX 5900 Ultra cards however, the V9950 Ultra is a single-slot design! Sounds great for SFF users, but how does this unique cooler (and card) perform? Find out, as we explore the performance of this card with NVIDIA's Detonator 44.71 driver. We also take a closer look at the issues surrounding GeForce FX 5900 Ultra and UT2K3!
Up to now, the GeForce FX 5900 Ultra card market has been fairly tame. Cards from third-party manufacturers have all been based on the same NVIDIA reference design; in fact, NVIDIA has handled all board production in house. The end result is that consumers have been purchasing the exact same hardware regardless of the card manufacturer chosen.
Because of this, early board manufacturers have split into three camps. You’ve got eVGA all alone offering their e-GeForce FX 5900 Ultra at higher clock speeds than the rest (500MHz core/900MHz memory). This gives them a performance advantage over other GeForce FX 5900 Ultra manufacturers that has so far gone unchallenged. The second camp has used its software bundle to gain an advantage over others. This group includes Gainward and MSI, who has an over-the-top software bundle with their FX5900-VTD256. The final group is about as no-frills as it gets, these guys include the card and its accessories in the packaging, but no game bundle. This camp consists of the three-letter companies, BFG and PNY, who are currently locked in a battle with each other at retail.
Complicating matters have been the early reports of some cards with flickering issues and/or a mysterious squealing noise coming directly from the card itself, an issue we witnessed firsthand with the MSI FX5900-TD128. Clearly things have not come easily for board manufacturers up to this point.
As one of NVIDIA’s oldest board partners, with products dating all the way back to the original RIVA 128, we had no doubt that ASUS would be offering a GeForce FX 5900 Ultra card, the question was just a matter of timing. It turns out that while other card manufacturers were busy bringing their first generation GeForce FX 5900 Ultra cards to market, ASUS has been playing it more conservatively, quietly working on their own card that wasn’t manufactured completely by NVIDIA.
To be honest, this probably shouldn’t come as a surprise if you’re familiar with ASUS’ previous graphics offerings. Their V8460 Deluxe was one of the first second generation GeForce4 Ti 4600 cards with video input support, and if you read our V9280S review last Winter, you certainly saw how impressed we were with this third generation GeForce4 Ti 4200 offering. Not only had ASUS integrated a Ti 4200 core on a Ti 4600 printed circuit board (with accompanying BGA memory), they also bumped the core clock to 275MHz and outfitted the board with 128MB of 600MHz memory.
The end result was a Ti 4200 card that outperformed the more expensive GeForce4 Ti4400, plus it offered built-in video editing as an added bonus! It’s no small wonder why this card was easily awarded our Editor’s Choice Award; ASUS had really upped the bar with the V9280S’s release.
Now ASUS is out again to stir things up, this time in the GeForce FX 5900 Ultra market. For the V9950 Ultra they’re representing with a 5900 Ultra board that doesn’t consume the PCI slot adjacent to your AGP card!
SIDEBAR: ASUS V9950 Ultra Product Webpage
GeForce FX 5900 Ultra
At the heart of the ASUS V9950 Ultra is NVIDIA’s GeForce FX 5900 Ultra GPU. This graphics core, originally codenamed NV35 is NVIDIA’s follow-up to the unsuccessful GeForce FX 5800/5800 Ultra that was discontinued earlier this year.
The V9950 Ultra card
Upon first inspection, you can easily see the differences between the ASUS V9950 Ultra and the reference boards we’ve received from other GeForce FX 5900 Ultra manufacturers. For starters, ASUS uses an aqua blue PCB for the V9950 Ultra, rather than the green PCB NVIDIA uses on its GeForce FX 5900 Ultra boards. When combined with the gold colored backplate and copper heatsink that adorns the GeForce FX 5900 Ultra core and memory, the V9950 Ultra card has a very distinctive look.
Whenever you see a video card with two fans, it’s only natural to be concerned about the noise level it generates. Fortunately we can report that noise isn’t an issue with the ASUS V9950 Ultra, in our testing the card actually operated quieter than NVIDIA’s GeForce FX 5900 Ultra reference card.
Rather than use one blower style fan like that used by NVIDIA, ASUS has implemented two conventional fans that spin at lower RPMs. This dual fan cooling approach is fairly popular among GeForce FX 5900 cards; the V9950 Ultra is the first GeForce FX 5900 Ultra card to employ this that we’ve encountered. In fact, ASUS has implemented the same cooler on its GeForce FX 5900 card, the ASUS V9950.
In operation, we witnessed temperatures as high as 63 degrees Celsius; this is quite a bit warmer than other GeForce FX 5900 Ultra cards we’ve tested. We feel that this is likely due to the fan’s operation. Unlike other GeForce FX 5900 Ultra cards that spin up once a 3D application is launched, the V9950 Ultra fans appear to rotate at the same speed. Even while we were running tests with the V9950 Ultra overclocked, the fans never cranked up to higher RPMs. If the fans were to adjust their speed with GPU temperature, core temperature would be easier to keep in check. Hopefully this is an issue that can be resolved in software.
As far as the rest of the board is concerned, the V9950 Ultra is essentially another GeForce FX 5900 Ultra reference design. Component placement is the same; the only difference is that the V9950 Ultra board is produced by ASUS rather than NVIDIA. From a features perspective however, the V9950 Ultra does have one significant change: ASUS has elected not to use the Philips SAA7108AE video encoder chip present on the GeForce FX 5900 Ultra cards produced by NVIDIA. This should help lower production costs for ASUS, but it remains to be seen if V9950 Ultra prices will be lower than other GeForce FX 5900 Ultra cards.
Rounding out the package are full versions of Gun Metal, Black Hawk Down, and Battle Engine Aquila. ASUS also includes its DVD playback software and a 6-in-1 CD of game demos (consisting of Splinter Cell, Warcraft III, Big Mutha Truckers, BREED, Colin McRae 3, and TOCA Race Driver.)
One area that we’d like to address concerns Unreal Tournament 2003. The current controversy surrounding Unreal Tournament 2003 started shortly after posting our Unreal Tournament performance article last month.
Dave Baumann of Beyond3D discovered that GeForce FX cards render a form of quasi trilinear filtering, even with the driver running in quality mode. This is important because by using a mix of bilinear and trilinear, performance is enhanced. When you factor in the significance of Unreal Tournament 2003 not just as a game, but as a performance benchmark used by numerous publications, quite a few people became concerned about the legitimacy of testing results taken on this website as well as countless others.
To get to the heart of the matter, we loaded up Unreal Tournament 2003 and went straight to the map our custom demo (T2) we use for all of our testing is based on, DM-Insidious. ATI’s RADEON 9800 PRO with CATALYST 3.6 was used to represent the ATI platform, while the ASUS V9950 Ultra with Detonator 44.71 (the driver that shipped with the video card) was used to represent GeForce FX.
First, we cranked up the anisotropic filtering setting on both cards to 8x via their respective control panels (both cards were running in their quality mode). This is the method we use to adjust image quality for all of our performance testing in Unreal Tournament 2003. Lets look at the RADEON 9800 PRO first:
Now the GeForce FX 5900 Ultra:
As you can see, neither card is truly using trilinear filtering. The transitions between mipmap levels are harsh. If trilinear filtering were in use, the colors (each color represents a mipmap level) would blend together more naturally.
Normally when lesser filtering methods are used, the lower quality mipmap levels stand out pretty dramatically; sometimes causing a shimmering effect to the eye. However, we can see here that with ATI and NVIDIA’s bilinear/trilinear mix, this isn’t the case. The higher detail textures that are closer to the end user blend fairly well with the textures they’re directly adjacent to, and the shimmering effect is not apparent during movement.
This is where the controversy begins. Both companies are giving end users a good-looking image (in our subjective opinion) that does sacrifice some texture quality, but yields better performance. NVIDIA did however tell reviewers that its Quality mode in the Detonator drivers would always display trilinear filtering a case which we don’t see here. But wait, there’s an “Application” setting in 44.71! Let’s try running both cards in their respective application modes starting with ATI:
As you can see, when application mode is forced on the ATI driver, it does what it’s told and renders an image with trilinear filtering in place. On the other hand, the NVIDIA card continues to use its quasi-trilinear mode, albeit with higher image quality.
This is where the controversy really heats up, as the ATI card submits to the desires of the end user, rendering the image as the developer intended, while GeForce FX always forces NVIDIA’s quasi-trilinear mode on its user (in NVIDIA’s defense, we’ve been told that they did have Epic’s approval for this optimization). For the consumer who wants all the eye candy turned on when he sets it in the driver, GeForce FX currently has, for lack of a better word, selective memory, while ATI’s RADEON line gives the end user what he wants. With today’s $400+ video cards all offering more than enough performance for today’s latest games, eye candy is becoming a more important factor than before.
Fortunately, there is light at the end of the tunnel for NVIDIA users. NVIDIA is currently hard at work on a driver that will resolve this issue. This provides little solace for current image quality enthusiasts with GeForce FX cards, but it does appear as if NVIDIA has seen the error in its ways.
SIDEBAR: ASUS also manufactures video cards with video input support under the VideoSuite brand.
Nascar Racing 2003 Season (Bristol custom demo)
IL-2 Sturmovik: FB
Quake III - OpenGL
Unreal Tournament 2003 – Direct3D
Splinter Cell – Direct3D
IL-2 Sturmovik: FB
Unreal Tournament 2003
Unreal Tournament 2003
Performance: NVIDIA’s GeForce FX 5900 Ultra GPU is no slouch when it comes to performance. With the highest core clock frequency in the industry, the GeForce FX 5900 Ultra GPU boasts a fill rate of 3.6Gigatexel/second thanks to its 4x2 architecture, and its 256-bit memory architecture is equally impressive.
|© Copyright 2003 FS Media, Inc.|