Summary: NVIDIA launched its ForceWare driver suite to go along with last month's GeForce FX 5950 Ultra and GeForce FX 5700 Ultra launches. These new drivers promised a lot, with a new compiler and a few other performance optimizations. In this article we take a look at NVIDIA's latest driver, read up on all the new features, performance enhancements and more inside. If you own a GeForce2/3/4/FX card (or plan on buying one in the near future), this is one article you won't want to miss!
While NVIDIA’s Detonator drivers have officially supported the GeForce FX line of DX9 GPUs for the past several months, NVIDIA’s software team has still been busy catching up to the hardware. We first hinted at this in our MSI GeForce FX5900-TD128 review where we stated: “The Detonator team is definitely under more scrutiny to deliver however. After the controversy surrounding 3DMark, NVIDIA’s driver team will be under more pressure than ever to provide high performance along with the stability and compatibility NVIDIA’s drivers are traditionally known for. It won’t be an easy task, but NVIDIA’s engineers have definitely delivered hardware that is superior to what they had at the beginning of this year. It will now be up to the Detonator team to expose all of NV35’s potential power.” Sounds like a subtle statement doesn’t it? But the key phrase is “expose all of NV35’s potential power”.
NVIDIA’s Detonator 50 drivers are about more than just a new compiler and scheduler, which solely benefit GeForce FX users. NVIDIA has also incorporated a few new additions that will benefit all NVIDIA card owners. The most obvious change is the new user interface for the control panel, and nView 3.0.
Updated user interface
First, the new interface isn’t as dramatic as the change between Detonator 30 and Detonator 40; rather NVIDIA has modified the look of some of its menus in Detonator 50, as well as adding new settings such as the “image sharpening” setting under the color correction menu for GeForce FX users.
The most talked about change is the updated “Performance and Quality Settings” menu. NVIDIA maintains the generic “High Performance”, “Performance”, and “Quality” settings, which will use predefined settings for aspects like texture quality (unfortunately, end users still aren’t able to manually toggle these functions like you can with ATI hardware) with the default setting being “Quality”.
One new feature however, is the “Application-controlled” setting for anti-aliasing and anisotropic filtering. In theory, this setting leaves both of these functions in the hands of the particular application you’re running. So if you have anisotropic filtering enabled in say, Unreal Tournament 2003, you’ll get anisotropic filtering in UT 2003. Unfortunately however, this isn’t the case, as we found in our recent image quality article that Detonator 52.16 still uses its pseudo-trilinear filtering in this app, and it appears all Direct3D applications.
This is particularly disappointing as NVIDIA has stated on multiple occasions that they would be implementing an application mode into Detonator 50 to address these concerns. Well, they put the button for application mode in there, but apparently it’s nothing more than a mask in its current form, leading many to openly joke about the irony of the new “ForceWare” label, as the new driver literally forces NVIDIA’s pseudo-trilinear mode upon its users.
An application setting will become even more important in the future, as DX9 titles that will ship this year and into 2004 (including OpenGL titles such as DOOM 3) will offer built-in settings for anisotropic filtering and anti-aliasing. Slowly but surely manipulating these settings will become more prevalent in the games themselves, rather than in the driver’s control panel. Therefore, getting this feature implemented needs to be a larger priority than it has been to date for NVIDIA.
SIDEBAR: Detonator 50 Release Notes
One of the key changes to nView 3.0 is the new gridlines feature. With it, you can more effectively control the real estate on your monitor’s screen by dividing your monitor up into separate regions. Gridlines are used to accomplish this.
The size of the grids is totally up to the end user, as they’re defined entirely with the mouse, so you could make two thinner grids at the top of your monitor, and two larger ones at the bottom (or vice versa) or if you’d rather just split your monitor in half you can do that too.
NVIDIA documentation refers to them as sub-monitors, which effectively describes what the grid lines do. You can use them to quickly reposition applications, which is helpful when multitasking; this is accomplished by sending windows to a grid. Another added benefit is that you can quickly resize windows, as the area of the grids is defined when you initially setup the gridlines.
NVIDIA’s nView 3.0 documentation also describes a new pop-up blocker for Internet Explorer users (perfect for those of you like me who like to hit up ESPN.com on a daily basis), but unfortunately we couldn’t find it.
We’ve had a few weeks to play with the new 52.16 drivers, and in our experience, they’re pretty reliable across a broad range of software applications and hardware. That doesn’t mean that they’re perfect though. One FS reader, Bjorn Larsen sent in a few screenshots with Splinter Cell that shows lighting issues in the game (his screenshots were taken with an MSI GeForce FX 5900-TD128 card):
We haven’t run into this with any of our custom Splinter Cell demos, so our test results are unaffected, but it was still something we wanted to report on. We’ve also seen reports of problems with NVIDIA’s new nView software where DualView and/or Windows desktop settings aren’t saved after the system is rebooted.
One issue that was somewhat alarming to us however was our inability to get our reference GeForce FX 5200 Ultra card to work properly with ForceWare 52.16. For whatever reason, the GeForce FX 5200 Ultra’s fan does not operate when 52.16 is installed on our testbed system. As a result, after a few minutes of gaming, the card would begin to overheat, frame rates would quickly taper off or the system would behave erratically, and eventually the testbed would either crash or reboot itself. A quick re-install of Detonator 45.23 would resolve this issue, so we’re convinced that the driver is the culprit.
If you recall our GeForce FX 5800 Ultra preview, we ran into similar issues with beta Detonator drivers, where the card would occasionally underclock itself after overheating (particularly in certain software applications) although in that case the FX Flow cooling system was always operating properly and the system would never crash.
This issue is particularly troubling to us because you could potentially damage your graphics card. In NVIDIA’s defense, it’s possible that this may be an isolated case that just affects our reference card, as we haven’t seen this reported anywhere else anecdotally or in online forums. We informed NVIDIA of this issue late last week but haven’t received an official response.
SIDEBAR: ForceWare 52.16 still has the overclocking button first introduced with Detonator 40, just use the Coolbits registry hack to enable overclocking.
To see what kind of performance enhancements NVIDIA’s ForceWare 52.16 drivers bring, we loaded up ShaderMark 2.0 with anti-detect mode enabled. As you can see, shader performance has increased substantially in some cases with the new drivers across the board.
Nascar 2003: OpenGL (for NV cards)
Regrettably, we saw a performance dropoff with the GeForce FX 5900 Ultra/ForceWare 52.16 combination in NASCAR 2003. Fortunately, this was isolated to 800x600, if you own a GeForce FX 5900 Ultra card, chances are you’re going to be playing NASCAR Racing 2003 at 1024x768 or 1280x1024 with the high quality settings we used, so this isn’t a huge issue. The GeForce2 Ti card actually saw a slight performance boost at 800x600 and 1024x768.
IL-2 Sturmovik: FB: OpenGL
Like NASCAR Racing 2003, we saw a slight performance drop in IL-2:Forgotten Battles with the GeForce FX 5900 Ultra card, although this time we saw it persist at the higher resolutions as well. This is a little more worrisome, as the higher resolutions are the settings you’re more likely to use if you own one of these cards. Keep in mind however, that we haven’t enabled AA or AF, as we did in NASCAR 2003, which will account for some of the difference. The GeForce4 Ti 4600 also sees a slight performance decrease at 800x600.
Quake III - OpenGL
Unlike the previous games, we see some nice performance increases in Quake 3, with the gains coming to the GeForce4 and GeForce FX 5600 and FX 5900 cards. The star next to the GeForce4 MX and GeForce2 denotes that the boards aren’t running with 8xAF, a setting which isn’t supported by older NVIDIA hardware.
Unreal Tournament 2003 – Direct3D
Like Quake 3, we see some slight performance improvements in UT 2003 with 4x anti-aliasing enabled. In this case however, the GeForce FX 5200/5200 Ultra take the place of the GeForce4 cards, which don’t see any gains beyond 800x600.
Unreal Tournament 2003
With AF enabled in addition to 4xAA, we’re still seeing gains, particularly with the GeForce FX 5200 and GeForce FX 5600 families, the margins are also roughly the same.
Splinter Cell – Direct3D
Interestingly enough, we see gains with the GeForce4 DX8 cards, but not the newer GeForce FX cards in Splinter Cell. The boost ranges from anywhere from 4-7% depending on the GeForce4 card tested and the resolution used. It’s always nice to see gains in hardware that’s nearly two years old.
Tomb Raider – Direct3D
As was hinted in our ShaderMark results, the new ForceWare 52.16 drivers bring some significant performance improvements to DX9 applications like Tomb Raider. We’re looking at double-digit gains here folks! The GeForce FX 5900 Ultra’s performance improves in the mid teens while the GeForce FX 5600 Ultra lies roughly in the high teens. Meanwhile GeForce FX 5600 sees gains of up to 20%.
Tomb Raider – Direct3D
Depth-of-Field is one setting in Tomb Raider that comes with a huge performance hit, which is why we’ve previously disabled it in our testing, even though it’s enabled by default. We decided to include test results with depth-of-field enabled for this article to see if the performance hit has been reduced, thankfully it has, although slightly.
In order to examine the performance benefits of the new ForceWare drivers in DX9 applications, we decided to relax our benchmarking standards a bit and include benchmark results with Halo. In a normal review, we wouldn’t include Halo results, as its benchmark mode is based entirely on letterbox cut scenes taken from various introduction sequences between chapters within the game. This can be a bit misleading, as the demos don’t always consist of actual gameplay. And since the game has no built-in demo recording functionality (yet), the only other way to obtain results is to run through a given level and record framerate with FRAPS. This has the drawback that you’ll never run through the level the same way twice, and is complicated even further if you have any sequences with combat – what if an enemy lobs a grenade at you in one run, but turns and runs (something the AI will do in Halo) in another? Your frame rate will vary even more drastically. And of course, don’t forget that pesky AI, which is always quietly lurking in the background, as well as any physics calculations that may be necessary should one of your foes take a leap due to a well-placed grenade.
Fortunately Gearbox will be releasing a patch that will enable demo recording for multiplayer games. This will be great when it arrives, but it isn’t here yet. We’ve been patiently waiting for this patch, but we decided to make an exception for this article. If you don’t trust the Halo results because they’re based on cut scenes which don’t always reflect real gameplay, or the fact that the cut scenes themselves could be highly susceptible to questionable driver optimizations similar to the ones seen with 3DMark 03 earlier this year, feel free to click over to the conclusion. Now that we’ve got the disclaimer out of the way, lets take a look at the results.
Halo – Direct3D
One of the cool features of Halo is that it not only gives you the average frame rate, it can also give you the percentage of time spent at a particular frame rate, and the percentage of frames at a given frame rate. We’ve decided to take the results from three sample points: below 60 frames per second, below 30 frames per second, and below 15 frames per second. Unlike traditional graphs however, lower scores are better, as the less time you spend below a given threshold (such as 15 frames per second), the higher your overall performance.
GeForce FX 5900 Ultra’s Halo’s performance is enhanced by 25% at 1024x768, while GeForce FX 5600 Ultra nearly doubles! We see performance gains of 1.5 times for GeForce FX 5200 and GeForce FX 5600 as well.
The margins at 1280x1024 remain the same across all boards, which is a good thing because we saw some pretty drastic performance improvements. The figures that really stand out are the dramatic reductions below 15 fps, GeForce FX 5900 Ultra goes from 10% of its frames below that magic number, to 1%. GeForce FX 5600 Ultra sees an improvement of 3X in that category (although keep in mind that we’re still looking at a 16 fps average). The GeForce FX 5600 and especially the FX 5200 are too taxed to see the gains of the 5900 Ultra and 56000 Ultra, 1280x1024 is just too high of a resolution for these cards.
DirectX 7 card owners
Life’s been tough for you guys recently, as more and more DX8, and now DX9, titles are shipping. We figure you’re still holding out because the titles you’re playing run fine with your current hardware. That’s a good argument when DX8 level cards are expensive, but with GeForce4 Ti 4200s selling under $100 online at some vendors, it’s getting harder each day to live with “okay” graphics, especially with titles such as Deus Ex: Invisible War right around the corner. And as you saw in this driver report, other than a mild speed boost in NASCAR, this driver isn’t going to help your performance.
DirectX 8 card owners
We saw some tangible performance gains with GeForce4 Ti 4200 and GeForce4 Ti 4600, which was pretty surprising considering NVIDIA has essentially had two years to tweak their drivers for these cards. The performance improvements aren’t as substantial as some of the ones you’ve seen in the past from previous driver releases, but they’re not something to quickly dismiss either, Splinter Cell and Quake 3 both saw some nice gains.
DirectX 9 card owners
Based on the performance improvements we saw in DX9 applications such as Tomb Raider, Halo, and ShaderMark, we’d have to say that these drivers are a must have upgrade. 30% performance boosts are nothing to scoff at after all.
|© Copyright 2003 FS Media, Inc.|