Texturing with UT2K3
Now the GeForce FX 5900 Ultra
One area that we’d like to address concerns Unreal Tournament 2003. The current controversy surrounding Unreal Tournament 2003 started shortly after posting our Unreal Tournament performance article last month.
Dave Baumann of Beyond3D discovered that GeForce FX cards render a form of quasi trilinear filtering, even with the driver running in quality mode. This is important because by using a mix of bilinear and trilinear, performance is enhanced. When you factor in the significance of Unreal Tournament 2003 not just as a game, but as a performance benchmark used by numerous publications, quite a few people became concerned about the legitimacy of testing results taken on this website as well as countless others.
To get to the heart of the matter, we loaded up Unreal Tournament 2003 and went straight to the map our custom demo (T2) we use for all of our testing is based on, DM-Insidious. ATI’s RADEON 9800 PRO with CATALYST 3.6 was used to represent the ATI platform, while the ASUS V9950 Ultra with Detonator 44.71 (the driver that shipped with the video card) was used to represent GeForce FX.
First, we cranked up the anisotropic filtering setting on both cards to 8x via their respective control panels (both cards were running in their quality mode). This is the method we use to adjust image quality for all of our performance testing in Unreal Tournament 2003. Lets look at the RADEON 9800 PRO first:
As you can see, neither card is truly using trilinear filtering. The transitions between mipmap levels are harsh. If trilinear filtering were in use, the colors (each color represents a mipmap level) would blend together more naturally.
Normally when lesser filtering methods are used, the lower quality mipmap levels stand out pretty dramatically; sometimes causing a shimmering effect to the eye. However, we can see here that with ATI and NVIDIA’s bilinear/trilinear mix, this isn’t the case. The higher detail textures that are closer to the end user blend fairly well with the textures they’re directly adjacent to, and the shimmering effect is not apparent during movement.
This is where the controversy begins. Both companies are giving end users a good-looking image (in our subjective opinion) that does sacrifice some texture quality, but yields better performance. NVIDIA did however tell reviewers that its Quality mode in the Detonator drivers would always display trilinear filtering a case which we don’t see here. But wait, there’s an “Application” setting in 44.71! Let’s try running both cards in their respective application modes starting with ATI:
As you can see, when application mode is forced on the ATI driver, it does what it’s told and renders an image with trilinear filtering in place. On the other hand, the NVIDIA card continues to use its quasi-trilinear mode, albeit with higher image quality.
This is where the controversy really heats up, as the ATI card submits to the desires of the end user, rendering the image as the developer intended, while GeForce FX always forces NVIDIA’s quasi-trilinear mode on its user (in NVIDIA’s defense, we’ve been told that they did have Epic’s approval for this optimization). For the consumer who wants all the eye candy turned on when he sets it in the driver, GeForce FX currently has, for lack of a better word, selective memory, while ATI’s RADEON line gives the end user what he wants. With today’s $400+ video cards all offering more than enough performance for today’s latest games, eye candy is becoming a more important factor than before.
Fortunately, there is light at the end of the tunnel for NVIDIA users. NVIDIA is currently hard at work on a driver that will resolve this issue. This provides little solace for current image quality enthusiasts with GeForce FX cards, but it does appear as if NVIDIA has seen the error in its ways.