Summary: Brandon is covering ATI Shader Day in Seattle, and brings you the results from Valve's own internal testing of Half-Life 2. See which comes out on top, the NVIDIA GeForce FX cards or ATI's Radeons.
Shader Day is Upon Us
We're here in Seattle covering the Shader Day event, organized by ATI, as they are showing off the 2.0 Pixel and Vertex Shader. Valve has just finished their slide presentation on the topic and has allowed us to give you the graphs in this article that are the results of their own internal testing of recent ATI and NVIDIA hardware when running Half-Life 2. Some of the text you see in this article will be taken directly from their powerpoint slide presentation.
Note: All references to the Radeon 9600 Cards are actually the PRO version - some of the graphs and text leave out the PRO designation, but that is the card that was tested in the benchmarks
As you can see from the chart above, the GeForce FX cards lag behind significantly in FPS performance.
Half-Life 2 Meets DX9
Using the new features of DirectX 9, Half-Life 2 will sport High Dynamic Range lighting (HDR), bump mapped characters, softer shadows and advanced full screen anti-aliasing. The engine will use the processing power of current hardware more efficiently whilst keeping framerates the same as DirectX 8 but with the added goodness of more features.
The Half-Life 2 DX9 Demo will feature all the advanced features above and videos of it will be available for download or via Steam.
Paying the Price
There are many issues that rise when perform benchmarks such as visual quality tradeoffs (ie. eye candy maximized), screen-grab specific image rendering, lower rendering precision, benchmark-specific drivers that never see the light of day and unstable optimizations. Given that the consumer would purchase hardware based on such benchmarks in the past, present conditions where benchmarking are engineered in a certain way, actual frame rates on Joe's home computer is not necessarily going to be the same as the chart he just read in a popular magazine (or website).
Mixed mode for NV3X?
What is mixed mode? Well, Valve has reworked the Half-Life 2 engine to help the performance of the GeForce FX cards by trading off texture fetches for pixel shader instruction count and partial-precision registers if the situation calls for it. By doing so, it helped pull NVIDIA-based cards' frame rates.
So how much has Valve done to please its NVIDIA-based video-card-owning customers? It has spent 5 times the amount of time optimizing the NV3X path as they have the DX9 path. Valve themselves were alarmed at the performance difference and went further to say that ATI did not need such specific optimizations performed.
The easy thing for Valve to do ( and to save lots of time ) is to treat NV3X as DX8 hardware, meaning it's up to us gamers to turn on DX9 for the game ourselves. Also, by doing so, if we had DX8 running with GeForce FX 5200/5600, we will get playable framerates. Another downside of having the mixed mode equivalent ( 2 different optimizations ) for any single title will mean that future developers using this engine will have to tailor their code to both paths, which means more budget required, which some studios do not have.
GeForceFX on DX 8 and 9
This is an early glimpse of Half-Life 2 performance form Valve's perspecitve. We wil be running our own benchmark with Half-Life 2 and we'll post the results for you on Friday.
This is NVIDIA's Official statement: "The Optimizations for Half-Life 2 shaders are in the 50 series of drivers which we made available to reviewers on Monday [Sept. 8, 2003]. Any Half-Life 2 comparison based on the 45 series driver are invalid. NVIDIA 50 series of drivers will be available well before the release of Half-Life 2".
SIDEBAR: What do you think of the results? Do you think you need to jump ship from NVIDIA and get a Radeon to get the full effects of this game? Excited about HL2 in general? Let us know!
|© Copyright 2003 FS Media, Inc.|