Summary: On the final day of Shader Day we were able to run some hands-on benchmarks of our own. In today's article we bring you performance numbers for GeForce FX 5600, 5600 Ultra, and 5900 Ultra as well as the RADEON 9800 PRO and 9600 PRO. In addition, we tested in DX8.1 mode, DX9 mode, and the mixed mode Valve created specifically for NVIDIA hardware. See how these cards perform with and without anti-aliasing as well as speculation on what kind of system you'll need to run this game properly!
In the first part of our Half-Life 2 Performance Preview, we brought you Half-Life 2 numbers run by Valve for their presentation at ATI Shader Day. Needless to say, these benchmark results caused quite a bit of controversy online.
Half-Life 2 is among the first crop of games to take advantage of the 2.0 pixel and vertex shaders first introduced with DirectX 9, and due to the success of its predecessor, is arguably the most anticipated game (of any genre) for 2003. And based on what we’ve seen so far, Half-Life 2 is living up to all the hype surrounding it. But there was one task remaining for Shader Day attendees: hands-on benchmarking with Half-Life 2!
The following are our test results with Build 5 of Half-Life 2. Keep in mind that this is a game that’s still unreleased, so the numbers we’re presenting will likely be a little different with the final game. For instance, bugs were still present with anti-aliasing enabled on NVIDIA and ATI hardware, although we were told that Valve’s AA fix was not implemented in our build of the game. Also, the high dynamic range (HDR) lighting that was present in the DX9 video from Friday wasn’t implemented in Build 5. For now this feature will be unique to ATI’s DirectX 9 hardware as GeForce FX cards running in DX9 mode already perform very poorly in full-precision mode (without HDR) at the moment.
If the performance is there for RADEON users, HDR could be a real selling point for ATI, as it’s definitely one of those features that you’re not going to want to turn off once you’ve seen it in action. Remember the first time you saw the sun in Gran Tourismo 3? To the uninitiated, HDR is similar to that effect, only it’s about two times more effective. In addition, the light reflects off of reflective or shiny surfaces. When you combine this with the Half-Life 2 water (which is the most accurate representation of water we’ve seen in a game to date), can you imagine how good HDR would look at sunset over a large body of water? HDR will also be used for effects like muzzle flashes.
All this eye candy comes at a price however. For maximum fidelity, Half-Life 2 is going to require cutting edge hardware. We’re not just talking graphics cards here folks. You’re also going to need a fast CPU and gobs of system memory. The test machine in Valve’s numbers from last week was outfitted with a gigabyte of RAM. Likewise, the gaming rigs we ran our tests with were also equipped with 1GB of memory. Based on what we’ve been told, this was no coincidence. Serious gamers are going to want at least 512MB of RAM for optimum performance, while the hardcore will insist on a gigabyte.
Half-Life 2 also craves CPU performance. In many cases, we were CPU-limited with an ATI RADEON 9800 PRO and a 2.8GHz Pentium 4 (800MHz FSB) in our testing! And in case you were wondering, Half-Life 2 doesn’t take advantage of Intel’s Hyper-Threading technology nor AMD’s 64-bit extensions in Athlon 64/Opteron. The jury is still out on which CPU architecture it prefers.
SIDEBAR: The shadows in Half-Life 2’s DX9 mode are soft shadows, whereas the DX8 shadows have hard edges.
Our testing methods
Due to time constraints, ATI limited reviewers to 90-minute benchmarking sessions with Half-Life 2. Each benchmarking run at a given resolution consisted of three separate demos, techdemo, bugbait, and e3_c17_02, so one run could take 20 minutes or more on slower hardware. Because of this, we teamed up with Dave Baumann of Beyond3D, Lars of Tom’s Hardware, and Bob Colayco from GameSpot, pooling all of our numbers together so we could bring you results at multiple resolutions, rendering modes, and with eye candy features such as AA enabled.
Even with this, time was still a huge limiting factor, despite the fact that we were running up to four test systems simultaneously. The flat panel displays that were used on the test systems didn’t support 1600x1200, so 1280x1024 was the maximum resolution we could test with. The RADEON 9800 PRO and GeForce FX 5900 Ultra were running in Dell XPS rigs equipped with Intel’s 800MHz FSB 2.8GHz Pentium 4, and 1GB of DDR SDRAM. The RADEON 9600 PRO, GeForce FX 5600 Ultra (which was donated by Lars for testing) and GeForce FX 5600 were also running on 2.8GHz systems, but we’re not certain if the FSB was 800MHz or 533MHz. The ATI systems were using CATALYST 3.7, while the GeForce FX cards utilized Detonator 45.23.
The demos themselves came from Valve’s E3 presentation, so if you’ve seen the video from May, you know what we’re talking about. The first demo, techdemo, starts off towards the beginning of the E3 presentation, inside the cave with the G-Man and the beautiful pixel shaded water. Bump maps are everywhere, the walls of the cave are composed entirely of them; you can also see 2.0 pixel shaders in action with the fire.
In our testing, we noticed that the techdemo wasn’t quite as CPU-bound as some of the other demos, which is why we’re able to see a larger gap (30%) between the RADEON 9800 PRO and RADEON 9600 PRO in this demo.
Due to the poor performance of GeForce FX cards running in full precision DX9 mode, Valve had to create a half-precision mode for these cards, which is denoted as “FX Mode” in these graphs. As you can see in the graph above, the GeForce FX 5900 Ultra’s performance improved by 25% in FX mode at 1024x768, but still falls short of the RADEON 9600 PRO running in full-precision DX9. The GeForce FX 5600 Ultra doesn’t come anywhere close to these cards, which interestingly enough doesn’t offer a substantial improvement over the regular GeForce FX 5600. Only when running in DX8.1 mode does the GeForce FX 5900 Ultra overtake the RADEON 9600 PRO, and it still falls short of RADEON 9800 PRO in full precision. This was quite a shock to everyone in attendance.
The RADEON 9800 PRO and 9600 PRO take huge hits with 4xAA enabled, nearly 50% at 1024x768. We were told that the 256MB RADEON 9800 PRO performs considerably better than the 128MB card in this scenario, but we didn’t have a card on hand to confirm this. Keep in mind that these numbers weren’t taken with the final AA fix in place so performance could be better in the retail game.
As we mentioned earlier, bugbait is more indicative of actual game performance than techdemo, as it takes place in a real level with combat scenarios similar to what you’d see in the game. As such, we’re very CPU-limited on the RADEON 9800 PRO at 800x600, (and even 1024x768) as we recorded a frame rate of 68.7 at 640x480. Once again the GeForce FX cards perform best in DX8.1 mode, and at 1280x1024 we actually see the GeForce FX 5900 Ultra overtake the RADEON 9600 PRO in DX9 mode.
The RADEON cards don’t take the incredible performance hit in bugbait that they did in techdemo, the RADEON 9800 PRO is just above 60 fps at 1024x768 with 4xAA enabled. The GeForce FX 5900 Ultra however does take a massive slide at 1280x1024, performance drops in half.
The E3_C17_02 demo is even more CPU-bound than bugbait or even techdemo, the RADEON 9800 PRO is CPU-limited all the way to 1280x1024! Clearly you’re going to want a 3.0GHz+ or AMD equivalent for optimum performance, even if you’re running one of the mainstream cards such as the RADEON 9600 PRO.
Check out the 1024x768 numbers, because we’re CPU-limited AA is practically free on the RADEON 9800 PRO in this demo. It will be interesting to see how things pan out with the final version of the game; we’re also wondering how many outdoor levels Half-Life 2 will contain. City 17 clearly wants the fastest processors money can buy.
Based on what we’ve seen so far, there are two topics worthy of considerable discussion. Number one (and definitely most importantly) – what happened to the GeForce FX cards? After all, it isn’t everyday when you see a $500 graphics card outperformed by a $200 card. Several hypotheses were floated about. One was the ATI and the RADEON 9800 PRO is capable of performing many more floating-point operations per clock than GeForce FX 5900 Ultra, due to its eight-pixel pipeline architecture. With each pixel pipeline capable of performing up to five floating-point operations per clock, the RADEON 9800 PRO is capable of up to 40 floating-point operations per clock cycle.
Another theory is that the RADEON 9800 PRO has more register space in its functional units than GeForce FX 5900 Ultra. While the RADEON 9800 PRO scales well with increased register use, the number of instructions executed per cycle with GeForce FX 5900 Ultra reduces more dramatically. This could explain why Valve cut the precision in half for the mixed mode unique to NVIDIA cards (“FX Mode” from the graphs). When ATI presented this theory to us the night before Shader Day began, our immediate question was “so then where did all the transistors go?” NVIDIA has kept the specific details on its GeForce FX architecture under tight wraps, so we may never know the answer to this one.
Of course, NVIDIA’s argument is that their Detonator 50 series driver are more valid for performance comparisons than 45.23 (the driver we used for testing), as development on the 45 series ended months ago. NVIDIA has added optimizations for Half-Life 2 and other new game titles in Detonator 50 that aren’t present in the 45.23 driver we tested with.
Whatever the case, the situation doesn’t look good for NVIDIA right now. According to Valve, ATI’s RADEON cards worked right out of the box with Half-Life 2. In fact, it was even suggested that ATI drivers dating all the way back to the RADEON 9700 PRO launch could work (albeit with limited success) with Half-Life 2. In contrast, when Valve first got their hands on DX9 hardware from NVIDIA earlier this year it didn’t work properly with Half-Life 2. It was at that point that Valve first became concerned with NVIDIA hardware. Ultimately Valve chose to endorse the RADEON 9800 as the card of choice for Half-Life 2:
“When we reviewed all the platforms available for use in the public unveiling of HL2, ATI’s RADEON 9800 PRO was unquestionably the choice for showing Half-Life 2 at its best.” – Gabe Newell, Valve Software
The crux of the problem for NVIDIA is that the NV3x architecture requires specific optimization for optimal performance in the next generation of games. DOOM 3 is another example of an upcoming title that requires optimizations custom-tailored for NVIDIA hardware for best performance. These optimizations aren’t engine-specific (meaning you can apply it once to an engine and all games based on that engine will take advantage of the optimization), rather they’re unique to the specific game. What happens if one development house is too small to optimize specifically for NVIDIA hardware? Based on all current indications, the game will perform poorly.
If you recall the old 3dfx Glide days, this is exactly what happened early on to Epic with Unreal. Performance with other APIs was significantly slower in this game. This problem was eventually resolved, but ultimately the market moved in the direction of more open APIs like Direct3D and OpenGL. In fact, NVIDIA made quite a name for itself thanks to its balanced blend of blazing performance in both OpenGL and Direct3D titles, something no other manufacturer could offer. NVIDIA has deviated quite a bit from the specs Microsoft lay out for DirectX 9, which could be costing them now.
SIDEBAR: My flight home was cancelled on Thursday, and then my flight on Friday was overbooked. By the time I got back it had been quite an ordeal.
The second factor to keep in mind is the significant demand on the CPU and memory subsystems of your PC. Half-Life 2 will press both of these system components like no other game before it. And regrettably, we have no data at this time on which processor Half-Life 2 prefers. Will Valve’s latest creation prefer Intel’s Pentium 4 architecture, or the floating-point prowess present in AMD’s Athlon XP/Athlon 64? Unfortunately, we won’t know the answer to that question until the end of the month. If one processor offers a clear performance advantage over the other (a la Quake 3 and the Pentium 4) gamers will flock towards it, just as ATI is currently getting all the kudos on the video side.
In addition, those of you with soft audio solutions or sound cards that lack dedicated hardware acceleration for DirectSound streams may want to upgrade your audio subsystem for Half-Life 2, as you may not want to give up precious CPU cycles for audio processing. Aureal owners witnessed this firsthand with the original Half-Life.
As far as memory is concerned, we don’t have a lot of solid evidence just yet, as we didn’t have a chance to test with a 512MB 2.8GHz Pentium 4 system. On more than one occasion ATI representatives commented on Half-Life 2’s extreme memory usage however. Load times with the 1GB systems we tested with weren’t too bad, but they weren’t lighting fast either. Since ATI isn’t exactly in the business of selling memory, we’ll take them at their word when they say that you’re going to want lots of memory for the best gaming experience with Half-Life 2.
Overall, we continue to be impressed with what we’ve seen so far from Half-Life 2. The addition of HDR gives the game an added level of flair, which we were already impressed with after the E3 demonstration. Thanks to HDR, it now looks even better! The only question is, how much of a performance hit does it bring?
NVIDIA is currently in a really tough position. Right now they’re significantly behind ATI in performance when running under the full-precision DX9 mode, yet the customized FX mode doesn’t offer the full precision of DX9 mode or HDR. Will the upcoming Detonator 50 release be able to catch up with ATI? The pressure is on NVIDIA’s software team to deliver an enormous task, as Half-Life 2 has thousands of shaders to optimize for. Most likely there are a smaller number that need to be dealt with immediately, while the rest can wait for follow-up driver releases, but this task will not only apply to Half-Life 2, but any other upcoming DX9 game that’s in the pipeline (Halo, which just went gold, is one such title), even if it’s based on Half-Life 2’s Source engine.
Fortunately, the wait for Half-Life 2 is almost over. This is the first title we’ve seen in quite a long time that we can see selling lots of hardware. We’re not just talking graphics cards here either. Half-Life 2 will push every component within your PC no matter how cutting edge it is. Make no mistake about it, Valve will soon own us all.
SIDEBAR: Are you as excited about Half-Life 2 as we are, or do you not believe the hype? Voice your thoughts about this upcoming shooter in the news comments!
|© Copyright 2003 FS Media, Inc.|