BioShock Performance with Budget Cards
Our 3D testing with BioShock continued over the weekend. In part 1 we showed you how the game performed with a variety of high-end graphics cards. In today’s article we’re going to focus on more cost effective sub-$200 mainstream graphics cards.
The world within Rapture can at times be a very spooky place and as gamers, we naturally want to experience it all as the developer intended. Fortunately we can report that with today’s latest mainstream cards, you can do just that without making too many compromises graphically. From what we can tell so far, Irrational has done a good job of scaling BioShock to run with older and/or slower hardware, although obviously gamers with shader model 2.0 (and older) graphics cards don’t feel this way.
If you happen to be in this camp though, we’re happy to report that you don’t have to put a massive dent in your bank account in order to upgrade for BioShock. In this article we’re going to take a look at the latest budget and mainstream cards from AMD and NVIDIA: we’ve literally included cards ranging from the Radeon HD 2400 XT and GeForce 8400 GS all the way up to the $200 GeForce 8600 GTS. We also dusted off one of the most popular shader model 3.0 cards ever, the GeForce 6600 GT, and its bigger brother, the GeForce 6800 GT for good measure. Hopefully this should give you a good idea of how the various cards stack up against one another in the game in terms of performance.
Before we get into that though we first want to address some of the questions we’ve received regarding our comments on driver optimizations, more specifically we’ll discuss AMD Radeon 2000 series driver optimizations first.
Quite frankly, we feel that AMD is still suffering from the delays of R600. Remember that Radeon HD 2900 XT wasn’t readily available until late May at the earliest -- by this time Irrational’s work on BioShock was largely done. Meanwhile the GeForce 8800 has been available since late last year, and as a member of NVIDIA’s The Way It’s Meant to Be Played program, Irrational has had access to DX10 GeForce hardware for quite some time. In our 3D Performance with World in Conflict Beta article we discussed this program with Massive’s Christian Seger, he had this to say about it and the issues surrounding Radeon 2900 cards during the World in Conflict beta:
Nvidia's program is fantastic! We had great support from them starting as early as last year. We had support from their engineers, and most importantly, help with performance measurements and identifying bottlenecks in our use of the GPU. AMD helped us identify some problems with using their GPUs, and those will be addressed in a future patch, hopefully as close to release as possible. I don't think gamers should be concerned about anything, if you get the latest drivers for your graphics card and download the latest patches for the game.
According to Seger, NVIDIA’s program not only seeds developers with hardware, but also in optimizing performance. We’ll assume this is accomplished through tools like their PerfHUD
We also know that NVIDIA worked with Irrational on effects like BioShock’s volumetric smoke and fog for example (including soft shadows), as well as the soft particles for DX10, and the game’s water effects.
The bottom line is DX10 games like BioShock that are a part of the NVIDIA program have been tuned and tested to work together for some time. Game developers have just started getting their hands on AMD’s DX10 hardware and this tuning is only just now beginning. If you recall, Radeon HD 2900 cards got off to a slow start with Lost Planet under DX10 as well.
Once AMD and game devs have more time to work together this issue will slowly resolve itself. Clearly as you saw in our previous BioShock benchmarks, AMD’s Radeon X1950 is scaling very well with the game. CrossFire support is another feature AMD plans on adding to BioShock in a future driver release.
NVIDIA also has more optimizations in the works for BioShock. They’re not happy with AA as it is in the game today, which is part of the reason why AA is disabled under Windows Vista in DX10 mode by default. They’re working on improving AA as we speak, and hope to have something available in an upcoming ForceWare driver release. We also wouldn’t be surprised to see performance improvements from them as well.
With that out of the way, let’s start by looking at the various graphics options in the game. How do they perform with today’s mainstream cards, and how do they actually look? Let’s find out!