||Gigabyte 3D1 GeForce 6600 GT SLI Review
December 29, 2004
Summary: Interested in a GeForce 6600 GT SLI setup but don't want to spend full retail for the motherboard and graphics cards? Gigabyte's 3D1 graphics card fuses two GeForce 6600 GT cards and 256MB of GDDR3 memory running at 560MHz onto one card! But that's not all, also included is a high-end nForce4 SLI motherboard with support for up to 8 Serial ATA hard drives and RAID 5, 802.11g WiFi capability, IEEE-1394b and 10 USB ports. The price for all this? Gigabyte expects boards to retail for $550. See how the 3D1 performs in comparison to a single GeForce 6600 GT, 6800 GT, and 6800 Ultra, as well as overclocking in this article!
| Introduction||Page:: ( 1 / 16 )|
Mainly, unlike the Volari Duo, Gigabyte’s 3D1 actually works; and remarkably well, I might add.
The 3D1 represents the first multi-processor card based on NVIDIA’s SLI technology. It’s a limited-edition product and it obviously appeals exclusively to a particular niche of gamers who are looking for elevated performance characteristics without spending exorbitant amounts of money.
According to some of Gigabyte’s initial performance data, there’s an appreciable gain to be realized by putting two graphics processors on a single PCI Express add-in card. In fact, the company’s synthetic numbers even indicate superiority over ATI’s RADEON X850 XT PE. But before you get too enthused by the prospects of a single-card implementation of SLI, there are a few things you should know about Gigabyte’s design.
The Gigabyte 3D1
While we’d love to see a card armed with two GeForce 6800 Ultra chips, Gigabyte instead chose to use a pair of GeForce 6600 GT processors for the 3D1. That’s an understandable decision since the cost and complexity of two 6800 Ultra or 6800 GT GPUs on one board certainly would have been prohibitive. Nevertheless, representatives at Gigabyte acknowledge that other dual-chip boards are currently in development and may realize retail availability.
|<% print_image("01"); %>||<% print_image("02"); %>|
Each 6600 GT is mated to 128MB of GDDR3 memory on a 128-bit bus. Of course, all of Gigabyte’s marketing material indicates that the board comes with 256MB of RAM on an effective 256-bit bus, but those numbers aren’t entirely representative of how the card works. Each core runs at 500MHz in 3D mode and 300MHz in 2D mode. The memory subsystem courses along at 560MHz DDR or 1,120MHz. And, since each core boasts eight pixel pipelines, it should be interesting to see how the two combined chips deal with a single GeForce 6800 GT – a 16-pipe contender.
There are also some limitations when it comes to configuring the 3D1. To begin, it only works with Gigabyte motherboards since a special motherboard BIOS is required to recognize the card, according to Carol Chiou of Gigabyte’s channel marketing. That shouldn’t really matter, though, because Gigabyte is planning to sell the 3D1 in a limited edition bundle with its K8NXP-SLI nForce4 motherboard anyway. The package is expected to bear a $550 MSRP and realize retail availability by the end of January.
| More Gigabyte 3D1||Page:: ( 2 / 16 )|
At least for the time being, Gigabyte says the 3D1 will only work on its nForce4 SLI motherboard. Perhaps it will also work with VIA’s K8T890 at some point in the future as well, but the company isn’t yet ready to solidify the card’s compatibility. And, despite the very unique implementation, you won’t need to use any special software to get the 3D1 up and running. NVIDIA’s reference driver set works just fine.
The card does require a somewhat different motherboard configuration, though. Rather than set the K8NXP-SLI to run in SLI mode, the 3D1 utilizes all 16 PCI Express lanes offered by a single slot and thus needs to operate in “Normal” mode. Nevertheless, Gigabyte still includes the NVIDIA SLI connector with the bundle just in case you decide to adopt more powerful 6800 Ultra cards down the road.
|<% print_image("03"); %>||<% print_image("04"); %>|
When you turn on SLI multi-GPU rendering in the NVIDIA driver, the single x16 slot is divided into x8 pathways, each of which is dedicated to one 6600 GT processor. While it’s nice to think that you could add a second PCI Express graphics card for even more expansive multi-monitor support, the fact of the matter is that by enabling SLI, you’re harnessing all 16 lanes available for graphics. Even without the feature turned on, it’s possible to run dual monitors through the onboard DVI and VGA outputs, but you can’t add another card to the mix. Gigabyte’s 3D1 will serve as your primary and secondary adapters at all times.
You should also note that, in SLI mode, all thermal monitoring is disabled. It probably isn’t as important as it would be on a dual GeForce 6800 Ultra system with gratuitous heat dissipation, but it’d still be a useful feature to see Gigabyte add into its V-Tuner software.
The bundled V-Tuner overclocking utility is complemented by Gigabyte’s @VGA BIOS tool that’s able to update the 3D1 should a fresh BIOS file become available. Also included is Thief: Deadly Shadows, Joint Operations: Typhoon Rising, and an I/O dongle that enables S-video and component output to an HDTV.
While independently setting the clock frequencies of two cards in an SLI setup often yields a modest increase in theoretical performance (though not necessarily real-world results, as seen in Brandon’s SLI preview), putting the two processors on a single board seems to detrimentally impact overclocking potential.
Using NVIDIA’s built-in CoolBits registry modification we were able to get the 500MHz chips up to 525MHz, though they crashed in Doom 3. Subsequent attempts at overclocking resulted in the board failing NVIDIA’s mandatory speed test. Even Gigabyte’s V-Tune utility failed to extract extra speed. We’re hardly worried, though; juicing the traditional SLI setup didn’t yield much in the way of tangible gains, and so we weren’t expecting much from the 3D1 in that regard.
| Gigabyte’s K8NXP-SLI Motherboard||Page:: ( 3 / 16 )|
Gigabyte’s able to add so much extra functionality, in part, because the K8NXP-SLI centers on the compact, single-chip nForce4 SLI solution. NVIDIA’s nForce4 connects to a Socket 939 Athlon 64 or Athlon 64 FX processor through a 1GB per second HyperTransport bus and in turn interfaces with up to 4GB of DDR400 memory. The four memory slots are color-coded as to indicate the proper placement for dual-channel operation.
|<% print_image("06"); %>||<% print_image("07"); %>||<% print_image("08"); %>|
Of course, the chipset also features SLI support, meaning it’s able to configure the 16 lanes of PCI Express connectivity according to a small onboard module. In one position, all 16 lanes are devoted to the first PCI Express x16 slot, while eight lanes go to each of the board’s two graphics slots in the other orientation. Remaining PCI Express lanes are used to expose two PCI Express x1 slots and one integrated Gigabit Ethernet controller. A second Gigabit controller integrated into the nForce4 chipset enables an extra RJ-45 port as well.
I/O is another of the board’s highlights. Gigabyte uses the nForce4 chipset to expose four Serial ATA II ports (compatible with the second-generation 300Mbps specification), while a Silicon Image Sil3114 controller offers four more with support for RAID 0, 1, 0+1, and, according to Gigabyte, RAID 5. There are also 10 USB 2.0 ports featured through a combination of back-panel ports and included headers that occupy expansion slots. Finally, a pair of Texas Instruments controllers enables three IEEE 1394b ports – the 800Mbps variety – for high-speed data transfers from compatible external hard drives. Gigabyte even goes so far as to include an 802.11g wireless networking card and a DPS add-in card to bolster the board’s power handing.
|<% print_image("09"); %>||<% print_image("10"); %>||<% print_image("11"); %>|
An integrated ALC850 audio codec makes eight-channel output possible through a series of jacks on the motherboard’s back panel. Those ports are equipped with jack-sensing technology to determine the connected device and arrange configuration accordingly. The sound system is 16-bit throughout and doesn’t quite live up to Intel’s HD-Audio standard, but it should suffice for most applications.
Expect a more thorough look at the K8NXP-SLI and its accompanying hardware and software features in our upcoming SLI roundup.
|<% print_image("12"); %>||<% print_image("13"); %>|
| System Setup||Page:: ( 4 / 16 )|
AMD Athlon 64 FX-53 (939)
Gigabyte K8NXP-SLI nForce4 SLI motherboard
1GB Corsair 2-2-2-5 DDR400 Pro Series Memory (2x512MB)
Gigabyte 3D1 Dual GeForce 6600 GT Graphics Card PCI Express x16
NVIDIA GeForce 6800 Ultra PCI Express x16
NVIDIA GeForce 6800 GT PCI Express x16
NVIDIA GeForce 6600 GT PCI Express x16
34GB Western Digital Raptor (10,000RPM, 8MB cache)
Windows XP Professional with Service Pack 2
Desktop resolution 1024x768, 32-bit color, 85Hz refresh
All power saving options were turned off, as were the Automatic Update and System Restore services. Graphics options under the ‘Performance’ tab were all disabled for maximum performance.
id Software Doom 3 Demo Demo001
Crytek Far Cry v.1.3 Demos Training and Volcano
Valve Software Counter-Strike: Source Custom Demo C
1C: Maddox Games IL2: Forgotten Battles TheBlackDeath Demo
| Doom 3 with 4xAA||Page:: ( 5 / 16 )|
id Software Doom 3 with 4xAA
Understanding that anyone with an SLI setup would want to play games with much more detail enabled, we started our tests at 1024x768 with 4xAA as a baseline. And even then, the 3D1 posts impressive scores that are clearly very playable. It offers about 58 percent more performance compared to just a single GeForce 6600 GT card, but the GeForce 6800 GT is still about 12 percent faster at 1024x768.
As the resolution increases, you can see the 3D1 lose the ability to deliver fluid gameplay, while the 6800 GT and 6800 Ultra are still pretty capable at 1600x1200. At that point, however, the 3D1 is offering nearly 63 percent more speed than the single GeForce 6600 GT to which it’s being compared.
| Doom 3 with 4xAA and 16xAF||Page:: ( 6 / 16 )|
id Software Doom 3 with 4xAA and 16xAF
At 1024x768, the 3D1 is still pumping out great numbers. In this case, they’re 59 percent faster than the lone 6600 GT and only a few frames behind the 6800 GT. Increasing resolution again takes a toll on the 3D1’s performance, and while 1280x1024 is a reasonable proposition, there’s too much slow-down at 1600x1200 to enjoy the game.
| Far Cry: Training with 4xAA||Page:: ( 7 / 16 )|
Crytek’s Far Cry: Training with 4xAA
The Training level is particularly lush, with lots of environmental detail in the form of vegetation and the huts you see at the beginning of the game. The 3D1’s advantage over the 6600 GT shrinks to 25 percent at 1024x768, though it only follows the 6800 GT by about seven frames per second.
By 1600x1200, the 3D1 is still attaining almost 40 frames to the 6600GT’s roughly 25, a 58 percent advantage. The GeForce 6800 Ultra, however, is still posting in excess of 50 frames per second.
| Far Cry: Training with 4xAA and 8xAF||Page:: ( 8 / 16 )|
Crytek’s Far Cry: Training with 4xAA and 8xAF
At 1024x768, Gigabyte’s 3D1 hangs right up there with the 6800 GT. This is where we can see a difference between one processor having a dedicated 256-bit memory bus and two processors with dedicated 128-bit buses. As the resolution increases, the 3D1 continues to lead the single 6600 GT by 55 percent, but it also falls off in relation to the 6800-class cards.
| Far Cry: Volcano with 4xAA||Page:: ( 9 / 16 )|
Crytek’s Far Cry: Volcano with 4xAA
The Volcano demo takes place both outdoor and indoors, but there isn’t as much detail for the graphics card to manipulate, other than steam rising from magma on the ground. As a result, we see the 3D1 50 percent faster than the 6600 GT at 1024x768 and a mere seven percent slower than one 6800 GT card. The 3D1 continues its exceptional performance at 1280x1024 and even at 1600x1200, where the 3D1 is 77 percent faster than the 6600 GT, it only loses to the 6800 GT by six percent.
| Far Cry: Volcano with 4xAA and 8xAF||Page:: ( 10 / 16 )|
Crytek’s Far Cry: Volcano with 4xAA and 8xAF
This time around the 3D1 actually trades places with the 6800 GT, beating it at 1024x768 and 1280x1024. When we reach 1600x1200, the 6800 GT regains the lead by less than two frames, and while the 3D1 is still enabling playable performance, the single 6600 GT isn’t. At that resolution the 3D1 leads by 75 percent.
| Half-Life 2 with 4xAA||Page:: ( 11 / 16 )|
Valve’s Half-Life 2 with 4xAA
We’ve established previously that Half-Life 2 is much more taxing on host processors than many other popular titles, and we can see that at 1024x768, where all of the graphics cards perform admirably in a fairly tight pattern. There’s more of a difference at 1600x1200 as the 3D1 outperforms the 6600 GT by 35 percent and loses to the 6800 GT by about 17 frames per second.
| Half-Life 2 with 4xAA 16xAF||Page:: ( 12 / 16 )|
Valve’s Half-Life 2 with 4xAA and 16xAF
Little changes with the addition of anisotropic filtering. The 3D1 nearly matches the 6800 GT at 1024x768, but falls off considerably as the resolution increases. By the time it hits 1600x1200, Gigabyte’s 3D1 is only 22 percent faster than the 6600 GT.
| IL:2 with 4xAA||Page:: ( 13 / 16 )|
1C: Maddox Games’ IL:2 with 4xAA
Flight simulators are generally regarded to depend on processor alacrity, but these tests show that isn’t always the case. Right off the bat, Gigabyte’s 3D1 establishes a 31 percent advantage over the 6600 GT and trailing the 6800 GT by a relatively small margin. At 1600x1200, the 3D1 is faster than the mainstream 6600 GT by about 75 percent.
| IL:2 with 4xAA and 16xAF||Page:: ( 14 / 16 )|
1C: Maddox Games’ IL:2 with 4xAA and 16xAF
| Ballistics Report: Gigabyte 3D1||Page:: ( 15 / 16 )|
Performance: It’s hard to argue with a single card that enables resolutions of up to 1600x1200 with anti-aliasing in many situations. The 6600 GT is a powerful eight-pipe card on its own and, matched up to a second processor performance is markedly better in all of the scenarios we measured.
Value: You might be wondering how a $550 motherboard and graphics card bundle could be considered a notable value. However, consider that the current prices for SLI boards hover close to $300 and the cheapest 6600 GT cards are $180. Packaged together, the 3D1 and K8NXP-SLI is priced much more attractively. It’s another story entirely if you were hoping for 6800 GT or 6800 Ultra cards, but Gigabyte claims to be working on ideas for those chips as well.
Ingenuity: Beyond the bottom line, there’s something to be said for Gigabyte’s inventiveness with the 3D1. Though it requires a special motherboard BIOS to work properly, the 3D1 is a great example of how PCI Express is already enabling greater platform flexibility through technologies such as SLI. It doesn’t require any special connectors – you plug the card in, install drivers, enable SLI, and you’re off. It’s a great solution for more mainstream gamers who don’t want to worry about matching graphics cards and so forth.
Constraints: The 3D1 isn’t an add-in card for the masses. If you buy one to drop into an nForce4 Ultra motherboard, you’ll be might disappointed when it doesn’t work. It’s only meant to run on an SLI board, and it currently works with Gigabyte’s exclusively.
Just because the 3D1 only uses one PCI Express x16 slot doesn’t mean the other will accommodate an additional graphics card, either. The 3D1 utilizes all 16 lanes made available to it, leaving none for an additional card. Unless you add a PCI graphics card, you won’t get any more than two display outputs, even with SLI mode disabled.
Finally, thermal monitoring is disabled for some reason when SLI is turned on. You can monitor each graphics processor with the technology disabled, so we’re hopeful that Gigabyte can fix this issue somehow.
Availability: The 3D1 bundle isn’t available yet and isn’t expected until the end of January. Even then, understanding that the market for a card like this will be small, Gigabyte is planning to make this a very limited-edition product. If you want one, keep a watchful eye out.
| Final Verdict||Page:: ( 16 / 16 )|
Print Article! | Close Window ]