Summary: What happens when you take two RV670 GPUs clocked at over 800MHz and combine them onto one board with 1GB of memory? The Radeon HD 3870 X2 of course! This new $449 card is poised to dethrone NVIDIA's mighty GeForce 8800 Ultra, but does it accomplish its mission? Yes and No. Read on for the full results!
Ultimately this approach led us to CPUs with multiple processing cores. The first dual-core CPUs were released in 2005, and todayís latest Core 2 and Phenom CPUs sport four processing cores.
This leads us to graphics.
Todayís latest high-end graphics processors contain over 650M transistors. In comparison, a quad-core Penryn CPU contains 820M transistors. Unlike a high-end GPU, the majority of those 820M transistors inside Penryn consist of its massive 12MB L2 cache, so you can argue that a GPU is a more complicated design. In creating the G8x GPU inside the GeForce 8 series, NVIDIA has stated that they spent half a billion dollars in R&D and that the GPU took four years to develop.
Quite simply, because these devices are growing more complicated, developing a cutting edge GPU like R600 or G80 is taking longer than ever and both AMD and NVIDIA are spending hundreds of millions of dollars in R&D to bring them to market. Because of this, the days of the 6-month product cycle didnít occur this generation, and itís possible that other than die shrinks, we may not see them again.
This puts AMD and NVIDIA in a difficult situation: how do you continue to deliver dramatic breakthroughs in performance if it takes longer and is more expensive to develop a next-generation high-end GPU? Simple, you add more GPUs to the graphics card itself.
This is exactly what NVIDIA did with their GeForce 7900 GX2 and GeForce 7950 GX2. By integrating two G71 GPUs onto one board, NVIDIA was able to deliver performance levels that were similar to two comparable graphic cards while only using one PCI Express graphics slot. Back in 1999, ATI did something similar with their Rage Fury MAXX card, which consisted of dual Rage 128 Pro chips.
The AMD Radeon 3870 X2 follows this same philosophy. As its name implies, the card essentially consists of two Radeon 3870 GPUs that have been grafted together onto one PCB and cooler. But AMD has made a couple of tweaks to enhance the performance of each GPUÖ
The GPU powering the Radeon 3870 X2 is known as R680. R680 is built largely on the RV670 chip already in use on the Radeon 3870 Ė for instance, both chips feature 320 stream processors and are built on TSMCís 55-nm manufacturing process. The key difference is that R680 runs at 825MHz, whereas the RV670 GPU used in the Radeon 3870 topped out at 775MHz. Here are the other key specs of R680:
As you can see, the Radeon 3870 X2 ships with GDDR3 memory modules that run 225MHz slower than the GDDR4 used on the Radeon 3870. However, the GDDR3 memory on the 3870 X2 runs at lower latency than the GDDR4 used on the 3870. The lower latency improves performance, helping to offset the difference in clock speed. In addition, GDDR3 is cheaper than GDDR4. This is important considering that the Radeon 3870 X2 ships with 1GB of memory. (Keep in mind that 512MB of memory is distributed to each GPU.)
Like RV670, R680 sports a 256-bit memory interface and supports DirectX 10.1. Paired to each GPU is 512MB of memory.
One interesting aspect of the Radeon 3870 X2 is the 48-lane PCIe 1.1 interconnect bridge which links both GPUs to each other. This chip sits between both GPUs, providing a bi-directional x16 lane link between both GPUs, and is the same bridge chip that was used previously on Radeon HD 2600 X2 cards. AMD initially planned to use an upcoming PCIe 2.0 interconnect bridge chip but ultimately settled on the 1.1 chip at the request of their board partners, who were eager to see the 3870 X2 card hit the market as quickly as possible. Most of todayís latest apps still donít take full advantage of PCIe as it is, so itís doubtful that a PCIe 2.0 interconnect bridge would have had much of an impact on performance.
The 3870 X2 Board
The PCB of the Radeon HD 3870 X2 board measures in at 10.5Ē in length. This is the same size as the GeForce 8800 GTX/Ultra PCB. Also like the GeForce 8800 GTX and Ultra, the board requires two power connections. On the right edge of the board youíll find an 8-pin PCIe 2.0 power connector, while a second 6-pin PCIe connector is located just above the cardís fan. The board needs both power connectors in order to operate, although just like the 2900 XT, you donít need a power supply with an 8-pin PCIe 2.0 power connector. The 8-pin connector is only required for overclocking.
Once the Radeon 3870 X2 is installed within your system and the drivers are loaded, the board automatically operates in CrossFire mode. In fact, at this time there is no way to disable CrossFire. The 3870 X2 card is compatible with all motherboards, so you can operate the board in CrossFire mode regardless if your motherboardís chipset actually supports CrossFire or not. We ran the card on an EVGA nForce 680i motherboard without any problems.
When youíre ready to upgrade for even more performance, two Radeon 3870 X2 cards can be linked together to support CrossFire X, AMDís 4-Way CrossFire solution. Unfortunately CrossFire X drivers havenít been released by AMD yet, but weíve been told that they should be available sometime later this quarter.
Intel Core 2 Extreme X6800
ASUS P5E3 Deluxe (for Radeon cards)
EVGA nForce 680i SLI motherboard (for GeForce cards)
2GB Corsair TWIN2X2048-6400C4
AMD Radeon HD 3870 512MB
AMD Radeon HD 3870 X2 1GB
8.451.2_080123a-058649E-ATI Driver (Catalyst 8.1 derivative)
GeForce 8800 GTX
GeForce 8800 Ultra
300GB Western Digital Caviar SE
Windows Vista 64-bit w/Service Pack 1 RC Refresh
Company of Heroes 1.71 (running DX9)
Crysis Very High Ė Direct3D
In DX9 the tables turned in favor of the Radeon HD 3870 X2: with the exception of Lost Planet DX9 (where CrossFire didnít scale at all), the 3870 X2 outran the GeForce 8800 cards in all of our tests. This included games like Call of Duty 4, Oblivion, Half-Life 2: Episode Two, F.E.A.R., and Company of Heroes with DX9.
If you recall our SLI versus CrossFire tests from last year, youíll remember that DX10 performance was an achilles heel of the Radeon cards in our testing: they put up competitive numbers in DX9 apps, but once a DX10 game was tested, performance began to suffer. At the time we surmised that AMDís DX10 driver was holding them back. It seems like weíre seeing this situation repeat itself again in 2008 with the Radeon 3870 X2. While AMDís drivers have certainly made progress and CrossFire is now scaling properly in a wider range of games than they were four months ago, the performance of their DX10 driver is still playing catch up to DX9 Ė at least thatís how it appears based on our testing.
Drivers are going to be the key to the 3870 X2ís success. Not only are we looking for a little more performance out of AMDís DX10 driver, CrossFire support is going to be important as well. After all, if games arenít designed to scale properly with CrossFire, the 3870 X2 will essentially perform similarly to a conventional Radeon HD 3870 card like the Sapphire 3870 Atomic we just reviewed. AMDís driver team has been working overtime getting CrossFire to scale with games like BioShock and World in Conflict, but going forward itís going to be critical for the 3870 X2 that these games scale properly with CrossFire out-of-the-box. Missing a game here or there is acceptable, but from August through December last year we saw multiple titles that lacked proper CrossFire support on launch. This situation canít be repeated in 2008 if the Radeon HD 3870 X2 is going to succeed.
With an MSRP of $449, the Radeon HD 3870 X2 is a tempting alternative to the GeForce 8800 GTX and Ultra if you find yourself playing a lot of DX9 apps like Call of Duty 4. In these apps, the Radeon HD 3870 X2 is unquestionably the fastest card on the market right now. But if youíre currently playing a mixture of DX9 and DX10 games, you may want to opt for one of the GeForce 8 cards, or wait and see how AMDís DX10 driver matures.
Of course, NVIDIA is expected to launch their latest dual-GPU GX2 card that combines two G92 GPUs onto one board. Of course, we donít know how much the board will cost, but considering the performance that two GeForce 8800 GT cards running in SLI offers today, expect this upcoming GX2 card to be a screamer when it comes to performanceÖ
|© Copyright 2003 FS Media, Inc.|