If you recall the design of GeForce FX 5600, youíll remember that it is a derivative of NV30, NVIDIAís GeForce FX 5800 family. NVIDIA essentially used NV30 as a building block for GeForce FX 5600 in much the same way an auto manufacturer builds multiple car lines off the same platform.
With approximately 125 million transistors, NV30 was expensive for NVIDIA to produce. They needed to come up with an alternative design for GeForce FX 5600 that was cheaper to manufacture in order to serve the mainstream market. In order to accomplish this, NVIDIA sliced off a TMU: while GeForce FX 5800 offers four pixel pipelines with two texture units per pixel pipeline (4x2), the GeForce FX 5600 family is built on a more conservative four pixel pipeline with one texture unit per pixel pipeline architecture (4x1). NVIDIA also modified GeForce FX 5600ís memory subsystem, GeForce FX 5600 employs dual 64-bit memory controllers versus four 32-bit memory controllers in GeForce FX 5800.
For GeForce FX 5700 Ultra, NVIDIA went back to this formula; only this time they used the GeForce FX 5900ís NV35 core as the new starting point. From there, NVIDIA made some adjustments to the core to make the GeForce FX 5700 Ultra less expensive to manufacture.
Another shot of the
5600U and 5700U
as well as the Ti4600
The 5700U is longer
than the 5600U
128-bit memory interface
Before you start drooling over the prospects of a mainstream card with a 256-bit memory interface, donít. GeForce FX 5700 Ultra did not adapt the GeForce FX 5900ís 256-bit memory interface. Instead GeForce FX 5700 Ultra relies on a 128-bit memory interface (with two 64-bit memory controllers), just like the GeForce FX 5600 Ultra.
One key difference between the GeForce FX 5700ís memory controller and that of its predecessor is its support for new memory types. GeForce FX 5700 supports conventional DDR memory like the GeForce FX 5600, but also DDR2 and GDDR3.
GDDR3 (short for graphics double data rate SDRAM) is a new memory type developed specifically for graphics. ATI and NVIDIA worked closely with memory manufacturers in the design and development of GDDR3ís specifications. GDDR3 is designed to operate at significantly higher clock frequencies than previous memory types without consuming large amounts of power. Micron in particular made a splash in June of this year when it announced it had shipped its first GDDR3 samples to both ATI and NVIDIA. Its website currently lists modules as high as 700MHz (1.4GHz effective) that will begin shipping this quarter.
In the meantime, GeForce FX 5700 utilizes first generation DDR memory, while the GeForce FX 5700 Ultra employs DDR2, just like NVIDIAís GeForce FX 5800 Ultra.
The GeForce FX 5700 Ultraís DDR2 memory is clocked at 450MHz (900MHz effective) providing the 5700 Ultra core with up to 14.4GB/sec of memory bandwidth. Like the 5600 Ultra, the 5700 Ultraís most popular memory configuration appears to be 128MB; although itís possible a board manufacturer may chose to produce a 256MB card (the core can support up to 1GB of memory).