GeForce 8800: the DirectX 10 era begins
inflection point: n. A moment of dramatic change, especially in the development of a company, industry, or market.
It has been just over four years since ATI first ushered in the DirectX 9 era of gaming with the Radeon 9700 Pro, and over two years since the first shader model 3.0 card, NVIDIA’s GeForce 6800 Ultra, were both introduced. Both of these introductions marked a huge inflection point
in PC graphics, and as a result, the games you and I enjoy today look remarkably more realistic thanks to their arrival. Just think how games would look if it weren’t for the more powerful pixel and vertex shaders (as well as other new features such as HDR lighting) that were first incorporated into these cards.
Now, in 2006, and more importantly in 2007 when the software arrives, PC graphics and gaming is going to take another huge leap forward with the next generation of DirectX 10 hardware. NVIDIA’s GeForce 8800 line is the first GPU that’s been designed to take advantage of this, but NVIDIA’s built in plenty of goodies to make the GeForce 8800 an exciting graphics chip for powering today’s latest DirectX 9 and OpenGL 2.0 titles today.
Before we go into the details on this new GPU however, let’s first quickly go over what makes DirectX 10 so special.
As we mentioned in our DirectX 10 Preview article
, DirectX 10 has been completely redesigned from the ground up: no piece of the API was left untouched on the graphics side. Because of this, DirectX 10 boasts several new features that are designed to not only improve image quality, but also performance. Here are the key highlights:
New driver model: Under DX10 the driver is split into two parts: the user mode driver and the kernel mode driver. The kernel mode driver is kept distinct from the user mode driver to enhance stability.
Brand new Geometry Shader added to the middle of the pipeline, in between the vertex and pixel shaders.
Increased efficiency, fixing the “small batch problem”. (Microsoft claims performance improvements up to six times that of DierctX 9 hardware running on Windows XP because of this). As a result, less overhead from processor (CPU offloading to the GPU), giving the ability to pump out more objects onto the screen. This increases realism and performance in newer games.
Virtualized memory for the GPU. The video card will be able to use space in system RAM to store information that does not fit on local video card memory.
Shader Model 4.0 has a broader instruction set including integer and bitwise instruction, transferring more work to the GPU.
Fixed function pipeline is gone. Everything is now programmable (done with shaders).
Consistency: Capability bits are gone, these were used to tell DirectX what features the GPU did and did not support. With cap bits gone, this leaves hardware manufacturers with fewer ways to deviate from spec. These stricter feature requirements ensure that video cards will all have the same basic requirements; there are only a few optional features such as multisample anti-aliasing. (For example in the early days of DirectX 9, there was lots of variation on floating-point formats (FP16, FP24, FP32) which led to confusion among software developers.)
HDR Lighting – Two new floating point HDR formats for DX10-compliant GPUs: Added additional support in DirectX 10 for HDR formats to more compactly represent HDR data, making it possible to use HDR more efficiently
Virtualized memory for the GPU – In the past, the amount of texture storage was limited by the amount of onboard memory the graphics processor held. Now textures can be stored on system memory, eliminating the memory bottleneck on texture size.
Better geometry instancing – Geometry instancing, first introduced to DX9 in shader model 3.0, has been tweaked in DirectX 10. The enhancements that have been made provide more customization for developers (for example, providing unique animations for objects (like ground units in an RTS game) that are rendered via instancing).
Increase in memory texture - increased the maximum texture dimensions in DX10. They were 2048x2048 or 4096x4096 in DirectX 9, and in DX10 they're 8192x8192.