Jakub: Brandon, listen to yourself. You're clearly being delusional here. A knife to a gunfight? It's more like I'm the one with the SIG, a laser sight, an upgraded trigger and all the goodies, and you came with some flint-lock smoothbore pistol from the 17th century. Just which pixel shader version does ATI support again? PS 0.3, or are they up to 0.4 now? I'm sorry, it's just so hard to tell from the lofty heights of Pixel Shader 3.0. You had best remember that half the reason NVIDIA got kicked in the nuts last time around was because they couldn't deliver the latest shader model features.
As for arguing that Far Cry and Tomb Raider are representative of next generation technology, what's up with that? That's like going to the Detroit Auto Show, picking up the latest and greatest Korean-built econobox and telling people that's the car of the future. No, wait, in Tomb Raider's case, it's more like spit-shining a Ford Pinto. And please, save yourself the embarrassment of arguing anything about Half-Life 2. Between Gabe "I sell Half-Life 2 every time I pitch someone a Radeon" Newell and that overhyped Source engine of Valve's, I think it's safe to discount that game as a worthwhile benchmark. Just what does it exactly do that's special? Ooooh… facial animation. Yeah, that's really going to stress the GPU. Physics? Not the graphics card's department. The lighting is straight out of DirectX Ancient History - lightmaps… pssh. The shader tricks that Valve showed are now appearing in every game. A man made of water was impressive last summer and might have been even when September 30th came around, but if you've played Painkiller, you'd have seen something way more impressive than a mere man made of water. Face it, the benchmark standard of the future will be from id Software, as it has always been and ever shall be (amen).
I see you've failed to counter any of my remarks regarding driver maturity. No wonder the Radeon X800 does so well with anti-aliasing and anisotropic filtering, it's not so different from its predecessors. You can't say anything like that about the GeForce, on the other hand. This card is brand-spanking new. NVIDIA went from an overstrung, small chip (like that RSX Type S K20 engine of yours) to a big, fat, beefy 16 pipe monster (think: 426 Hemi). So please, counter if you will my point that NVIDIA likely has a lot more optimization - and thus performance headroom - left than ATI. On top of that, you should know better than anyone else the logo you see whenever you start your benchmark games - The Way It's Meant to Be Played. So don't doubt for a minute that NVIDIA's horde of engineers and programmers aren't tweaking the living crap out of next-generation games. Where's ATI? Oh right, working on Half-Life 2.