As the successor to the GeForce2 MX series, NVIDIA’s GeForce4 MX had some very big shoes to fill. Not only was the GeForce2 MX the poster child of the value segment, it was the first product truly geared from the ground up to attend to the needs of this critical sector. OEMs and gamers alike bought GeForce2 MX cards in droves and while attempts from STMicro and later, ATI, were made to dethrone it, NVIDIA countered by adding additional variants and with price cuts. If there were a graphics card hall of fame, the GeForce2 MX deserves a spot right alongside 3dfx’s Voodoo Graphics chipset, the granddaddy of 3D graphics as we know them today.
Despite concerted attempts by NVIDIA and graphics card manufacturers, GeForce4 MX just couldn’t get off the ground in the hearts and minds of gamers like the GeForce2 MX did. For starters, NVIDIA built GeForce4 MX largely off the same core as GeForce2 MX. NVIDIA implemented a quasi crossbar memory architecture in the sense that the GeForce4 MX utilized dual rather than quad memory controllers.
NVIDIA also added its Accuview anti-aliasing engine from the GeForce4 Titanium family, but the chip lacked the presence of hardware pixel and vertex shaders, features NVIDIA introduced to the world with GeForce3, and then enhanced in GeForce4 Titanium. With its GeForce4 MX designation, many assumed it was a stripped, feature complete version of the GeForce4 Titanium, just as GeForce2 MX had been with GeForce2 GTS. This led many gamers to cry foul, accusing NVIDIA of deceptive marketing.
In fact, due to market conditions, NVIDIA’s flagship of the GeForce4 MX line, the GeForce4 MX460, never really get off the ground. The GeForce4 MX sported a 300MHz clock frequency, making it one of the fastest chips in NVIDIA’s stable at the time based on raw clock speed. In addition, the card was outfitted with more expensive BGA (rather than TSOP) DDR memory. The end result was a product that was very hard for card manufacturers to produce affordably, in fact, we’re not aware of a single GeForce4 MX460 card that was ever sold to the public. Instead, graphics card manufacturers chose to go with NVIDIA’s GeForce4 MX 440.
The GeForce4 MX440 was cheaper to manufacture, allowing card manufacturers to hit the price points they were shooting for the value market. Just as important however, was NVIDIA’s GeForce4 Ti 4200. Due to its manufacturing cost, GeForce4 MX460 was priced closely to GeForce4 Ti 4200, yet the Ti 4200 offered considerably more performance, and hardware pixel and vertex shaders. There simply was no point in purchasing GeForce4 MX 460.
NVIDIA hopes to have another sales success in the GeForce FX 5200 family. This series of chips boast full DX9 compliance, but without the killer price tag you’d expect of a DX9 graphics card. Last month we took a look at the GeForce FX 5200 Ultra, today we’re reviewing MSI’s GeForce FX 5200 card, the FX5200-TDR128.