I've never gotten into lossy texture compression either. I want my graphics to look their best, not to be all mangled by graphics compression schemes. Remember the skies in Q3A? If not, check out the shots below for a refresher. We have great hardware now, thanks to Nvidia and ATI, and we should not have to put up with lossy texture compression any more. More speed, yes, but not at the loss of image quality please!
With Texture Compression
Without Texture Compression
Speaking of image quality, good graphics artists are hard to come by. Rich, vibrant, flowing graphics can be a wonder to behold, and I wish I saw them on every single game. Unfortunately, it seems like even the best companies have their troubles now and then. Assuming the 3D architecture and lighting is all good, the biggest impact is usually textures. Some games are really able to pull this together, with intricately detailed and shadowed brick and stone work - the illusion of depth is incredibly well-implemented. Other games, for some reason or another, succeed only in looking a bit flat, lacking that special touch creates a more photo-realistic illusion of depth to a flat image.
The ironic thing is that unlike extra special effects, having high-quality (as in well-done) textures does not add to system requirements or rendering time. It is certainly less taxing than generating bump-mapping on the fly and in fact will let you enjoy it regardless of the particular hardware support of your graphics card. Looks the same on a Geforce as it does on a Radeon.
A technology that never seems to pan out is Anti-Aliasing. I guess it is a nice option to have if you are lucky enough to use one of the high end cards from Nvidia or ATI, but I'd rather use that extra speed and processing power to kick up the resolution as high as it can go. To me, games look a heck of a lot better at 1600x1200x32 with no anti-aliasing than they do at 800x600x32 with 2x or even 4x anti-aliasing. ATI can't seem to get it right at all, and Nvidia is still working on it. The only form I've tried that worked as advertised seemed to be on the 3dfx Voodoo 5500. That was a cool card, and the fact that you could toggle the anti-aliasing on and off in hardware rocked the house. Still, all that messing with Level Of Detail and the like to get the image to stop looking so fuzzy was just a pain and kind of defeated the purpose. I'd rather see developers focus on fast, efficient engines that can be cranked to 1600x1200 or higher so that we can have rip-roaring frag-fests with crisp, clean lines and rich textured graphics.