Summary: Today Alan Dang looks at the deinterlacing performance of the NVIDIA GeForce 6600, the ATI Radeon X800, and the XGI Volari 8300. Test-taking skills will tell you that FiringSquad wouldn't have bothered talking about the XGI Volari 8300 unless it had something special to offer...
You may want to review this interview with NVIDIA, and this interview with ATI before we begin. We have reprinted the relevant parts of the text from our first look at DVD deinterlacing.
Since we're just doing mainstream, I'll be using the HQV Benchmark DVD for my primary source. All of the tests were conducted using a 1280x1024 DVI connection. Since the DVD software plays a large role in video quality, we used the "best available" software for each platform. This meant NVIDIA PureVideo decoder for the GeForce, Cyberlink PowerDVD for the XGI, and ATI's own Multimedia Center (based off Cyberlink) for the Radeon. All of the GPUs were configured for the auto-detect mode. Forceware 81.95, Catalyst 5.12, and Reactor 3.03.03 drivers were used.
The screenshots we have used were taken with the Epiphan VGA2USB. Since we're doing a true analog capture of image quality, please be aware that differences in brightness, color, contrast, or even aspect ratio are a fault of the capture device and that the images are only designed to provide examples to go along with the text. Make sure you view the HIGH-RES version of each of the images. Feel free to post any questions in our comments section if it's unclear whether there's a difference between image quality or if it's simply an artifact from the way we captured the video.
Color Bar / Vertical Detail
1st place – ATI (10 points)
Tied for 3rd place: NVIDIA and XGI (5 points)
The ATI is likely going to be the best solution for picture slideshows where there's no animation.
The three Jaggies test evaluate how good a video processor is at deinterlacing objects that are moving. The first "jaggies pattern" is a simple spinning bar, the second "jaggies pattern" is a set of three bars with a waving motion (like waving your hand), and the third test is a real-world picture of the American flag in the wind. These all represent scenes that a deinterlacer will easily be able recognize as having motion.
In the ATI interview, I hypothesized that they were using some form of edge-adaptive deinterlacing when talking about "vector-adaptive" anti-aliasing. ATI is reluctant to discuss this any further. On the other hand, XGI advertises that they use the edge adaptive deinterlacing. Remember that edge-adaptive interlacing works in this way:
Suppose that you have these pixels at the odd rows 1 and 3:
The video processor has to figure out what 5 pixels should go in between
A simple bob would just average the pixel above and below for each.
Resulting in this:
A better intrafield method of deinterlacing is edge-adaptive diagonal interpolation in which you examine several pixels from the row above and below the discarded pixel instead of just the pixels above and below. If an edge is detected, then the missing pixel is interpolated by averaging the pixels along that edge. So, first the video processor looks for edges in several directions
When it detects the edge, it interpolates along that axis.
You can then use the nearest pixel outside of the edge resulting in
The difference doesn't looks like much until you take a step back and compare the two. With traditional bob, you get artifacts that result in jaggies but with edge-adaptive directional interpolation it looks exceptionally smooth. (Try squinting for the maximum effect).
Amazing isn't it? Diagonal filtering was a feature pioneered by Faroudja. The next question is how do I know if the correct "edge" is the red part
or if the background happens to be red and the white line is what I'm supposed to interpolate?
Resulting in :
Was the picture supposed to represent a red slash on a white background or a bold white backslash on a red background? It's impossible to say. This is why deinterlacing isn't a brute-force computational solution. There's an art to video processing. You could have two video processors with diagonal interpolation, but one might be more accurate at guessing the appropriate direction. Likewise, you could have an exceptionally exotic deinterlacing strategy that only worked in some conditions.
Well looking at those same 3 images, we can see that both ATI and NVIDIA develop jaggies while in the yellow zone. This means that their ability to deinterlace isn't able to prevent anti-aliasing at oblique angles.
But wait? Look at the XGI image again:
It's perfect. Since the XGI Volari 8300 has edge-adaptive deinterlacing, it is providing the best results in this portion of the test. Part of the reason why XGI does well with video is that the development costs required to do good video are not as high as they are for 3D. XGI is putting a substantial part of their effort into improving video quality, and their engineers are formerly of Trident, a company that now manufacturers video processors used by Sony, Samsung, and Metz televisions.
Under Jaggies 2, all three properly smooth the first two bars, but none are able to keep the third line smooth.
Although all three earn the same numerical score, the XGI still does marginally better in preventing the aliasing.
Moving onto the waving flag, all three are able to produce images where jaggies on the red and white stripes are minimized. None of the three GPUs are able to completely remove the jaggies but XGI and NVIDIA do a better job than ATI.
Among the three GPUs, ATI does the worst. NVIDIA and XGI perform similarly with some portions of the flag looking better on NVIDIA and some portions looking better on XGI.
1st place – XGI (13 points)
2nd place - NVIDIA (11 points)
3rd place - ATI (6 points)
When it comes to de-interlacing 30 fps content such as TV shows, the $50 XGI bests the GeForce 6600 and Radeon X800 by a substantial amount.
When it comes to detail enhancement, NVIDIA does not implement overlay-specific sharpening tools. XGI does. ATI seems to apply some level of sharpening to the image, but it does not allow users to customize the level of sharpening in the same way that XGI does. It's possible that the Multimedia Center is responsible for this. The XGI sharpening algorithm does not appear to be particularly advanced as it does create halos at the highest settings. Still, we like the fact that they have user configurable sharpening.
None of the cards have noise reduction.
Tied for 2nd place – ATI and XGI (5 points)
3rd place: NVIDIA (0 points)
By the testing standards, NVIDIA gets 0 and ATI and XGI get 5. However, as I have said before, the numbers can be misinterpreted. A score of zero means that NVIDIA is faithfully reproducing the material on the DVD. While sharpening is a necessity, the problem occurs when you have sharpening applied at multiple stages. Some DVDs are already sharpened, meaning that you won't want any additional processing. Some televisions sharpen and so you won't want the video card to do any additional processing.
When outputting to a television, you will want to have the TV control the level of sharpening whenever possible because it takes into account your viewing distance and the screen size. You will want to send an unsharpened image and so NVIDIA and XGI are the best choices. Since XGI does not have component video out, NVIDIA actually takes the crown.
When outputting to a monitor, you will typically need some level of sharpening and so the XGI is the best option here. ATI doesn't actually benefit because DVDs that are already sharpened will look better on NVIDIA.
If you recall my proposed scoring mechanism of forced sharpening being worse than no sharpening, we have XGI > NVIDIA > ATI.
The ATI Radeon X800 XL fails miserably on both of these tests. The XGI does reasonably well although it takes a brief moment to lock onto the 3:2 cadence. With NVIDIA PureVideo the 3:2 cadences are detected correctly. Interestingly, PureVideo also seems to handle the 3:2:3:2:2 cadence well. This is sometimes used for movies shown on TV. None of the three cards are able to detect unusual cadences.
1st place: NVIDIA (20 points)
2nd: XGI (15 points)
3rd place: ATI (0 points)
The ATI Radeon X800 XL is a poor choice for watching movies. NVIDIA and XGI are equally good for playback of well-mastered DVDs but NVIDIA does a better job with the vari-speed cadence (3:2:3:2:2) and locks onto a bad 3:2 edit faster than XGI.
Tied 2nd: ATI and NVIDIA (15 points)
3rd place: XGI (0 points)
You may see artifacts when watching movies on TV on CNN or MSNBC or during the credits sequence with the XGI. You don't see artifacts with the ATI or NVIDIA cards under such conditions, but you still lose half your resolution.
Total HQV Benchmark Scores:
NVIDIA GeForce 6600: 51
"Still going strong after one year, PureVideo represents the most balanced video solution that provides great image quality with just about everything."
XGI Volari 8300: 38
"The XGI's low score on the benchmark is due to the fact that the Volari doesn't handle uncommon scenarios well. But common things are common, and for Hollywood movies and films, it's actually just as competitive as NVIDIA's PureVideo, sacrificing some cadence detection for a slight improvement in 30 fps de-interlacing."
ATI Radeon X800 XL: 36
"At least with current drivers, it's a poor choice given that it is unable to properly detect 3:2 cadences meaning that it ignores half of the resolution available with Hollywood films."
Real world tests:
We finally did a real-world test using a copy of the Friend's Series Finale. Again, ignore the differences in color
This is a test of 3:2 detection and again we see that NVIDIA and XGI are superior to ATI's Radeon X800 XL.
The star of today's round-up is the XGI Volari 8300. Although it doesn't have as robust of a cadence detection algorithm as NVIDIA's Purevideo, for most real-world content it's superb. With 3:2 detection, well produced DVD movies will look just as good on the XGI as it does on the GeForce 6600 and with the added benefit of user-adjustable sharpening, the XGI is a good choice. When it comes to deinterlacing standard 30fps content, the Volari actually beats NVIDIA's solution thanks to its edge adaptive deinterlacing (diagonal filtering).
The best way to describe the Volari 8300 and GeForce 6600 is that they have non-overlapping talents when it comes to video. The Volari 8300 does a superb job when it's able to appropriately detect the content but it doesn't detect the content as well as NVIDIA does. Still, you have to consider the fact that the Volari 8300 is a $50 video card that competes against the GeForce 6200. Since the GF6200 doesn't have the same video capabilities as the 6600, the XGI Volari 8300 is the clear choice for video quality in that $50 price range.
It's ATI who has fallen behind the video performance curve. When a $270 Radeon X800 XL fails to keep up with a $50 value card from XGI, you know something is wrong. Still, with AVIVO, ATI is committed to making substantial improvements in video quality across its product line. Like NVIDIA PureVideo, ATI’s video processing hardware is software programmable. Our sources indicate that ATI is testing new drivers that will use more of the video hardware in the X1800 and the X800. The buzz we’re getting from our sources is that this new driver may bring ATI’s video performance into the triple digits in the HQV Benchmark! What about high-definition performance? We'll have to wait and see.
|© Copyright 2003 FS Media, Inc.|