Summary: NVIDIA has developed a more powerful video processor for the GeForce 8500/8600 which is capable of offloading new H.264 for Blu-ray and HD-DVD playback. In this article, Alan goes over all the changes. See what's new inside!
As we approach the halfway point of 2007, the world of PC video is beginning to shift. HD-DVD and Blu-Ray are beginning to attain the critical mass they once lacked. Today, HD-DVD drives can be found as low as $200 while Blu-Ray burners have finally crossed the $500 mark. By Christmas, you can expect prices to be half that as more OEMs (Dell, HP, Sony) begin to market PCs with high-definition optical drives… and as the OEMs get on board, the economies of scale drive costs down for the rest of us.
If manufacturing more drives will help drive prices down, why is it that OEMs haven’t done so yet? It’s not HDCP. Now that 20” widescreen LCD monitors are mainstream, you can easily find HDCP-enabled monitors for under $300. The problem has largely been computational expense.
With some of today’s ultra high-bitrate content, even an Intel Core2 Duo 6600 is stressed to close to 100% CPU utilization when running a full software decode of Blu-Ray/HD-DVD. Enthusiast-grade software such as CoreAVC can help to minimize the CPU utilization, however this comes at the expense of image quality (even compared to ffdshow) and CoreAVC lacks the ability to work with protected content. As a result, OEMs have been slow to adopt high-definition optical media.
Last November, NVIDIA demoed PureVideo HD. With the latest drivers, the H.264 acceleration was finally enabled for HD-DVD and Blu-Ray PCs. With the PureVideo Video Processor, those same movies that brought a Core 2 Duo E6600 to the limit, now only required 60% utilization… With PureVideo HD, it was finally possible to have high-quality HD-DVD and Blu-Ray playback on the PC – you just needed substantial CPU power.
Now, things have changed.
With the GeForce 8500 and GeForce 8600, NVIDIA has now introduced a second-generation PureVideo HD product. In 2005, I called ATI AVIVO the biggest improvement in PC video quality since the original Mach64VT. In 2007, GeForce 8500 and GeForce 8600 will take PC video quality to the next step.
The Old PureVideo Architecture
Although NVIDIA has never provided specific details about the underlying architecture of PureVideo, our best analysis suggests that the GeForce’s video processing pipeline involves several elements :
The New PureVideo HD
The GeForce 8500/8600 now features a second generation VPU, codenamed VP2. Although architectural details for the VP2 are limited, we do know that NVIDIA claims that secondary video streams can also be decoded on-chip (i.e. Picture-in-Picture – useful for certain HD-DVD and Blu-Ray). Assuming that there’s no increase in efficiency from the software side, the ability to decode secondary streams suggests that the VP2 is twice as powerful as the existing VPU inside the older GeForce CPUs. NVIDIA has had a history of running some deinterlacing and quality-enhancing features on the VPU as well as the shader. With time, we may see new features added to VP2.
The real innovation is the new H.264 Bitstream Processing Engine. This dedicated unit provides full H.264 decoding, including context-adaptive variable-length coding and context-adaptive binary arithmetic coding (CAVLC/CABLC) reverse entropy. Reverse entropy is the most computationally intensive step of the process and accounts for more than half of the decode time. Remember those movies that took a Core 2 Duo to its knees? With the GeForce 8600GT, you’re looking at just 20% CPU utilization on that same system. Want another perspective? That same clip that took the Core 2 Duo to its knees? A GeForce 8600 and a Celeron 347 (3.06GHz Cedar Mill – like a 65nm Prescott) will be fine…
The GeForce 8600’s shaders are also put to good use with support for nearly all of the quality features found in a GeForce 8800GTX-class machine. You have things like content-based HD inverse telecine (the ability to recover the full 1080p24 source hidden in a 1080i60 broadcast), NVIDIA’s more advanced spatial-temporal deinterlacing, capable of passing the “guitar string” test on the HQV Benchmark, and even support for unusual SD cadences. The only thing that’s currently missing is HD noise reduction and sharpening. (Although with NVIDIA’s history of the 6600GT, we may see new features with new drivers over the lifetime of the product).
Finally, the AES128 engine provides a fixed function pipeline for accelerating the content protection found in AACS and potentially BD+. The 8600GT line now mandates full HDCP-support with the full encryption key support. Importantly, the GeForce 8600 now allows 30” Dell owners to enjoy full HDCP protected video.
When it comes to software, NVIDIA no longer requires a NVIDIA-branded PureVideo software decoder. (They haven’t since last year). With Windows XP, PureVideo quality features will be available to any DXVA application but H.264 decode will only be enabled with partner software such as Intervideo, Cyberlink, and Nero via a proprietary interface. With Windows Vista, NVIDIA is writing to the Vista-only DXVA2. This allows any developer to tap into NVIDIA’s H.264 decode abilities as well as NVIDIA’s video processing features. With the standardized DXVA2 interface, even enthusiasts can write to this API, and I wouldn’t be surprised to see software such as ffdshow or DScaler being written to take advantage of the new technology. Officially, PureVideo quality features are enabled by the built-in Windows Vista MPEG-2 decoder, but I still seem to have had better results with Intervideo’s decoder.
With the GeForce 8500 and 8600 however, NVIDIA has given video enthusiasts a superb mainstream option. With the more powerful VP2 video processor, a bitstream processor, and AES128 decode engine, these new GPUs are offloading virtually all of the compute-intensive elements of Blu-Ray and HD-DVD playback. This allows HTPCs to be built with combo HD-DVD and Blu-Ray drives while avoiding the hefty cooling requirements of a flagship CPU. With today’s drivers, the only difference between the 8800 and 8600 (when it comes to video quality) is HD noise reduction and HD sharpening. With tomorrow’s drivers, who knows…
The 8800 is "picture quality" champion due to HD sharpening and noise reduction. This is particularly important for watching broadcasted HDTV. These quality features are only available on the GF8800 because they are being performed by the GPU shaders.
When watching a well-mastered Blu-Ray or HD-DVD film, HD noise reduction and sharpening is less important -- the noise and necessary sharpening will have been performed at the time of the digital master. In this case, the GF8600 will be the GPU of choice. It is also important to note that the VP2 is not being used to its full potential at this time. In time, it may be possible to convert the HD sharpening and noise reduction algorithms into something that runs on both the GPU shaders and the VP2 processor. If this happens, the GF8600 could have the full set of quality features found in its bigger brother, with the added benefit of the additional H.264 CPU offload. NVIDIA has had a good track record of transitioning GPU-shader based algorithms to the VP1 when it comes to GeForce6 and GeForce7. The future is bright for NVIDIA's VP2.
Hardcore gamers looking to play games at 1600x1200 or higher resolutions will still need to go with the 8800 series GPUs, but for those of you still running 1280x1024, the 8600GTS becomes an interesting proposition. For non-gaming HTPC enthusiasts, a silently cooled or passively cooled 8600 is going to be a no-brainer decision. The only remaining question is which board manufacturer is going to be the first to implement a full HDMI solution.
Look forward to an updated article with comparisons between NVIDIA’s second-generation PureVideo and ATI’s still-unannounced second-generation AVIVO GPU. Oh yeah, we’ll be adding a new test to our video suite.
Yeah, that’s a Blu-Ray disc.
|© Copyright 2003 FS Media, Inc.|