||Matrox Parhelia-512 Review
June 25, 2002 Bob Colayco
Summary: Triple Head video, Gigacolor, 16x Fragment Anti-Aliasing and more - how does Matrox's new effort stack up? Tuan takes the Parhelia out for a test drive.
While most people who are staying up late tonight (as this article is being written), stay up because they want to watch the World Cup, we're staying up because we enjoying something a little more different. Although it can be argued about whether or not I'm having a good time, once in a while there are some things worth staying up for.
| Introduction||Page:: ( 1 / 14 )|
Case in point is Matrox's Parhelia-512, which officially launches today after a few weeks since our initial preview. The anticipation for numbers were very high and most people were eagerly waiting for board samples to ship to reviewers so they can make judgment calls on whether their next upgrade would be a Parhelia. Fortunately, we do have benchmarks to satisfy your lust for hardware knowledge.
Standing back on two feet
It's been about 2 years since we've seen any major release from Matrox and the Parhelia marks Matrox's re-entry into the high-end graphics market. When we say "high-end", we're referring to not one, but many user segments. Matrox has designed the Parhelia to suit high-end gamers, high-end workstation users and of course the everyday enthusiast who must have multiple displays. The Parhelia however, is much more versatile and isn't limited to just three distinct categories. You can take a feature designed to be used in one area like graphics, and bring it over to gaming. Then there are the many uses for multi-head that Matrox has designed and nurtured.
By now, you probably have read a few things about the Parhelia, and if you haven’t we strongly suggest you go to our Parhelia-512 Preview for a primer on the product before reading on with this review. There are a lot of things that we talked about in the Preview that won’t be necessary to reiterate in this review.
What we will talk about though, is where the Parhelia is today and how it is positioned – which is somewhat complicated. Currently, NVIDIA and ATI have offerings which have been out for a couple of months now and contain similar technology on the 3D rendering side of things. Both the GeForce4 from NVIDIA and the Radeon 8500 from ATI have been around long enough to mature their drivers as well as OS support. It’s also important to note that while Matrox does have superb technology in the Parhelia, NVIDIA and ATI both have been dumping massive amounts of resources into researching and designing everything 3D. All that in mind, it’s fair to say that Matrox may have overlooked a few things in its long two year journey back into 3D. It’s also fair to say that Matrox has been spending all this time researching and developing only one main product and not spreading its focus on other areas.
Whatever the case, we’ll be examining the Parhelia as thoroughly as we can over the next few pages.
SIDEBAR: Parhelia hits the market at $399 for a 128MB DDR card. Other versions will come later.
| The Parhelia-512||Page:: ( 2 / 14 )|
The following is a list of the many features of the Parhelia-512. Again, we strongly urge you to read the Parhelia-512 Preview before continuing so that you can fully understand the scope of all the technologies that the Parhelia contain. It also helps in understanding benchmark scores when the architecture in the tests is fully understood.
- True 512-bit GPU
- 80 million transistors in a 0.15u fabrication process
- 220MHz Core speed
- 325MHz memory speed (650MHz DDR)
- True 256-bit DDR memory interface
- Up to 20GB/sec. memory bandwidth
- Up to 256MB DDR unified frame buffer
- 10-bit Gigacolor Technology
- 10-bit per channel RGB rendering and output
- Over one billion simultaneously displayed colors
- 10-bit precision for 2D, 3D, DVD and video
- 10-bit frame buffer mode for ARGB (2:10:10:10)
- 10-bit RAMDACs with full gamma correction
- AGP host interface designed up to AGP8X
- 8-way parallel DMA streaming engine
- OpenGL1.3 and DX8.1 compliant
DualHead-HF Display Technology
- Fourth-generation DualHead
- 400MHz 10-bit RAMDAC
- Support for two digital TMDS transmitters
- Integrated 10-bit high fidelity TV/Video encoder
- TripleHead desktop
- 10-bit gamma correction
- Dual independent, gamma correctable hardware overlays
- Support for true multi-display under Windows 2000 and XP
- Hardware accelerated multi-screen OpenGL support
Quad Vertex Shader Array
- Four vertex shader units (DX8 and higher)
- Parallel processing of up to 16 vertices
- 512 instruction on-chip cache
- 256 constant registers
- Quad texturing per pixel, per clock cycle
- 64 super sample texture filtering
- Dynamic allocation of texture units
- 8-sample anisotropic and trilinear filtering on 4 dual-textured pixels/clock
- 16-sample anisotropic filtering on a 4 single-textured pixels/clock
36-stage shader array
- 4 pixel pipes
- 4 texturing units per pipe
- 5 pixel shader stages per pixel pipe
- Support for up to 10 pixel stages per pass
- 4 pixels/clock throughput with quad texturing and 5 pixel shader operations
Hardware displacement Mapping
- Patent-pending depth-adaptive tessellation for continuous level of detail geometry
- Vertex texturing for dynamic generation of geometry using texture maps
- Support for Bezier curves and N-patch (PN-triangle) evaluation
16x Fragment Antialiasing (FAA-16x)
- 16x super sampling quality on edge pixels only
- Avoids blurring of internal pixels
- Low performance overhead
- Supports for Full Scene Antialiasing (FSAA)
SIDEBAR: Triple Head technology makes the Parhelia-512 the only major chip that can support three displays all off the same card
| Efficiency||Page:: ( 3 / 14 )|
Design and research go a long way
What may surprise many is the fact that the Parhelia currently runs at a much slower clock speed than its competition and this is due to the fact that it is the largest consumer graphics processor currently available. Containing roughly 80 million transistors, the Parhelia is one complex chip. There are basically two ways that Matrox can increase the speed on the Parhelia. The first method is to decrease the manufacturing process and optimize internal interconnects. This method is basically the “dirty” method of increasing clock speed. The other more elegant method is to design more pipelines – like Intel does in its Pentium 4 processors, although some may argue otherwise – so that increasing speed wouldn’t require retooling the manufacturing line.
We also want to touch base on the MHz-myth. There are times when raw MHz matters and then there are times when other things are more paramount in determining performance. Just recently in the CPU world, the DEC Alpha processor finally reached 1GHz but it is one of the most powerful processors available. While MHz plays a role in performance, there are many other things that contribute to performance such as cache, pipelines, memory management, I/O performance, op-code efficiency and many more.
Clearly the Parhelia doesn’t quite have the filtrate as the high-end GeForce4 Ti4600, but at the end of the day, it may not need the extra clock speed to compete.
Memoirs of 3dfx
Back in the day when 3dfx was still around and boasting that speed was king, NVIDIA and ATI determined to show the speed is not king, and that features and quality are the biggest factors in the overall “performance” of a graphics card. However, as NVIDIA and ATI added more and more features that developers were not using, it became apparent that speed was the major competing factor.
Since all the special features that were being added to graphics cards were too far ahead of their time, developers sometimes seemed unsupportive simply because they wanted to be sure that their application would run on the widest array of configurations available. Supporting the greatest and latest often times means that a developer has to sacrifice support for older technology, leaving users with outdated equipment without hope. Often times, people would complain at companies like ATI and NVIDIA for releasing one product after another. People were simply upset that as soon as they opened the shrink wrap on their new video card that another new chip had been announced. Suffice to say that the culprit was really not having features that could be used out of the box.
This brings us to the Parhelia and the technology it contains, most of which can be used out of the box! From the advanced 16X Fragment Anti-aliasing to TripleHead to Surround Gaming to 10-bit GigaColor, all can be used today and with today’s software and games. The Parhelia also contains DirectX 9 features, and although it isn’t a complete DX9 solution, it does contain many of the of significant DX9 features. Matrox has told us that there are parts of the Parhelia that aren’t even enabled today because of the lack of software and the Parhelia will be ready for tomorrow’s software and games as well. Whether Matrox’s claim will hold true or not remains to be seen, but at least on some counts, we’re not stuck with “T&L” hype that no one can yet use.
SIDEBAR: Korea’s World Cup Cinderella story came to an end last night, when they fell 1-0 to the Germans.
| The card itself||Page:: ( 4 / 14 )|
The card and its features
It was almost a strange experience to look at the Parhelia because the PCB is a regular green color. These days, we’re so used to multi-colored graphics cards that seeing a “normal” card seems strange. Colors aside, there are many unique features on the board itself.
|<% print_image("01"); %>||<% print_image("02"); %>|
The first thing you’ll notice is the strange memory positioning that Matrox has opted for. The fact is, this arrangement is far from strange and is more efficient than the typical straight alignment we regularly see. This is because this arrangement allows for shorter traces, which reduces interference and increases signal strength as well as ensures that critical memory timing is sustained.
The next visible feature is the appearance of dual DVI-I connectors. These connectors allow both digital and analog signal on the same connector. Should you have a standard DB-15 VGA monitor, simply use the supplied converter to attach your monitor. For the time being, it’s probably best to stick to tried and true analog displays simply because digital displays (CRT and LCD included) don’t offer the high resolutions that standard CRTs can reach. Moreover, DVI-enabled CRTs lack the refresh rate required to reduce eyestrain. All these drawbacks of the DVI specification are due to the fact that it is a digital interface and has a finite bandwidth defined. The current DVI-I spec doesn’t have enough bandwidth to supply ultra-high resolutions at high refresh rates.
On the back of the card things are also quite interesting as there are 8 vacant areas where the extra 128MB of memory will be placed with the 256MB Parhelia boards coming later. Our Parhelia came stocked with 128MB of DDR 325MHz memory for an effective clock rate of 650MHz – on par with the competition. Throughout this review we want to keep reminding you that the Parhelia is running more than 100MHz slower at the core than the GeForce4 Ti4600. Keep this in mind as you go through the benchmarks and things will make more sense.
|<% print_image("03"); %>||<% print_image("04"); %>|
Filters are the key
Examining more closely we take a look at the Matrox’s 5th order filters on the back of the Parhelia (there is another set on the front of the card for the other DVI connector) that are responsible for producing razor sharp images and unsurpassed image quality that Matrox has been famous for.
SIDEBAR: NVIDIA is often criticized for the lack of 2D quality on its cards – since they rely on 3rd party OEMs to produce product, they cannot exert as much control over the quality of filters used in the cards as ATI and Matrox who manufacture their own cards.
| 2D and GigaColor||Page:: ( 5 / 14 )|
Still the king of image quality
Thanks to the design of its analog filters, Matrox is betting its money that no other card will produce the same high-fidelity images – and to a great extent they is right. It’s no surprise that Matrox makes such a claim since they’ve been the sole supplier of graphics cards that are tailored to perfectionists and graphic artists alike.
This time around however, there’s another player on the block who’s also boasting about its 2D quality as well. If you’ve been reading FiringSquad regularly, you would have noticed us talk about Leadtek and its filtering system integrated on its GeForce3 and GeForce4 cards. In a blind test that consisted of text examination we obtained the Sony F500 monitor to be our de-facto standard for image comparison. Arguably the world’s sharpest CRT, the F500 rolls in at a retina cutting 0.22mm pitch. The card we used to test the Parhelia against was Leadtek’s WinFast A250 Ultra – a GeForce4 Ti4600 with high image quality.
So who won the test? Matrox did but not by much. Often times we found ourselves questioning whether our vision was good enough to see the difference because it was very, very difficult. It took a lot of staring and squinting to notice any difference at all. Ironically, this is saying more for Leadtek than it is for Matrox. In terms of 2D quality, you just expect Matrox to beat the competition, but you would never guess that a GeForce4 (out of the many that are out there) to approach Matrox’s quality so closely. The Parhelia possesses serious filtering design and the Leadtek comes very close. Both the Leadtek WinFast and the Parhelia are sharper than Matrox’s previous cards like the G550 and G400.
One of the big features in the Parhelia that people are talking about is GigaColor. Your usual color arrangement in 16.7 million colors is 8-8-8-8 for R-G-B-A respectively (A for alpha). GigaColor takes a total of 6-bits from the Alpha channel and distributes it evenly over RGB for a 10-10-10-2 arrangement. This gives you a palette of over 10 billion colors but leaves the Alpha channel with a little something to be desired. In Matrox’s effort to eliminate banding and increase overall image quality, docking that many bits from the Alpha channel increases banding in transparency effects you often see in games.
To test the effectiveness of GigaColor as an everyday use feature we heavily scrutinized the viewer included with the Parhelia which is currently the only application available that is capable of displaying 10-bit GigaColor. We wanted to ensure that there weren’t any tricks being played with the viewer since the supplied images (used by the viewer for comparing 16.7m versus 1b colors) contained heavy banding to begin with that 16.7m colors could have easily eliminated.
SIDEBAR: 2D Quality is of big concern for artists who work with a lot of images and developers who spend time staring at a lot of text.
| GigaColor tests||Page:: ( 6 / 14 )|
The test setup
Image viewers used:
Matrox GigaColor Viewer
- Has two modes of filtering
- Linear or Point
- One must be selected; cannot be disabled
- Image modified because of filters
Test file used
- 16-bit/channel RGB
- Standard TIFF format; no compression
- Original supplied file from Matrox demo CD
Tests and experiences
- Using provided cubes_(1.0).tif file from GigaColor Viewer installation
- Viewed image in standard image viewer (ACDSee)
- Viewed image in GigaColor Viewer (viewer set to 16.8m colors/8-bits per channel with point filtering)
- Viewed image in GigaColor Viewer (viewer set to 1b colors/16-bits per channel with point filtering)
- No banding
- 500% shows no pixelation*
- Viewed image in Photoshop
- Viewed image in Photoshop
- Image Mode > 8 bits/channel
Is GigaColor Viewer simply converting 16-bit/channel TIFF to 8-bits/channel TIFF?
|<% print_image("05"); %>||<% print_image("06"); %>|
- Created grayscale 16-bits/channel TIFF in Photoshop and saved (1024x300/grayscale gradient)
- Image has slight banding in Photoshop
- 500% zoom shows some pixelation
- GigaColor Viewer notes file as being 1 billion colors (10-bits/channel)
- Image has lots of banding in GigaColor Viewer (viewer set to 1b colors)
- Tools to Split Image to Vertical
- 16.8m color side has no banding
- 1b color side still has same banding
- Opened same image in Photoshop
- Image has slight banding (file is 16 bits/channel)
- Image has slight banding
- 500% zoom shows some pixelation
- Image to Mode to 8 bits/channel
- No banding
- 500% zoom shows more pixelation than at 16 bits/channel
- This is because 16-bit has only 32 shades of gray and 8-bits has 256 shades of gray. Therefore you see less banding but more pixelation because when moving down in bits, the pixels are dithered or spackled so it is more difficult to see edges (bandings).
*GigaColor Viewer has pixel filtering which we could not disable.
SIDEBAR: World Cup matches at 4AM + Neverwinter Nights = one bleary eyed editor
| GigaColor cont.||Page:: ( 7 / 14 )|
We achieved the same visible results (without intense zooming) using Photoshop 16-bit/8-bit conversion. The only difference is when viewing with GigaColor Viewer, we don’t see any pixels. So far, it seems as though GigaColor Viewer is converting 16-bit/channel images down to 8-bit channel images to achieve these results. When viewing the supplied cubes_(1.0).tif file, there is no pixelation when zoomed in 500% and there is no banding. Is GigaColor Viewer achieving a no pixelation image by using filtering?
Microsoft’s Image Preview (which has pixel filtering) was then used to view the 16-bit/channel converted down to 8-bit channel grayscale image that we created in Photoshop (that looks exactly like the 10-bit GigaColor sample in the GC Viewer) and viewed in Image Preview.
Result: Edgeless pixelation (creates a blotchy pattern) when zoomed in 500%. Not as smooth as the cubes_(1.0).tif image viewed with GigaColor Viewer. Although for some reason GigaColor Viewer indicates all 16-bit/channel TIF files are 1 billion color palette images. Why?
Also, we can’t be sure that Microsoft Image Preview is using the same filtering method as GigaColor Viewer. The only fair comparison is to view the image in ACDSee (no filtering) and in GigaColor Viewer (with no filtering enabled, which is not an available option). It is also easier to show banding when the image does not have many different colors and only has highly contrasting and few colors. When viewing a regular photograph in GigaColor Viewer, no difference can be seen in Split mode between either side (16.8m colors and 1b colors). There is a slight increase in vibrancy when GigaColor is enabled but the same effect can be achieved with NVIDIA’s Color Vibrancy option in its drivers.
We spoke with Matrox extensively about GigaColor Viewer over the weekend and came to the conclusion that there were things that needed updating and fixing in the Viewer. The 12MB data size limit is a big downer as well as a bug that seems to keep pixel filtering on (creating a smoother appearance than actual) no matter what setting we chose.
While GigaColor does make a difference (still require a non-filtered version of GigaColor Viewer to be absolutely certain), it does not make a noticeable difference in everyday use, especially because you can’t see the difference simply by using any image viewer or surfing around your desktop. GigaColor is mainly positioned at the graphics artist who wants to see what his or her masterpiece looks like in GigaColor and print in GigaColor as well. Microsoft’s next generation OS will have support for higher color support out of the box and Matrox will be ready when the new OS is released.
It’s not everyday that you stare at 5 or 6 highly contrasting colors (only way to significantly see banding) on the screen, but rather looking at many millions of colors mixed together in games and photographs, making banding virtually impossible to tell. Matrox insures us that they will be quickly updating the viewer to support large files.
We’re approaching GigaColor from a gamer’s point of view but we have heard others in different areas of the industry like photography and film who think very highly of GigaColor. Don’t get us wrong though. We think GigaColor is a great feature to have, it’s just not that practical for gamers at the moment.
SIDEBAR: Riots have broken out all over the world due to World Cup results. If you’re a foreigner in a country that has been eliminated, we hope you’ve been keeping a low profile (i.e. Koreans living in Italy, Japanese living in Russia, etc.)
| TripleHead||Page:: ( 8 / 14 )|
There’s not much you can analyze about Surround Gaming except see the feature in action for yourself. Fortunately, we were able to sit around with three 22” monitors from NEC-Mitsubishi for some intense gaming with Neverwinter Nights. The game is already really good on a single monitor but once you use three, there’s no going back.
|<% print_image("08"); %>||<% print_image("09"); %>|
With the Parhelia, you can realize immediate advantages as soon as you unpack the card from its box. Not only is Surround Gaming an amazing feature to have, it works with many current games. With Neverwinter Nights, we simply picked the appropriate resolution (listed if you’re in TripleHead mode only) and away we went. You can’t see so in the pictures but performance was also very good. The game played very smoothly and we were able to see much more of the playing field. Other games that work with Surround Gaming include Quake 3 (likely to also support Doom 3 when it arrives), Jedi Knight 2 and many other games. The list of games are only limited to the imagination of developers.
TripleHead isn’t just for gamers. The benefits of using dual monitors are endless and having another simply increases the level of interaction that people have with their computers. Suddenly things become so free and you’re not limited to a particular resolution or monitor. Below you can see that we’re already becoming very fond of TripleHead and its use as an everyday tool. We’re able to do many more things simultaneously without the fuss of alt-tabbing.
|<% print_image("10"); %>||<% print_image("11"); %>|
|<% print_image("12"); %>||<% print_image("13"); %>||<% print_image("14"); %>|
The Matrox Reef demo was the most impressive thing to watch. It ran through smooth as silk using a resolution of 2400x600 with 16X Fragment Anti-aliasing. Other notable features are being able to utilize multiple displays as a single desktop. Many other multi-monitor solutions available only allow the use of two distinct desktop spaces. With TripleHead, you can do that and much more. TripleHead isn’t limited to just everyday graphics but also video. Again, Matrox gives us a multi-monitor solution that we can use, and that’s feature packed. This is no surprise considering that Matrox has the most advanced and mature multi-monitor solution available.
SIDEBAR: Triple Head works great….if you have three monitors to spare! I think I’m going to break into Tuan’s place and rip off those three monitors he used.
| 16X Fragment AA||Page:: ( 9 / 14 )|
Anti-aliasing done right
Let’s briefly return to the subject of clock speed. There’s a real reason why you need fast clock speeds and fast fill rates these days. This is because more and more people want to run their games at higher and higher resolutions. The real reason behind this is because high resolution reduces jaggies (aliasing) on edges of objects in a game. High fill rate is required to be able to render scenes at high resolutions without dropping below acceptable frame rates.
The concept here is that with the Parhelia, you don’t need high fill rates and fast clock speeds. With 16X Fragment Anti-aliasing, the need to run games at super high resolutions is no longer there. Matrox has truly taken FSAA to the next level by only applying FSAA to those pixels that need it. This means that you don’t require high fill rates and hot clock speeds. Now you can play your games at 1024x768 (or whichever you like, but the most popular seems to be 1024 these days) and get the same visual results as though you were running at 1600x1200 with 4X FSAA.
When Matrox compared its 16X F-FSAA to the competition’s method, we were a slightly skeptical about what it could do and how fast it could do it. After playing with it for a while, we can say that we’ve witnessed the next generation of FSAA. Not only is Matrox’s 16X Fragment AA more elegant than other brute-force methods, it looks much better as well. Take a look at the following images:
|<% print_image("15"); %>||<% print_image("16"); %>||<% print_image("17"); %>|
The screenshots (actual game play) above clearly show the superiority of 16X Fragment AA over the competition. The screenshots were taken while playing Quake 3 at 1024x768 using various AA techniques. As you can see, the difference between 4X AA and Matrox’s 16X Fragment method is large. In action, 16X Fragment AA really does give you the quality of playing at 1600x1200 with 4X FSAA but without the huge performance penalties. Quake 3 played smooth as silk and never stalled.
Is 16X Fragment Anti-aliasing available now? Yes! You can enable it in all your games for immediate gains and trust us when we say that it looks damn beautiful – there simply is no comparison.
SIDEBAR: Go back to our Parhelia Preview for a discussion of 16x Fragment AA
| System Setup||Page:: ( 10 / 14 )|
AMD Athlon XP 2000
1GB Corsair PC2100 Registered DDR SDRAM – ECC disabled
Matrox Parhelia-512 128MB
Driver version 0225
Leadtek A250 Ultra GeForce4 128MB
Driver version Detonator 28.32
30GB BM Deskstar DTLA 307030 ATA/100 Hard drive
Toshiba 10X DVD-ROM
Windows XP Professional
3DMark 2001 – 32-bit color, 32-bit textures
Quake 3 Arena 1.17 – High Quality
Jedi Knight 2 – High Quality
We apologize for not including ATI’s Radeon 8500 due to time constraints from receiving the Parhelia sample a few days late into the week.
SIDEBAR: We’d like to have included the new Unreal benchmark – thus far, Epic has not been responsive to our requests to get a copy of this software.
| 3DMark 2001||Page:: ( 11 / 14 )|
3DMark 2001 – DirectX 8
The Parhelia performs very well considering that it’s clocked more than 100MHz slower than the GeForce4. It’s slightly unfair to guage the two cards because of maturity and clock speed, but this goes to show that if the Parhelia performs this well while clocked so low and using early drivers, then when it matures and clock speeds are improved, that it may be even faster than the GeForce4. Bear in mind however, the NVIDIA won’t exactly be standing still with new parts and refreshes every 6 months.
SIDEBAR: We wonder if someone will write tools to overclock the Parhelia…
| Quake3 & Jedi Knight 2||Page:: ( 12 / 14 )|
Quake 3 Arena 1.17
Jedi Knight 2
In actual games, the Parhelia keeps up very well with the GeForce4 considering that it’s clocked slower. What’s most impressive is the Parhelia’s performance when 16X Fragment FSAA is enabled. You’re basically achieving 1600x1200 with 4X FSAA without a major performance cost. 16X Fragment FSAA brings gaming to a whole new level. When we were trying 16 F-FSAA with Quake 3, frame rates were high and the image quality was unsurpassed by any other card. Besides the FSAA, the Parhelia is quick in performance with anisotropic filtering enabled as well. Combine the two features together and you begin to see the major quality and speed benefits of the Parhelia.
Jedi Knight doesn’t scale as well as Quake 3 does even though both use the same engine. GeForce4 still stays ahead in frame rates thanks to its increased clocked speed and fill rate.
SIDEBAR: Remember when we used to benchmark with Expendable?
| Ballistics Report||Page:: ( 13 / 14 )|
TripleHead: The Parhelia is the first of its kind. A consumer level graphics card with support for three simultaneous displays. Having gone through many improvements, Matrox’s multi-display technology leads the crowd in terms of quality, functionality and features. While others claim to “have” dual display support, Matrox holds the crown in terms of usability and support for a third display.
The possibilities of TripleHead are nearly endless. From Surround Gaming to desktop publishing, you simply can’t go wrong with more than one display. Most people don’t think they have a use for more than one screen, but as soon as they try a multi-display capable system, they quickly realize how useful having more than one screen can be. Packed to the brim with features that even take advantage of TV-out, TripleHead is the de-facto standard for multi-display support.
16X Fragment Anti-aliasing: Matrox has done an incredible job with FSAA and driving FSAA as the standard for graphics. This is FSAA the way it should have been. Intelligently designed, Matrox’s 16X F-FSAA takes into account only the pixels that are actually aliased. Since it only filters fragmented pixels, all other pixels are left untouched and retain complete image quality.
DirectX 9 support: The Parhelia is ready for DX9 and contains many features that still lay dormant. Many parts of the pixel and vertex shading areas in the Parhelia go largely unused today but will be taken full advantage of when DX9 hits the streets.
GigaColor: Covering all grounds, the Parhelia offers publishers and graphics professionals what they need to accurately work. While GigaColor doesn’t make a huge impact in all conditions, it does it what it’s supposed to do and that’s supply 1 billion colors. You won’t really notice any improvement in games and regular Windows use – at least not until Microsoft releases its next generation OS that includes support for high color precision.
Image Quality: This is one area where Matrox has never failed in. Known worldwide for its superb image quality, Matrox has up the ante with the Parhelia. Besting even its previous cards, Matrox implements the highest quality imaging electronics on its board so all you have to do is say “wow”.
Runs cool: The Parhelia runs much cooler than the competition thanks to the lower clock speed but fast architecture.
Some driver issues: There are some driver issues that still need to be ironed out. It’s also a given that drivers will mature over time. What we also would have liked to see is the presence of more tweaking options in the drivers. Hopefully, more features will be added.
GigaColor limitation: Currently, GigaColor is only viewable by using Matrox’s GC Viewer. However, even the viewer has issues that still need to be resolved. Having a data size limit of 12MB is peanuts when you’re a graphics artist working with file sizes that are hundreds of megabytes in sizes and are unable to see the results as expected.
Speed: Obviously, we didn’t get the raw, jaw dropping speed gains we were hoping for. In retrospect, this probably should not have come as a surprise, with Matrox being out of the game for a couple of years – obviously ATI’s and especially NVIDIA’s driver team are going to be more experienced at optimization. This all goes back to the previous point about driver issues. It will take time for drivers to mature, and for Matrox’s software team to squeeze all the juice they can out of the Parhelia. One important note - with the Parhelia, you don’t need to run games at such high resolutions because of 16X F-FSAA. You can get better visual results than people who can run 1600x1200 with 4X FSAA.
Price: Price will always be a concern for new releases, which goes without saying. Currently, the 128MB of the Parhelia will retail for $400 on the street. This is pretty much in line with current GeForce4 Ti4600’s. However, you get a lot more with a Parhelia for the same price.
| Final Verdict||Page:: ( 14 / 14 )|
Print Article! | Close Window ]