Looking Back Pt. 2: X800 & Catalyst Under The Knife
by Ryan Smith on February 22, 2006 12:05 AM EST- Posted in
- GPUs
Conclusion
Having now taken a look at two different series of ATI video cards over their lifetimes, we can finally pinpoint some trends with regards to ATI’s drivers that weren’t so clear having seen just one card.
First and foremost, there’s really very little of a progressive performance increase in most games. On any given game, unless there’s something about the driver specifically for it, either a major performance improvement or a bug fix, there’s no reason to upgrade drivers as it’s not going to change anything. It may still be a good idea overall, since most gamers play more than 1 game at once, but on a per-game basis, there’s little reason to upgrade drivers.
Secondly, what performance boosts do come can almost be guaranteed to be in the form of significant one-time performance boosts. On comparing any two drivers, it may seem like performance has gone up or down depending on the natural variations in benchmarking, but only a single game on the R420, Half-Life 2, showed that this was a meaningful improvement instead of the aforementioned variation.
Thirdly, most significant performance improvements occur either early in the life of a card or early in the life of a game depending on which is newer. Although Far Cry is an example of this occurring a bit later in life, as was Halo on the 9700 Pro, these are the exceptions rather than the rule. For any new game that’s been out for more than a couple of months, don’t expect ATI to make any further significant performance changes.
Fourthly, and this isn’t something that we were originally looking at when we started this series, but after installing the Catalyst Control Center on our test bed for the R420, we’re growing increasingly worried about ATI’s direction with their driver control utilities. The Catalyst Control Center increased the booting time of our test bed by approximately 10 seconds using the informal “how long until the hard drive stops working” method. And now that ATI has discontinued their control panel, this is the only 1 st-party way of adjusting an ATI card. ATI seems to have learned little since it first launched the Catalyst Control Center over a year ago.
Lastly, ATI seems to have taken a keener interest in 3dMark lately than they did with the 9700 Pro and 3dMark 2003. For whatever reasons, their 3dMark 2005 has kept increasing while it hasn’t in games, once again providing a practical example of how synthetic benchmarks can be deceiving versus what happens to performance in real games.
So, getting back to the primary questions at hand, how will this translate in to what we can expect from ATI in the future with the R5xx series? Considering what we’ve seen with both the R300 and R420, there seems to be little reason at this moment to expect that ATI will deviate from what they’ve done on their last two generation products. This isn’t going to be a perfect prediction, especially since the R5xx architecture is both brand new this time around and further deviates from traditional GPU design with a heavy shift towards pixel shading, but all signs point to ATI continuing to follow the trends above.
But what about NVIDIA, you may ask? Look for the ForceWare drivers to go under the knife in the near future.
Having now taken a look at two different series of ATI video cards over their lifetimes, we can finally pinpoint some trends with regards to ATI’s drivers that weren’t so clear having seen just one card.
First and foremost, there’s really very little of a progressive performance increase in most games. On any given game, unless there’s something about the driver specifically for it, either a major performance improvement or a bug fix, there’s no reason to upgrade drivers as it’s not going to change anything. It may still be a good idea overall, since most gamers play more than 1 game at once, but on a per-game basis, there’s little reason to upgrade drivers.
Secondly, what performance boosts do come can almost be guaranteed to be in the form of significant one-time performance boosts. On comparing any two drivers, it may seem like performance has gone up or down depending on the natural variations in benchmarking, but only a single game on the R420, Half-Life 2, showed that this was a meaningful improvement instead of the aforementioned variation.
Thirdly, most significant performance improvements occur either early in the life of a card or early in the life of a game depending on which is newer. Although Far Cry is an example of this occurring a bit later in life, as was Halo on the 9700 Pro, these are the exceptions rather than the rule. For any new game that’s been out for more than a couple of months, don’t expect ATI to make any further significant performance changes.
Fourthly, and this isn’t something that we were originally looking at when we started this series, but after installing the Catalyst Control Center on our test bed for the R420, we’re growing increasingly worried about ATI’s direction with their driver control utilities. The Catalyst Control Center increased the booting time of our test bed by approximately 10 seconds using the informal “how long until the hard drive stops working” method. And now that ATI has discontinued their control panel, this is the only 1 st-party way of adjusting an ATI card. ATI seems to have learned little since it first launched the Catalyst Control Center over a year ago.
Lastly, ATI seems to have taken a keener interest in 3dMark lately than they did with the 9700 Pro and 3dMark 2003. For whatever reasons, their 3dMark 2005 has kept increasing while it hasn’t in games, once again providing a practical example of how synthetic benchmarks can be deceiving versus what happens to performance in real games.
So, getting back to the primary questions at hand, how will this translate in to what we can expect from ATI in the future with the R5xx series? Considering what we’ve seen with both the R300 and R420, there seems to be little reason at this moment to expect that ATI will deviate from what they’ve done on their last two generation products. This isn’t going to be a perfect prediction, especially since the R5xx architecture is both brand new this time around and further deviates from traditional GPU design with a heavy shift towards pixel shading, but all signs point to ATI continuing to follow the trends above.
But what about NVIDIA, you may ask? Look for the ForceWare drivers to go under the knife in the near future.
24 Comments
View All Comments
lombric - Thursday, February 23, 2006 - link
It may be interesting to see the evolution in cpu discharge under various video format and in image quality.I know that the introduction of AVIVO in recents drivers was very efficient for the X1xx serie but what about the R420? No chance to have similar results?
Egglick - Friday, February 24, 2006 - link
As far as I know, the X1x00 cards are the only ones with AVIVO, or at least the entire feature set.So does that mean that a $80 X1300 has better video playback than a X850XT PE?? Yep.
pkw111 - Thursday, February 23, 2006 - link
... but their conclusion is rather boring. True it may be good solid research, but how about some studies that give colorful results, liek comparing the non-offical ATI drivers, such as WarCat, Omega, ngo, etc.Egglick - Thursday, February 23, 2006 - link
I think it's a little too early to make guesses about the R5xx series right now. Don't forget that both the X800Pro and the 9700Pro are R300 based, and what we're looking at is a cumulation of 3+ years of tweaking and optimizing. The R580 has been out for what, a month?We could still see very radical performance boosts for R5xx based cards, particularly the R580 with it's unique shader architecture. It's also possible that performance boosts in new games will be even larger once the successive driver has been optimized for it. Basically, it's a whole new architecture, and what may have been true for both of these R300 based cards may not be true at all for R5xx.
Also, the CCC is garbage. Boo to ATI for forcing us to use it.
DieBoer - Thursday, February 23, 2006 - link
I just wish ati would stop wasting time on optimising 3dmark and start with games. No serious gamer would take notice at all at 3dm scores only the average joe.Spoelie - Thursday, February 23, 2006 - link
The most horrifying thing about CCC is the horrendous memory usage. I had been using the normal control panel all this time but recently formatted and downloaded the latest drivers. WindowsXP's memory usage after bootup went from ~70 something (not much had been installed yet) to a full fledged 200mb!! Only from installing the f*cked up driver.After some tweaking (disabling all ATi's added services and the CCC entry in the registry's startup) I'm back at around ~95mb after startup, which I was at before the format.
Still find it incredible in what kind of default configuration the CCC 'ships'.
Questar - Thursday, February 23, 2006 - link
Your saw rhings that weren't there - XP's footprint is much larger than 70MB.Spoelie - Friday, February 24, 2006 - link
Not really, once you start tweaking and don't have all programs installed, around 70 is really not that much of a stretch without programs open. Even so, even if the task manager for some reason is lying about the absolute numbers, there was a difference of 130mb just by installing a driver.abhaxus - Wednesday, February 22, 2006 - link
i find it surprising that you did not run the test with a dual core CPU to see if the dual core optimizations actually did anything in the new drivers. i know there was a writeup on them awhile back with the 5.12s i believe but i'd like to see if newer versions got any further improvement.SonicIce - Wednesday, February 22, 2006 - link
You can create a short 300 frame timedemo for Farcry and play it back with the http://www.hardwareoc.hu/index.php/p/download/st/....">Farcry bench tool in screenshot mode. This will give you perfectly consistant results. I did it once to compare the shadows on the weapon.