
Original Link: https://www.anandtech.com/show/1892
Looking Back: ATI's Catalyst Drivers Exposed
by Ryan Smith on December 11, 2005 3:22 PM EST- Posted in
- GPUs
Looking Back: ATI’s Catalyst Drivers Exposed
It’s no secret in the hardware world that good software is often just as important as good hardware. The best processor, the best TV tuner, and even the best sound card can only be as good as the software and drivers backing it up. Even a small change in one critical piece of code can result in a massive difference that represents a significant change in performance and sales of a piece of hardware.
Above all, however, this concept is embodied in the realm of video cards, where over the years, we have been spoiled by promises of “A performance improvement between 17 and 24% is noticed in Jedi Knight II” and “up to 25% performance improvement in popular consumer and professional applications”. These days, it’s not just common to see GPU makers find ways to squeeze out more performance out of their parts - it’s expected. Finishing the design of and launching a GPU is just the first steps of a much longer process of maximizing performance out of a part, a process that can quite literally take years.
With the flexible nature of software, however, it has caused a significant shift in the marketing strategies of GPU makers, where the war is not over at launch time, but continues throughout the entire product cycle and in to the next one as new optimizations and bug fixes are worked in to their drivers, keeping the performance landscape in constant motion. Just because a side did not win the battle at launch doesn’t mean that they can’t still take it later, and just because a side won now doesn’t mean that they’ll keep their win.
We have seen on more than one occasion that our benchmarks have been turned upside down and inside out, with cases such as ATI’s Catalyst 5.11 drivers suddenly giving ATI a decisive win in OpenGL games, when they were being soundly defeated just a driver version before. However, we have also seen this pressure to win drive all sides to various levels of dishonesty, hoping to capture the lead with driver optimizations that make a product look faster on a benchmark table, but literally look worse on a monitor. Quake3, 3DMark 2003, and similar incidents have shown that there is a fine line between optimizing and cheating, and that as a cost for the flexibility of software, we may sometimes see that line crossed.
That said, when the optimizations, the tweaks, the bug fixes, and the cheats are all said and done, just how much faster has all of this work made a product? Are these driver improvements really all that substantial all the time, or is much of this over-exuberance and distraction over only minor issues? Do we have any way of predicting what future drivers for new products will do?
Today, we set out to answer these questions by taking a look back at a piece of hardware whose time has come and is nearly gone: ATI’s R300 GPU and the Radeon 9700 Pro.
R300 & The Test
As the patriarch of nearly 3 years worth of technology from ATI, it's by no mistake that we start out with one of the most influential GPUs ever made. The R300 not only was the primary architecture of ATI's entire 9500-9800 line of video cards starting in later 2002, but was also the father of much of the design elements that we saw go in to ATI's R4xx GPUs, and only finally replaced with the R5xx series in the later part of 2005, a testament to the strong design of the R300.
Because of these reasons, not to mention the strong sales of both R3xx and R4xx based video cards, the R300 and its host video card, the Radeon 9700 Pro, are a prime example of what developments in device drivers can mean for a product. ATI has now had over 3 years to make the most of the software that drives the 9700 Pro, allowing us to see just how much more performance ATI could get out of the card with later drivers that was not obvious upon the card's launch.
To identify and analyze these improvements, we have taken a 9700 Pro and run it through a modified version of one of our earlier benchmark suites, testing a slew of games and benchmarks in a regression test against a dozen versions of ATI's drivers, taking a quarterly snapshot of performance and image quality. Unfortunately, in spite of the R300 hardware supporting Shader Model 2.0 from the start, ATI did not offer such support in their official drivers until some months after the card shipped, so for our testing purposes, we had to start with the first drivers that offered such support, Catalyst 3.0, which are a couple of versions newer than the first drivers for the 9700 Pro. Still, we will see even excluding the first few months of the 9700 Pro's life doesn't skip on the performance improvements that ATI's Catalyst team was able to work out.
The specific games/benchmarks tested were:
- D3DAFTester
- Unreal Tournament 2004
- Jedi Knight: Jedi Academy
- Warcraft 3: The Frozen Throne
- Halo
- 3dMark 2003
AMD Athlon 64 3400+(S754)
Abit KV8-MAX3 motherboard
2GB DDR400 RAM 2:2:2
120GB Maxtor DiamondMax Plus 9 Hard Drive
Antec TruePower 430W Power Supply
All tests were done at 1280x1024 unless otherwise noted. To view larger images, please download them from here.
D3DAFTester
Although it’s not a benchmark in and of itself, D3DAFTester is a great diagnostic tool to see if there has been any sort of global change in anisotropic filtering quality, and if so, what kind of impact that change offers. Before we settle down with the benchmarks, it’s best to see if there have been any global changes made to the drivers that would help or harm overall IQ.
D3DAFTester, Tunnel Mode, 8x AF
D3DAFTester, Plane Mode @ 15x75, 8x AF
Jedi Knight: Jedi Academy
Released in 2003, Jedi Academy represents the pinnacle of what the Quake3 engine could offer. With massive levels, dynamic glow, and lightsabers abound, it's one of the most punishing Quake3 engine games ever made, and a good representation of the vast number of games made in the early 2000's with this engine. As our only OpenGL title in this roundup, it's also our gauge to see if ATI's OpenGL performance changed at all over the 3-year period That said, even with ATI's traditionally poor OpenGL performance, we still had to increase our testing resolution to 1600x1200 in order to put a sizable dent in to our test setup; otherwise, we would continuously hit the 100fps frame rate cap.
Looking at the screen captures, however, we see a very interesting story that the benchmarks do not show, and it's not all performance related.
Catalyst 3.09 versus 3.06 (mouse over to see 3.06)
Overall, however, Jedi Academy shows that other than early improvements and a bug fix, there was little change in performance in this game with the 9700 Pro.
Unreal Tournament 2004
As UT2003 and UT2004 are near-perfect substitutes for each other, we went with Epic's latest version of their best-selling multi-player FPS in order to put the 9700 Pro up against 2004's more refined engine. UT2004 is a good example of a near-modern game, utilizing some SM 1.x features, along with being the engine of choice for many more games, including America's Army. With the number of games built on the Unreal Engine 2.x, UT2004 represents an important engine to optimize for, given the era.
Catalyst 5.11 versus 3.00 (mouse over to see 3.00)
Other than improving AA/AF performance, it seems that ATI had little need to optimize for UT2004. Without a performance or IQ difference, there is little to say about the 9700 Pro with regards to UT2004.
Warcraft 3: The Frozen Throne
Leaving the realm of FPS's for a bit, we take a look at Blizzard's massively popular RTS Warcraft 3, and its expansion pack, The Frozen Throne. As we have mentioned in the past, even for its superb image quality, WC3 is not a terribly performance-intensive game, so we aren't expecting any surprises here. Because WC3 does not have a benchmarking mode, all frame rates are approximate using a custom replay and FRAPS.
Catalyst 5.11 versus 3.00 (mouse over to see 3.00)
Even more than with UT2004, Warcraft 3 is a no-story. ATI did not make any driver changes that significantly impacted either IQ or performance.
Halo
Halo represents both a curse and a blessing among possible titles to benchmark. As one of the first FPSs to make good use of SM 2.0 and a very popular title on both the PC and Xbox, it's an important title to use to see what ATI could do with the newest feature of the 9700 Pro; but at the same time, it was still a console port that was in many ways mediocre. While we would have liked to test Gearbox's "Custom Edition" that implements properly optimized shaders, the lack of single player support in that version has limited us to the less optimized original version of the game. The lack of AA support has also limited us to only benchmarking Halo without any advanced features turned on.
Catalyst 4.02 versus 3.09 (mouse over to see 3.09)
As all the other pre-3.09 shots are indistinguishable from our 3.09 shot, and all post-4.02 shots are just like our 4.02 shot, there appears to be no other IQ change other than the flashlight fix. Overall, Halo stands apart as a game that received a constant (if small) improvement in performance, and the second game to receive an IQ-related fix.
3dMark 2003
3dMark is not a benchmark that we routinely bring you here at AnandTech, as our editorial policy is to bring you benchmarks from real-world games and engines, and not synthetic metrics. That said, it would be inappropriate to leave out 3dMark in this case due to the significant cheating incidents with it. And, as a flashy, system draining benchmark backed by a unique database for comparisons, it's still an important title in the eyes of many consumers, OEMs, and the GPU makers looking for bragging rights.
With 3dMark, its importance in this regression is not so much the performance improvements as a sign of what happened with the card - the improvements were most certainly exaggerated due to in part by the synthetic nature of the benchmark - but rather a possibility of what can happen when ATI dedicates its resources to a game/benchmark that it considers most important. We should note that ATI has admitted to "cheating" on 3dMark 2003; however, these were what we consider honest shader-replacement optimizations (same mathematical output) that ATI voluntarily removed, though they were apparently re-introduced at some point. We used the latest version of 3dMark 2003, so this "cheat" was not activated in the older drivers.
For these benchmarks, 3dMark was run at its default resolution of 1024x768.
Catalyst 5.11 versus 3.00 (mouse over to see 3.00)
Overall then, 3dMark is much like Halo, a benchmark that received a slow, but steady improvement, without any fixes.
Conclusion
So, now that we have gone through 6 applications and 12 drivers, what have we learned? Not much, if we want to talk about consistency.
In general, there was one significant performance improvement across all games via the driver, and this was the move from the Catalyst 3.00 drivers to the 3.04 drivers. Otherwise, for anyone who would have been expecting multiple across - the-board improvements, this would be a disappointment.
Breaking down the changes by game, we see an interesting trend among what games had the greatest performance improvement. Jedi Academy, UT2004, and really every non-modern/next-gen game saw no significant performance improvements to which we can isolate to just the driver offering targeting optimizations, and there was only the one aforementioned general improvement. However, with our next-gen benchmarks, Halo and 3dMark, we saw a similar constant performance improvement among the two, unlike with the other games.
There is also a consistent performance improvement among most of the titles we used that was isolated to when we enabled AA/AF, which is a positive sign to see given just how important AA/AF has become. With the latest cards now capable of running practically everything at a high resolution with AA/AF, it looks like ATI made a good bet in deciding to put some of their time in these kinds of optimizations.
Getting back to the original question then, are drivers all they're cracked up to be? Yes and no. If the 9700 Pro is an accurate indicator, than other cards certainly have the possibility of seeing performance improvements due to drivers, but out of 3 years of drivers, we only saw one general performance improvement, so it seems unreasonable to expect that any given driver will offer a massive performance boost across the board, or even that most titles will be significantly faster in the future. However, if you're going to be playing next-generation games that will be pushing the latest features of your hardware to its limits, then it seems likely that you'll find higher performance as time goes on, but again, this will be mostly in small increments, and not a night-and-day difference among a related set of drivers.
As for the future, the Radeon 9700 Pro is by no means a crystal ball in to ATI's plans, but it does give us some places to look. We've already seen ATI squeeze out a general performance improvement for OpenGL titles for their new X1000-series, and it seems likely that their memory controller is still open enough that there could be one more of those improvements. Past that, it seems almost a given that we'll see future performance improvements on the most feature-intensive titles, likely no further game-specific changes on lighter games, and plenty of bug fixes along the way.