Looking Back: ATI’s Catalyst Drivers Exposed

It’s no secret in the hardware world that good software is often just as important as good hardware. The best processor, the best TV tuner, and even the best sound card can only be as good as the software and drivers backing it up. Even a small change in one critical piece of code can result in a massive difference that represents a significant change in performance and sales of a piece of hardware.

Above all, however, this concept is embodied in the realm of video cards, where over the years, we have been spoiled by promises of “A performance improvement between 17 and 24% is noticed in Jedi Knight II” and “up to 25% performance improvement in popular consumer and professional applications”. These days, it’s not just common to see GPU makers find ways to squeeze out more performance out of their parts - it’s expected. Finishing the design of and launching a GPU is just the first steps of a much longer process of maximizing performance out of a part, a process that can quite literally take years.

With the flexible nature of software, however, it has caused a significant shift in the marketing strategies of GPU makers, where the war is not over at launch time, but continues throughout the entire product cycle and in to the next one as new optimizations and bug fixes are worked in to their drivers, keeping the performance landscape in constant motion. Just because a side did not win the battle at launch doesn’t mean that they can’t still take it later, and just because a side won now doesn’t mean that they’ll keep their win.

We have seen on more than one occasion that our benchmarks have been turned upside down and inside out, with cases such as ATI’s Catalyst 5.11 drivers suddenly giving ATI a decisive win in OpenGL games, when they were being soundly defeated just a driver version before. However, we have also seen this pressure to win drive all sides to various levels of dishonesty, hoping to capture the lead with driver optimizations that make a product look faster on a benchmark table, but literally look worse on a monitor. Quake3, 3DMark 2003, and similar incidents have shown that there is a fine line between optimizing and cheating, and that as a cost for the flexibility of software, we may sometimes see that line crossed.

That said, when the optimizations, the tweaks, the bug fixes, and the cheats are all said and done, just how much faster has all of this work made a product? Are these driver improvements really all that substantial all the time, or is much of this over-exuberance and distraction over only minor issues? Do we have any way of predicting what future drivers for new products will do?

Today, we set out to answer these questions by taking a look back at a piece of hardware whose time has come and is nearly gone: ATI’s R300 GPU and the Radeon 9700 Pro.

R300 & The Test
Comments Locked

58 Comments

View All Comments

  • timmiser - Wednesday, December 14, 2005 - link

    That is why they are so good. It shows that the early drivers are already well opitimized and that there is not much improvement over the months/years from driver release to driver release.

    Nvidia on the other hand, will have a driver release (typically around the launch of a competing ATI card) that all of a sudden shows a 25% gain or some ungodly number like that. This shows us that either A) Nvidia didn't do a very good job with opitimizing their drivers prior to that big speed increase, or B) held the card back some via the driver so that they could raise the speed upon any threat (new release) by ATI.

    Either way, it reflects poorly on Nvidia.
  • DerekWilson - Monday, December 12, 2005 - link

    lots of people have requested more modern games.

    our goal at the outset was to go back as far as possible with the drivers and select a reasonable set of games to test. most modern games don't run on older drivers, so we didn't consider them.

    for future articles of this nature, we will be including a couple modern games (at the very least, half-life 2 and doom 3). we will handle the driver compatibility issue by starting with the oldest driver that supports the game.

    very new games like FEAR won't be useful because they've only got a driver revision or two under their belt. Battlefield 2 is only about 6 months old and isn't really a suitable candidate either as we can't get a very good look at anything. My opinion is that we need to look at least a year back for our game selection.

    thanks for the feedback. we're listening, and the next article in the series will definitely incorporate some of the suggestions you guys are making.
  • Cygni - Tuesday, December 13, 2005 - link

    I cant belive people missed this point. I thought it was pretty obvious in the text of the article. Throwing teh gaem of teh futar at a videocard running drivers from 1997 is going to have obvious consequences. That doesnt give you anyway to measure driver performance increases over time, whatsoever.
  • nserra - Monday, December 12, 2005 - link

    I agree.

    But I think the best candidate would be the R200 (8500) for testing,
    since everyone said it was a good card (hardware) with bad drivers (software).

    So a good retro test is how the R200 would standup with recent drivers VS nvidia geforce 3/4 with the older games.
    The all idea is to see if 8500 could keep up with geforce 3/4 if it had good drivers.

    Resuming:
    2002/2003 games | radeon8500 card | 2002/2003 driver
    2002/2003 games | geforce3/4 card | 2002/2003 driver

    2002/2003 games | radeon8500 card | 2005 driver
    2002/2003 games | geforce3/4 card | 2005 driver

    2004/2005 games | radeon8500 card | 2005 driver
    2004/2005 games | geforce3/4 card | 2005 driver
  • JarredWalton - Monday, December 12, 2005 - link

    The problem is that the R200 is no longer acceptable for even moderate gaming. If you have a 9700 Pro, you can still get reasonable performance on almost any modern game. Yes, you'll need to drop to medium and sometimes even low quality settings, but a 9700 Pro is still three or four times (or more) as fast as the best current IGP.

    I'm not working on these articles, but personally I have little to no interest in cards that are more than 3 generations old. It might be intersting to study from an academic perspective, but for real-world use there's not much point. If enough people disagree with this, though, I'm sure Ryan could write such an article. :)
  • Hardtarget - Monday, December 12, 2005 - link

    Neat article idea but I would deffinitely of thrown in one more game, a modern one. Probably Half Life 2, see how it does on teh older hardware in general, and see what sort of driver revisions do for it. Would of been pretty interesting.
  • Avalon - Monday, December 12, 2005 - link

    I think Far Cry, HL2, and Doom 3 ought to be tested. I remember running those games on my 9700 pro. Far Cry and D3 ran just fine at 10x7, and HL2 ran great at 12x9. I'm pretty sure quite a few people were using these cards before upgrading, in these games.
  • WileCoyote - Monday, December 12, 2005 - link

    My conclusion after seeing the numbers: ATI prefers directing money/man-power/time/resources towards synthetic benchmarks rather than improving game performance. I consider THAT cheating.
  • Questar - Monday, December 12, 2005 - link

    Explain the image quality increases then.

    Or do you consider nvidia lowering image quality from generation to generation an improvment?
  • Jedi2155 - Monday, December 12, 2005 - link

    Explain the Halo benchmark then?

Log in

Don't have an account? Sign up now