Vacations are tough for me to come by. Planning around tradeshows is easy, but planning around unannounced product launches, new driver releases, bugs and unexpected discoveries is impossible. Last year I threw a dart at the calendar and told myself I was taking 10 days off in May and thankfully, there wasn’t too much that was announced while I was gone.

I did miss one rather important thing: the launch of an OS X version of Steam. I actually contacted Valve ahead of time to see if they’d give me access to a pre-release version so I could do a performance article before I left. I got no response. After reading Ryan’s Mac OS X Portal Performance article when I got back, I understood why.

In the process of porting the Source engine to OS X a great deal of performance was lost. To Valve’s credit, games like Portal are more than playable at good looking settings on modern Macs. You’re just better off playing those games in Windows using Boot Camp.

Ryan’s original article used a Hackintosh to compare OS X and Windows performance. Now that 1) I’m back, and 2) Half Life 2 Episode 2 is out for the Mac, I can provide an updated comparison using another reference point between Steam on both OSes.

For this comparison I’m using two systems. The first is a Nehalem Mac Pro with an EVGA GeForce GTX 285 Mac Edition.

Testbed System Specifications
  Nehalem Mac Pro (Mid 2009)
CPU 2 x 2.93GHz Quad-Core Nehalem Xeon Processors
Memory 6 x 1GB DDR3-1066
GPU EVGA GeForce GTX 285 Mac Edition (1GB GDDR3)
OS Mac OS X 10.6.3

The second is Apple’s new 2010 13-inch MacBook Pro with a GeForce 320M.

Testbed System Specifications
  13-inch MacBook Pro (Early 2010)
CPU 2.4GHz Intel Core 2 Duo
Memory 2 x 2GB DDR3-1066
GPU NVIDIA GeForce 320M
OS Mac OS X 10.6.3

I’m running Boot Camp and a clean install of Windows 7 x64 on both Macs for the comparison. I’m using NVIDIA’s 197.45 drivers for the GTX 285 on the Mac Pro and the latest drivers under OS X. Steam was up to date as of 12:47AM this morning.

I’ll start with the 13-inch MacBook Pro:

Half Life 2 Episode 2 Performance
13-inch MacBook Pro (Early 2010) Mac OS X 10.6.3 Windows 7 x64
1280 x 800 44.2 fps 68.0 fps

At the panel’s native resolution of 1280 x 800 the 13-inch MacBook Pro is playable at high quality settings with no AA/aniso. Episode 2 runs smoothly on the portable Mac. Gaming, albeit dated, is possible under OS X.

Boot into Windows however and you get a 54% performance boost. The game goes from definitely playable to butter smooth. In other words, there’s a perceivable difference.

With the additional headroom of the CPU and GPU in the Mac Pro, I ran our benchmark at higher quality settings and at more resolutions. Under OS X you only get 2X and 4X MSAA options compared to NVIDIA’s plethora of AA modes under Windows, so I stuck with 4X MSAA for this comparison. Anisotropic filtering (16X) was enabled and all settings were as high as possible.


OS X HL2ep2 Settings

Multicore rendering is an option under Windows that isn’t adjustable under Steam for OS X, and despite the setting being greyed out as Enabled it doesn't appear to be enabled under OS X. In our benchmark with multicore rendering disabled both versions of the game eat up around 1.5 out of the 8 cores in the Mac Pro. Enabling multicore rendering in Windows bumps the average up to 2.4 cores, but drops performance at higher resolutions. I’ve provided both sets of results in the graph below so you can see what happens:

The Windows performance advantage with multicore rendering disabled ranges from 62% all the way up to 103%. Even at its worst, the GTX 285 under OS X is fast enough to make 2560 x 1600 playable, but it is noticeably slower than under Windows.

With multicore rendering enabled CPU bound performance goes up around 18%, but we see a drop at more GPU limited resolutions.

Image Quality: Still Foggy
POST A COMMENT

95 Comments

View All Comments

  • Scali - Wednesday, June 9, 2010 - link

    I wonder if that test was done correctly.
    I know that Doom3 performed quite poorly on my GeForce 8800GTS aswell...
    After experimenting with the driver settings a bit, I found that the problem was in the "Multiple display performance mode" setting.
    Somehow this had no measurable effect on D3D apps, but in Doom3 (and probably most other OpenGL apps), it was much MUCH faster with "Single display performance mode".
    Reply
  • Brian Klug - Friday, June 4, 2010 - link

    Honestly I think that has more to do with the fact that most of the new apple hardware is using Nvidia before AMD. The exception being the iMac lineup. Reply
  • Setsunayaki - Friday, June 4, 2010 - link

    Personally, I side with common sense.

    I don't care about gaming benchmarks to some 2 year old game, ok? I already was able to play it on Windows upon release and anyone who can blow money on buying a MAC, along with it iPHONES, iPODs, iPADs, (and the subscription fees that go towards some of these devices) definitely has $800 - $1000 to spend on making a gaming PC to enjoy the latest games out there....without having to wait 2 years for their own platform to make a game "playable."

    The largest difference I've seen as a computer scientist and a musician is that Apple Offers programs that are not Professional Programs, but are used by Professionals.

    A lot of professionals get stuck into using those non-professional programs because the professional program user interface on both, MAC and Windows have a high learning curve. On Windows the same program costs a lot less to buy, but still requires a long time to learn.

    Lets not forget the professional is already a professional in his or her own field. Learning a program means they spend more time learning programs and computers and less time working in their fields making a living (until they master a program)...call this "Downtime" for professionals.

    The main difference between Windows, MAC and Linux I have seen is:

    Windows = Professional Programs exist, which are used by professionals on a wide scale, but those programs are not efficient in CPU usage and memory usage, they take longer to learn than MAC versions of the same program but offer more features through free updates and other measures and can be bought at lower prices.

    MAC = Professional Programs exist, but the costs are so high that very few attempt to learn these programs or even buy them. Unfortunately a lot of professionals use non-professional programs in many areas to make things work. It gets them by until they need an extra feature available in the professional program. Not wanting to learn the professional program, they wait for the next version of the non-professional program and pay more for it...

    Linux = Professional Programs exist. Linux users learn to master the command line interface. This is done for speed. But Linux has a philosophy of "One program = One major Function" and this allows for a high level of efficiency. Linux users learn both, the command line interface and also the GUI interface on programs as well. It takes a long time to learn as well, but ultimately in the end they have the best resource management and program efficiency. They wait on features as well like MAC users do, but updates are a lot faster on many of the professional programs out there. The exception comes in server-side programs which are light years ahead of windows or macs since that is what they are built for. Linux suffers in the long-term for availability. It was in the last 5 - 10 years when professional programs emerged while MAC and Windows have had access to professional programs for 10 - 15+ years.

    In the end, rather than fight about it....Its always good to at least own two of the three major OSes and try the one you do not own...not for the sake of argument, but for the sake of reality.

    If you truly love computers...it means you can prove how great you are by learning and trying different Operating Systems. It means anywhere you find a computer, be it Windows, MAC or Linux, you will be able to access it, use it and know what it can and can not do....

    That knowledge goes farther than crucifying one OS in favor for the other.
    Reply
  • toast70 - Friday, June 4, 2010 - link

    Are there ATI drivers for MAC? I have never seen any ( doesn't mean there aren't any i just haven't run into any)
    I would believe that is why it hasn't been tested this way...
    Reply
  • Penti - Friday, June 4, 2010 - link

    Are you joking? You can choose a ATI HD 4870 with the mac pro right in the system builder since Mac Pro 4.1 (March last year) to come installed and delivered with the Mac Pro. More to the point every 27" iMac comes with Mobility HD4850 for the Core iX version and HD4670 or HD4850 (Mobility) with the Core 2 based version. Plenty of people use ATi cards for hackintoshes on top of that. Drivers are in the OS except when the vendors didn't have time to include them. Like when GTX 285 where released.

    Before that you had, HD3870, HD2600, Mobility HD2600, Mobility HD2400, X1950, Mobility X1600 and on G5, G4, G3 etc you've had X800 XT, 9800 Pro, 9700 Pro, 9600 Pro, 9200, 9000, 8500, 7500 and ATI Rage and so on.

    And it's not bad that even Apples integrated graphics handles Valves Source-based games. It makes every Macbook, Macbook Pro 13, Macbook Pro 15 & 17 a gaming machine, although lightweight that. Also every Mac with a dedicated graphics card from the 8600-series and up from nvidia and every ATI card since Apple released Intel-Macs should handle both Valves games and Starcraft 2. Every iMac with a dedicated graphics card should do fine, certainly playable on any 24" and 27" iMac. And they where released when many of the competitors all-in-ones had like X3100 graphics. They certainly targeted the casual or old ex win/dos gamers fine. Of course many primarily game on consoles now days. There's not many exclusive titles. It's confirmed that it runs fine on Core iX 27" iMacs with Mobility HD4850. Certainly doesn't hurt that X1000-series is still supported in 10.6 when official support has been dropped for Windows 7 (I did install Vista drivers). Bugs should still be ironed out though but ones they have been it should run fine for most people. But ATI cards are prevalent.
    Reply
  • setzer - Friday, June 4, 2010 - link

    One thing that people are ignoring in these Windows vs OSx source engine comparisons, is that although the hardware is the same, the APIs are not, the Windows version of Source runs on DX and the OSx version runs on OGL.
    Those two don't compare in IQ or performance, people keep expecting that an open-source api offers the same visual output/performance of a proprietary one, but I really don't see how you can manage that without code branching, which I suspect that Valve doesn't really want to do beyond the point they are now, they have separated their game logic/engine from the render part.
    If they invest enough time and money on this you might get better performance and on par IQ, but i don't expect OGL performance to surpass DX.

    A valid benchmark of this issue would be to run both systems using the opengl render, which i have no idea if it's possible or not in windows, aside from that, the only other benchmark that i can think of is between the linux client and the osx client.
    Reply
  • bobvodka - Friday, June 4, 2010 - link

    Minor nitpick; OpenGL is NOT Open Source. It is an open standard, the OpenGL driver components are very much closed. Reply
  • CptTripps - Friday, June 4, 2010 - link

    This benchmark is perfectly valid in one sense. Which one plays it better. Reply
  • Penti - Saturday, June 5, 2010 - link

    Haha, many vendor specific extensions, functions and effects where developed on OpenGL. It's purely up to the graphic vendors to backport new features to it even to the older versions of the API and support the API and functions they want, it's just an API an Vendor extensible one of that. Some are mandated, some are vendor specific, some started out as vendor specific but became a normal extension, some are backported from newer versions off OGL. The implementation is proprietary. Features get supported cross and forth between DX and OGL. Many of the most advanced features of DX where just implemented on to off OGL or GLSL originally. If you want to try out utilizing new hardware features it's easiest to implement a demo for that on top of OGL. So don't think nVidia and ATi don't care for OGL.

    You can get equal IQ, you can get equal performance. But obviously you have to work on the engine and get drivers issues sorted out like when games are released on Windows. On Consoles you have to work out the kinks yourself and OS X easily surpasses the capabilities of consoles. OGL isn't a static thing. Even if OGL 2.0 is from 2004 it has been extended since by the vendors and OGL 2.0 as in 2004 is still about equal to both PS3 and Xbox 360. You got OpenGL, GLSL, ARB, Nvidia Cg support, OpenCL and newer hardware then the consoles. There's no reason why the console ports would run worse on OS X then on the intended consoles, it's the same companies that even write the drivers that support the hardware. Lacking HDR of course the picture will look different. HDR itself is supported in OpenGL 2.0. And extensions might be useful to achieve it. It can also be made with vertex and fragment programs (GLSL). It's simply neglected from being implemented, remember this is a new rendering layer, not a translation layer. All the same features are accessible. It's what the API are designed to do, be a way to use the hardware. Remember OpenGL 2.0 is old, it however has access to all the same functions as DX9.0c and most of the functions from DX10, 10.1 and 11 (of the hardware supports it). Nothing more is needed for current generation engines like Source. In both DX and OGL's case it's actually the driver that implements the interface. They are responsible for the pipeline, optimizations and performance. Not some component from Apple or Microsoft.
    Reply
  • Ben90 - Thursday, June 10, 2010 - link

    After reading the first article I hypothesized the same conclusion, that the OpenGL implementation wasn't as optimized as the Direct X version. I did some googling and found this articles results mirrored when comparing only OGL across the Windows/Linux/OSX platforms. Its something else Reply

Log in

Don't have an account? Sign up now