10 Dec The ‘Apple’ HD5770
Category: Apple, Personal

I own a Mac Pro, the beefiest and most user-serviceable of all Macs, and I love it for a multitude of reasons. Out of all the reasons, my favorite thing remains being able to replace parts of it myself like you would with a tower PC. And when my second Apple-sanctioned Nvidia 8800GT died, I did just that: I took out the old card and stuck in the then-best shipping graphics card that works with Mac OS X: the ATI Radeon HD5770. I got a lot of questions from people on how well it performs, how silent it is, and more, so here’s a little post about the card that can.

Gaming on the Mac is certainly not as common or well-supported as it is on Windows, but the HD5770 handles whatever you throw at it quite well. I still have to adjust to it, though: the HD5770 is not a brand-new top-tier graphics card, like the card I use in my Mac Pro for gaming under Windows (the HD5970), and can sometimes have issues with the latest games at 30″ monitor resolution (2560×1600 pixels -are- a lot to push around).

As for the PC enthusiasts that often sneer at the Mac GPUs ( – “What, doesn’t STEVE want you using *illegal* cards in your Mac? Sniff! Why buy the expensive Apple card?!” ), I have to explain that Mac OS X compatible GPUs require EFI / EBC firmware on their ROM chips to be initialized for use under OS X. This is not something you can just ‘hack together’: the cards’ ROM chip needs the extra space to have a (Windows) BIOS-compatible and (Mac) EFI-compatible firmware on there, and even then Apple has to make drivers that lets you use the card to perform well. Apple would, of course, love it if everyone could just drop a good GPU in there, as it’d just make the Mac Pro more attractive to consumers. Unfortunately, it’s not that simple. On the bright side: you can do that just fine with hard drives, eSATA controllers, USB cards, firewire cards, most audio cards, and so on.

Despite that, it performs great. Fortunately for Mac owners that enjoy gaming, most titles that run on Macs can be shown in full detail on 30″ / 27″ displays using the HD5770. And best of all: it remains almost perfectly silent. You won’t be hearing it rev up the fans like the old 8800GT, completely nullifying Apple’s care to acoustics in the rest of the computer (my Mac Pro nary makes a sound).

It’s easy to connect; the card has my requisite dual-link DVI (for all ‘typical’ LCDs and the 30″ Cinema Display I use) and two mini Displayport outputs, for the LED Cinema Display and other DP monitors. It uses a single 6-pin cable from the Mac Pro motherboard to supply extra power, which is the same as my old Nvidia 8800GT, although the HD5770 is far more efficient: it draws far less power when idle, for instance.

The only two issues I have with the card are the price, as it’s about 75 dollars above the ‘street price’ of a PC HD5770, which is unacceptable for a larger ROM chip and some firmware and the requisite Mac Pro motherboard cable. I understand ATI may have to produce these cards in smaller runs, but it’s a big chunk of cash on top of what is normally 135 dollar card. The packaging sort of makes up for it:

(yes, that’s a little Sony Vaio UX UMPC. With OS X on it. Blog post coming? Hell yes!)

The other issue is grapical glitches in Minecraft. Somehow, despite having excellent performance, Java OpenGL graphics are a terrible mess. I suppose this isn’t as much an issue with the card as it is a matter of the Java runtime, but the artifacts are awful.

Overall verdict:

8/10. If you’re in the market for a graphics upgrade, I’d check out how well the now-finally-shipping HD5870 compares in terms of pure bang for your buck. If you’re using all the extra power pins on your Mac Pro motherboard already (check!), and need an affordable replacement for Nvidia’s horrible, unreliable cards and crash-prone drivers (especially in Photoshop – check!), this is a no-brainer. And you can use the box for… well, I don’t know. Storing cats.

Unrelated posts:

    Bricky Shirt

    By no means a replacement of the Cocoia Exploded Settings tee, but I wanted to wear this and I was told others enjoyed it as well. Represent the...

    the Cocoia WWDC shirt

    It's often said that there is great value in having a distinct shirt at WWDC, and while I'm not very preoccupied with that notion myself, I do...

    Redesigning Steam for Mac

    Steam for Mac will be available for download in a week's time. Steam, for the uninitiated, is the world's largest gaming platform, serving in...

    The first Photoshop icon.

    While I was doing some research for one of my upcoming projects today, I found the very first Photoshop icon. It's actually a tiny little photo...

You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.

6 Responses

  1. 1

    Ah, I was waiting for this. Nice run-down, Seb!

  2. 2

    Lol tf2. I’m playing it too. Too bad don’t have a mac and desperately want some earbuds.

  3. 3

    Thanks a lot for this post.

  4. Nice write-up.

    I’ve got a 2008 Mac Pro with an 8800GT, and I’m starting to see driver problems and crashes, so I’m looking at upgrading. One of the annoying things about my current setup is that whenever you wake the Mac Pro from sleep or boot it up, the fan on the GPU goes at full speed for a couple of seconds. It’s loud, and quite annoying. Does the HD5770 exhibit this behaviour also?

  5. Be aware that for anyone wanting to put this card in anything but a mid-2010 Mac Pro, if you have any problems with the card, you must return it during the 14 day return period. Apple Support will do nothing to help you troubleshoot or swap out a card if it’s problematic after that. You’ll just be SOL.

  6. I’ve got a Radeon 5870 card in my 2008 MacPro (8-core) and I to went from a 8800 to this massive upgrade. I don’t experience the fan’s going at a crazy speed after the system is woken from sleep. It could be they do and the card just holds far bigger fans, but over all the system works very nice.

    I’m using 2 DVI monitors with out a hitch also. One connected via a mini DisplayPort to a DVI converter.


Leave a Reply