• Home
  • Mac
  • Apple And AMD: More Than Just Graphics?

Apple And AMD: More Than Just Graphics?

Apple And AMD: More Than Just Graphics?

The new MacBook Pro models were unvelied yesterday, revealing the fastest MacBook Pro models the world has ever seen. In addition to Thunderbolt and the Sandy Bridge update, which have been discussed extensively, I found it especially interesting that Apple has moved to ATI’s mobile graphics processors.


ATI is owned by Advanced Micro Devices (AMD) – the same company that remains in still competition with Intel in the processor race. With this being the first time we’ve seen AMD’s graphics processors in a MacBook Pro, I am left to wonder: Is this really just about graphics, or is there a deeper method to Apple’s decision?

Since November, members of the Apple community have been curious about the idea of Apple embracing AMD’s 8-Core Fusion Processors coming to the MacBook Pro, and while we didn’t see that happen this time around, the future is open for such a move.

There are many reasons why Apple may not have used the Fusion processors in the recent update, with the benefits of Light Peak and a long-standing relationship with Intel to consider in the matter. So the real question I would pose is this: Is it possible that Apple is testing the AMD waters (as far as hardware support, price negotiations, ad long-term loyalty) by including their chip sets in MacBook Pros?

I, for one, believe this is not only possible, but likely. Apple has never been afraid to switch processor technologies to take advantage of new developments (such as with their early move towards PowerPC, and their subsequent move away from it towards Intel.), so why not? AMD has some great new technologies available, and in some respects is really ahead of Intel’s curve.

What do you think? Sound off in the comments!

  1. Jim Flayhan says:

    Not sure I agree completely. Apple left behind NVIDIA graphics not Intel. Apple still uses Intel integrated graphics with the new MBPs as it is part of Sandy Bridge.

    Although AMD may jump ahead every few half decades or so, Intel remains the consistently best CPU option. Intel has been firmly in the lead since mid-2006. AMD has interesting technology but they will be behind Intel very soon and for a long time yet again.

    1. R V says:

      I have to disagree. The clock speed limit is being reached. While we may see a designed 4.0 gig cpu, that can run without disabling most of it's cores, sometime in the near future I don't think that it will come this year. What that leaves is incremental performance from die shrink and improvements in memory controller. What Intel needs to do is develop a new generation which if you follow their "tick-tock" concept will not happen for another year.

      Yes, yes I know Westmere Zeon 5698 clock is set at 4.4 gig. But it generates SO MUCH HEAT that 4 of the 6 cores have to be disabled. That's is just a rediculous product release. It is just an overclocked and crippled 3.4 gig.

      1. R V says:

        continued….

        So I think that what we have reached parity on CPU code executions per clock with AMD's release of Llano and Bulldozer. What we will see are more cores per die and more cache per die and of course the die shrink down to <22nm. The Intel sometime in 2012 will probably launch a designed 4 gig cpu and AMD will follow in 3-5 months. But 80% of the cpu market is not the fastest clock. So while the speed crown is certainly relevant it really doesn't matter if all you want is value.

        The real differences will be in graphics and nobody can argue rationally that Intel has an advantage in gpu design.

        While one might argue that since Intel has been a fairly consistent product leader, it is not much of an innovator. It prefers to respond to AMD's pressure rather than set a bar that AMD can beat. As for me, I think that we should all thank our lucky stars that AMD has been there to compete with Intel, or we'd be paying $1000 for a 1gig cpu today.

  2. outfall says:

    I think the defective Nvidia cards cost Apple millions, they replaced my logic board out of warranty for free .

    1. vin says:

      Ditto. I have a 2007 C2D and that logic board has been replaced twice- at no expense to me. That's got to cost Apple dearly.

  3. Mark says:

    I remember ATI graphics cards in Macs since the old iBooks. Granted, they probably were not integrated graphics, but still, AMD did exist in Mac models.

  4. Switcher says:

    Not the first time Apple provides some ATI stuff to its hardware… For me, there is no more than that.
    Pleace notice that only the high-end models have received such an upgrade. The models which are likely to be sold the most will port the Intel graphic chip (or whatever you name it).

    In France, we say : "never put all your eggs in the same bag". Apple is doing it : guggling with the Intel / nVidia / AMD trio to remain independent : the IBM / Motorola lesson has been learned for a while.

  5. FGH says:

    I may be missing something, but I'm pretty sure that my 2006 1st-gen MacBook Pro has an ATI Radeon X1600 graphics chipset…. so is the ATI to which you refer, Mr. Kunzler, a different ATI, or are we simply limiting the analysis to the unibody MacBook Pro?

  6. FGH says:

    My mistake, I misread your post. You are absolutely correct, and I retract my earlier statement.

  7. macboy says:

    I have macbook pro mid 2010 with NVIDIA GT 330M and its always crash randomly (black screen of death).  NVIDIA card on MacBook pro is really CRAP & USELESS!!!!

  8. 726 422302Thanks for one more informative post. Where else could anyone get that kind of info in such a effortless to recognize way of presentation. 182654

  9. try this out says:

    340387 440315Some genuinely superb blog posts on this web site , regards for contribution. 340347

  10. 175001 200128Thanks for the post. I like your writing style – Im trying to start a blog myself, I feel I may possibly read thru all your posts for some suggestions! Thanks once much more. 487720

Leave a Reply

Your email address will not be published.