Stardate
20020722.1547 (Captain's log): Hale writes:
If Motorola can't deliver a suitable chip for a hypothetical super-speedy Mac, why doesn't Apple go to another supplier? I would think that AMD (for instance) would be willing to produce a suitable chip if Apple waved enough cash under their noses (assuming Apple has the cash to wave).
Or am I missing something?
Having someone like AMD design a PPC replacement from scratch would be a non-trivial problem. It would take years. Intel has been working on Itanium for at least six years, for instance. There are also licensing issues involved; either IBM or Motorola (or maybe both) would have to consent and would have to be paid. In particular, Motorola owns the Altivec instruction set, and no-one can implement that without paying them. (Not even IBM.)
Frankly, given the potential sales and the number of top-drawer chip designers that AMD has, I seriously doubt they'd be interested. They'd rather apply their people to Hammer and the Athlon. The reality is that Apple would be the only significant customer for the chip, and Apple alone isn't large enough to support the development cost of a modern full-performance CPU. That's why Motorola isn't keeping up; most of its PPC customers (doing embedded) don't need what Apple does, and it isn't cost-effective for Motorola to develop PPCs for Apple's special needs.
AMD would certainly be glad to sell Hammer to Apple for a new Mac. But that would present its own set of difficulties. Switching to a foreign architecture which is not software compatible with the PPC means that any Mac user upgrading to the new box would have to purchase new versions of all their apps for the new machine. You've completely broken continuity and removed all reward for loyalty. Once that happens, there's no important difference between doing this and switching to the PC, except for the fact that the PC would be cheaper and would have a much larger base of software for it.
There's also a non-trivial bootstrap problem. Software developers aren't going to put extensive effort into creating software for the new machine unless they're sure there will be a lot of customers, and customers won't buy unless the software is already there. Each waits for the other to move first; thus neither will. That's what killed BeOS; the apps never appeared, so there was never a movement of users. There were never many users, so there was no incentive to develop apps for it. It's a Catch 22, and it's one that the industry has been working with, against and around for 20 years.
The only reasonable answer anyone has ever found for it is compatibility modes. That's what Apple used when it switched from the 68K to the PPC. They actually involved in the architectural design of the PPC and asked that certain changes be made to it to make it possible for the PPC to efficiently emulate 68K instructions. The first PPC-based Macs shipped with a 68K simulation environment which could run programs written for the 68K Macs (not to mention most of the OS itself), and once a substantial number of them had been sold, it began to be attractive for developers to create PPC-native programs, which is now the norm.
That transition took about 10 years. (And some people are still using some 68K programs, and it's rumored that some parts of OS9 are still written in 68K code. There may even be some legacy 68K code in OSX itself.)
Microsoft faced a similar problem in trying to bootstrap the WIN32 API as part of its long term plan to convert from the Windows code base to the NT code base.
The solution was Windows 95. It was a major rewrite of Win 3.1, which retained the ability to run Win16 programs but which also implemented the WIN32 API. It was acceptable to customers because it offered a value proposition to them unrelated to the WIN32 API, since it could run their Win16 programs and it offered a much nicer use experience. Once a large installed base of Win95 existed, it became reasonable for developers to create WIN32 programs; they couldn't run on Win16 systems, but they ran better on Win95 systems than Win16 programs did.
That transition took fifteen years.
Apple is trying to bootstrap a new API (Carbon) and it's doing the same thing: OSX will run OS9 apps but provides a better use experience for the customers, and once there is a large base of OSX users it will be more attractive for ISVs to stop making OS9 programs. For a variety of reasons, this will probably be faster. I suspect this transition will take a total of about four years. (We're about one year into it.)
The biggest problem with Hammer is that its design is already mature, and it was designed to be backward compatible with x86, because AMD hopes to get PCs to upgrade. So the Hammer architecture has a number of concessions to the x86 mode, which are incompatible with the PPC.
If Apple were to switch to Hammer, it's extremely unlikely that PPC emulation would work well. Among other issues, Hammer is little-endian and PPC is big-endian, and I'm sure there are other issues too, and the upshot is that while 68K emulation on the PPC ran at tolerable speeds, PPC emulation on Hammer would crawl. Would Mac users really buy new computers if their existing software ran half as fast (or worse) than it did on the PPC-based computers they were replacing?
Some might, but a lot more of them would decide that the Mac platform was no longer worth sticking with, and would switch to PCs.
That's the trap Apple is in: unless a miracle happens and M
|