USS Clueless - A compulsion to revisionism
     
     
 

Stardate 20021020.1915

(On Screen): I hadn't intended to visit this again, but I feel compelled to do so. A confluence of Brian's most recent post, and a comment made by "Mr. Capitalist Lion" in response to a post by Mike Hendrix. Mike says that he refuses to become involved in the debate, and the comment thread instantly became such a discussion.

But first, Brian's most recent post, where he asks, perhaps a little bit plaintively, why there should be any need to "gloat" over the dire reality of Apple's peril.

But is that any reason to gloat over the death of the competitor who had the audacity to try to hold itself to a higher standard? I mean, you know that the other computer companies hold Apple in higher regard from a business standpoint than the average consumer does.

Actually, I don't think that Apple's competitors hold it in anything like as high a regard as many Mac users think. The Mac community is soaked with urban legends, and one of those is that every PC company spends all its time watching Cupertino to see what the trendsetter at 1 Infinite Loop will decide is the Next Big Thing™. The reality is that it's normal in high tech for companies to watch what all their competitors are doing, and to copy what's good. Despite what many think, Apple does it too.

But that's beside the point. I don't write this stuff because I want to gloat. (Well, maybe a little.) The main motivation is that I'm fed up with lies and deception. I'm fed up with Macolytic smugness. I'm fed up with the a priori assumption of Mac users that "of COURSE the Mac is better; it's a Mac, isn't it?"

There are ways in which the Mac is good, but both Apple and the Mac have just as many flaws as anyone else. But the Mac user base is pretty inbred intellectually, and it is fertile ground for gross misconceptions which approach the delusional.

For instance, in Mike's comment thread, as part of a smug "of COURSE" posting, Mr. Lion said:

As for graphics cards, I seem to remember the GeForce 2, 3 and 4 being available on the Mac months before they were available for the PC.

No, I'm afraid not. In fact, every NVidia chip has either gone only to PC users, or gotten to PC users first, or arrived simultaneously. Apple has never actually been first; not by months, not even by days.

For reasons I've never understood (and always been curious about – anyone care to explain it to me?) NVidia's early chips were incompatible with the Mac and couldn't be used. There seems to be something that has to be designed in so that a graphics chip can be used in a Mac card, and in the early days NVidia didn't do whatever-it-was. So you saw no Mac cards with the TNT, nor the TNT2, nor the original GeForce.

Nor did you see GeForce 2 GTS cards for the Mac, nor GeForce 2 Pro. The Pro was the result of yield and process improvements on the GTS core which could run at a higher clock rate.

NVidia also developed a related chip, the GeForce 2 MX. It shared a lot of the hardware with the GTS/Pro, but had differences. It was intended to be a budget part, and where the GTS and Pro were targeted mainly at hobbyists (i.e. gamers) which had been NVidia's core customer base until that time, the MX was designed for business users. It was the first NVidia chip to support two monitors, and in order to keep the cost down they also reduced the amount of 3D hardware in it. Its 2D performance was comparable, but its 3D frame rate was substantially lower than the GTS or Pro. It's also the first chip NVidia ever designed which had the whatever-it-is that was required to make it Mac-compatible, and thus it was, when it came out several months after the GTS, the first NVidia chip which Mac users got to use.

Meanwhile, NVidia speeded up the basic GTS core again, and released it as the Ultra, which was not Mac compatible. But thereafter every NVIdia chip has been Mac compatible.

The GeForce 3 was highly anticipated (to put it mildly). ATI had managed to get Jobs pissed off at them by mistakenly releasing information too soon that he'd been hoping to use as "Oh, and one other thing" at MacWorlds, and NVidia wanted to take advantage of this to try to get in Jobs' good graces. Apple seemed as if it would be a significant customer for NVidia, though not as large as such companies as Elsa and Asus, and so they decided that they'd formally announce the GeForce 3 at a MacWorld. It was a commercially-meaningless step, but it provided Jobs with something exciting for the audience, since they'd be the first to see the specifications and capabilities of the chip which had until that point been the subject of much fevered speculation in both the Mac and PC communities.

And so they stood there and watched in horror as Jobs announced that "We're going to get it first!" They'd told him that the Mac fans would learn about it first, but Steve was announcing that NVidia was going to give Apple parts before anyone else and was going to let Apple ship before such stalwart customers as Asus. So they made their presentation, returned to the office, and the next day issued a press release trying, as best they could, to make clear that Jobs had been wrong and that NVidia was going to deliver GeForce 3 chips to all its customers simultaneously – without making Jobs look like a jackass and getting him just as pissed off at NVidia as he was at ATI.

NVidia has never favored any of its customers. Other competitors have made that mistake, but NVidia's management knows full well that its relationship with all the board makers is very fragile; they know that in part because they watched (and benefited from) 3DFX pissing in the soup in exactly that way, causing most of its best customers to switch over and start using NVidia's parts.

NVidia had some last minutes problems, and was late getting volume manufacturing. But when they did finally get parts, they did ship them simultaneously to all their customers including Apple.

But one of the ways that Apple is inferior to its competitors is in its manufacturing process. Some of it is OEM (contract) and some of it belongs to Apple, but all of it works less well than any of the major companies in the PC industry.

The reality of the graphics card business for the PC is that most of the companies making such cards are using the same basic components. They will try, to the extent that they can, to differentiate their cards in terms of features (such as amount and speed of RAM, or bundled software), and there are some quality differences, but by and large their products are all interchangeable. So they compete heavily on price, and there's always a rush to market. The first guy out there gets a couple of weeks of sales at a premium price before someone else shows up and the inevitable price cuts begin.

It is an strange thing but a true one that despite its extremely low market share, Apple actually is in many ways a monopoly and acts like one. PC users can buy virtually identical NVidia-based graphics cards from 20 sources, but Mac users at the time could only get them from Apple. So there was less urgency at Apple, and that, coupled with Apple's traditionally inefficient manufacturing process, meant that even though everyone got their chips at the same time, PC users were able to buy and run GeForce 3 cards about three weeks before the first Mac users ever saw one.

But what many Mac users remember is not the reality that PC users actually had cards first, but the image of Jobs saying that the Mac would get them first. Thus Mr. Lion "seems to remember" the Mac getting all those parts months before they showed up for the PC, despite the fact that NVidia has always shipped parts to everyone simultaneously and the highly-competitive PC card makers have always beaten Apple to market.

Since that time, NVidia has released many parts. So far as I know, all of them have been Mac compatible, but Apple hasn't chosen to make cards based on them all. Every one of them (several versions of the GeForce 3, and now several versions of the GeForce 4) have been available on the PC, but Mac users have only been able to buy some of them, and they've never gotten any of them first.

And Apple has shown a marked preference for the MX chips, which have been by far the most common ones sold. It's true that all MX chips support dual-head and none of the others do, but Apple has also used the MX in products which did not support dual-head (such as the iMacs) and the most likely reason is that Apple chose them based on price (they're the cheapest) irrespective of performance (they're the slowest). Among other things, the MX parts have always been designed to work with cheaper (slower) RAM.

Which is part of what grates me: Apple is presenting itself as a premium brand, but it's using the cheapest components it can find.

That becomes particularly stark with the GeForce 4 family, because the GF4 MX is misnamed. While the GF2MX was a cut-down version of the GF2 GTS/Pro/Ultra, which could do all the same things (though not as rapidly), the 3D hardware in the GF4MX is actually based primarily on that same GF2 core. My buddies at Tech Report put it this way:

The strangest thing about the GeForce4 MX is that its 3D rendering core is ripped directly from its predecessor, the GeForce2 MX. The GF4 MX has two pixel pipelines capable of laying down two pixels per clock, and it has a fixed-function T&L engine. There aren't any pixel or vertex shaders in sight (unless you count the GeForce2's register combiners as primitive pixel shaders, I suppose). In terms of 3D technology, the GF4 MX is significantly less advanced than the GeForce3 or the Radeon 8500.

They're using the "GF4" brandname on it, but it's not really the same. The GF2MX could render anything the GF2 GTS could, only slower. The GF4MX is missing features compared to the standard GF4 because it's actually a generation back. Nonetheless, as always it's the most popular GF4 chip shipped by Apple, because it's cheap.

It's not a bad part; it performs very well as long as you aren't trying to do anything very complicated. But it isn't leading edge; it isn't state of the art. Mac users pride themselves as being at (or beyond) the cutting edge, and they know that Apple is leading the industry and that everyone else is trying to catch up. Mr. Lion says, in that same posting:

As for Apple being a follower in any regard... that's a bit too inflammatory for me to get into, so I'll just say: HAH!

In hardware, Apple has been a follower in nearly every regard for the last five years, and remains one today.

RAM is a good example. Apple has always lagged in RAM technology; it has invariably been on the trailing edge, one or two generations behind mid-high range PCs. Apple was using 66 MHz SDRAM on its top end computers when 100 MHz SDRAM was the standard even on mid-range PCs. It finally moved to 100 MHz SDRAM long after 133 MHz had become standard for PCs.

About the time it finally adopted 133 MHz SDRAM, the majority of mid-high range PCs were using either PC800 RDRAM or 266 MHz DDR-SDRAM. And you saw the Athlons move up to 333 MHz DDR, and now are moving to 400 MHz DDR, and simultaneously you've got PC1066 RDRAM on the other side of the fence.

And so it was, two months ago, that Apple announced its very first computers which would use something faster than 133 MHz SDRAM. With a front-side-bus speed of 167 MHz instead of 133 MHz, and with one or two processors, these new computers used 333 MHz DDR-SDRAM.

But they don't fully use it. The memory bus from the RAM to the chipset is double-pumped at 167 MHz, but the bus from the chipset to the CPU(s) is single-pumped at 167 MHz. Apple can't do anything about that; to change the bus between the CPU and mobo chipset would require Moto to redesign the CPU, and they seem in no hurry to do so.

The only practical way for Apple to even marginally increase memory bandwidth was to raise the clockrate on the single-pumped FSB. Motorola was able to produce CPUs which could operate on a 167 MHz FSB, but no one makes SDRAM that fast. You can buy 150 MHz SDRAM, but no one offers 167 MHz. The only memory Apple could reasonably use which could run at 167 MHz was 333 MHz DDR-SDRAM. But for purposes of supporting the CPU, they're treating it as if it was 167 MHz SDRAM. Meanwhile, the PC industry is moving to DDR-II, which is quad pumped.

Apple has been behind the curve in graphics much of the time, too. Compared to comparably priced PCs, Macs usually have a lower priced and less capable graphics chip. But that's not really where it became a serious problem. At about the same time as the now-legendary CPU clock-rate stall, Apple had a different stall going on. For a very long time, the normal graphics chip shipped with Macs was the ATI Rage 128.

Characteristically, they trumpeted it as the fastest graphics chip available. In actuality, the Rage 128 was considered mediocre compared to contemporary chips from Matrox, NVidia and other sources. (A review at the time tested a Rage prototype found the TNT, already on the market, to be faster. The much improved TNT2 came out two months later.) And long after the Rage 128 ceased to even be sold for the PC (because you could buy commodity graphics cards based on better chips for $30 or less) it remained the standard Apple graphics chip, as generation after generation of faster and better parts became available for the PC, sold for a while, and then became obsolete and were superseded by even faster and more powerful parts.

It wasn't until Apple began offering the GF2MX that the Rage 128 finally went to its rest, but it was still shipping in quantity months later. And the GF2MX continued to be the standard Apple graphics part long after it ceased to be commercially viable in the PC industry in anything except no-name commodity cheap graphics cards.

Better chips have often been available from Apple, but only for certain top-end boxes and you had to specifically choose the better card, and pay more for it. When the default graphics chip on a Mac usually was the GF2MX, lower-priced PCs usually shipped with much better cards. And if you bought any of the iMacs, you didn't get any choice and couldn't upgrade.

It turns out that Apple has been lagging PCs in hard disk technology, too, but I've already gone into this enough. It's exactly the same story. The reality is that on the hardware level, Apple has been a follower in nearly every regard, adopting many technologies only after they're considered obsolescent in the PC world. Apple has been between one and three generations behind in RAM, in graphics chips, and in hard disk interfaces for years now, and it still is. My suspicion is that the primary reason for this is that Apple adopts them when they become cheap due to high volume and declining demand in the PC business. If you're shipping an after-end-of-life graphics chip which was low end to begin with, you can get a hell of a deal on them.

Then you wave your hands really fast on the stage at MacWorld, and prove to the faithful that every single aspect of your computer is faster by running five or six Photoshop filters which don't stress the hard disks, doing operations on data which is carefully tuned to run in the L3 cache so that they don't get clobbered by slow RAM speeds, and which don't use the display heavily and reveal the deficit in the graphics chip, and do your comparison against a deliberately crippled PC whose actual configuration you refuse to reveal.

I despise Jobs for lying like this, for doing so repeatedly and shamelessly. And I find myself revisiting the subject time after time because I'm astounded that otherwise rational people can drink the kool-aid and not realize that they're being conned.

If you like the Mac, I have no problem with that. If you find it helps you solve your problems better, then it's the tool you should use. But if you try to pretend that it's actually technically superior, then I am no longer capable of standing by the side. And if you try to pretend that Apple is leading the industry in every important regard and that all eyes are on Apple at all times to see where they'll lead next, then you're delusional. For years now Apple's hardware has been industry-trailing crap, and Mac customers have been paying top-drawer prices for equipment that PC users were already throwing away.

That's their privilege, as long as they don't try to pretend that a sow's ear is actually a silk purse. I know better, and I won't put up with those kinds of claims.

Update: Arnold Kling comments.

Update: Mr. Lion responds -- or rather, he doesn't.

Update 20021021: John writes:

Hey, I know the answer to your question about why early NVIDIA TNT cards didn't support Apple:

I once read an article where an NVIDIA spokesman (I think it was CEO Jensen Huang) said that the reason that NVIDIA TNT/TNT2 didn't support Apples was that the TNT/TNT2 didn't support some of Apple's native RGB pixel formats. (Sort of a big endian / little endian issue.)

Apple's original 2D Color QuickDraw software expected to be able to draw to the screen in a particular RGB byte order, which was different than the ones the cards supported. And Apple QuickDraw didn't have a device driver layer that allowed the card's driver to translate between the upper level software's pixel format and the card's pixel format.

Newer NVIDIA cards added native support for the Apple RGB pixel formats.

Good grief.

Update: Scott Forbes comments. By the way, yes, I know all about 802.11b and DVD writers and Firewire and I'm unimpressed. Those have nothing to do with the issues I raised, and Apple didn't develop 802.11b or DVD writers. It's just that like the PC graphics card makers who try to be first to market with the latest chip, Apple has been using speculative technology from other vendors as a way of trying to differentiate itself. Apple did develop Firewire and it was quite a technical achievement. They then proceeded to nearly kill it commercially by pigheaded licensing demands.

And Apple didn't develop USB. That came from Intel.

But all this is beside the point: why are Mac fans talking about the fact that their cars have different headlights, when the engine and suspension are thoroughly inferior? Why in hell can't Apple use modern RAM and modern HD's?

Robin Goodfellow comments.

Brian Tiemann comments. He says:

I've been treating as axiomatic the "Apple is a force of good in the industry" line; I've used that as the basis for both my cheerleading and my criticism of Apple here. (If they do something that I find worrisome or disagreeable, I point it out, in the hopes that they'll fix it, in the interest of being a better and more successful company.) But if we can't agree on that-- if we can't agree that having Apple around is good for technology as a whole-- then there's very little common ground to be reached.

Yes, we do disagree on that, because I don't think any company is a "force for good". I think that concept is meaningless in the context, and part of what I'm objecting to here is precisely the casting of a straightforward case of commercial competition as some sort of morality play. That is the core delusion on the part of Mac zealots that I find most objectionable.


include   +force_include   -force_exclude

 
 
 

Main:
normal
long
no graphics

Contact
Log archives
Best log entries
Other articles

Site Search

The Essential Library
Manifesto
Frequent Questions
Font: PC   Mac
Steven Den Beste's Biography
CDMA FAQ
Wishlist

My custom Proxomitron settings
as of 20040318



 
 
 

Friends:
Disenchanted

Grim amusements
Armed and Dangerous
Joe User
One Hand Clapping


Rising stars:
Ace of Spades HQ
Baldilocks
Bastard Sword
Drumwaster's Rants
Iraq the Model
iRi
Miniluv
Mister Pterodactyl
The Politburo Diktat
The Right Coast
Teleologic Blog
The Review
Truck and Barter
Western Standard
Who Knew?

Alumni

 
 
    
Captured by MemoWeb from http://denbeste.nu/cd_log_entries/2002/10/Acompulsiontorevisionism.shtml on 9/16/2004