USS Clueless Archives

  USS Clueless

             Voyages of a restless mind

Main:
normal
long
no graphics

Contact
Log archives
Best log entries
Other articles

Site Search

Stardate 20010804.2212 (On Screen): A woman from Detroit who runs a non-profit religious group called Love Your Neighbor Corporation has been bringing actions using the Uniform Domain-Name Dispute-Resolution Policy of ICANN to try to get control over various domain names such as "GodSpeaks.net" and "WuzUpGod.com", and in particular over the names "LoveYourNeighbor.net" and "LoveThyNeighbor.net".

She's apparently a fruitcake; she's had every single one of her attempts summarily dismissed by the arbitrators. Doesn't this woman notice even the faintest bit of irony in the fact that she is using belligerent legal proceedings to wrangle with another non-profit Christian organization over the ownership of the phrase "Love Thy Neighbor"? (discuss)


Stardate 20010804.2110 (On Screen): I've been sitting here all day watching the activity light on my server blinking, and I just got into the server's management package and noticed a really strange anomaly on the referer list, which I've captured and linked to here. (It gives you all a chance to see what the Qube's management menus look like; it's a really nice system.) I reset the statistics this morning (as I do about once a week) and what you're looking at is about 12 hours. Refers from "Disenchanted" and "Cheesedip" are familiar; I get a regular stream of visitors from both of them. (Thanks, folks!) Equally, the refer from about.com is an old friend; it's linked to my CDMA FAQ. But the two which are just labeled "1264" and "252" are very puzzling. In the "most common requesters" frame (not shown here) the 1264-refers seem to be coming predominately from other RoadRunner sites. I believe that this is "Code Red" attempts to probe my server in hopes that it is running unpatched IIS. It's not happening enough to really make any difference to performance, and of course my Apache server isn't vulnerable. (discuss)


Stardate 20010804.1036 (On Screen): I've written before about my admiration for the engineers involved in the European Southern Observatory because of their work on the supremely elegant VLT. The VLT consists of 4 8.2 meter scopes which will have the ability to work together, but even when working alone are already among the largest scopes in the world. Antu, all by its lonesome, has already yielded some superb images because of its adaptive optics and other advances in its design. But that's nothing.

The OWL project, if approved, will produce a telescope with a 100 meter main mirror. The mirror and the assembly which holds it and its secondary optics will weigh a whopping 13,500 tons (which is about the size of a USN cruiser). In fact, it will be so huge that some of its secondary mirrors will be as large as the largest scopes in existence now. They're completely serious about this and their work looks practical, albeit immense and spectacularly expensive. It remains to be seen whether they can convince the governments of Europe to fund the project, but given their undeniable success with the VLT, there's a chance. OWL stands for "OverWhelmingly Large" and they mean every word of it. Go get 'em, boys! (discuss)


Stardate 20010804.1002 (On Screen): The blue-noses are pointing their fingers at Yahoo again because Yahoo is still running porn ads and selling sex-oriented objects on its sites in Europe. One of the critics is a spokesman for the "Traditional Values Coalition." Gad, I hate names like that. (Could be worse, I suppose, it could have been "Family" something or other.) I'd like to know if Mr. Aiken can suggest a busines proposition for "clean". We already know that porn sells and that porn advertisers are still willing to pay for banner ads. How does Mr. Aiken suggest that Yahoo become profitable without their business?

Corporations are not "clean" or "dirty". Corporations are profitable or unprofitable, and that's the only characteristic about them which matters. That happens to be the law, interestingly, and if a corporation operates clean and unprofitable where dirty (legal, but dirty) would make it profitable, its stockholders can sue.

"What I don't understand is why a mainstream company like Yahoo would stay in the business of pornography." That from Patrick Trueman of the American Family Association (oh, well). Surely he knows the answer. (discuss)


Stardate 20010804.0722 (On Screen and console): I'm old enough to actually have memories of the language changing. Words which were once mortal insults (like "bastard") become common banter. Common words change meaning, polite terms become impolite and change back to being polite. Some of that is due to the cognitive disconnect sometimes known as "political correctness", which is the classic confusion of cause and effect. The idea went like this: if there's a word for something which has negative connotations then if we substitute and enforce a different word for that same thing, the connotations will not move over and we'll be able to make everyone think correctly. It don't work that way; words are just tags we use to access concepts, and changing the words doesn't change the concepts. So changing "colored person" to "Negro" to 'black" to "Afro-American" to "African-american" to "people of color" didn't make racism go away. (Racism is going away, slowly, but not because of that.)

But this is about a different word. When I was a kid, "gay" meant "happy". It had nothing to do with sexual persuasion. (When I was a kid no-one was willing to admit in polite society that there was any alternative sexual persuasion.) When I was a young adult, the term came to mean "male homosexual" (and later "any homosexual"), about the time that most people became comfortable with the idea that there were such beasts (without them actually being beasts). But no-one can control language; words mean what they're used to mean and new meanings appear all the time. Language is a work in progress. Kids have now adopted a new meaning for "gay" as a general insult, and I don't think that it has anything to do with sexual persuasion either. It's possible that it derived from that but there's no longer any connection. I'm not sure just where it did come from, really; but in practical use for young people now, saying "that's gay" is similiar to saying "that sucks". The two links given above both contain examples of that usage of the word gay but I've seen plenty of others. I first noticed this maybe two years ago. I suspect the homosexual community will object to this usage, but there's not really a great deal they can do about it. (discuss)


Stardate 20010804.0514 (On Screen): To some extent, the new anti-piracy approaches being used by the music publishers WRT CDs are relying on deterrence. Sony is experimenting with the so-called Cactus protection system which they claim will make it so that a copied CD will be filled with noise. It's emerged that in some cases this might result in actual damage to the system which plays such a CD. I think it likely that Sony is not displeased with this publicity; it may even have planted the story. "See, pirate our CD and blow up your speakers. It'll serve you right." They can't say that officially because then they'd be responsible, but if word gets around, all the better.

But there are a couple of issues here. First, it may not be true. There are copies and then there are copies. One way of copying a CD involves a "raw copy", which not all systems permit. But if it can be done this way, then it means that the copy is bit-for-bit identical to the original and should perform exactly the same way. If the original sounded fine then the copy will, too. The second issue is a question of whether copying software even in non-raw mode (i.e. MP3) could not detect this kind of copy protection and remove it. There are distinct limits on what they can insert before it becomes something that a traditional dumb CD player can't correct, so within limits it should be possible to make an MP3 generation program smart enough to do a filtering pass on the data to remove crap spikes like this system adds. I don't anticipate that it will be long before they become available. (discuss)


Stardate 20010803.2235 (On Screen): So I guess we're seeing it proved yet again that secure crypto isn't guaranteed even in this age of uncrackable ciphers, if you make mistakes. 802.11 is a standard for wireless networking, used in a number of existing products (such as this and this). It was already known that it was vulnerable, and there had been previous cracks. However, the earlier cracks were slow and they required interaction with the system, and therefore the assault could be detected. The new one, however, is devastating.

First, this crack is totally passive. It's possible to crack the crypto simply by listening, which means that there's no way for the network to know it is being attacked. Second, the 40-bit key used by 802.11 can be cracked in fifteen minutes. Third and worst is that the crack scales linearly with the number of bits of the key. In other words, going to a 128 bit key would only increase the crack time to about an hour.

What should have happened is that each base station should have shipped from the factory with a unique public and private key burned into its flash memory. Then what would happen upon opening a session is that the remote would query for the public key, which would be sent in clear. The remote would calculate a session key, encrypt it and send it to the base station which would decrypt it, and then you'd use DES or AES thereafter. Then the only issue is how randomly the session keys were chosen by the remote, which is a much less serious problem. But even if a given session key were cracked, that would do no good for any other sessions since each would use its own. And if the public key for any given base station was cracked it wouldn't endanger any other base station.

Of course, that wasn't possible because 802.11 was designed back in the dark days when the US Government was imposing controls on strong encryption, which is why it is using a 40-bit key in the first place. But there's a difference between the size of the key and the actual strength of the encryption. The DVD CSS uses a 40 bit key but because of screwups it's actually only 26 bits strong (and can be cracked by brute force in a couple of seconds). Now it turns out that the 802.11 crypto is even weaker. If there's a lesson to be learned here it's that private citizens really do need strong crypto, and government controls on same do more harm than good. (But we knew that already.) (discuss)


Stardate 20010803.1620 (On Screen): There are times when I regret being an atheist. I've been following the joys and sorrows of a little girl named Fiona Elise online for quite some time now. She just turned two years old, and the is absolutely the center of her father's life (as she should be). What's unusual about her is not just how cleverly her father has brought her to the web, but the fact that she was born with a deformed heart and has had two open heart surgical procedures in her short life. Now she may soon have to have a third. They moved to New Jersey a few months ago but the surgical team is in the Bay area, and now they have to travel back so she can have a cardiac catheterization (which won't involve opening her chest, thank goodness). There's a chance she'll live to adulthood or even have a normal life span. There's a chance she won't. There's no way of knowing what those chances are. It all depends on a contest between the progress of her disease and the talent and skill of her therapists, but so far it's looking pretty good.

I wish I could help, but I don't know how I can. If I were religious I could pray for her and her family. If I was superstitious I could wish them luck. But as a mechanistic atheist I don't believe in those things, so all I can do is sit here and feel dread about the future of a little girl I've never met that I care a great deal about. (discuss)


Stardate 20010803.1553 (On Screen via long range sensors): I'm not quite sure whether this is a good thing or a bad thing. TV ratings for the "Miss America" contest have been in decline for years and last year they became intolerably low for the show's producers. So they're going to change to try to catch the current wave. Sigh, they're incorporating aspects of "Survivor" and game shows into the format.

One example of supremely bad taste is going away. (That's good.) But it's transforming itself into an even worse example of bad taste. (That's not.) (discuss)

Update: At least they're not going to clone Bert Parks. (I promise that's the last "clone" joke.)


Stardate 20010803.1056 (On Screen): About 15 years ago I worked for BBN, in the group which was maintaining the hardware which supported the ARPANet. It was near the end of its technological life and was about to be replaced by more modern networking architectures, but for a long time it was state-of-the-art. ARPA wasn't the only customer which used it; there were other companies which at the time needed their own private networks. Airlines were big users, for instance, to connect travel agents and airports to their central reservation systems. Big hotels used them, too. The basic technology was what was known as a "packet switch node" which was connected to between three and six neighbors using 56 kbit lines, the fastest available at the time for any reasonable amount of money. It didn't use "source routing"; rather, a packet travelling from Seattle to Miami might go through San Francisco, Salt Lake City, Denver, Chicago, Atlanta and then reach Miami. At each step, the node would decide based on current traffic conditions in the network where it ought to go next.

Obviously, a network like this can be designed well or badly, and you could get the same performance out of networks 50% different in expense. Customers wanted optimized network designs, and BBN had a group whose job was to design networks for potential customers in hopes of making the low bid. They used genetic programming, a relatively new technique then. They would sit down and manually design a potential network, and then put it into a special program that ran on a dedicated high speed workstation. What that would do is to successively mutate the network, and then evaluate all the children and only retain the best few, then mutate etc. until someone told it to stop. Usually they'd run it for three days or so (representing thousands of generations). There were two invariant results. First, the winning network bore no resemblance to anything a human would design. Second, the winning network would massively outperform the best human design for less money. If you didn't know the mechanistic source of the design, you might describe some of them as "ingenious".

As a problem in genetic programming, this one was quite circumscribed. The "genome" was relatively small, consisting of the number of nodes, their placements (out of no more than a few hundred potential choices most of which were cities) and the interconnections between them. Part of the model would be a description of the expected traffic and that didn't mutate, but it did feed the evaluation heuristic which attempted to determine how well a candidate design would carry that particular traffic pattern. This was about the limit of what was possible with the compute resources available at the time, since the high speed workstation on which this ran used a 25 MHz 68030.

But as we all know, the cost and power of computing has come a long way since then, and it's interesting to read that the same essential approach is now being used, with thousands of times more compute power, to actually design computer algorithms. This is really exciting, and it's the first practical example I've ever seen of a program which really can solve a problem for us without us having to tell it how to do so. All programming until now has involved humans doing the design. In this case we're actually seeing working code which no human described ahead of time. It will be interesting to see how widespread this becomes as computers continue to get cheaper and faster. (discuss)


Stardate 20010803.0957 (On Screen): A research project wants to learn how humans think and speak, so they're going to a ready source of utterances from a huge number of people: Usenet news postings. Gad. Is this really the aspect of the human race we want to use in order to teach computers what humans are like? (discuss)


Stardate 20010803.0906 (On Screen): One of the problems with terrorism as a tactic of warfare is that terrorist movements take on a life of their own, and it can be really hard to stop them even after you've achieved your goal (or gotten close enough to it). Some people will get involved as much because they like the struggle and the power and the romance (and the fear, and the money) and they don't want to give that up. Since terrorist movements are quite often involved in crime anyway as a means of financing, it's not uncommmon for the remnants of a terrorist movement to continue on as organized crime. (The Mafia began that way in the 19th century.) It's a lot easier to start a fire than to put one out. (discuss)


Stardate 20010803.0742 (Crew, this is the Captain): I was in a drug store yesterday and while waiting on the checkout counter I spotted a magazine with pictures of three women on the cover. The center one was Britney Spears (no surprise). Only when I looked more closely, it wasn't. It was actually Heather Locklear. The resemblance between them is frightening; you don't suppose that someone has already been working on human cloning, do you? (discuss)


Stardate 20010803.0712 (On Screen): Be corporation just laid off a bunch more people; it's at the end of the road for bleeding money and will soon be dead. Its alternative OS for the Mac died quietly, taken down by Apple, and it missed its chance to be acquired by Apple and become the basis for OSX because it didn't have Steve Jobs. (I still think that Apple would have been better served with Be than with Next; OSX would have been on the market at least a year sooner. On the other hand, without Jobs Apple would probably have been dead by now.) On the PC BeOS couldn't compete because it never got a sufficient weight of apps, even though the OS itself was always excellent. It threw in the towel (and began to give it away) when it decided to shift to trying to make its OS the basis for a "net appliance" just as that market failed, but it did manage to sell its BeIA OS to Sony for use in a product. Alas, Sony's product hit the market just recently, in the middle of a tech slump and just as a couple of other net appliances from other vendors became commercial flops. Sony's appliance is also not expected to succeed. So Be laid off a bunch of people recently, leaving just its core group of engineers. The resulting organization is not commercially viable as an independent company because it no longer has a sales staff. The speculation is that it was being done to prepare the company for acquisition, and that is what this article suggests actually has happened. If so, we'll learn about it in the next few days. I hope it's true.

But the article doesn't say who the buyer is. There's long been speculation that it might be Sony, which would make sense. But that's not the only possibility, especially if Sony has already given up on its network appliance, and I'd like to suggest another. Network appliances are not a viable business, but PDA's are, especially PDAs merged with cell phones. I suggest the following possibility: Be will be purchased by a major cell phone company to produce a competitor for Symbian and Palm and WinCE for use in smart cell phones. None of those three is totally satisfactory; each has its flaws. It is also an emerging business and looks like the wave of the future for the cell phone industry, and it's the only place where a "net appliance" really does make sense. Be has already shown that it is capable of porting its code to different platforms (from PPC to the x86) and should have no difficulty in porting to ARM (the de facto standard for cell phones). Their code is fast and efficient and should have no difficulty running on underpowered portable devices, especially if the screen is small. The GUI would have to be reworked but that's not a serious problem. So my dark horse for an acquisition is my former employer, Qualcomm. (I have no inside information on this; it's been months since I've talked to anyone there. I'm just speculating.) (discuss)


Stardate 20010803.0648 (On Screen): I'm not sure I believe that peer-to-peer (P2P) can ever be a viable business model. The difficulty is that there are fundamental contradictions in the goals which can't be resolved.

A bunch of refugees from Netscape have formed a new company called Kontiki and are working on a P2P system. They are not repeating the "new economy" mistake of punting the business model. So they've concentrated up-front on worrying about how it's going to make money, but I don't think they've thought deeply enough about it. Their answer is that it would be a cut-rate competitor for Akamai. Akamai is a hosting company which has server farms placed all over the US (and probably other places in the world) with high bandwidth net connections, which will host big files for other companies. The idea is to distribute bandwidth problems, but so far it hasn't been a commercial success. The alternative for companies who want to distribute large files (e.g. video files or large flash files) has been to host them on their own servers, but it's possible to really get hammered that way. Kontiki's answer is to use a P2P system to distribute the load, at minimal cost to the content provider. All well and good, except why would individuals want to let their computers (and more important, their bandwidth) be used for distribution this way?

The problems of a P2P network is that its end-users won't pay for it. They have become accustomed to using P2P for free. Also, if they're going to spend long periods on the system (hours per week, which is essential for it to work) then it's going to be because they can trade pretty much anything on it that they want (i.e. MP3's and porn). But if it's a commercial system then it's going to have to be paid for by corporations. Advertising is out; the revenues aren't high enough. Direct sponsorship by corporate content producers would come with the string that the system be designed to prevent data piracy. That means that from the point of view of the potential users the system would be crippled because it wouldn't permit them to trade what they want on it. So you can build a system which will be used heavily but which no-one will pay for, or a system which corporations will pay for but which no-one will use. But even if the users of a P2P system were willing to pay for it (obviating the need for corporate sponsors), they'll only use it if it permits unrestricted file sharing, and in that case the system will be sued out of existence.

Which is why the future of P2P is going to be freeware running a distributed model, so there are no central servers. That means no-one has to pay for the central servers, and there are no large targets to sue, so it means it can be designed to permit unrestricted content distribution, which will make it popular with the users. With such a system in place, even if inefficient, why would users ever want to adopt a commercial system with data restriction? The essence of successful P2P is copyright violation. (discuss)


Stardate 20010803.0604 (On Screen): In the hands of someone looking for cheap shots, this story presents many opportunities, and you'll see a lot of them in days to come. A comedian in Georgia stood up in front of an audience and confessed on stage to having committed three bank robberies. While the audience laughed, the managers of the theater called the cops. He just got convicted of the crimes and may spend decades in prison because of it.

"Jailed for not being funny." "Anything for a laugh." And so on. My take is simply this: anyone this stupid really belongs in jail. I don't think there's anything humorous about it at all. (discuss)


Stardate 20010802.1909 (On Screen): Before he was arrested, Dmitry Sklyarov was in Las Vegas giving a talk at DefCON. This article reveals what he was talking about. No wonder Adobe was pissed. It turns out that most of the encryption algorithms being used for eBooks are laughable. A couple of them are so weak that they make the DVD CSS look strong. One of them is actually using rot13. The one used by Adobe is reasonably strong, but it includes the key in the file (which rather defeats the point). "Inept" doesn't even begin to describe them. (discuss)


Stardate 20010802.1642 (On Screen): We're having an interesting discussion about human cloning in Clueless Comments. (discuss)


Stardate 20010802.1526 (On Screen): If there weren't enough good reasons already to avoid WinXP, it turns out that you won't be able to run arbitrary applications on it. Only apps officially blessed by Microsoft will work; WinXP will refuse to install some programs not blessed by Microsoft itself. It's not clear how extensive this is or whether it applies to everything or only a subset of programs, but even in the latter case it's unacceptable. The purported reason for doing it is to decrease unreliability by not permitting people to install drivers which haven't been thoroughly been tested (this being the largest source of unreliability in NT and 2K) but it also has the effect of forcing ISVs to pay tribute to Microsoft before publishing their programs. (discuss)


Stardate 20010802.1447 (On Screen): Given the perennial shortage of organs suitable for transplantation, it makes sense that such organs as do become available should be used in the people who will get the greatest benefit out of them. Given a choice between an 85 year old man and a 25 year old man, all other things being equal one would choose the younger patient simply because it's likely that even with the transplant the old person will die soon anyway, whereas a successful transplant in the younger person could extend life fifty years.

By the same token, I would object to a transplantation to any person with incurable cancer. My father died of cancer and among other things it consumed his liver. At that time transplants weren't possible, but even if they had been, a transplant for him would have been useless because his pancreas was also gone and he had cancer elsewhere in his body. He ultimately died because of liver failure, but if it hadn't been that it would have been something else.

So though I have a great deal of sympathy for people who are HIV positive, I also don't think that they are suitable candidates for transplantation surgery. It is true that modern drug treatment can extend life for such people, but even with that they're still going to die sooner (and become disabled sooner) than would someone who doesn't have the disease. It's not that I think that a transplant can't help someone with HIV; it's that I think that same organ would help someone else even more, and there aren't enough to go around. (discuss)


Stardate 20010802.1416 (Crew, this is the Captain): One of the interesting things about owning my own server is the ability to access a lot more detailed information about who is getting in, when, and from where. Cobalt has a nice web-based management system for looking at this kind of stuff, which permits me to see things like total traffic or per day or per hour, or the most common 200 requesters, but I rarely look at those. The one I look at a lot is the referrer list, and that can sometimes be strange. I am mainly watching it looking for places where someone else referenced me because I'm trying to keep a list of such people (out of courtesy, if for no other reason). Of course, what I'm seeing here is every refer to my server, not just to the web log. So since my CDMA FAQ (about cell phones) also lives here, a lot of the traffic I see is because of that. But some of it is also an indication of people who don't know how to use search engines. I just spotted a refer from Yahoo that was looking for "beautiful+pictures+of+beautiful+women" and got linked to my essay "Beautiful Women". Unfortunately, he (presumably a male) was searching for all those words, not for for a phrase, and indeed all those words do appear in that essay. Alas, it has no pictures. (The guy looking for "beautiful+young+breasts" will have been equally disappointed.)

Of course, a "refer" doesn't necessarily mean a page with a link to me. What it sometimes means is the page that you (my loyal crew) were at just before coming here by clicking on your shortcut you have for me, and I must say that some of you have very strange reading habits. I still haven't been able to figure out just what "www.hungryhippo.com" is or why anyone would look there, given that it comes up as a nonexistent page for me. For a while I wondered if someone was just trying to play games with my mind. (Now I bet I get to see a lot of really weird ones. Spare me.)

I reset the statistics every few days, and I last reset them about a week ago. I registered with Google to be on their list of places to crawl, and I've noticed since then that several other crawlers have found me. I know that some people object to that, but I don't mind and I find a lot of hits from search engines now, including some I've never heard of: search.rediff.com, for example. In the last week, I've gotten nearly 140 refers from Google alone, which is fine by me. (I have no idea what they were searching for, though I could probably find out by getting into Linux and actually looking at Apache's log file. I suspect quite a few of them were looking for CDMA information, since my CDMA FAQ is top of the list of hits for the phrase "CDMA FAQ".) But I get a bit of a charge out of finding an all-new refer, and today I got a doozie: My entry on Yahoo has gone live and I've already gotten about 15 refers from there. Of course, once I'm no longer "new" and sink down to the end of the page I'll probably never see another one. Jeeze, that page is long. (Why, oh why, didn't I name this AAA Clueless instead of USS Clueless?) (discuss)


Stardate 20010802.1333 (On Screen): This story describes a way of storing energy which involves using abandoned mines. The mouth of the mine is sealed off, and when energy is plentiful the mine is filled with compressed air at up to 75 atmospheres. When energy is needed, it is released again and used to generate electricity. My biggest problem with this is that they're going to flexing the structure of the ground in which the mine was dug. Over the course of hundreds of cycles, there will be weakening, and eventually you're going to get something catastrophic: either a blowout or a ground collapse somewhere. If you really need a significant energy storage system, why not build a reservoir on the top of a big hill and pump water up to it? (discuss)


Stardate 20010802.1251 (On Screen via motion detectors): This apology tries to rationalize that it's completely OK for Apple to only have 3% market share. After all, Porsche only has 0.2% market share and does fine, right? The situations are not analogous. Porsche can survive at that level because its cars burn the same gas as do Chevrolets. What if a Porsche could only burn specialized Porsche-specific fuel? Could there be a reasonable network of Porsche-gas stations with an installed base that small? What kind of price would they ahve to charge for the fuel, and would even rich people looking for a trophy be willing to pay it? There's a good reason why Porsche doesn't make cars that burn nitromethane.

The comparable thing for Apple is application software. There exist PC makers whose market share is smaller than Apple's (Micron PC, for instance) but their products use the same software as Dell and Compaq and all other PCs. They use the same gas stations. But Apple's product needs its own; its apps are not compatible. This means that Apple, unique among manufacturers of desktop computers, needs an entirely separate retail support structure of stores and of ISVs to create products for those stores. That, in turn, means that the absolute installed base of Apple's computers has to be large enough to create a market big enough to sustain that structure, irrespective of how large any competing systems are. By being different, Apple also has to go it alone. It can't leverage off of overall market mass or share the expense of the support structure with other companies. It may well be (some might claim) that Mac software is nitromethane, but regardless of performance that still makes it rare and expensive, or potentially could make it become rare and expensive. (This is what killed OS/2, for instance.)

Apple is unique among desktop computer makers in that a 3% share really may spell trouble. It remains to be seen whether that really is adequate for an incompatible architecture. The ISVs have been very nervous about it for a long time, and if they abandon the platform then it is walking dead. (discuss)


Stardate 20010802.1212 (On Screen): This article is extremely interesting because it explains what the "Goodwill" entry on corporate quarterly announcements really means. It isn't quite what I thought it was, and in a sense it's even worse. However, his claim that it represents paper losses isn't quite correct. If it is indeed a case of trading new stock for an overpriced acquisition, then it really is important that it appear on the corporate spreadsheet because the new stock dilutes existing stock. It needs to be shown as a loss so that the overall worth of the company is calculated correctly in terms of assets-per-share. (discuss)


Stardate 20010802.1117 (On Screen): The military is an ecology. That sounds strange but it really is true. Is a given animal well adapted? There's no way of knowing without knowing about the environment in which it lives. A camel is a superb desert creature but would perish in days in the Artic. Equally with weapons: when you look at a weapon, it's impossible to answer the question of whether that weapon is good one. Good for which army under what circumstances? A weapon which will massively benefit one army may be totally useless to another.

The reason is that modern warfare isn't won by weapons; it's won by logistics. An artillery piece without ammunition is a useless tube of steel. A tank without fuel is a statue. "Logistics" refers to the problem of getting adequate supplies to the front, which sounds easy and isn't. Anyone can field an army, but a modern army without supplies is useless, and modern warfare consumes supplies at a ferocious rate. The miracle of Operation Desert Storm wasn't that we were able to field six divisions or that they did as well as they did, but that we were able to keep them supplied with food, water, fuel and ammunition. But while those divisions only represented about a third of our field strength, it took every bit of logistics support we had, which is worrisome. The only National Guard units which were called up for Desert Storm were transportation units. This is known as the "tooth-to-tail" problem, and a modern army requires a huge tail for every tooth. To field an infantry division of 18,000 men it can take upwards of 100,000 men behind the line (extending all the way back to the mother country) moving supplies and performing other support functions and coordinating everything. And behind that there may be a million civilians working on farms or in factories or driving trucks or operating trains. That's why it is a truism that "amateurs discuss tactics, but professionals discuss logistics." If you've got supply and your enemy doesn't, strategy and tactics become simple.

Is the new Iranian anti-armor missile a good weapon? Probably not. One reason is that high tech weapons like this can't be produced on an as-needed basis. They simply take too long; if you go to war, you fight with what you have, which means you have to stockpile them ahead of time. That is enormously expensive. And once you've stockpiled them, you have to maintain them so that they're still useful when you finally need them. The US can afford to do this, but Iran can't. If this missile is as good as they claim it is (about which I have my doubts) then Iran would have to build and maintain several hundred of them before they became militarily significant, and if they try to do this they will either destroy their economy or will starve the rest of their military.

Also, weapons like this are vulnerable. The backbone of US military doctrine since World War II has been to achieve and maintain air supremacy in any theater where US ground forces operate. This grants you an enormous advantage: you can scout easily and see where your enemy's forces are while preventing him from doing the same to you; you can attack his logistics while keeping your own safe, and you can launch air attacks against his ground forces while keeping your own safe. Desert Storm represented the ultimate expression of this tactic and the lopsided casualty figures speak for themselves.

That's a rich man's war. It's a way of spending money to save soldier's lives. The US is willing to do this and is one of the few countries capable of doing so. In WWII the Germans learned that their forces would always be smothered by artillery fire during any operation on an American front, because the Americans were willing and able to expend immense amounts of artillery munitions in an effort to save American soldiers. German artillery was just as good as American artillery but never had the amount of ammunition that American artillery had.

Which brings up the fact that a high tech anti-armor missile is a rich man's weapon. The US is quite willing to fire million dollar missiles to take out half-million-dollar tanks (with some missiles not hitting). Iran doesn't have those kinds of resources. If the US in WWII was fighting a rich man's war, then the USSR was using the opposite tactic. For the USSR supplies were dear but men were cheap, so they used tactics which were effective but which resulted in huge casualties to the Red Army. They won against the Germans while taking five casualties for every one they inflicted on the Germans, but that's because they could afford to sustain losses like that and the Germans could not. Equally, the US won against the Germans because the US could afford to spend three times the supply per soldier.

If the Iranians build these missiles they'll have to work out how to protect them against weeks of air assault before any opportunity arises to use them against ground attack. (That's true against Israel, too.) So they have to accept that they'll lose perhaps three quarters of their supply of these missiles without firing them, and that the remaining missiles won't have a 100% success rate. Are they willing or even capable of spending that kind of money on an anti-armor defense? Not a chance. Or are they willing and able to field an air force capable of defending these missiles on the ground? Unlikely. Absent that, these weapons become like the Iraqi Scuds: an annoyance but not militarily significant.

In a logistical war, the goal is to use your logistics advantage in such a way as to present your enemy with an unsolvable problem. That was the reason why in WWII there was fighting in Italy. Italy was a place where an attacking force could not be ignored by the Germans. If a substantial British-American force operated there without opposition, it could move up to the top of the nation and invade France, or Germany itself through Austria. But as long as it was opposed by a substantial German force it was not a threat. So since Germany was up against a man-power limit and the US was not as much so, British and American forces in Italy tied down Germans who could have been better used in France or the USSR. And indeed by late 1944, the Germans really were in the situation of being "one army short", which is one of the big reasons they lost. If you can sustain 2500 miles of front but your enemy can only sustain 2000 miles, then you want to have 2500 miles of front because then your enemy will be undefended somewhere.

A weapon is only useful if the nation which fields it has the logistics to support it. Is this new Iranian missile a good weapon? It may be. Is it a good missile for the Iranian military? Not a chance. Of course, it is impolite to correct an opponent when he's making a mistake. If this missile is a logistics blunder for the Iranians, then the best thing for our military people to do is to act worried about it and to encourage the Iranians to deploy it. (Besides which, this may give our military an excuse to pry more money out of Congress.) (discuss)

Update: It's been pointed out to me that I screwed up. The picture in the news article of the missile lying on a truck bed isn't the anti-armor missile about which I thought I was writing. In fact it doesn't say how big the anti-armor missile is, but it's likely an infantry weapon on the order of TOW. Oh, well; all that work about something they weren't actually doing. A front-line infantry weapon would be much more reasonable for Iran's military situation. It also would be less of a threat for a number of reasons, not least of which is that the US Army has been facing these kinds of weapons from the Warsaw Pact for 25 years and I think probably has worked out doctrine for dealing with them. Also, a small missile armed with a shaped charge warhead (along the lines of TOW) isn't really a threat to a tank equipped with Chobham armor such as the US M1. Chobham armor was designed precisely to defend against shaped charges. That's why NATO tanks fire APDS rounds against enemy tanks now rather than HEAT.


Stardate 20010802.0906 (On Screen): They just won't let it rest, will they? IBM is making another attempt to push "thin client computing", but the real point of which is "big iron computing" and guess who gets to sell the big iron? They're going to build a network in Europe originally for "scientists and researchers" but with hopes that it will become the common model there for individuals.

I have several objections to this concept. First, at the rate that cost-of-computing is falling, centralizing it doesn't make sense. For nearly anything except mondo-bit-crunching, it's not that expensive now to buy hardware to solve your problem. Second, any centralized system will be vulnerable to attack and any failure in it will have massive consequences. As an engineer, I don't like single points of failure. Distributed systems are inherently more resiliant. Third, I'm afraid that if some poor or politically-motivated choices (e.g. Java) are made in establishing this model that it will institutionalize those choices. IBM's done it before; we're still living with the consequences of their choice of the x86 over the 68K for the original PC. (discuss)


Stardate 20010801.1531 (On Screen): This article describes in pretty great detail how the marvelous new CD copy protection mechanism which is being sold by a company called Midbar Tech. Though they're keeping mum, it turns out be revealed in patent #6208598. It's really pretty pathetic and there should be no difficulty at all defeating it. Is this all the better they can do?

They crow about the fact that they can stay one step ahead of the pirates. Actually, they can't. The reason is that deploying new countermeasures of this kind (such as changing the size and shape of the noise spike which is interjected) is a slow process, whereas changing the code which defeats it is rapid. A new countermeasure is deployed on new CDs which are distributed in physical form in huge quantities, whereas the crack will be software distributed on the internet. (discuss)


Stardate 20010801.1408 (On Screen): Today's nomination for biggest tech non-story to get widespread coverage is the resurgence of the Code Red worm which was predicted for today but doesn't seem to have happened. Apparently a lot of the hysteria began with cries of doom by Steve Gibson, who seems well on his way to squandering years worth of credibility he's built up. So today after a great deal of coverage which is proving that it was a non-event, this story from CNet which almost seems to be rooting for it to actually happen after all. I think the reason is that CNet is feeling a bit sheepish for having been taken in by the hype, and is sort of hoping for at least a little bit of a disaster, just so they can feel less foolish. Too bad for them. (discuss)

Update 20010802: Seems it was never really a threat in the first place.


Stardate 20010801.0831 (On Screen): Motorola, one of the biggest high-tech companies on earth, is in deep trouble financially, and the running wound of its operation is its semiconductor division which has lost a whopping $800 million in just the first half of this year. I think what we're really seeing here is the death of the old model of the semiconductor business. The new model, which is becoming increasingly common, is to divide the business into foundries and fabless designers. Taiwan Semiconductor Manufacturing Company (TSMC) is an example of a big-name foundry; they don't design chips, but they do make chips for other companies -- lots of other companies, including ones you've heard of. The foundries concentrate on what they know: how to take masks and make good chips from them. The designers, on the other hand, have the ability to start small and grow as their business justifies it. This means companies like nVidia, Via or Qualcomm, who know a specific business and know it well, and can create the designs but can't afford the immense capital investment involved in owning their own fab. Fifty or a hundred companies like this effectively spread the gargantuan cost of the fab by contracting with the same foundry.

That's the problem: fabs have reached the point where no companies can actually afford to operate them exclusively for their own products. One of the few which does this successfully is AMD, but they've done it by staying starved. AMD makes its own chips, but it also contracts outside for a substantial part of its production, so it is saturating its own organic capacity and even if it faces an economic downturn in business its own fab will remain saturated. Other major semiconductor companies like IBM and Intel create a lot of chips for themselves, but they're also major foundries and because of this can spread the capital expense around.

Until now, Motorola has been designing its own chips for things like cell phones and using them exclusively in-house. Their new plan is to also sell those chips outside, but the industry is skeptical about this kind of conflict of interest (I know about this first hand) and I'm not sure I believe that this is actually going to bring in $billions in new chip business. Equally, the PPC is reaching the end of its useful life, being supplanted in embedded by the ARM (which Moto recently licensed) and on the desktop by the amazingly resilient x86 architecture (which is also at the end of its useful life, soon to be replaced by Itanium and Sledgehammer and ARM). Volume in the PPC isn't high enough to justify the awesome expense which would be required to keep it competitive, so it has been languishing.

If there's any solution to Moto's semiconductor woes, part of that is going to have to be getting into the foundry business big time (as Intel is) to subsidize the capital cost of equipment. It may also include a radical change in business model, and abandonment of entire product lines or even businesses, and a substantial deemphasis on in-house designing. With the recent announcement by Palm that it would port its OS to the everpresent ARM, Moto will no longer have the inside track on the PDA business. Moto is going to make ARM-based PDA chips, but it will be competing with Intel and probably with Palm itself, who could go to foundries and make itw own chips. I see no future for Motorola in the PDA chip business and I predict they'll be completely out of it (except as a foundry) within two years.

I don't know what the complete answer might be for Moto's chip business (or even if there is an answer) but they are going to have to completely rethink their business model. Their current business model is the one which was successful in the 1980's, back when all chip companies owned their own fabs. That was possible when a fab cost less than $100 million. When a modern bleeding-edge fab can come in above $3 billion, it's no longer viable. (discuss)


Stardate 20010801.0723 (On Screen): There's a real problem with stock analysts at the big brokerages who trade in the stocks they monitor. The difficulty is one of a conflict of interest, because when one of these brokers announces a change in position, it can make that stock move. So the knee-jerk reaction is to make it so that analysts are not permitted to trade in the stocks they monitor.

Is this really all that much better? Do you really want to follow the advice of an analyst who isn't willing or able to invest his own money based on his advice? Perhaps another route could be followed here.

Right now SEC regulations require that all significant "insider" stock trades be published. I think the best answer would be to permit analysts to trade whatever they want with their own money, but to force them to reveal all trades concerning stocks they officially follow as analysts. Their votes with their wallets would be the most sincere analysis they'd be doing; if they issue a "strong buy" on a stock but sell their own holdings, it isn't too difficult to figure out their real opinion. (discuss)


Stardate 20010801.0628 (On Screen): In 1999, a father and his 19-year-old son went on a fishing trip. The son walked away and was never seen again. Extensive searches, including use of dogs, didn't find anything. Two years later, by a fluke, he's probably been found about two miles from where he was last seen. Without luck they probably never would have found him. It happens. The world is large and people are small and vulnerable. Despite how vital and important they are to us, a human body is just not that easy to locate in the wide open spaces of the world. In that case there was every reason to believe he was near where he'd last been seen (and he was) yet they still didn't find him. That's how it goes.

In an ideal world there would never be any "missing persons". In an ideal world, everyone who died would instantly be found and their remains instantly identified. We're not living in an ideal world, and in fact hundreds of people in the US go missing every year and are never seen again. With millions of people dying every year, it's simply how things are.

In the UK right now, there is a boy in foster care, attending school. He was discovered wandering the streets. Despite having him right there whom they can interview, the police can't figure out who he is. That's the way it is in the real world; this isn't any mystery novel where some genius figures out every puzzle and solves every problem.

Chandra Levy was last seen about three months ago, and her parents want to know what happened to her. That's completely understandable, and if I was them I would, too. It happens to be the case that they are wealthy, and they've hired a publicist whose job it is to keep her story on the front pages by trickling out new developments about her on a regular basis. They're doing this for three reasons: to embarrass and pressure Representative Condit (who they suspect of being complicit in her disappearance), to keep her picture on the newspapers of America in hopes someone will recognize her and provide a lead, and to light a fire under the DC police so that they'll use a disproportionate amount of their resources on this case. So far only the second one has failed, which is unfortunate because it was the most important one. But they've been notably successful at toasting the DC police, who have undertaken foolish things like a mass search of DC parks even though they don't have any reason to believe that she might be found in them. The DC police are starting to make "probably can never be solved" noises and have called off the search, and that is as it should be. They'll keep working on this case at a reduced level, but if they haven't been able to turn up a lead in this case in the two months they've been working hard on it, then they probably never will by straightforward detective work. Further concerted effort is going to be a waste. It's going to take a fluke; someone will stumble on unknown human remains someday, and tests might indentify them as belonging to her.

This is not the answer that her parents want to hear. I'm sorry for them. But Chandra Levy was not the only person in DC who needs the services of the DC police, and it's time for them to get back to work on all the other cases in their jurisdiction. It's also time for the newspapers of America to stop lunging for the bait every time the Levy's publicist tosses another hook in the water. This story is now old until and unless something really important develops -- like discovery of a corpse. (discuss)


Stardate 20010801.0018 (On Screen): I'm getting confused about just what today's House vote on human cloning really means WRT fetal cell research. Most of the coverage I've read suggest that the House vote implicitly covered both issues, but this CNN report suggests that they're being handled separately. Apparently today's vote was on a bill which would make it illegal to "clone" a human with heavy fines and prison time for violators. I actually support some of that. I think that the idea of trying to create a child who is genetically identical to an adult is a road we do not want to travel. On the other hand, this bill would also make illegal any attempt to develop a process whereby someone's own cells could be cultured to grow a new heart or kidney or other organ which could then be transplanted into them without any danger of rejection. I think that even without a ban it would be a long time before this was possible, but this would ban any attempt to make this happen at all. I think that is wrong.

The other thing this would ban is deliberate attempts to harvest human eggs to be fertilized in order to create cells for fetal research. It would not ban the use of spare fetuses left over from in vitro fertilization clinics, which is the primary source now for fetal cell research. What's not clear to me is how it would affect research on adult stem cells; it may also make that illegal.

The margin of victory in the House on this bill was much greater than the Republican majority, and I know that some Republicans opposed this bill, so a great many Democrats must have voted for it. The Senate Majority leader has stated that he favors this bill. It remains to be seen how many Representatives and Senators will vote the other way when the issue of fetal stem cell research finally comes before them. The indications are that it will not go the same way. In the mean time, our President still hasn't issued a decision on fetal stem cell research, and I wish he'd get on with it. What is he waiting for? (discuss)

Update 20010801: It looks as if the House vote was a show-vote, an opportunity for Representatives to take a stance for the benefit of the voters back home, secure in the knowledge that the other chamber will reject it. This happens now and again, and it makes more sense of this. The Washington Post speculates that there is a good chance this bill will be defeated in the Senate.


Stardate 20010731.1855 (Crew, this is the Captain): I was just sitting in a waiting room leafing through a news magazine and it had a side bar about Lance Armstrong, full of impressive numbers like how much food he was going to be eating during the Tour de France (it was published before the race) and how many calories he would burn and how much water he'd consume, and the last factoid was that he was going to generate 97.2 million watts "enough to power his home town for 2 hours and 28 minutes". 97 megawatts is an impressive power output for a human, don't you think? Puts a whole new meaning to the phrase "explosive breakaway from the pack", don't it? That's comparable to the energy release of burning TNT, except that only lasts for a fraction of a millisecond. (I believe they meant joules, not watts. Like as not someone said "watt-seconds" and the reporter omitted the time period.) (discuss)


Stardate 20010731.1112 (On Screen via long range sensors): David Roos writes "I thought we had a deal." We would write content for the web, and you readers would click on the banners to make our advertisers happy. I'm sorry, but there was no deal. Nor is there any such deal in any other medium. Does Mr. Roos read every advertisement in each magazine or newspaper he buys? Does he sit in rapt attention and listen to every advertisement on TV interspersed in the shows he watches? No fair getting up and going to the bathroom, now; do that during the show content so that you won't miss the ads. After all, they're the ones paying for it, right? (discuss)


Stardate 20010731.0901 (On Screen): It's just too easy for an old fart like me to underestimate teenagers. While a fair number of them are airheads, there's depth there, too. Jon Wilcox is (I believe) 17, and this is a link to his site which he recently started after being banished from WankerCounty by someone's parents. (It's not clear whose. WankerCounty is one of my regular reads; it's a collective effort by about six high school guys.) Jon is, among other things, an artist and he's good! It's a fine thing that there is this kind of teenager around; without them we're all doomed. (discuss)


Stardate 20010731.0758 (On Screen): This article describes the new wonderful disposable phones you'll be able to buy soon. Use them for an hour and then toss them. Well, no, not really. Yes, this may happen, but this is not the future of cell phones. The reason is that these phones will be for AMPS, the old-style analog cell system. While the most broadly deployed protocol there is, it's also enormously inefficient in use of spectrum, and all the 800 MHz service providers in the US are gradually switching their 800 MHz spectrum over to one of the digital standards (usually IS-136 or IS-95) because they can make more money this way. (Each MHz of licensed spectrum can carry five to fifteen as many calls with the digital standards.) It's unlikely that anyone will develop disposable digital phones because they're a lot more expensive to make; who would buy a disposable phone that cost $100? (discuss)


Stardate 20010731.0702 (On Screen): About twenty years ago, there was a schlock SF picture called "Looker" which posited the idea that a studio would hire some actor or actress to come in and permit themself to be digitized, after which computer synthesis would permit advertisements or movies to be made with synthetic versions of that actor forever. It's a fun film, though not high art.

The future is now, or at least getting there. They are actually working on doing exactly that, at least with voices. By taking on the order of 40 hours of spoken material from a given person they are working on a technique where fragments could be pieced together to create whatever spoken speech you want with that voice. I'm a little skeptical that they would have the ability to control intonation and expression this way; they might be able to create the word sequences they want (or even to synthesize words which are not among those recorded) but controlling emphasis would be harder. Still, it's not impossible and they're just getting started.

There have been several movies now which were completely computer-synthesized, most famously "Toy Story". With the release this year of "Final Fantasy", though, that process has taken a substantial leap because this is the first time that they've actually tried to create realistic looking protagonists. No-one will be fooled, but it's an amazing achievement and if you look how far they've come in just ten years it becomes evident that within another ten it will be possible to create synthesized movies indistinguishable from reality. Actually, in some areas they are doing those kinds of things now; computer-synthesized special effects are nearly ubiquitous in action films nowadays, and it's become the tool-of-choice for many kinds of TV advertisements. In some cases it's become quite difficult to tell where reality ends and synthesis starts.

But until now computerized animated films have always used real human voice-actors. Read the credits for "Toy Story" and you'll see Tom Hanks and Tim Allen. Look at the credits for "Shrek" and you see Mike Meyers and Eddie Murphy. Will that be the case in 10 years? One of the problems that producers of family-story sit-coms have is that child actors grow up. Over the course of a five year run, adult actors don't really change all that much, but kids will change quite dramatically and that cute four year old becomes an obnoxious nine year old. But not if they are synthesized. New Shirley Temple movies, anyone?

There are two future steps to watch for. First will be the emergence of a TV show where the actors are all digitized from real humans. The humans will come in and be paid for for an extended metric session, where their bodies and movements are digitized and their voices captured. They'll then be handed a big check and no longer needed. In this, as in so much else, new technology will demand new law. I think what will eventually emerge is a licensing scheme where the original actor will be paid a royalty each time their synth is used, and may be able to specify within limits what kind of material the synth can be used in e.g. "no porn".

This will lead to the second step: the emergence of a completely synthesized star, not based on any specific human at all. There's already been interesting work where pictures of a large number of people's faces were captured and averaged, to create a synthetic face not belonging to any of them. Interestingly, if you start with forty faces and do this, the resulting average face is considered "good looking" by people no matter who the original forty faces were. After a reasonable number of these actors have been digitized, someone will average them to create a completely new synthetic person unlike anyone who actually exists (except, perhaps, by coincidence) and may do the same thing with their voices. Or perhaps they'll just hire forty people off the street to come in and be digitized. Issue a casting call for forty Chinese women and create a generic beautiful Chinese synth actress. Or forty AA men. Do the same thing three times with the same ethnic group and you'll end up with three synths, all different, all attractive. And such a completely synthetic actor could make movies essentially forever without aging or being hurt by doing his own stunts. Of course, advances in the state of the art could make a given version of the mesh obsolete, but you'd retain the original film used for the digitization and could create new meshes as necessary to keep up, while retaining the distinctive look of your synth.

Makeup and costuming are obviously no problem, and if you need to age the person for a part, you no longer mess with rubber protheses; just apply an aging-algorithm to the mesh and you're set. (For that matter, with proper analysis it might be possible to do the opposite and take an adult and produce a child version.) If you've got a female synth and she isn't built right for the part you've cast her into, just twist a dial and grow the relevant parts. No more need for falsies or creative support clothing. Need a hunchback or a scar? Does the part require an amputation? It's all just ones and zeros. Want to cast your star into an ethnic role they don't match? No problem. Take your nordic mesh and run the "oriental" algorithm on it.

Which brings up an interesting question about the audience: how much of the experience of enjoying these kinds of entertainments comes from the fact that we grow to know and like specific actors? Will we enjoy these kinds of things as much if the actors in them are different each time, or if they don't exist at all? I think we will. One of the most beloved characters in the history of cinema is Bugs Bunny, who clearly has no reality outside our minds. At the time he was created he had to borrow his voice from the great Mel Blanc, but in future such a character would not need to, because his voice could be synthesized right along with his visage. Who's to say that a completely synthetic character in future could not become just as beloved as Bugs, or Charlie Brown? (And how many kids have you seen carrying around "Woody" and "Buzz" dolls?) I don't think that this will make real human actors and actresses obsolete in film, but synths will arise and rival human actors in popularity within thirty years. Some may be realistic and idealized, some may be caricatures and some may be