USS Clueless - chains of email
     
     
 

Stardate 20040121.1111

(Captain's log): I thought you might be interested in some of what's been appearing in the virtual mail bag lately. These letters are not necessarily representative of the whole, but they are rather distinctive. Given the mixture of quotes from me and various correspondents, I'll color code their words and my words.

Duncan writes:

I'm quoting you here, from yesterday:

"... we can assume there's a conspiracy to spread a big lie? And where we can safely dismiss the opinions of anyone who repeats it?"

That way lies paranoia. You have apparently decided that this is civil war, not just a policy debate. The logic that "anyone who's not with me is against me" leads to solipsism. Opponents sometimes have helpful things to say, sometimes by accident. I don't think you should blithely dismiss their opinions. (In fact, I would think even enemies should be very carefully observed.)

And who are all of these "leftist groups" you see around you? Every honest person in the world has had to admit that Marx's arithmetic doesn't work out, and that the labor theory of value is basically a pretext for theft and tyranny by the powerful. That's an argument that we won quite a while ago. No country that wants to raise capital anymore can even pretend to believe it. (Those that do, like the governments of Venezuela and Haiti, are spiraling down.)

You know, if I were you I would take some time off from this shit, because it's distracting you from your fine technical and scientific writings, which I and thousands of other found very useful. But evey time I check back (more and more infrequently) you're going off about the perfidious French or evil "leftists." Neither of these threats are among the highest priorities for me. And I just basically don't find the engineering/analytical approach to foreign policy that useful (no matter how rigorous), because it doesn't allow enough for surprise.

I hope this helps.

It helps to demonstrate that different people have different interests and should read what interests them. It demonstrates that no single piece of writing or individual author has universal appeal. And it shows that Duncan would probably be wise to visit this site less and less frequently, since he seems to take little pleasure in reading it.

But it doesn't demonstrate that I should stop writing about what is important to me. (And as they say, just because you're paranoid doesn't mean they're not out to get you.)

Duncan also demonstrates naïveté in assuming that Marxist principles are universally acknowledged as wrong. Marxism is alive and unwell and living under the assumed name of "Social Democracy". And it's still failing. But to the extent that Social Democracy has made compromises with and has tolerated limited degrees of capitalism, its failure has been slower and less dramatic.

Reader Simon took exception to this section of a post from earlier this month:

It seems as if each new generation of Americans is forced to prove that the fire in our hearts has not gone out, and the steel in our spines has not dissolved.

He wrote:

Uh, the fire you are looking for is in the belly. The life/death center. The center you propose is the center of relations.

In the Gurdjeff/Ouspensky system what you propose would be considered the wrong use of centers.

He wrote again the next day, and I then exchanged a couple of emails with him on the subject:

The proper use is fire in the belly.

You might want to look into Gurdjeff. He had amazing insights into human psychology. Mixed with a lot of other not so useful stuff.

You write what you want to on your site.
I will write what I want to on my site.
The "proper use" is whatever I want to say.

You may not be entirely correct on this point.

Common understandings allow communication of subtle ideas.

I have the right to fail to communicate on my site.

Well sure. If that is what you wish to accomplish I think in this case you have succeeded.

I was under the mistaken impression that people publish to communicate. I will try to keep in mind that opacity is your real purpose.

I guess you have finally caught the pomo disease. My condolences.

It doesn't appear to have occurred to him that I might not care about "proper use", or that I as an author might not think that "fire in the belly" had the same connotations for my audience as "fire in the heart".

Quite frankly, I had never even heard of George Gurdjieff or Peter Demianovich Ouspensky until Simon wrote about them. And now having looked into it, I'm still confused about why it might be that I am somehow obligated to use metaphors established 80 years ago by two Russian mystics. (Or one each from Russia and Armenia.)

The purpose is communication, and I felt that my phrasing communicated the idea I wished to communicate. "Proper use" and communication sometimes come into conflict, and when they do communication must take precedence. Thus I occasionally deliberately misspell words, or write sentences which contain no verbs. Sometimes I use profanity for emphasis. And sometimes I deliberately mangle standard multi-word phrases such as this one.

Common understandings sometimes allow communication of subtle ideas, but sometimes there are ways in which "proper use" results in subtle alteration and distortion of the message.

Simon also missed the deeper point I was trying to make: it's my site, and it's also a web log. This isn't supposed to be about writing polished papers suitable for publishing in academic journals or big-circulation magazines; it's rough and fast and really rather informal.

Readers may find that they disagree with my opinions. I hope they do at least some of the time; that's part of the intellectual attraction of this medium. When they do, some of them write to me, and that's good, too, because sometimes I learn from that and end up changing my own opinions.

But part of why I like this medium is that I don't have to deal with an editor. I can write what I want, whenever I want, and post it here. It's OK for readers to disagree with me on substantive points, but I consider it presumptuous for them to try to tell me how I should have expressed my own point of view.

There are symbolic and metaphoric meanings to references to our hearts and our bellies in general use in the West; we tend to associate the heart with strong feelings, and we tend to associate our bellies with concerns relating to mundane survival and self-interest. That's because we get hungry in our bellies, but when we think about that which is most spiritually important to us our hearts beat faster. It's actually all in our brains, of course; the heart is just a pump, and the belly contains organs mostly concentrating on processing of food.

A "fire in the belly" to me connotes a crass self-interest; the idea that someone is motivated by a "fire in their belly" sounds like they're strongly motivated to defend their good position, and to hell with everyone else. That may not be how Ouspensky and/or Gurdjieff interpreted it, but that's how I thought my readers would interpret it. On the other hand, a "fire in the heart" connotes a strong dedication to a higher principle.

For all I know I'm the first person ever to use the phrase "fire in the heart"; or it may be something that's appeared many other places. That's beside the point. Irrespective of whether that was common use or not, I felt, and still feel, that it better communicated what I wished to say.

But whether it did or did not, the deeper point I tried to make to Simon was that it is my site, and the purpose of it is to let me say what I want, in the way I think best. Maybe I succeed and maybe I fail; but what's important is that I have the freedom to try.

I like receiving letters about substantive disagreements, but this kind of nitpicking is a preposterous waste of time. I don't really want to receive letters informing me of mispelings, or about when I uses bad grammar. Spelling police and grammar poliec and metaphor police, please kindly send your emails to /dev/null.

(Not to mention the simile police, the allegory police, the parable police, the isomorphism police and the analogy police.)

And I damned well don't want to receive letters from readers who try to offer guidelines about what I should and should not write about.

I might mention in passing that while Duncan dislikes my political writing and wishes I would stick to technology, other readers tell me they like the political stuff. In a few cases it goes beyond that. Stephen wrote (in part):

Hello Steven: use your arguments in my classes occasionally -- gratefully.

Good heavens! What do you teach?

I teach in the Humanities -- civilisation history, future studies, & great thinkers in context. Co-incidently, I'm a lesser version of you: not an engineer, but before my Ph.D I was a network systems tech (IT diploma back in 1979).

I link to your posts in the courses' maillists where they match the course that week for the upcoming seminar discussion. They make my prep work a breeze those weeks: I owe you ;--)

[Blush]

Actually, this is not the first time I've had this happen. One of the greatest thrills I've had during the time I have been writing for this site was when I received a letter from an instructor at the Command and General Staff College asking for permission to use this post about "egoless programming" in a class he was teaching.

I was flabbergasted, and incredibly flattered. Naturally, I granted permission immediately. A couple of months later he wrote again:

A couple months ago you graciously gave me permission to use your essay of software development teams in my military staff training course here at the Command and General Staff College (CGSC) at Ft Leavenworth. Just wanted to give you some feedback That essay generated some of the best professional discussions I have witnessed in 2 years of teaching here. It was everything and more that I had hoped it might be as we compared and contrasted our experiences as staffs with your model for software development teams. great stuff.

I'm happy to hear that. It's gratifying to know that what I'm writing actually may be of practical value and not just amusement for me and my readers. It is particularly gratifying to know that I may be making some contribution to the military; I owe you guys a lot.

It does seem to me that implementing the kind of thing I'm talking about in a military setting might involve special problems. For one thing, if something like walkthroughs was used, and if the writer of the document was senior to the reviewers, then the criticism process might be interpreted by some as a breakdown of discipline. I would hope that an enlightened officer would understand that this is not true, but I guess you'd have to make a presumption that "permission to speak frankly" is in operation during a walkthrough.

The other point which is critical about the process is that being on the receiving end of a walkthrough not be seen as a threat. It was a vital part of the process that managers not be permitted to come to the design reviews, and that discovery of mistakes and problems not have any potential for affecting someone's next performance review. Officers are just as ambitious as anyone else, and if participating in this kind of process is perceived as damaging anyone's chance of promotion, it's not going to work.

A lot of that is bootstrapping; when you're first initiating this process in a group and none of them have done it before, it's quite uncomfortable. But when everyone's been on the receiving end of it a couple of times, and when they realize just how much it actually strengthens their final result, you get a shift from apprehension to acceptance and even positive feelings about the process. You come to think of the reviewers not as critics but as helpers. And it eventually becomes routine rather than traumatic.

1. Cmd and General Staff College is a year long staff course for our Majors who will head out to be battalion thru division level staff officers. A select number stay for a second year of deepthink at the School for Advanced Military studies (SAMS); these were the so-called "Jedi Knights" of Desert Storm fame. CGSC is one step below our War Colleges where we train senior LTCs and COLs who have successfully commanded battalions and who are being groomed for higher levels of command. There's usually 5-7 years between attendance at CGSC and War College.

2. We use our senior officers more for the "sanity check" and "find the fatal flaw" and "provide guidance role" more so than for the "develop a great idea" role. what I found intriguing was how the software team invested so much in ongoing QC and how important it was to manage the interface between modules. We find that to be troublesome as we typically receive guidance, scurry to our own technical area, develop implementing instructions (for artillery or logistics or signal) and often do not bring the entire order together for integration checks until well along in the process. It's useful to have reusable modules (our standard operating procedures) to facilitate rapid orders production.

3. The discussion was interesting on how to generate a fear free environment and converting the culture to one where we are happy to find errors early as an important part of building a high performance staff. The senior guy sets the tone for this climate and it is crucial that he model the expected behavior when errors are found, especially given the hierarchical nature of our officer corps. It can be done effectively, though. We also do expect our junior technicians to be able to call BS on a bad plan of a senior but to do it in the spirit of shared warriorship and integrity. If done professionally with common commitment to a good product that soldiers will have to execute, it's not a problem.

I receive a lot of letters suggesting topics for posts. I don't mind that, but more often than not I can't really comply. As I wrote here, I can't make a deliberate choice to write about a particular subject. If I'm not inspired and try to force it anyway, the result is invariably total crap.

Yesterday Jason wrote:

I don't know if you read The Long View, but the blogger posted his ideal political platform and that made me wonder what yours would be. Would you be willing to post such a thing?

I don't really have one. There are a lot of political issues on which I don't have very strong opinions, or about which I know very little, or which do not really affect me strongly. I also tend to be somewhat realistic about politics and I don't ever expect to see a candidate whose stated position includes nothing I disagree with. It's all about compromises, because I have to choose between candidates all of whom I disagree with to a greater or lesser extent. I have to decide which issues are more important to me and which are less, and to try to pick the candidate that comes closest on the issues I value the most, and right now what's most important to me is the safety of this nation against foreign attack. So even though I feel very uncomfortable with many aspects of Bush's domestic policies, I consider that much less important than the issue of foreign policy and conduct of the war, and on that he gets top marks in my book. Certainly his marks are vastly better than any of the Democratic candidates, and that's why it's virtually certain he'll get my vote.

I guess that makes me something of a single-issue voter, even if that issue is a really big one. But that also means I can't really create a comprehensive platform, because I don't actually have comprehensive opinions or distinct overall ideology.

I mentioned that I like receiving letters which disagree with me, but some of them are less treasured than others, if perhaps more amusing. After "Hesiod Theogeny" linked to this post (where I criticized a Wapo writer for continuing to propagate the "imminent threat" canard) I received several letters expressing deep disagreement which featured shallow arguments. This, from Andrew, was typical:

It seems you are really splitting hairs. Even if the words imminent threat were never uttered it is the impression that EVERYONE had due to the way the administration sold the "new product". Now it turns out that the massive politicization of the intelligence has come back to bite us in the butt. No one believes anything out of the US's mouth and that is causing difficulties in securing Chinese help with NK.

The problem with this argument can be summed up in one word: Qaddafi.

Despite such empirical demonstration that Andrew is wrong, there's another problem with the entire "Bush lied!" argument which has always bothered me: if someone makes a statement that is untrue, are they a liar? Strictly speaking, the answer is "not necessarily".

I think that one of the unsung heroes of western civilization in the last fifty years is Martin Gardner. Back when Scientific American still had a top-notch reputation as a popular journal of science and engineering [before such things as its notorious slash-job on Bjorn Lomborg], Martin Gardner's monthly "Mathematical Games" column, which he began writing in 1956 and continued to write until 1986, was one of the best parts of the magazine. Gardner did a better job of making clear just how fascinating mathematics can be than any other author I have ever encountered, and his columns were always interesting and sometimes immensely challenging and intriguing. I read it every month when I was a teenager, and he helped inspire me and countless thousands of others nascent geeks and nerds. I think you would be hard-pressed to find anyone in hard science or engineering over the age of 40 who had never read or been inspired by Gardner's column, and despite the fact that he stopped writing it 17 years ago I suspect many younger than 40 also are familiar with and were inspired by Gardner's work.

Again and again I find myself flashing back to things he wrote about. That's happening here: Gardner spent a few columns one time exploring various puzzles relating to the "the tribe of truth-tellers" and "the tribe of liars", and how to get correct answers out of someone whose tribe was unknown.

One form of the puzzle is this: you face two people and know that one is from the truth-teller tribe and the other from the lying-tribe, but you don't know which is which. You stand at a fork in the road and know one road leads to a village and the other does not, and need to determine which is which because you need to go to the village. If you are only permitted to ask one question of one man, how do you learn which road is the correct one?

While pointing at one of the men, you ask the other this question: "If you were him, which road would you say was the one which led to the village?" Then whichever road gets pointed to, you follow the other one.

Or if you meet only one man at the fork and don't know which tribe he's from, and are only permitted one question, then you ask, "If you were a member of the other tribe, which road would you say led to the village?" and again take the road not pointed to.

In response to this, Gardner received and published a fascinating letter from someone who pointed out that there was more than one kind of liar. There was the strict liar, who always expressed the logical dual of the truth, and the creative liar, whose intention was to deceive. The letter pointed out that Gardner's solutions only worked with strict liars; creative liars would recognize the intent of the question and would deliberately point to the road that led to the village, since they would know that you intended to follow the road they did not point to. So if there were three tribes, one of truth-tellers, one of strict liars and one of creative liars, you were stuck.

The letter writer pointed out that to some extent there was no way to guarantee to defeat a creative liar, since you were involved in what amounted to a game with him. Rather than responding literally to your question as a truth-teller or strict liar would, the creative liar would analyze your strategy from the way you formulated the question and would seek to act in a way that defeated it. As such, the best you could hope for was to increase your chance of determining the correct answer, so that you were more likely to determine the right answer than to be deceived.

Gardner's reader suggested the following question as the one which gave the greatest chance of success: "Did you know that they're giving away free beer at the village?" You would ignore any answer, but you would follow him as he then set off along one of the roads.

Asked this question, a truth-teller would say "No", and would then head towards the village. The strict liar would say "Yes", and also head towards the village. And the creative liar would face a problem.

Gardner's reader said that his best answer would be to say, "Ugh – I hate beer" and then head towards the village, hoping that you might not follow him. On the other hand, if his dedication to deception trumped his interest in free beer, he might deliberately follow the wrong road anyway and lead you astray. In that case you'd at least get the satisfaction of knowing that he couldn't be sure he was missing out on free beer.

With respect to the arguments used last year to justify the invasion of Iraq, the question of Iraqi WMDs was only one of several, and was not actually the most important. I have written about that many times here, and in this document I offered an extensive list of arguments for why Iraq was invaded.

There were a number of reasons why the question of Iraqi WMDs occupied such a central place in the political discussion, but there was never a correlation between the amount of attention paid to various arguments and their importance. And there are a number of other points that can be made about the entire question of WMDs and the process of deciding whether to invade. But what I wanted to talk about here was the specific question of whether Bush lied. Is it actually correct to refer to Bush's claims regarding Iraqi WMDs as being "lies"?

Let's assume, for the sake of argument, that WMDs were the sole reason for the invasion, even though that isn't even remotely true. Let's further assume, for the sake of argument, that Iraq had indeed fully destroyed all its WMDs and all its banned equipment and that Saddam had no intention whatever of reviving its WMD development programs after the international political heat had been alleviated. Ignore for the moment the fact that there was nearly universal consensus that Iraq still had some WMDs, including UN agencies and international opponents of war such as France and sundry NGOs, not to mention the Clinton administration right up until Bush's inauguration.

So if Bush made the claim that Iraq still represented a threat because of its WMDs, did that make him a liar?

Not necessarily. It is not the case that everyone who utters a falsehood is lying. Someone can only be held to be a liar if they knew the truth at the time they spoke. A truth teller knows the truth and makes accurate statements about it. A strict liar knows the truth and makes statements which contradict it. A creative liar knows the truth and makes statements which are intended to deceive the listener. But there are several other possibilities; those three cases are not comprehensive. In particular, a person who is convinced that what they are saying is true is not a liar even if they're wrong.

Someone who is misinformed, and who genuinely believes that misinformation is not a liar simply because they repeat the misinformation or act on it. They can be accused of many things, such as gullibility, but not of lying.

All of the rhetoric about "lying" obscures the fact that this is an inductive process, not a deductive one, and words like "truth", "falsehood" and "lie" have to be interpreted entirely differently in the hazy world of inductive logic. As a practical matter, no one in the US government (or anywhere else) had conclusive evidence one way or the other about whether Iraq had WMDs or retained means and motivation to continue developing them once it became possible to do so. In fact, after we invaded evidence developed that even Saddam didn't truly know.

All the Bush administration had to work from were hints and calculations and imperfect reports from sources of less-than-ideal credibility; that's how it usually is in intelligence work. It's not crystal clear vision; you're usually trying to identify hazy shapes in the fog.

In other words, at the time Bush made the kinds of statements which my leftist friends have been referring to as "lies", what he had access to were reports which said that Iraq might still have any or all of those things, along with at least some degree of calculation of how likely it was.

And even if those reports and calculations were wrong, or if the calculated probability was low, that doesn't mean that acting on them was wrong.

A simplistic heuristic in this kind of situation would be that one should act on suspicions of threats if the likelihood that the threat exists is better than even money, and not act if it is less than even money. That heuristic is deeply flawed because it doesn't take into account the fact that the consequences of being wrong each way usually are not equivalent.

There's much that can be said about the whole question of false positives and false negatives but for the moment I don't want to fully explore it. For the moment, let me state a modified heuristic which is far better than the one I offered above: when one has information whose veracity is doubtful, one should act on it if the probability of a false positive multiplied by the value of the consequences of acting on a false positive are less than the probability of a false negative multiplied by the consequences of inaction in the case of a false negative.

For example, assume that doctors identify some person in a big city as possibly being infected with some terrible and infectious disease e.g. Ebola. Should they quarantine that person or not before they can run diagnostic tests which would provide certainty?

If they do quarantine the patient and it turns out he wasn't actually sick with the disease they fear, the consequences are extremely small. On the other hand, if they don't quarantine him and it turns out he actually is infected, he could potentially become the seed point for a pandemic which could potentially kill thousands. So even if the chance is low that he's infected with Ebola, they should quarantine him until they are sure he isn't really infected, and that's what they do. So they act on suspicion of infection with serious disease even if the probability is very low.

On the other hand, consider the case of crewmen in a ballistic missile submarine during the height of the Cold War. That sub was there as part of our nuclear deterrent, and the men on that sub had to be ready to launch their missiles because enemy knowledge that they were ready would make it unnecessary for them to do so. It was a truism among those who served in the boomers that if they had ever actually had to launch, then they would have failed in their mission.

If there had been an order to launch, it would have arrived by radio. Suppose that the crew of such a submarine received what looked like an order to launch, but had doubts about whether it was genuine?

If the order was false, but they treated it as true and launched their missiles, they would start WWIII. On the other hand, if WWIII actually had already started, and if the order was true but they did not obey it, all that would mean is that WWIII would only be a monumental catastrophe instead of a grotesquely monumental catastrophe. In other words, the consequences of acting on a false positive were incomparably worse than the consequences of inaction because of a false negative. Once missiles were already flying, then whether that particular sub joined in ultimately didn't matter much.

So every aspect of the process of transmitting such an order to a missile sub, and of verifying it once it was received, was geared towards reducing the risk of a false positive as much as it could be. If there was any doubt whatever about such an order, the crew of the sub would not launch its missiles.

What if, a year ago, there was viewed as being a non-trivial chance that Iraq had WMDs, even if it was not known for certain. If we did not act on the suspicion, there are a large number of ways things might have developed, and in many plausible scenarios eventually there would be widespread death and destruction here in the US, although in some cases it might not happen for a very long time. So the consequences of inaction in case of a false negative were extremely bad. As to the consequences of acting on a false positive, we're apparently seeing them now.

My considered opinion is that the consequences of inaction based on a false negative (i.e. what would happen if we did nothing and it had turned out that Iraq still retained WMDs or the means and intention to develop them) were vastly worse than what has actually happened, and that means that even if there had only been a low chance that Iraq retained such things, it still would have been right to act on it.

This is a case like my quarantine example where the right decision is to act even if the probability is low, rather than like the submarine example where one should refuse to act if there was any doubt. If we had to be wrong, it was better to be wrong in invading than to be wrong in not invading.

So even in the simplistic and reductionist case where one assumes that the only argument for invasion was concern over the threat of Iraqi WMDs, it was still correct to invade.

And in fact, that was far from the only reason why it was the right thing to do.

The entire issue is being manipulated by my leftist friends in a way which makes clear that they are creative liars. They deliberately ignore the fact that it was not just the Bush administration which thought that Iraq retained WMDs or the intention and ability to develop them; UN agencies and other nations also thought so, as did the Clinton administration. They ignore the fact that post-invasion investigations have proved that Saddam had mothballed the development effort and had every intention of restarting development if the sanctions had been lifted. In other words, even if there were no WMDs in Iraq last March when we invaded, there would eventually have been WMDs in Iraq if we had not invaded. And they egregiously ignore the fact that WMDs were not the only reason to invade, or even the most important reason.

And finally, they conflate "the utterance of falsehoods" with "lying", which is wrong, and are trying to apply deductive standards to an inductive process, which is invalid.

And that is why I characterize them as creative liars; they are making statements with the deliberate intention of deceiving. Some of what they say is true, some is false, but all of it is intended to mislead, because they know the truth and seek to obscure it.

Just in passing, this argument is not something I dreamed up afterwards as a rationalization for why attacking was right even when no WMDs turned up. I wrote about this in August of 2002, long before we invaded, and even before Congress began deliberating an authorization for war in Iraq.

Part of induction is to decide not just what the chance is that a given conclusion is wrong, but also what the consequences are for a false positive or a false negative, and decide based on that which way to err. If we are incorrectly optimistic, and do not take adequate action now to prevent development of WMDs, then there's a significant chance that such a weapon would be used against us. I consider that alternative to be far worse than any other outcome scenario I have seen from anyone, an alternative worth almost any price to avoid. I do not want Atlanta turned into a radioactive crater, or any other American city, and I'm not even willing to tolerate a 5% chance of such an event. And given a historical track record of immense efforts on the part of Iraq to develop all kinds of WMDs in militarily useful quantities, and their clear efforts over the course of several years to hide them and impede all efforts to destroy them, I think that there's a high probability anyway that the Iraqi government has not given up on those ambitions.

On the other hand, if we're incorrectly pessimistic, it means we'll fight a war that probably wasn't necessary. That's certainly very bad for whatever nation we attack, but the cost to us of such an outcome is much lower than the effects of having one of our cities nuked.

Update 20040122: George sends the following quote:

"But, nevertheless, the generation that carried on the war has been set apart by its experience. Through our great good fortune, in our youth our hearts were touched with fire. It was given to us to learn at the outset that life is a profound and passionate thing. While we are permitted to scorn nothing but indifference, and do not pretend to undervalue the worldly rewards of ambition, we have seen with our own eyes, beyond and above the gold fields, the snowy heights of honor, and it is for us to bear the report to those who come after us..."

—From Oliver Wendel Holmes' 1884 Memorial Day oration

[I don't need any more examples of prior use of "fire in the heart", thanks.]

Update 20040128: However, I just ran into this case of "fire in the heart" which I think is especially good.


include   +force_include   -force_exclude

 
 
 

Main:
normal
long
no graphics

Contact
Log archives
Best log entries
Other articles

Site Search

The Essential Library
Manifesto
Frequent Questions
Font: PC   Mac
Steven Den Beste's Biography
CDMA FAQ
Wishlist

My custom Proxomitron settings
as of 20040318



 
 
 

Friends:
Disenchanted

Grim amusements
Armed and Dangerous
Joe User
One Hand Clapping


Rising stars:
Ace of Spades HQ
Baldilocks
Bastard Sword
Drumwaster's Rants
Iraq the Model
iRi
Miniluv
Mister Pterodactyl
The Politburo Diktat
The Right Coast
Teleologic Blog
The Review
Truck and Barter
Western Standard
Who Knew?

Alumni

 
 
    
Captured by MemoWeb from http://denbeste.nu/cd_log_entries/2004/01/chainsofemail.shtml on 9/16/2004