2007-12-07

The Senate After 2008

Let's assume that there is a tremendous victory in 2008 for the Democrats. That is, a Democratic president and a Democratic majority in both houses of Congress. This sounds great, but what does it actually mean for the country?

The prognostications that I've seen do predict a Democratic victory, but without much change in the House, which is already in Democratic hands, and with at most four new Democratic seats in the Senate. It is this last prediction that bodes ill for the nation.

In the Senate, any senator of either party can require a 60% vote for clôture before ending debate on a bill. Since (obviously) debate must end before the bill can be voted on, this means that any senator can delay any bill in this way. If it were a matter of a traditional filibuster where a small minority of senators decided to hold up the vote, the 60% clôture procedure would be beneficial. However, what we have seen in recent years is the emergence of a new habit whereby the minority party, as a block, uses the filibuster/clôture procedure to prevent action by the Senate. That is, the minority party, even though it would lose in a straight up-and-down vote, can prevent bills from becoming law whenever they want to.

During the current term, this problem is not as great as it may become after the 2008 elections. In this term, the Republican minority in the Senate can, and does, block most legislation. The only way past them is by pandering to them, compromising strong Democratic programs to the point where (1) they may no longer fulfill the purpose for which they were intended, and (2) they may become so distasteful to House Democrats (who do not have the filibuster/clôture process), that it no longer can pass the House. However, even when the bill does pass both houses of Congress, unless it is truly bipartisan (and therefore usually weak), the President will veto it, so 'twas all for naught.

In the post-2008 world, however (assuming a Democratic sweep), the filibuster/clôture problem will become acute. Even if the Democrats pick up all four seats, there would be 53 Democrats, 2 Democrat-leaning independents, and 45 Republicans. This is nowhere close to the 60 votes required by the filibuster/clôture process, even on bills that Lieberman supports.

Therefore, the public will have spoken overwhelmingly that they want a Democratic government, that they support the Democratic program. Yet, when it comes to passing legislation (and also, certain Executive Branch appointments), the minority party in the Senate will be in position to derail that program.

The problem is exacerbated by the relative unruliness of the Democrats. They are less likely to vote en bloc, and this already weakens the Democratic majority as a cohesive force. However, it is the Senate's filibuster/clôture procedure that will create the largest problem.

This state of affairs has several consequences. For one thing, we should be asking our Democratic presidential candidates how they will deal with this issue. There are basically two ways to do it: (1) compromise with the minority party, or (2) hold the minority party up to public shame for obstructing the will of the people. I think that both methods could work, depending on the goals of the moment, but I confess that the "public shame" approach does appeal to me. However, for it to work, there must be unity among the majorities of both houses and the Whitehouse. That is, the onus of explaining why the Senate minority is working to prevent the implementation of the will of the people must be put squarely on them. Furthermore, the majority must be damn sure they've dotted all the i's and crossed all the t's. Making a big push like this for poorly written, pork-filled legislation would backfire tremendously.

Another consequence is that the voters will probably not get what they want. There probably will be gridlock once again in Congress, and, if past history applies, the Democrats will be blamed for it. This is something that must be discussed up front, during the election debate. The voters must understand the dynamics of the situation, and the candidates must address the issue of how their program will fare as a result. This will probably dampen the enthusiasm of voters, but at least they will know what they are voting for.

2007-11-29

Saving space

On my Macintosh, there are two sources of disk bloat: international language support, and universal binary support. When an application is distributed, there is a resource folder that contains one or more files for each language supported by the application. For example, iChat supports Dutch, English, French, German, Italian, Japanese, Spanish, Danish, Finnish, Korean, Norwegian, Portuguese, Swedish, and two variants of Chinese.  That is, of the 18.5 MB taken up for language support, over 90% is for languages other than English. This phenomenon is true of most Macintosh applications, although not all support as many languages as iChat. The sum total of language files in my /Applications folder alone is currently 2.5 GB, up to 90% or more of which is for languages other than English.

Another source of bloat is the universal binary. This is a method, used for executables and object libraries, of packing several different versions of a compiled program into a single file. For example, in the Macintosh world, there can be Power PC, 32-bit i386, and 64-bit i386 versions of all compiled programs and libraries. In my /Applications folder,  452 MB are used for executables, with another 82 MB in /usr/lib for dynamic libraries, so just over .5 GB for this purpose, with around half or more used for non-native CPU architectures.

Personally, this doesn't really bother me that much. I have 100 GB on my hard drive, and newer systems generally have much more than that out of the box. Expending 2.5 GB or so on international and universal CPU support isn't a problem. However, many users are annoyed by this state of affairs, and there have been many hacks proposed to delete foreign language support and to remove non-native CPU support. However, these hacks can mess up the software maintenance process in various ways, ande, a software update can undo the effects of the hack.

In any case, I think that this issue deserves to be taken seriously at the system design level. In my view, a decent compromise would be to allow users to enable auto-compression of the less-frequently used components of their system. What follows is a proposal for a system-level change that could accomplish this fairly easily.

The first and most important thing would be to build in expansion of individual compressed files and folders to the software libraries or frameworks at a low enough level so that the process would be transparent to most programs. In effect, there would be a bit right in the inode of a file or directory indicating that it is compressed. (There could optionally be some other bits indicating the compression mode.) That is, the user would see no difference between a compressed or uncompressed file system resource; the standard frameworks would invisibly expand compressed files. In addition, auto-compressing or uncompressing a file system object should not cause any of the time stamping on the object to change.

For reasons of efficiency, a few programs would work with the compressed objects directly; for example, utilities such as find(1), and GUI file system browsers such as the Macintosh Finder, shouldn't require that files or folders be expanded. That is, there should be support built in for shallow access of auto-compressed file system objects in their compressed state.

However, most programs would trigger auto-expansion of the file system objects they touch. If you open a file in a text editor or word processer, it would auto-expand. If you compile a source file, it would expand. If you execute an executable, it would expand.

Each file system object keeps track of how long it has been since it was last read or written. A system daemon needs to scan the file system at low priority in the background, and compress file system objects that have not been read or written recently, where "recently" can be defined programmatically. For example, the idle time required for an object could be a function of individual files or folders, of file types, of file ownership, the amount of free space on the disk, and so on.

Under this scheme, all applications and files would be installed in their compressed state, and even the system would have everything compressed initially. As the user began to use the system, things would expand. If a resource wasn't used for a while, then (if this functionality is enabled), it would be autocompressed by the daemon.

A related functionality would be targeted toward the elements of universal binaries. This system would compress those elements that have not been accessed recently, and uncompress them as needed. 

In effect, there would be a trade-off between disk space and execution time. If the compression is set too aggressively, you'll save lots of disk space, but your system will be spending a lot of time compressing and expanding files, and will be slow. However, a correct balance will buy you disk space but cost very little in time. For example, I do not read any of the non-Latin character set languages, so I would rarely access the .lproj folders associated with them; they would probably all stay compressed at all times. On the other hand, the English files would be accessed frequently and would rarely qualify for auto-compression.

The situation would be even simpler for universal binaries, since in almost all cases, only the native architecture would be used on a given machine. The exception would be a file server with clients of different architectures; in that case, several architectures would be expanded.

This change could be done in a fairly straightforward manner, I think. However, I must admit that I would probably not enable it on any of the Macintoshes that I own or administer. As I stated in the introduction, the amount of space used to support internationalization and universal CPU architectures is small as a percentage of modern disk space. If a given system was actually running out of disk space such that this overhead became critical, the correct solution, in my opionion, would simply be to upgrade the hard drive.

2007-11-26

Airlines and laissez-faire

This article in the NY Times describes a two-class system of air travel, where business- and first-class travelers benefit from increasingly lavish treatment, and coach-class travelers suffer in less and less comfort. The explanation for this state of affairs, as laid out in the article, is quite clear. Airlines must compete for the fewer, but deeper-pocketed better-class traveler, but, since the coach-class traveler buys tickets based solely on cost, and since coach sections are almost always full or even over-booked, all they need to do there is to make the ride as cheap as possible. It seems to me that this state of affairs is the expected, necessary result of applying laissez-faire principles to air travel. Since there really are no relevant government standards, airlines are free to do whatever it takes to maximize profits.

The problem is that airlines depend on the government at various levels for many things. It simply isn't possible for laissez-faire to extend to things like air traffic control, noise abatement, safety and interoperability standards, security, and so on. This is why various other countries have opted for full or partial state-ownership of the national airline. I'm not advocating a federally-owned passenger airline, but it illustrates the point that the government is already, necessarily, involved in oversight and control of airlines.

Why doesn't some particular airline take the step of upgrading coach amenities unilaterally? The answer to this is clear: it would be economic suicide. The NY Times piece gives examples of this. Each time, the change reduced profits and was withdrawn. Clearly, under the present circumstances, the situation will only continue to worsen.

Therefore, the question for America as a society is whether we will continue accept the consequences of relinquishing oversight of passenger comfort. Only a centralized, government mandate can improve the situation. This would also be fair, since such a mandate would apply equally to all airlines, not giving a competitive advantage to any one.

Why don't we adopt standards for such things as:

  • Spacing between seats (possibly as a function of the length of a flight).
  • In-flight meal service.
  • Maintenance of passenger comfort items such as seats, music systems, lights, video systems, and so on.
  • Width of the seats.
When we read about oil-rich sheiks converting airliners into flying pleasure palaces, it becomes clear that air travel doesn't have to be hideously uncomfortable. Coach-class passengers shouldn't expect luxury, but there is a certain basic comfort level that we should be able to count on.

What about cost? Well, yes, the cost will go up. However, airlines will still compete on cost in coach class. Since all of them would be subject to the same passenger comfort standards, this competition will reduce ticket prices without sacrificing passenger comfort. We will end up paying a little more, but, given a level playing field, not as much more as one might expect. Airplane travel could actually become pleasant, not something to be dreaded. The question is, will we continue only to look at the cost side of the cost-benefit equation, or will we accept that a greater benefit might be worth a greater cost?

2007-11-19

Ideological Terrorism As a Mental Health Issue

This is probably the strangest bit of random philosophizing I've ever done. But I had this idea about terrorism and I thought I'd write it down.

In the US and in many countries around the world, there are laws that allow people to be committed involuntarily as the result of mental illness. In the United States, there are many kinds of safeguards against false or unnecessary imprisonment, which can be summed up as the following basic principles:

  • The person must be found to be mentally incapable by a psychiatrist or panel of psychiatrists.
  • Because of their mental status, the person must be a danger to himself or herself, or to other people.
  • The danger to self or others must be such that only institutionalization can prevent harm.
  • While in the institution, the individual must receive treatment for their disorder.
  • While in the institution, the individual's status must be reviewed frequently (e.g., every six months).
I believe that terrorism--the deliberate slaughtering of innocents--is the product of the worst kind of conduct disorder and delusional mental status. There is currently no "official" diagnosis in the APA manual for "Ideological Terrorism Disorder" (ITD), but I think that one could be well-motivated and clearly defined. Even in the absence of a separate diagnosis, there are components of existing disorders that can be identified with terrorism: delusions of various sorts, paranoia, depression, conduct disorder, and obsessive-compulsion. But the core disorder, it seems to me, is that the individual terrorist has been infected, so to speak, with an ideology that cancels out normal standards of social awareness, and that promotes terrorism, even self-destructive terrorism, by the individual. There have been many such ideologies; this is by no means an attack on the currently most frequent one in cases of terrorism, so-called Islamism.

The question has often been asked, why so few Islamist terrorists? If the ideology intrinsically promotes violence, why is it that only a few individuals actually commit terrorist acts? I believe that this state of affairs is strong evidence for the terrorism-as-mental-illness hypothesis that I am promoting here. That is, as the ideology itself becomes prevalent throughout the community, ITD is triggered only in susceptible individuals, not universally.

If this hypothesis is accepted--and I'll return to an important reason why it will be difficult for this to happen--then it follows that ITD individuals are not responsible for their actions, any more than other mentally deranged individuals are for theirs. Therefore, the issues of guilt and punishment do not apply to them. Instead, the important question is whether those individuals present a danger to themselves or to others that cannot be controlled without institutionalization. If an individual presents with ITD, they are almost by definition a danger without institutionalization, and so in most cases, an ITD diagnosis would lead directly to long-term institutionalization, until periodic review indicates that the person no longer has ITD.

I believe that this approach is highly satisfactory compared to the military or criminal approaches that we now apply, for two reasons. I think that this approach will be more effective at preventing harm to the individual or to society, and I think that it offers the possibility of treatment and possibly a return to normal or near-normal life for the ITD individual, through out-patient monitoring or eventual remission and release.

Why isn't this approach already in place? In the face of the horrible acts perpetuated by ITD individuals, why hasn't the mental health community and the government responded in the manner outline above?

There is an existing problem with the diagnosis of delusional disorders: many delusions are so widespread that they cannot be considered abnormal. I may offend many here, but the prime example of this is religious delusions. There are many psychiatric patients whose delusional systems involve religious aspects. Millions of "normal" people accept as true the reports that Jesus, Moses, Muhammed, or Joseph Smith conversed with God or with angels, and that God, angels, or other supernatural entities affect their daily life. Society can accept this kind of delusion as long as it is benign, that is, that it doesn't cause antisocial acts. This fact interferes with the diagnosis of some individuals, such as many schizophrenics, whose delusions tend to run along conventional religious tracks. For example, we accept that people somehow hear God telling them what to do, as long as their actions are not harmful to themselves or others, or are not outside the conventions of normal behavior.

This can also be true of non-religious ideologies. People can become "true believers" in such things as racial or national superiority, or that certain social systems are inherently bad or even evil, as long as they do not harm others or themselves, or become too eccentric in their personal behavior.

This makes the task of diagnosing standard disorders such as schizophrenia more difficult, not to mention a disorder like ITD where the defining symptom, outside an ideology that is shared by millions of "normal" individuals, is the commission, planning, or substantive support of acts of terror. Yet, there seems little doubt that in susceptible individuals, certain ideologies have been demonstrated to trigger antisocial acts, including terrorism.

I do not think that simply because ITD individuals may share the same ideology as people who do not commit antisocial or terrorist acts, that we should fail to recognize the fact of the disorder; that the disorder, with proper study, may have brain correlates and may eventually be treatable. If we are to combat terrorism, and if terrorism is the product of a mental disorder, as I believe the facts indicate, then the failure to involve the mental health community, and the system of laws, procedures, and safeguards that exist for them to use, is like fighting with one arm behind our back.

The US government has imprisoned hundreds of individuals, and our leadership has basically spun a web of legalalistic rationalizations to support their continued imprisonment, in ways that go against our legal tradition and our constitution. However, if ITD were recognized, along with peer-reviewed, tested standards of diagnosis and treatment, it is my contention that the individuals we have imprisoned could have been treated much more justly using the mechanisms of our mental health system mentioned above. Their rights would have been protected, but also their safety and the safety of society, in an open and compassionate way.

One final thought. What about people like Osama bin-Laden? He personally, as far as I know, has committed no terrorist act; certainly no act of suicidal terrorism. Where does he fit into this picture?

I think that bin-Laden is a sociopath. He understands very well the processes of ITD, and he works to trigger it in as many susceptible individuals as he can, and he assists them in manifesting their disorder by carrying out ideologically motivated terrorist acts. This kind of malevolent parasitism has been seen from time to time in the past, where a sociopath has manipulated mentally ill individuals to carry out acts of violence, but because of the very nature of ITD, it is possible for al-Qaeda to work on a much wider scale, and for a much more focused purpose, than in the historical examples of the deliberate manipulation of schizophrenics.

In summary, I suggest that there is a mental disorder that I have labeled Ideological Terrorism Disorder, ITD, that is at the root of the vast wave of terrorism we are currently witnessing; that ITD individuals should be handled via the mental health system rather than solely by the military and criminal justice systems; that this approach would be more effective in protecting society as well in protecting the ITD individuals themselves.

2007-11-17

On credibility

We've been hearing a great deal about credibility lately. There are currently almost 1.5 million hits on Google for +bush-administration +credibility +2007. Virtually all the articles say the same thing, that the administration's credibility is either eroding or is already at floor; there are differences about the topic, though. Left wingers focus mostly on the war and torture (along with quite a few other matters), while right-wingers mostly focus on immigration. But they all agree that the Bush administration has a credibility problem.

We also hear about the credibility of Congress. A lot of people voted for Democrats so they would end the war. However, the war is still going strong, facilitated by legislation passed by a Democratically led Congress.

What interests me, however, is that there are actually two different issues that tend to get lumped under the credibility rubric: truthfulness and delivery of what was promised.

Under the law, intentions are everything. If you promise something in good faith but cannot deliver because of circumstances that you cannot control and that you couldn't reasonably have been expected to know about, then you are considered blameless. A somewhat similar situation obtains in politics.

If a political leader promises something and doesn't fulfil the promise, there are a couple of possible consequences.

  • Everyone knows that it was just campaign rhetoric ("no child left behind"), no one expected it to be fulfilled literally. In this case, no credibility is lost unless the politician's behavior goes flatly against the entire flow of the campaign. Even then this is a gray area, and it depends in large part on the economy.
  • The individual leader makes a strong attempt to fulfill the promise, but is blocked by other politicians or other factors. This is more serious. No one doubts that the individual told the truth when he made the promise initially, but he still loses credibility in terms of effectiveness.
This highlights the dichotomy. In order to be credible as a politician, you must be perceived as (1) honest, and (2) effective.

The Bush administration had developed a reputation in left-wing circles as dishonest, and as a result had little credibility. However, for the population at large, it wasn't until Hurricane Katrina, when the administration was exposed as grossly ineffective, that the most serious credibility problems began.

It is also true that there is a kind of osmosis here. If you doubt someone's effectiveness, then it becomes easier to doubt their honesty, and vice-versa. Furthermore, when some people with a given label lose credibility, for either reason or both, then it becomes easier to see other people with the same label as having low credible. This can result in a downward credibility spiral such as we have seen with the Bush administration and Republicans, within the US, and Americans in general, outside the US.

In the domain of politics, it is extremely difficult -- usually impossible -- to rebuild lost credibility. This means that the response we have seen by the Bush administration of ignoring the credibility issue may actually be rational. That is, the credibility game is irreparably lost, so the only thing that matters is (1) helping their contributers, and (2) trying to clean up the future historical record by both actions and secrecy.

In first few years of the Bush administration -- George Lakoff has pointed this out also -- Bush's credibility was actually protected by this "good ol' boy" pseudo redneck shuffle. That is, if Bush didn't carry through on a promise or made some other error, it was put down to ineffectiveness rather than dishonesty. Even in leftwing circles, where his honesty had been questioned for some time, there was debate about whether such and such a piece of wrongdoing was the result of dishonesty or of foolishness. This is in large part why it took Katrina to make most people realize not only that the administration was ineffective, but, by osmosis, that it was also dishonest. In a way, it was only after Katrina that many people finally understood that the Bush administration never had any significant supporting evidence for the contention that Saddam Hussein either had weapons of mass destruction, or that he had any desire or intention of using whatever weapons he did have against the United States.

Well, it's definitely been a strange almost-seven years since the 2000 elections, and the issue of credibility has been a large part of the strangeess.

2007-11-15

Abolish Veterans Affairs Healthcare?

In the United States, we have government-operated, single-payer healthcare for only a few selected segments of our population: the very poor, through Medicare; prisoners; the military; and veterans, through the VA healthcare system. The rest of the country must pay their own way, through a mosaic of private deals between employers and individuals on the one hand, and insurance companies of various kinds, but almost always "for profit", on the other. This system has always seemed to ridiculous, in that the argument against "socialized medicine" has been that it would taint our democracy and lead to communism; yet, we provide socialized medicine to millions of members of special groups throughout the land.

Up until now, I have been against this because it seemed unfair to the population at large that they could not share the benefits given to prisoners and military veterans. But today, I read about the study by CBS investigators concerning the suicide rate among veterans, especially young veterans of the "War on Terror". It seems that the rate is about four times higher for those individuals than it is for their age-mates in the general population, as many as 120 suicides per week among those veterans. This is obviously a failure of the VA healthcare system, yet, it is difficult to criticize them unduly. The VA must provide healthcare to veterans both during peacetime and during and after combat episodes. This means that there will be long periods of relatively low and stable utilization punctuated by surges of high utilization. Its is very difficult for the VA to respond adequately during these surge periods because their resources tend to be optimized to the long periods of inter-war stability.

But what if there was no separate VA healthcare system? What if all Americans could receive the healthcare they needed from government-operated facilities? Over time, the resources of the system would become optimized to cover the needs of all Americans, a much larger and more widely distributed group. This would benefit all Americans in obvious ways: no one would be without healthcare; costs would go down; there would be an increased focus on preventative care and so overall health would improve. The undoubted benefit to the general population of socialized medicine is something I've believed in for a long time. It clearly would also benefit veterans, because one of the complaints that veterans tend to have about the VA is that for many of them, it's a long trek to the regional VAMC where they can be treated. Under universal socialized medicine, there would be few if any VA-specific medical facilities, and veterans would largely be treated closer to where they lived, which would benefit them greatly. However, the CBS study suggests another benefit to veterans that would follow the abandonment of VA healthcare in favor of universal socialized medicine: it would help mitigate the wartime surges of healthcare demand -- for the kind of mental health treatment that could reduce the suicide rate among veterans, but also for other kinds of healthcare. All those veterans who must wait a year or more for treatment by the VA would be much more likely to get prompt attention if they could go to any facility in the nation.

So, rather than blame the VA for the suicide epidemic among young veterans, to me they are just another casualty of the greed-based American healthcare system in general. Maybe someday we will be able to move past our individualistic cowboy mentality and understand that there are certain things that simply work better when we operate collectively as a nation "united".

2007-10-25

Some thoughts on Internet-based language learning

Sometime ago, I decided I wanted to use the multinational aspect of the Internet to help me improve my French. Here are some comments on the experience so far. My situation is, located in California, with no local contact with the French language except through books and other written material. We get TV5, the French cable channel, but we don't watch that as much as we should, and it's not interactive, so if you don't understand something, it's just gone.

The first thing I did, several years ago, was to start a bilingual discussion mailing list, called Freng. It started with free discussion, but it was clear that some kind of structure would improve things, or so I thought, so I came up with the idea of serializing public domain texts, found for example at the Gutenberg Project. For the past several years, we have been doing this, reading English and French texts on alternate days. We are currently well along in Monsieur LeCoq by Émile Gaboriau, and also The Breaking Point by Mary Roberts Rinehart. What happens is that we receive each day a selection, about a screenful, in email. The truth is, discussion is very sporadic, and only a few of the members usually discuss anything. However, it's really great to read a serialized novel in this way. My intention is to change the Freng list over to a web-based discussion group, with the same daily distribution of serialized texts; however, this probably isn't going to happen for at least a few months.

But since I started Freng, blogs have become much more important, and in particular, Corine Lesnes' Big Picture blog, which is a French-based blog (Corine is a journalist with Le Monde), but which includes people of various language backgrounds and has a true international flair. It is really on Big Pic that the kind of discussion I had originally envisioned is truly available. I'm sure there are other venues for this, there are thousands of blogs out there.

Those are the main things I currently do on the Internet to improve my French.

However, there is one more thing I want to mention in this regard. Since my object is to learn, it is not sufficient for me that I merely make myself understood. I also want to get feedback specifically on things like vocabulary, grammar, othography, and so on. To do this, I invested in a tool that has proven quite useful, although it has quite a few limitations.

The tool is a pair of programs called Ultralingua and Grammatica, both available here. Ultralingua has a variety of modules; the relevant one is English-French. You put your cursor on a word you don't know and type (on the Mac) F1, and a little window pops up with the definition. You can also enter words directly into a bilingual dictionary panel to get definitions. It also has a verb conjugator. Grammatica has several modules as well; the relevant one is, of course, French. This is a grammar checker. You highlight a sentence or a paragraph or a phrase, and type F2. Grammatica goes through it and complains about various kinds of grammar problems, usually offering one or more suggestions. This is great for reminding you about gender agreement, verb concord, accent mark placement, and so on.

There are two more tools I use. I have a Systran dashboard widget (this is on the Mac, I'm sure there are Windows equivalents) that does computer translations to and from a bunch of languages. If I'm not sure I've gotten the meaning quite right on a French sentence, I'll get Systran to translate it and see. This helps with false cognates and some of the details of verb tenses and so on. Sometimes I'll use Systran to translate English to French, but surprisingly, I really don't use that aspect of it very much. In any case, the translations are often pretty awkward. It is what it is.

The other tool is Google. Ultralingua has a panel that allows you to enter a phrase and a context, and it calls Google to find examples. It is useful, and I use it sometimes, but I usually just type in the phrase in Google. If, for example, I'm not sure that people say "penser de", I'll just google for tha sequence.

The result of this approach is generally pretty good. It's not like it corrects all my mistakes, I still make plenty, but it catches a lot of them. I think that over time, I don't make as many. My vocabulary is slowly improving. Grammatica doesn't help much with certain things, like verbal prepositions, and both Grammatica & Ultralingua have a limited wordlist. When Grammatica encounters a word it doesn't know, you can tell it to ignore it, but then the quality of the grammar check goes downhill. A better idea would be to allow you to enter a word that patterns grammatically about the same as the unknown word. Even if the match wasn't perfect, I think that would improve the grammar check. I think this means that sometimes what I write may be technically correct but not idiomatic French. Hopefully, as I read more French, this problem will correct itself.

The other problem is that my writing is probably too academic, not as natural as I might want it to be. Sort of "bookish". But, given the approach I've chosen and the methods I'm using, I guess there's not much I can do about that. Obviously, my verbal ability isn't going to improve much with this method, but the truth is, what would I use the ability to speak French for? Who would I speak to?

Well, if anyone reads this and wants to comment or write down things they've tried vis-à-vis using the Internet to help with language learning, vas-y.

2007-09-11

The US invasion is a knife in Iraq's chest

I think that this is a pretty good metaphor for the invasion and occupation of Iraq. If you get stabbed deeply, sometimes the presence of the knife blade can push cut or broken tissues together and slow down the bleeding; that, along with the certainty of even greater damage as the blade is withdrawn, can be an argument for leaving the knife in place, at least for a while. In the metaphor, the knife is generally left in place until the victim (1) is in the O.R. and is medically stabilized enough for the withdrawal, or (2) dies.

Therefore, as much as it pains me to say it, there could be an argument for American troops (the blade of the knife) to stay in the heart of Iraq. I think there is even an effective way to do it that I'll mention below. However, returning to the metaphor, the issue of how to extract the knife and help the victim to recover is a separate one from the criminal matter, which is that when you stab someone deeply in the chest, you must generally face criminal charges. In the present case, the facts of the invasion would probably lead to manslaughter charges rather than to murder charges, because the goal was not to destroy Iraq (even though that is pretty close to the consequences of the invasion). However, manslaughter is a serious crime, and the perpetuators--the US and its allies--must face the consequences in terms of international war crimes procedures. I personally feel that along with the individuals in the American government who were directly responsible, the American electorate also shares a significant role. I have no idea what kind of punishment would fit the crime, but at the very minimum, I think that a formal admission of wrongdoing and the sincerest apology would be a good start. As for the ringleaders, they should be thrown from public office and spend the rest of their miserable lives in jail.

As for how American forces can stay effectively in Iraq, I go back to a letter I wrote to the New York Times in 2004: more than ever, we need a voter referendum in Iraq. There really is no other way for us to be seen as a legitimate presence there. The government, created by us, cannot legitimize us. Only the people can, in a direct referendum. We should contribute whatever resources are required for a national referendum, but the voting should be monitored by international observers. The question must be: "Should American forces stay in Iraq for one more year?" Part of the basis of the referendum would be that after a year, another referendum would be held with the same question.

If a majority of Iraqis ask us to stay, then, as the perpetuators of the deadly knife blow, we owe them that much. However, our status would be changed for the better: it's one thing to occupy a country by force; it's quite another thing to do so by the specific request of the people. One the other hand, if a majority do not want us to stay, then we of course still would have an obligation to do whatever it takes to help the victim of our attack (i.e., financially, technical advising, etc.), but we must withdraw our blade from her heart as quickly and smoothly as possible.

2007-08-09

Emergency foreign worker legislation

We have reached an impasse regarding immigration. We have basically three camps:
  • So-called "patriots" who can't stand the fact that millions of undocumented workers are soiling our...soil.
  • So-called illegal aliens who are doing work that Americans can't or won't do.
  • American employers who can't find citizens or legal resident aliens to do business-critical work.
The three camps are trying to solve this problem with long-term solutions that appeal to partisan idéologues who, let's be honest here, are using the immigration issue as a political football, with much more symbolical than practical value.

But there is another way to look at this situation: undocumented workers are being hired in America because there is a great emergency here: Americans in sufficient numbers simply are not available to do hard manual labor at low pay, yet our economy still has a critical need for workers who will do that work. The immigration system that we have in place is oriented to the top end, seeking to find immigrants who we want to join us as members of an upwardly mobile, consumer-oriented nation. This system may or may not be optimal for that purpose, but it is completely useless in the ongoing employment crisis. That is why the informal, but very widely accepted, system of undocumented guest worders has evolved, and it's why it won't go away.

As is so often the case, the underlying problem here is denial. We simply do not want to admit that there is a "slow crisis" at the lower end of our job scale. The unofficial acceptance of foreigners who have stepped up to help us deal with the emergency feeds into our denial, because those foreigners and those who employ them basically work together to conceal direct evidence of their presence. If we could simply accept that we have a crisis and that without the assistance of millions of foreign workers, our economy would be in much worse shape than it is, then I think we could resolve the immigration issue.

Basically, what we need is legislation that formalizes what is now going on. If an American employer or group of employers wants to hire people to perform certain work at a very low wage, and can't find citizens or holders of work visas to do the work, then they should be allowed to request the needed number of workers through the Emergency Foreign Worker Act (EFWA). There would be a process required to certify that the position(s) cannot be filled via citizens or holders of normal work visas, resulting in a renewable "EFWA certification" for the positions, or for certain classes of position. EFWA certification would allow workers to be brought in, bypassing the normal Immigration Service procedures, to do the work. It would also gather biological personal identification of the workers, such as height, weight, and fingerprints, and issue them photo IDs. This information could be used to screen out undesirable candidates. Finally, in gratitude for the willingness of these foreigners to help us deal with our labor crisis, we should give those who have served in this capacity several benefits:

  • First, they should be allowed to move to another EFWA-certified job without leaving the country. However, they would not be allowed to move to non-EFWA jobs.
  • They should be allowed to apply for normal immigrant status even while working in EFWA, and their EFWA record could be cited in favor of their request for normal immigrant status.
  • Using their EFWA ID, they could receive some, but not all, social services avail to normal immigrants, for example, drivers licenses, bank accounts, and so on.
  • They would be entered into a Social Security escrow system, and they would pay income tax and other taxes.
A word about taxes, Social Security, and minumum wages. The reason why it is important that EFWA workers and their employers pay income and social security taxes is that otherwise, employers would be more likely to use the program as a way to bypass American workers. This is not the intention of EFWA. On the other hand, the status of an EFWA worker is such that their Social Security account is not standard. If they do not progress to standard immigration status, they will not be allowed to stay in the USA as retirees, because that would also be contrary to the purpose of EFWA, which is to deal with the employment crisis. The solution for this is that the amount of money that they deposit into the Social Security system would be held in escrow. If they become conventional immigrants, then the escrow account would be converted into a normal Social Security account, in effect giving credit for their EFWA work. On the other hand, if and when they withdraw from EFWA and leave the USA, then their escrow accounts, plus an amount of interest established by law, would be transfered to the country of origin, based on treaties negotiated with that country (for example, into their equivalent of social security); or, in the absence of such a treaty, into a an annuity paid directly to the individual. Note that in either case, the obligation of the Social Security system would be limited to the amount actually paid into the escrow account, plus interest. An EFWA worker would also be eligible for health insurance and life insurance, including group plans organized by the employers or possibly by the EFWA program itself. This could also be paid for via payroll deductions. The logic of this is the same as for taxes. In fact, payroll-deducted insurance is identical to a tax, except that it enriches a middleman. If health benefits were not allowed for, then once again, it would be too tempting for employers to abuse EFWA in a manner discriminatory against American citizens and normal residents. Finally, the same logic holds for minimum wage. Once again, employers should be required to pay at least the local minimum wage to EFWA workers, because otherwise abuses would result in discrimination against American workers.

Americans and holders of work visas should be able to question EFWA certification. The EFWA certificate would need to be renewed periodically, and during the renewal process, those who want to remove the certification should be allowed to make their case. Basically, the main argument would be that EFWA unfairly discriminates against Americans or visa holders who want to do the work. It is important to point out, by the way, that EFWA certification would not prevent Americans or visa holders from being employed in the job, and in fact, the presence of large numbers of residents would inherently question the need for EFWA certification. Also, the certification process would need to prevent abuses where an employer lowers the wage scale to uncompetitive levels just to qualify for EFWA. If other employers in similar industries pay higher wages and fill their positions without recourse to EFWA, then EFWA would not be granted. Also, if there are sufficient numbers of qualified American or visaed individuals who are willing to do the work at, or perhaps slightly higher than, the minimum wage, then EFWA would not be granted. It would be very important that a set of fair rules for granting EFWA certification be part of the act, and also that the decision-making process be transparent and completely accessible to the public, because of the risk of abuse. In addition, the renewal process must allow those who feel that there has been an abuse of EFWA to make their case.

I believe that EFWA or something like it would change the dynamic considerably. It is true that some overly greedy employers would complain, because they are probably paying currently undocumented workers less than they would have to pay EFWA workers. But on the other hand, the only requirement made by EFWA is that minimum wage and minimal benefits be paid. As long as foreign workers are available who are willing to work under those conditions, then that segment of the economy would benefit from EFWA. Some of those who object to "illegal immigrants" would also complain, because it is fairly clear that many of those objections are based on racist or exceptionalist viewpoints. Still, the EFWA ID system could screen out undesirable candidates, and so I believe that objections to it from "patriots" would become marginalized.

So, to summarize, what I'm suggesting is that we enact an Emergency Foreign Worker Act to deal openly and explicitly with the crisis of hard manual labor. The costs to the employer for EFWA workers would be as low as possible without unfairly discriminating against citizens and resident aliens. The normal waiting lists of the immigration service, and most of the requirements for visas, would be bypassed on the basis of the labor crisis. However, because the EFWA would be part of the federal government, workers would be minimally documented, and as a result, EFWA status could be denied to undesirable individuals.

I think it could work.

2007-08-04

Let's buy some opium

It hit the news today that the Afghan opium production is bigger than ever. They now account for almost the entire world supply, and opium farming is by far the strongest pillar of their economy. But that's not what this blog is about.

The US is funding a remarkably unsuccessful anti-opium program there. The plan for next year involves $475 million. They will spend this on eradicating crops, chasing down traffickers, and perhaps some money in funding for alternative crops. However, if past years are any indication, this will ruin a number of unlucky small farmers and drive them into the Taleban, and will have no impact on the profits of largescale opium and heroin traffickers, or on drug addiction in the US. But I decided to put on my bean-counting hat and take a look at the structure of this industry.

The Afghan farmers themselves are the ones who make the perfectly rational decision to grow opium. Now, they don't get all that much money from it. A recent (2006) reports a total household income of $1700-$1800 per year from growing opium (and only about $250 per capita), on average. In fact, the so-called "farm gate" price of the entire opium production of Afghanistan is only $560 million to $760 million. However, by the time this opium is converted to heroin and hits the streets of the first world, it will have enriched many middlemen and especially the large scale traffickers, to the tune of many, many billions of dollars.

Well, what's wrong with this picture? The US is spending around 1/2 a billion with no success in an eradication program, but the farmers are only making from about 1/2 to 3/4 a billion selling the crop. These two numbers are not very far away from each other.

I say, why not send some buyers around the country, backed up with armed US and/or NATO soldiers, with the job of buying up all or as close to all of the Afghan opium crop as possible?

Here's how it could work: the opium would be bought at the average market price, and resold to pharmaceutical companies at a profit. Clearly not all could be sold this way, the rest would be destroyed. If the farmer agreed to grow some wheat or some other crop, non-opium agricultural support, for example, wheat seed, could also be distributed. This would translate into paying individual farmers an average of $1500 or so each, in exchange for the typical annual production of a few kilograms of raw opium resin.

There would be a fairly large reaction from the narcotraffickers, and not only in Afghanistan. This would make a lot of junkies very sick. And, at least in the short run, Afghan farmers would still be growing opium. However, since the US would soon be the only buyer, or at least almost the only buyer, there would be a lot of things that could be done to move production into alternative crops. The goal here would be steady reduction in opium vis a vis other commodities. Building roads, schools, electricity, internet--all of those things would also help. The most important thing is, though, that we will have ended the illegal opium crop in Afghanistan, and at a cost only maybe 50% more than what we have been spending in our failed eradication program.

2007-07-28

Yet another Iraq idea

Things just keep getting worse in Iraq; we seem to be in a rut that goes around and around, repeating the same things, never getting any closer to a conclusion. The primary argument for staying in the rut is that if we started to withdraw, things would get much worse and it would be our fault for leaving. (Never mind that most of the blame for the bad situation is already ours: the thinking here goes along the lines of "all's well that ends well".)

In fact, the plans I've seen for withdrawing all seem to be vague. The differences among them have to do with (1) the speed of the withdrawal, (2) the extent of the withdrawal, and (3) the triggers and conditions modulating the withdrawal. If you can point at any distinguishing pattern among the various proposals that have been made, if you are closely identified with the Republican Party and/or the Bush administration, then your withdrawal plan, if you have one, is (1) slow; (2) shallow (i.e., a significant American presence will be maintained indefinitely in Iraq); and (3) conditional on events that are unlikely to take place. On the other hand, if you a strongly against the Republicans and the Bush administration, then your plan for withdrawal is (1) fast (i.e., six months or so); (2) complete (i.e., no American presence will remain unless under UN mandate); (3) unconditional (i.e., seamless hand-over to Iraqi forces is desirable, but not mandatory). Other plans in between these extremes seem to be a function of the political dimension.

I have a somewhat different idea, sort of a blend of the two plus an empirical test of the assumptions of both sides.

What I propose is that a contiguous region of Iraq, preferably one whose populance reflects the ethnic, religious, and economic diversity of the country at large, be designated the "autonomous zone". That is, American-led forces will draw a line around a fairly large region of the country (probably somewhere in the mid-eastern part), and set up control points on the main entrances and exits. Then, they will withdraw from that region, and pledge to stay completely out, not entering even for humanitarian assistance, for a minimum period of time, such as three or six months. The region would be under the control of the national government, Iraqi army forces could pass freely in and out, as well as supplies and so on. The control points would restrict other traffic to people who live in the region and have a legitimate reason to travel in and out.

This region would be presented explicitly to all Iraqis and to the world as a test. If the region dissolves into bloody civil war, then the case of the Republicans and the Bush administration for a slow, shallow, and conditional withdrawal will have been strengthened; if the region is mostly calm and stable, then the case for a rapid, complete, and unconditional withdrawal would have been strengthened, and in my view, should be begun at once.

The worst possible result would be that the autonomous zone becomes a haven for those who are mounting attacks outside of the zone. That is why thorough border checks during the experiment are so important.

Also, this kind of artificial partitioning of a country is unpleasant for the people. However, if the experiment has a known duration, fairly short, and if the benefit is very clear, I believe that people would cooperate with it. In fact, one of the biggest problems would probably be keeping people out of the zone once it starts to develop some degree of stability. Again, if the time period is short, it should be fairly easy to convince people to wait.

There are many technical details that are very important. The zone must have it's own infrastructure, such as sources of water and electricity. If this is vulnerable to being damaged from outside the zone, then it should be protected to the extent possible by the American forces outside.

However, the experiment should be begun without undo delay. The Americans should supply some overall parameters such as the range of sizes of the zone, what resources must be inside the zone, and the rough population size and composition. But the Iraqi government should decide all of the details. This would be the first time that they would be making decisions regarding law and policy that would not have to be backed up by the Americans.

Either way, this would be positive action, a partial withdrawal that was both fast but measured, complete but limited to one zone, and unconditional yet would determine subsequent further withdrawal.

2007-07-03

Some attributes of careers

These are just some random thoughts about different kinds of careers one might choose, and why some people find certain careers more interesting than other people. This is not really a very exciting blog entry, so be warned.

There are wide differences in talent and in training and experience that can be used to classify different career paths. Someone who never studied music until after college is unlikely to have a successful career as a classical musician; someone who has poor hand-eye coordination will not succeed easily as a surgeon, and so on. But that's not what I'm talking about here.

I think that there are four very general attributes that can be applied to any possible career. Some careers seem to have one of the four almost to the complete exclusion of the others, while other careers are more of a blend. Here they are:

  • Drifting.
  • Scamming.
  • Maintenance of the present.
  • Focusing on the future.
First off is drifting, which many people wouldn't call a career at all. I include in this both the Skid Row bum and the idle rich. For example, Paris Hilton's career up to now has had a strong "drifting" component. Some people fall in to this category naturally, and others are forced into it. Some ordinary jobs can attain a very large component of drifting after a time. Someone who just "puts in the hours" or "keeps the seat warm" is drifting.

Next come scamming. This kind of career is built on the exploitation of human frailty. Most of the criminal careers fall into this category, but there are others. If someone spends their time pandering or taking advantage of peoples' weaknesses, then they are scammers. Some religious figures fall into this category, as do many in the entertainment industry. People with porn websites are basically scammers, as I am using the term. Since societies and laws serve to protect citizens, it is no accident that many scams are illegal, but not all are. Politicians, for example, often scam voters by making a career of exploiting their fears and prejudices in order to keep getting elected. While reprehensible, this is far from illegal.

Next we come to the largest class, those who maintain the present. These jobs are the pillar of society. People who raise food, who prepare food, who sell food--they are in this class. People who make cars and repair cars are too, along with doctors and policemen and firemen. Our large and complex human society requires a lot of maintenance, and there are many different types of maintenance activity. Many maintenance jobs are not widely respected: picking fruit in the field and collecting the trash are two out of many examples. Yet, all maintenance jobs are important.

The final class is those who focus on the future. Research scientists, legislators, inventors, philosophers, and similar workers also play an essential role. Almost all of our technology was developed by members of this class. Virtually all of the various medical procedures, laws, and similarly important aspects of our lives were not created by drifting, scamming, or maintenance, but by individuals whose focus was a bit beyond the needs of their day.

There are a great many careers who blend these four factors. For example, entertainment frequently has both scamming and maintenance aspects. As I stated above, many careers can be directed primarily at scamming, maintenance, or the future, but can fall into drifting through boredom or constant repetition. Very often, people whose primary careers are involved with maintenance also concern themselves with the future and make contributions in that area. People can be in maintenance or future-oriented businesses, but concern themselves primarily with scamming, that is, profiteering.

I think that applying these four attributes to potential careers could be useful to young people who might be making career choices, and also to older people who may not feel satisfied with their current careers, and might be looking for some kind of change. Anyway, speaking for myself, I've found it interesting to apply them to my career.

2007-07-01

Is it time to reshuffle the family farm?

America has a long tradition of family farms. My own ancestry includes farmers in Indiana and Illinois, as does that of millions of my fellow citizens. The land upon which this farming tradition is based was made available to families willing to farm it, many of whom immigrated to America in order to do it. It is easy to forget the connection between immigrants and the family farm, because a considerable number of generations have passed, blurring its origins.

Today, we have a situation where immigrants still perform a good deal of the farm work in America, but on very different terms. There has been a great consolidation of farms, with fewer and fewer owners running larger and larger agricultural operations, in a manner that would hardly be recognizable to the the present owners' great-great grandfathers who immigrated here, broke in the land, and raised their families on it. For example, as far as I know, I no longer have any relatives who own and operate family farms. And the immigrants aren't coming here with their families so they can have their own farms--they're coming to be farm labor on other people's farms.

A recent article pointed out an interesting side effect of the great consolidation. It seems that farmers are putting off retirement until well past age 65. In part, this is due to their long habit, and to their use of machinery that allows one man, even an old one, to run a farm. However, another important factor is that relatively few people are staying on the land. Farmers know that if they stopped farming, their land would be sold off, possibly to developers, possibly to other very large commercial operations to create even larger consolidations. But more and more often, there is no longer a connection between the family and the farm.

Now, some would say that this is perfectly desirable as an outcome. After all, the life of a farmer is one that contains a lot of drudgery, frustration, and disappointment. Why not industrialize farms? Perhaps in the end, there will be no more farm consortiums than there are automobile manufacturers or oil companies, that is, perhaps half a dozen or maybe a dozen. They will hire people to work on them using the products of their industrial peers: pesticides and fertilizers from the chemical industry; genetically engineered seed stock and live stock from the biological engineering companies; machinery from the manufacturing industries. The Calval farm corporation will compete with the Gulf farm corporation. Kismet.

On the other hand, there are at least two things that will be lost if this happens: diversity in the food marketplace, and the direct connection to the land by families or small groups that work and live on relatively small farming operations. There are somewhat abstract, and I can't really cite a lot of evidence in direct support of the idea that diversity in eating and a direct connection to the land is better than an ever-increasing consolidation and industrialization of farms, but I actually don't think I need to, It seems rather self-evident to me.

So what do we have: (1) immigrant farmers were given plots of land which they farmed and lived on, and passed on to their children, resulting in a long-lasting system of family farms; (2) over many decades, more and more families have moved on to other pursuits and their farms have been consolidated into large, semi-industrial, corporate farms; (3) the consolidation process has now reached the point to where farmers are afraid to retire, because they know that their farms will not be passed on to their families; (4) thousands of poor farmers in Mexico and other South American countries risk their lives to come to America to work in our farms. Doesn't there seem to be a rather obvious, if radical, possible solution to this developing problem?

Why not take back the land? The original land grants were motivated by the concept of the family farm. They weren't intended to create giant agricultural corporations or housing developments. If the family of a farmer who owns agricultural land no longer wants to farm it, it could be taken back--bought back, perhaps, under imminent domain--and then given away to families who will live on and farm the land. Very large farms should be broken up into manageable pieces, and given to people who will live on and work the land, under terms similar to those used in 19th century land grants. If Americans can not be found who will take up this opportunity, then the same thing will happen in the 21st century that happened more than 100 years ago: people will come into our country and meld with the land. It will become their land to an extent that most native-born Americans have never known.

Reshuffling the farmland deck like this will do two very important things: it will bring back the family farm, the backbone of the American way of life; and it will alleviate the immigration problem by providing a permanent home for a subset of the people who are sneaking under the wire to work other people's farms. A new deal for American agriculture.

Now, let's be clear: I'm not talking about all of the farmland in America. There is a trend, as mentioned in the above-cited article, for farms to be lost due to lack of interest by farming families, but this is surely a minority of the cases. I'm also not not talking about "solving the immigration problem" with this idea. There will still be a need for farm hands, just as there was 150 years ago. But at least some of the "unwanted" land ought to be made available to at least some immigrant families, just as it was to my ancestors and their families.

I believe that a rational approach to this would be to create a new designation for agricultural land that was originally granted to farmers by the government which is in danger of being lost as a family farm (or as a farm period, as for land sought by developers). The designation would be as a "family farm", and a farmer would be given this land by the government, for him or her to work and to pass on to family members in perpetuity, up until the point where no family member wanted to live on it and farm it; at that point, it would be taken back by the government and made available to some other farmer who was willing to accept it on those terms. One way to think about this would be as a kind of "family farm bank", run by the government, that would give land to farmers willing to work it in the "close up and personal" family-oriented way. If one of the products of this program was to allow former undocumented agricultural workers to become American family farmers, so much the better.

2007-06-27

First Thursday in July

There has been a trend over the past couple of decades to normalize holidays to certain more convenient days of the week, generally a Monday or Friday, so that workers can have three-day weekends. One notable exception to this trend is Independence Day, which is still date-based rather than day-of-the-week-based.

I think that we should change how we officially define Independence Day to reflect the calendar of 1776, when it was on a Thursday. That is, we should celebrate the founding of the nation using the calendar of the year when it was founded, on the first Thursday in July.

This would create a situation almost identical to that of Thanksgiving, which is also on a certain Thursday. That is, the official holiday would be on Thursday, but most workers would arrange things so that they could take a four-day weekend.

The biggest question is, would an Independence Day defined as a certain date, the date when the Declaration of Indepencence was signed, July 4th, 1776, be more meaningful as a national holiday than one defined in terms of the day of the week when the same Declaration was signed in that year. It seems to me that the two definitions are completely equivalent in terms of meaningfulness. That is, there would be no loss, no gain in that dimension.

The secondary question is, would it harm the nation to make the switch from a date-based definition to a weekday-based one. This is a bit more complicated, because there are two ways to answer the question. The easy answer is that it will not, because they same transposition has already occurred with, for example, Presidents' Day. This is clear. The more complicated answer is whether it would affect things like salary and leave computations. In this case, I believe that it will make things easier, not harder: each year would be the same, because salary and leave are generally calculated on a weekly basis.

In summary, I believe that we could achive a significant improvement in our enjoyment of Independence Day, with no loss of patriotic meaning, by redefining the day to match the weekday when the Declaration was signed in 1776 rather than the calendar date. On the other hand, it seems pretty unlikely to me that this will ever happen.

2007-06-24

Vice President vs President of the Senate

President Cheney has made a very interesting point. (By "president" I'm referring to his role in the Senate.) He is claiming that his office--the dual office of Vice President of the United States and President of the Senate--is immune to restrictions placed on the Executive Branch, because it is "also" part of the Legislative Branch. The scare quotes around "also" signal the thesis of this article: I think that on constitutional grounds, the office of the President of the Senate should be primary over that of Vice President of the United States, and therefore, that Cheney's office is wholly in the Legislative Branch unless the President of the United States should die or pass control under the 25th Amendment, at which point it leaves the Legislative Branch and passes to the Executive.

Historically, that is, in the original unamended US Constitution, we have several indications as to the role of the dual office:

The Vice President of the United States shall be President of the Senate, but shall have no Vote, unless they be equally divided. The Senate shall choose their other Officers, and also a President pro tempore, in the Absence of the Vice President, or when he shall exercise the Office of President of the United States.
Also, the original process of the Electoral College included three indications, that the term and election is identical to that of the President of the United States:
The executive Power shall be vested in a President of the United States of America. He shall hold his Office during the Term of four Years, and, together with the Vice President, chosen for the same term, be elected, as follows...
and that the President of the Senate has a unique ceremonial role in the process itself:
The President of the Senate shall, in the Presence of the Senate and House of Representatives, open all the Certificates, and the Votes shall then be counted
and that the Vice President/President of the Senate is the person receiving the second highest number of Electoral College votes
In every Case, after the Choice of the President, the Person having the greatest Number of Votes of the Electors shall be the Vice President. But if there should remain two or more who have equal Votes, the Senate shall choose from them by Ballot the Vice President.
Finally, the rule regarding succession is laid out:
In Case of the Removal of the President from Office, or of his Death, Resignation, or Inability to discharge the Powers and Duties of the said Office, the Same shall devolve on the Vice President...
That's the unamended US Constitution.

Note that in the original version, this office has three roles in the government: to be President of the Senate, to preside over the Electoral College, and to succeed to the presidency should need arise. The 12th Amendment refines this process slightly, and in a way that strengthens the thesis of the present article. It provides that instead of the runner-up to US President becoming the Vice President/President of the Senate, that person must run and be elected explicitly. It also provides that in case of an Electoral College tie, the Vice President/President of the Senate be elected by the Senate. There are quite a few further amendments that affect the office in ways that are not relevant here.

Therefore, it seems to me quite plausible that this dual office may have been conceived originally as yet another way to balance power between two branches of govenment. By electing the President of the Senate via the Electoral College, the considerations of the people regarding the chief executive would be maximally respected when the Vice President becomes President of the United States. Furthermore, by creating an office of President of the Senate elected by all of the states, and with a term shorter than that of the other senators, there would be a stronger connection of the people of the United States with the upper house of the legislature, whose members would otherwise be responsible only to their individual states. Therefore, what we see today as a fairly useless position of Vice President was probably intended to be an active President of the Senate, the second most powerful position in the federal government.

The problem is that the President of the Senate, being voteless except in case of a tie, was never given any real power in the Senate itself. Or, to say it another way, the power granted the President of the Senate was taken away by senate rules passed by senators unwilling to be limited by the power-balancing schemes of the Founders. In addition, because of the fact that the office of Vice President is much "sexier" because of its potential power upon the death of the President of the United States, it is that role which has been given strong emphasis by those who hold the office. To my knowledge, there has been no President of the Senate who actually exercised its powers, and this has also caused those powers to whither to the point of irrelevance.

However, those historical facts do not change the US Constitution. If we remember our basic logic, which says that if A is identical to B that we can always substitute A for B and vice-versa, then we can see that the office of which we are speaking is a single office: to be the President of the Senate and of the Electoral College, and to be the first in line to succeed the President.

Note that the second in line of succession is, interestingly enough, the Speaker of the House of Representatives. Now, because such a sequence was deemed much less likely, a special form of election was not provided for this office, nor was a second name (such as "Second Vice President") invented, although such a name would be perfectly valid. But in spite of this, there is an undeniable symmetry in the fact that the President of the Senate, the upper house, is first in line, and the President of the House (known as the "Speaker", but the offices are basically identical) is second in line. This symmetry is also strong evidence that the office of Vice President aka President of the Senate is not an executive office at all, but a legislative one.

In summary, it appears that the office of the Vice President aka President of the Senate has been grossly misinterpreted during over 200 years of history. It has wrongly been considered an Executive Branch office, with unfortunate consequences, and the Senate has lost the benefit of a president elected by all of the states. The presidents of both houses of Congress are in the direct line of succession to the presidency of the United States, and therefore, they each deserve certain considerations, such as being fully informed regarding the activities of the Executive Branch. However, they also have strong roles to play in their home Legislative Branch, and therefore when either one of them, the President of the Senate, or the Speaker of the House, becomes President of the United States, it must, under the Constitution, be seen not as a promotion of a member of the Executive, but as a transition whereby on of the leaders of the Legislative Branch moves to the Executive to assume new duties and powers there.

The one thing that we can do to try to move things in this, the constitutionally correct direction, has to do with names. This office has two names, "Vice President of the United States" and "President of the Senate". For historical reasons alluded to above, the former is seen as more powerful than the latter, and is the more commonly used. However, the apparent relative power of the two positions is an error: the Vice President has no constitutional role in the daily function of the Executive Branch, but a key role in the Senate. Therefore, we should begin by referring to the Vice President as "President (of the Senate)", as I did at the top of this article. Perhaps it isn't too late to mend this hole in the Legislative Branch, and to remove a pernicious growth from the Executive.

For some additional discussion of this thesis, please go here.

2007-05-23

NAFLA

The North American Free Labor Agreement could be a possible title for a solution to the "immigration problem" in the US.

Here's what I was thinking: suppose that we negotiated an agreement among the NAFTA partners, Canada, the US, and Mexico, that extended the free trade zone to be a free labor zone. What would happen?

I'm basing this idea on similar agreements among the countries in the European Union. As I imagine it, negotiations would create a system that would allow properly identified citizens of any of the NAFLA countries to live and work in any of the others. Within certain limits, they would have all of the rights regarding residence and employment enjoyed by the citizens of the host country, as well as the right to travel freely back and forth across the various borders.

Let me mention some of the limits that would be present. First, convicted felons would not have the same degree of freedom as other individuals, and in fact, member nations could require such individuals to go through the same work permit process required for citizens of non-NAFLA nations. Second, certain jobs deemed critical to the nation(s) could be restricted to citizens of the host country with exceptions granted on a case-to-case basis. For example, jobs in the defense industry that required a security clearance would probably remain inaccessible to all but a very few foreign nationals. Third, communities could open their local elections to foreign nationals residing there, but state and federal elections would probably remain limited to citizens of the host country. In other words, it makes sense for all of the residents in a community to have a say regarding community services, schools, local surtaxes, and the like, but not for foreign citizens to be involved in decisions regarding state or national laws or policies. Another issue is social welfare. If you work or have recently worked for a significant period of time in the host country, or are the spouse, child, or dependent parent or sibling of someone who does or has, then under NAFLA, you would qualify for social welfare just like host-country citizens. However, if you aren't a worker or in the immediate family of one, then you do not qualify, unless a local jurisdiction allows it. And there are probably other similar limits on non-citizen workers that would all be decided during the treaty negotiations.

As for the process that would result from this, all of the member nations would issue ID cards and other documentation that would, as a matter of law, be recognized by all governments and employers in the other nations. It would probably be necessary to adjust the laws of each country to accomodate this, or to spell out rules to make up for mismatches. For example, employers in the US would be required to provide healthcare coverage comparable to that provided by the Canadian and Mexican governments (or to switch to a single-payer government-administered system, which is unlikely). Retirement or social-security type plans would have to be aligned, and laws regarding the pay-out of funds accrued to individuals residing in another NAFLA member would need to be passed. Rules regarding the length of the workweek, vacations, and other benefits would need to be discussed as part of the treaty negotiations, as would the details of rules regarding social welfare. When mismatches remained, they would have to be well publicized to reduce misunderstanding on the part of workers. This is not to underemphasize the difficulty of these negotiations, they would be lengthy and hard. Compromise would be necessary, and the end result would not please everyone. Still I think that a successful NAFLA treaty that would be acceptable to the NAFTA nations is at least theoretically possible.

So, what would happen if such a treaty could be constructed?

First, in the current immigration debate, the presence of undocumented workers is muddying the water. Apparently, about 10% of all Mexican nationals currently live in the US, most of them without papers. Under NAFLA, all of these individuals would have the same legal relationship to their employers and to the host governments as citizens. For example, they could be issued US Social Security numbers and have ordinary SS acounts; they could receive provincial health care cards in Canada; they could get Mexican drivers licenses, and so on. Taxes would be assessed on them just as for local citizens. They would have the same rights and obligations under the civil and criminal justice system as citizens. One effect of this would be to increase the number of taxpayers in the US to the tune of many millions. There would no longer be a sizable immigration problem here, but there would be a considerable flow of people back and forth. But it is very unlikely that the flow would be larger into the US than it is now, because of the limitations imposed by the treaty. In other words, it is only a free labor agreement, not a new nationality. There would be new administrative burdens, but there would also be new tax revenues and a more vibrant economy to help support them. Many of the security concerns regarding the leaky border would be eliminated because of the requirement for valid, secure identification of individuals who cross the border. In fact, anyone who crossed illegally would either be an idiot or some kind of criminal. The job of patrolling the border for criminal or hostile activity would be made much easier because the current noise factor created by illegal but harmless individuals who are merely seeking work would no longer be a factor.

Second, because of the open border, I believe that the issue of "a path to American citizenship" would become largely irrelevant, because I don't think that the majority of Mexicans would continue to want American citizenship. Given free passage back and forth between the two countries and full rights to live and work on either side, I believe that the overwhelming majority of Mexicans will decide to retain their citizenship, to maintain their familial and cultural relationships, and their identities as Mexicans. In short, the intense economic pressure for Mexicans to become American citizens would disappear. It is true that children born and raised in a host country should have the right to choose citizenship there when the reach their majority; this is currently the case and should not be changed. But absent any legal or economic incentive, it seems unlikely to me that the parents would become citizens, and it also seems likely to me that their children will be more likely to claim Mexican rather than American citizenship when they reach adulthood than under our current system of vicious inequity.

If we could somehow shift the energy currently being poured into debates about the dangers of "illegal immigration" and even the "browning" of America into active negotiations targeting a free labor agreement among the NAFTA partners, I believe that we could accomplish something that would increase security in all three countries, improve all three economies, and be a stabilizing influence throughout the hemisphere and the world.

2007-04-27

Iraq Timetable

There's an old joke: you go up to an attractive member of the opposite sex and ask them, would you sleep with me for $25 million? If they say yes, then the punch line is something like, "Now that we've established what you are, let's work out a better price". I think that this might be a useful strategy in attempting to bring the Bush administration to heel regarding withdrawal from Iraq.

For example, would the president sign a bill that had a mandatory withdrawal "within the 21st Century"? "Within the 3rd Millenium"?

I would think that his refusal to sign either of those bills would probably lead to his impeachment. I doubt that more than a handful of even the hardest of the hard-core rightwing Iraq hawks would have the stomach to keep up this struggle for 93 more years.

However, if he did sign, then the political battle, just like in the joke, would move to establishing a better "price", or in this case, a better date for withdrawal.

So, let's pursue this mental experiment: what about 50 more years? 20 more years? Or even ten more years? Would any of those timeframes be acceptable? One shudders to think that George Bush might be unwilling to commit himself to a withdrawal before ten more years of this slaughter have passed. My sense is that probably 20 years, or a generation (actually, 24 years counting those already passed) is about what he would be willing to sign.

But time passes very quickly. Think about how fast the past four years have gone by. In fact, a commitment to a 20 year deadline would actually be a substantive improvement over what we have now.

Furthermore, with a commitment to a time certain for withdrawal, even an absurd one like two decades, many things would become possible. For example, it would facilitate a much more concrete discussion about things like what our goals are in Iraq, the status of American military bases there in the long term, and so on. And, it would completely end the absurd "setting a deadline is 'cutting and running'" theme.

Therefore, I humbly suggest that the Democrat leaders submit a new bill to President Bush that is identical to the one he has said he will veto, but with a much longer deadline for withdrawal, for example, "within 20 years". Not only would it greatly enhance the dialog about the war, but would also be much more sane than the current endless, rootless, killing.

2007-04-21

Social Security Numbers: release them all

The situation with identity theft and Social Security Numbers is getting worse. Recently it was announced that several US government agencies' web sites had been displaying thousands of SSNs for more than 10 years. In response, the pages were taken down, and the government has offered free credit monitoring for the individuals whose numbers were exposed. There have been other cases where even millions of SSNs were jeopardized.

But my question as a random philosophizer is, is the problem here that SSNs are being exposed through malfeasance and/or malefaction, or is the problem that these numbers, part of a system created 80+ years ago for reasons having nothing to do with unique personal identifiers, are being used as critical pieces of personal ID?

I think the answer is clearly the latter. SSNs are useful to identity thieves solely because they are a handy way to tell people apart, and because there is an assumption that they are private.

It is very expensive to protect SSNs, or to pay for credit monitoring when they are exposed. And under the situation we are in today, when one is exposed, it clearly does make the individual who holds the number more vulnerable to fraud via identity theft. I think that the whole approach is wrong-headed.

The random philosopher's plan for dealing with SSN exposure is very simple: the government should immediately open a database on one or more of their web sites listing all SSNs, along with the names and DOB of people registered under the numbers. This should be a public web site, with no restrictions on access.

While it sounds rather absurd, what this would do instantly is remove all expectations of privacy with regard to SSNs. It would not interfere with the administration of Social Security: the numbers would still be perfectly useful as Social Security account numbers, which is all they were designed to be. But they would no longer be useful in any way to help someone prove their identity, which would eliminate the problem of SSN fraud and of identity theft based on SSNs.

A minor advantage of this would be that employers could do some basic checks on employees claiming certain SSNs, just name, DOB, gender; the basic info on the Social Security card, which would be in the public database. This would help prevent people from using the wrong SSN, either through error or deliberately.

SSNs could still be used as unique identifiers, something they are extremely useful for, but they just wouldn't be useful as proof of identity. Instead, something more useful would have to be developed for this purpose.

Most people who have studied this problem concur that "smart" codes, along with biometric data of some sort (fingerprints, photos, retinal photos) are much more useful for this purpose. Another element of a reasonable system of identification is to have multiple independent sources of identity, so that even if one source is exposed or contaminated, the other sources would continue to be valid. However, the specification of a viable system of identity is beyond the scope of this note.

In summary: Social Security Numbers were never designed to function as proof of identity, and as a result, inadvertent or deliberate exposure of SSNs is a tremendous problem in our society. The problem with SSNs can be solved overnight if the government simply publishes them all, removing all expectancy of privacy from them. This action would have several additional advantages, but its primary effect would be to force bureaucracies that have been misusing the SSN as their clients' personal identifiers to find something better for this purpose.

2007-04-16

Text messaging to the rescue

Today, the worst mass shooting in history happened at Virginia Tech. Over 30 killed and many wounded. The police and university officials are receiving a good deal of criticism, because the shootings happened in an initial, less mortal phase, followed two hours later by a second, horrible slaughter. It seems "obvious" to us that the campus should have been shut down during that two hours, in order to protect the university community.

The university used email and had people calling dorm RAs on the phones, but there are 26,000 students, plus faculty and staff, at that school, and most of them were on their way to campus during the critical two hours. So I ask the question: how in the world were the university officials supposed to implement a warning to tens of thousands of people, in a few minutes?

I think that we have reached a point in our cell phone technology where they can begin to perform a much larger role in emergencies than they do currently. The cell phone system works by interconnecting a vast network of smaller, local "cells", hence the name. When someone moves from one cell to another, the old cell deletes the SIM code from its database, and the new cell picks it up. As a result, the computers that control each cell always have an up-to-date list of all cell phones, by service provider, that are in range.

Therefore, it seems to me that there is no technical reason why in the case of a mass emergency, such as what happened today at VTU, but also other kinds of terrorism, or natural disasters such as fires, floods, earthquakes, and so on, that emergency messages couldn't be pushed out from local cellphone towers to all phones in range.

In the worst case, it would take only a few minutes for a text message to be sent to every phone in range, and in most cases, it would take only seconds. Text messages are highly superior to voice messages for this purpose, because they do not require anything like the bandwidth or time of a two-way voice connection.

If all of the towers serving the VTU campus had sent out one or more messages after the initial shootings, they could indeed have shut down campus and warned virtually every member of the community, or someone standing nearby who could have spread the word.

Well, that's my thought on this topic.

Greg Shenaut

2007-03-20

Considering the role of US military force

The neoconservative movement had at its core the principle that in addition to self-defense, a valid use of American military force is to spread American ideals such as democracy, human rights, and justice throughout the world. The movement has now been thoroughly discredited as a result of the misconceived and botched regime change in Iraq (and to a somewhat lesser degree in Afghanistan). However, I believe that it would be a mistake to throw out that core idea without careful consideration.

In fact, there are many grave injustices around the world, cases where entrenched local governments, extremist insurgencies, or endemic regional conflicts are bringing great harm to millions of individuals, and which endanger the unalienable right of all human beings to life, liberty, and the pursuit of happiness, if I can use that phrase. There have been cases where the judicious application of American military force has made a positive difference, especially when American have played a major or leading role in multinational interventions under the auspices of the UN or of NATO. Unfortunately, there have also been many cases where American intervention has created a bad situation or worsened an already bad one, in some cases even more so than in Iraq. How can an idea that seems so obviously good cause so much harm? Are we wrong to think that American might can improve the lives of people living under the yoke of tyranny or sinking into chaos?

There are important lessons to be learned from our adventure in Iraq. I think that perhaps they can all be summarized in two small proverbs: "Don't bite off more than you can chew" and "You can lead a horse to water but you can't make it drink". We didn't--and perhaps couldn't--commit adequate forces to the Iraqi occupation/rebuilding/democratization effort. It may be that we do not possess an adequate amount of men and materiel to do this, but certainly our leaders grossly underestimated the magnitude of the effort that would have been required. Also, the essence of democracy (dēmos "the people" kratia "power, rule") indicates that there are inherent contradictions in an external power attempting to forcibly impose democracy on an occupied people. It is possible to impose the forms of democracy, but how can you impose the habits of thought and the view of the world that must underly any effective practice of democracy? I don't think it's very easy to do that, and it may in fact be impossible. It certainly hasn't worked in Iraq.

I think that the most important lesson about Iraq and Afghanistan, though, is the importance of using our own democracy effectively. There are two really important things that we did not do, which could have prevented a great deal of trouble. First, in a time of national hysteria immediately after the twin tower attacks, we rushed into military action without any attempt by our leadership to seek calmness and an informed discussion of our options. That is, the people, basically out of fear, relinquished their role as the dēmos and submerged their thinking to that of the mob. Calmer voices were available, for example those who assessed the attacks not as an act of war, but as a particularly horrible and disgusting mass murder by a band of psychotic thugs. There is a real difference between those two perspectives. Of course, the advantage of the act-of-war interpretation is that we could spring into action and let the chips fall where they may. (This had tremendous advantages to a political party determined to cement itself into power, and to a presidential administration determined to inflate the power of the executive.) As has often been said, the army isn't a democracy, and therefore, once the army and its commander in chief took the lead, democracy fell by the wayside. The much more accurate "band of psychotic thugs" perspective, on the other hand, would have required international cooperation to a much greater degree, and it would have required a slower, more thoughtful approach, with things like evidence, laws, jurisdictions, and so on. As I have blogged elsewhere, there was even some indication that that approach would eventually have compelled Mullah Omar to give us bin Laden and the other al Qaeda leaders, if we had hard evidence (Sharia law doesn't like circumstantial evidence very much, and that is all we had during the few weeks leading up to the Afghan invasion). But the bottom line is that there was no evidence then that another attack was imminent, and in any case, a thorough investigation, looking for evidence, creating diplomatic connections to get access to international police work--all that would have been a perfectly adequate way to protect ourselves from another 911-style attack. We had time to let things calm down, to allow the dēmos to play their role, both individually and through their elected representatives, and to gather the hard evidence needed to convince the naysayers around the world that al Qaeda were indeed the perpetuators. If after this period of consideration, Mullah Omar had still refused to give us access to the murderers of thousands of our people, and if as a result the American Congress had declared war on Afghanistan, then that war would have been on an entirely different footing. Those who opposed it would have had the clear message that a calm, reflective dēmos, which included them, had considered the matter fully and openly, and had adopted a certain clearly specified and limited course of action. This message would also have been clearly understood around the world, and most importantly, by the Mullah. I personally believe that we could have broken up al Qaeda and jailed its leaders without any significant military action, had we followed that route. But we didn't.

The second thing we didn't do, to some extent with Afghanistan, but to a very large extent with Iraq, is to give the dēmos access to the information they needed to make an informed and rational decision. In fact, access to information was limited selectively, to paint a picture that was very inaccurate, and to mislead the people into backing military action in a direction and to an extent that was miserably mismatched with American and Iraqi interests. I don't need to belabor this, because most people have now understood it.

There is an obvious connection between these two points: a calm and involved dēmos can do nothing without free access to the best possible information, and good information is wasted on a hysterical public. It should be one of the foremost roles of government in America to ensure that the people are fully and accurately informed, including information that may not agree with the government's agenda. Furthermore, it should be central that whenever possible, the people as well as Congress be involved in important decisions such as the use of military force, and that their involvement not be detoured by hysteria or demagoguery. For example, government should not fan hysteria, but calm it, whenever possible. (Don't get me wrong: there are emergencies that require immediate response by our military and that can not wait for calm reflection by the dēmos. But this wasn't the case in Afghanistan, and definitely not the case in Iraq.)

What am I saying here that pertains to the neoconservative notion that America can use her military might to enforce and promote American ideals?

Well, I think that we should consider the principle itself, that is, whether American should ever go to war in the cause of justice and democracy, in the absence of an imminent military threat. By consider I mean that this principle should be the focus of a national debate, in the media, the Internet, and in Congress. I believe that once all possible sides of the debate are heard, an overwhelming majority of Americans would most likely agree in principle that such an action could be justified, but only under a certain very constrained set of circumstances. It is quite odd that in spite of the fact that America has gotten involved in that manner over many decades, above all during the Cold War in both overt and clandestine actions, this debate has never taken place in a meaningful manner. In my opinion, its importance rises to the level of a Constitutional Amendment. I admit the possibility that the result might be other than what I expect, and I could live with that if the debate were honest and thorough.

However, assuming that the dēmos decided that America could, in principle, use its forces in that way, then the next and even more important step would be to implement a set of supporting rules and facilities, and then to make a decision about specific instances where our military could lend a hand.

First and foremost, we should never make a move unless we are invited by the victims of violent injustice to intervene. Furthermore, the people who invite us must be willing to participate before, during, and after our involvement. Once we have been invited, there must be a period of research, the information must be published for public consumption, and there must be a public debate, and a debate in Congress. There must be an official, constitutional Declaration of War by Congress (or an alternate form if one is created by an amendment as mentioned above). Our involvement must be limited, what the French call a coup de pouce. It could be large or small in terms of expenses and personnel, but it must always be clearly limited. If things start to go wrong, as they often seem to do, the limits could always be extended, but only after the same kind of honest and thorough public debate as during the initial declaration. Once there has been a national debate followed by a Congressional declaration, then at that point the military force of the nation will be led by their commander in chief to implement their mandate, with little if any further dickering by either the people or Congress (although it is Congress who will have established rules guiding and constraining the military). That is, the people and Congress need to debate the various issues before the flag goes up, and, once they have committed the military, no "micromanaging". On the other hand, if things go wrong and the people lose their appetite for some particular intervention, and time runs out on the declaration, then the military will just have to withdraw, doing so in a manner to reduce the deleterious effects as much as possible.

Anyway, this is all just an attempt to sketch a possible framework where neoconservative-style interventions could be done in a manner consistent with American principles. I think that the interventions we have done recently, in the absence of such a framework and with the contravention of the dēmos, have been disasters. I think that our best course, until we can go through some kind of process along the lines I have outlined, is to stick with what is already has been established and which is in our Constitution: Congress must declare war, and decisions by the president to go to war without a declaration must involve true emergencies. I think that a fairly good model can be found in Article VI of the Articles of Confederation, which discusses the exigent circumstances under which an individual state can go to war:

No State shall engage in any war without the consent of the United States in Congress assembled, unless such State be actually invaded by enemies, or shall have received certain advice of a resolution being formed by some nation of Indians to invade such State, and the danger is so imminent as not to admit of a delay till the United States in Congress assembled can be consulted; nor shall any State grant commissions to any ships or vessels of war, nor letters of marque or reprisal, except it be after a declaration of war by the United States in Congress assembled, and then only against the Kingdom or State and the subjects thereof, against which war has been so declared, and under such regulations as shall be established by the United States in Congress assembled, unless such State be infested by pirates, in which case vessels of war may be fitted out for that occasion, and kept so long as the danger shall continue, or until the United States in Congress assembled shall determine otherwise.
The key concepts are that "the danger [must be] so imminent as not to admit of a delay till the United States in Congress assembled can be consulted", and that the response must go on "so long as the danger shall continue, or until the United States in Congress assembled shall determine otherwise". (I admit that the context is fairly different, but I believe that the concepts are still relevant.) This is not what we have seen either in Afghanistan or in Iraq. The resolutions for the use of force that were issued by Congress contained so many hedge words and conditionalities that they were almost useless except to eliminate Congress and the dēmos from the process. Basically, what we have is a kludge, an attempt to force the system we have, designed in the 18th Century (note the arcane phrases "some nation of Indians" and "infested by pirates" in the Articles of Confederation extract quoted), to work with the goals articulated by the neoconservatives of engaging militarily in the name of democracy and justice. It was a huge misstep to try to create a framework for such a drastic new role for America on the fly as we have done.

In summary, I think we need to start the debate about the underlying principle immediately. We need to decide once and for all, as a nation, whether and how our military forces will be used. The result has to be comprehensive and specific legislation, preferably one or more constitutional amendments. Until this happens, we need to disengage as cleanly as possible from all military interventions short of declared warfare or responses to true emergencies.