Tuesday, January 31, 2006

Live by the Boycott...

The small, ordinarily non-troublemaking country of Denmark is in hot water because one of its newspapers printed a series of cartoons depicting the prophet Muhammad. Many Muslims all over the world found them grossly offensive. (Not without reason, as some of them are grossly offensive, not just to Muslims but to anyone who objects to mocking other people’s religious beliefs.) The newspaper published them, its leaders said, to underline the importance of freedom of expression. In particular, it was a response to the inability of a Danish author to find anyone willing to risk violent retaliation if they illustrated the author’s book about Muhammad.

What makes this of interest to me is the reaction of Arla, a company based in Denmark that is a major producer of dairy products. According to a story on the company’s own web site, it is now the subject of a boycott by consumers in Saudi Arabia, Kuwait, Qatar, Bahrain and the United Arab Emirates. Its leadership naturally views this as unfair, with one PR official telling the BBC that the food company is not the newspaper, and so laments having to pay for the newspaper's deeds. As an aside, the boycott will be very costly as well. Saudi Arabia is the firm’s most important non-European overseas market, and Arla's products have apparently vanished from the shelves in many Persian Gulf countries.

But ironically, it appears that Arla itself might be (it is hard to say for sure yet) a longtime practitioner of this type of “unfairness.” According to its annual report, it has operations in Saudi Arabia and the UAE. But it does not have any sales or operations of any kind in Israel, which is by Middle Eastern standards a relatively small but very wealthy market. A search of its website using the term “Saudi” yields over 50 results, and a search using the term “Israel” yields none. This refusal to do business in Israel may be because of the longstanding practice of Arab countries (with the exception since their peace agreements of Jordan and Egypt) of not allowing companies that trade with Israel to trade with them. The Arab League’s official boycott of Israeli imports and of companies that contribute to Israel’s military and economic development is now often not vigorously enforced, but the legacy – a reluctance to do business with Israel for fear of offending Arab nations – remains.

To be fair, there are other possible explanations for Arla's absence from a small but wealthy market. One might not expect dairy products to be a big import for a mostly Jewish state. However, most Israelis are secular, and the International Dairy Foods Association complains about their inability to get greater access to the Israeli market because of Israeli protectionism. Israeli firms themselves certainly produce a large number of dairy products, and intentional avoidance of the Israeli market by foreign firms for fear of alienating Arab consumers is historically a common practice.

There is no obvious moral objection to a voluntary boycott as long as one doesn’t believe that what is being boycotted is itself moral, although a boycott enforced by a government is a different matter. (Indeed, U.S. law actually punishes U.S. firms that participate in the Arab League boycott of Israel.) But there is a certain comic justice in seeing a company that appears to have unofficially embraced one boycott being damaged by another, and lamenting the injustice of it all. For business reasons, they appear to have elected not to sell to customers in Israel who were willing to pay. But tending to some short-term bottom-line concerns turns out to affect the bottom line in unexpected ways.

Monday, January 30, 2006

Palestine, the Supplicant Society

With the victory of Hamas in the Palestinian elections there are already reports of unrest by armed gangs affiliated with the defeated Fatah movement (the core of the old Palestine Liberation Organization), and predictions of more. To understand why the Fatah gunmen and cronies have so much to lose it is important to understand what kind of government is in charge of the Palestinian pseudo-state. The table below shows the ten nations and territories (out of 160 for which data are available) with the highest ratio of government consumption spending to GDP, according to the World Bank:




CountryGovt. consn./GDP (%)
Kuwait26.43
St. Lucia26.95
Namibia23.90
Sweden28.05
Antigua28.30
Israel31.17
Botswana33.28
Suriname34.54
Eritrea43.73
West Bank & Gaza51.74


Government consumption spending is spending for immediate purchases of goods and services. It does not generally include welfare-state transfers, and is the most discretionary type of public spending. The Palestinian territories are thus the biggest nanny state in the world, by a substantial margin. Much of this money comes from foreign governments –- Arab states, the EU and the U.S. in particular. Much of it in turn is spent on pure pacification of Palestinians made restless by a dysfunctional economy. Astonishingly, according to the remarks of outgoing Palestianian economy minister Mazen Sinokort paraphrased in the LA Times and The International Herald Tribune, public employees are the “breadwinners for 30 percent of Palestinian families.”

Palestine is thus now perhaps the world’s limit case of what Anne Krueger once called the “rent-seeking society,” a society where who prospers and who does not is mostly dependent on who has an in with the bloated state ministries. The eviction of Fatah means that all of those on the PA’s foreign-funded swollen payrolls now risk losing the ability to feed their families. All too many developing countries have fallen into the rent-seeking trap, where people need to curry favor with the state to survive and the state’s decision-makers in turn expand their control over society to give them more leverage to extract bribes. But the difference between the PA and, say, Zimbabwe is that the world has had too much emotional investment in the Palestinian enterprise to let it fail. The economic dysfunction resulting from such a society, in other words, was kept at bay by a massive inflow of foreign funds. Good money has been poured after bad, and the result has become a society of supplicants, dependent first on Arafat and now on Fatah more generally for support.

If, as seems most probable, Western support for Hamas dries up, Hamas will have only two choices – further shakedowns of other Arabs to replace Western aid or coping with widespread violence that will result when the foreign-aid pipeline and the slush funds it empties into dry up. Of course, it is possible that the West and particularly Europe will find a way to rationalize continued spending on a cause whose emotional hold is powered by guilt over (and maybe belief of) the Left’s portrayal of Israel as simply another manifestation of Western colonialism. But German chancellor Angela Merkel, whose country is caught between believing in the rightness of the Palestinian cause and the embarrassment (unique perhaps to her country) of funding a Palestinian government now officially committed to the destruction of the Jewish state, has so far indicated that the latter is untenable. If the foreign largesse that has for some time been the only thing standing between the PA and fiscal collapse dries up, expect the rent-seekers cut loose to take it very, very badly.

Thursday, January 26, 2006

The Incumbency Conundrum

R. Emmett Tyrell, Jr., who is a national treasure, has an absolutely hilarious column today on the rudderless performance of Sen. Edward Kennedy during the Judge Alito confirmation hearings, and indeed throughout his recent Senate career. It prompts the very obvious question of why the voters of Massachusetts keep returning a man who is widely acknowledged to drink to excess (and who, by the way, killed a woman) to office term after term. Nothing better than the continuing presence in office opf the distinguished senior senator from the Bay State illustrates how difficult it is to dislodge incumbent politicians.

One of the most insidious trends in modern American governance is the growing advantage to incumbency. The American constitutional experiment relies on three prongs to maintain ordered liberty – separation of powers, federalism and elections. The first is subject to ebbs and flows over time, with the President’s authority rising and falling with national crisis, and recent decades having seen the judicial branch more and more being the arbiter of our disputes. The second is also eroding, mainly due to the substantially increased role of the federal government in regulating economic life and redistributing income; its expanded power of the purse also allows it to impose its will on the states by threatening to withhold funds.

But the competitiveness of elections has been in decline for some time. Incumbents in the U.S. House are routinely re-elected at a rate in excess of 90% (over 95% in recent years), and even in the Senate, where incumbents do not benefit from favorable district-drawing, re-election rates are well over 50%. (I have seen no data, but would be surprised if the same were not true in state legislatures.) While in no sense is unalloyed democracy consistent with protection of liberty, voting serves as one tool among several of restraining a government that has overstepped its bounds, and so the decline of competitive elections is of concern.

Why has it happened? In the economic model of democracy, incumbents carry several advantages. Their marginal fund-raising costs are lower, because most of their search costs of finding likely donors are sunk in the course of getting elected the first time. They benefit from subsidies in the form of mailings to constituents and so on. Precisely because they are already elected, incumbents are on average better candidates than challengers. (Statisticians call this selection bias.) And once in office, particularly if they are on important committees or in the majority party, they may allocate goodies to their constituents (although one district’s benefits are someone else’s costs). Incumbents in power probably have cost advantages in bargaining with state legislators who redraw districts after every census at both the state and the federal level.

Some of these effects are clearly more or less constant over time. However, improved technology has made redistricting a much more scientific enterprise. And the rising cost of elections accentuates the advantage of incumbents, and perhaps more critically, incumbents have used outrage over previous scandals to enact entry barriers to potential challengers by restricting fundraising. Given incumbents’ experience, they will always benefit both from shaping the fundraising rules to begin with and having more experience with a donor base that can be made to conform to the new rules. And it is also possible that incumbents are reelected at a higher rate because better polling technology has allowed those who are likely to lose to know ahead of time and hence not run. But the first chart in this study indicates that is unlikely. The rate at which incumbents choose to run in the U.S. House (as of 1994) and overall reelection rates there are as high as they have ever been.

If we accept that incumbents should have to compete for votes, two factors that are mechanically if not necessarily politically easy to change are redistricting and campaign finance. Many advocate “nonpartisan” redistricting, but this is probably a fantasy. There are very few politically knowledgeable people who are “nonpartisan,” and partisans (which is to say, most voters) would probably not trust the results in any event. This may be why the referendums proposing such redistricting, so popular with “good government” types, often lose.

Campaign finance is another matter. In most markets, low-quality producers are displaced through entry. Entry has costs in any market, but there are few legal restrictions on raising funds to open a restaurant or produce a better computer. But political markets are different. Potential entrants must navigate a maze of fundraising restrictions, including the recent and very complex McCain/Feingold legislation. These entry barriers stifle political entrepreneurs. People forget that the late Eugene McCarthy was only able to oust Lyndon Johnson on an anti-war platform because at a critical juncture he got funding from a handful of wealthy men like Stewart Mott and Jack Dreyfus, Jr. Such an insurgent campaign would be impossible today. Despite the hysteria of campaign-finance “reformers,” money – willingness to pay – is how market participants make their views known as surely in political markets as in any other.

And as many have predicted, McCain-Feingold is actually a link in a chain going at least back to 1974, when other “reforms” were enacted in the wake of Watergate. Sen. McCain has pursued the white whale of elections without special interests (and we are in the end all special interests) for years, mainly because he took offense at being caught up in the Keating 5 scandal, when five U.S. Senators, including Sen. McCain, were accused of trading money for favors for savings-and-loan fraudster Charles Keating. If you want some chilling reading, look at this prediction about the direction restrictions on speech designed to dislodge incumbents will take in the wake of the Supreme Court’s upholding of McCain-Feingold. More campaign-finance restrictions means less speech, which means more incumbency advantages, which means less accountability, which means less freedom in the end. Allowing politicians to set the legal rules of political competition is like allowing Microsoft to set the legal rules of software competition.

Throw the bums out? The polis is increasingly too enfeebled even to lift them off the ground.

Wednesday, January 25, 2006

Tribal Rent-Seeking

Today’s Wall Street Journal has a front-page article (available online via subscription only) called “Textbook Wars: Religion in History.” It describes how various religious groups (Muslims, Sikhs, Jews and Hindus are cited in particular) are pressuring state textbook-adoption agencies to change unflattering descriptions of the histories of their groups. For example, a group lobbied to change a statement in a textbook about ancient Hindu India from “men had many more rights than women” to “Men had different duties…as well as rights than women.”

This is an example of a worrying, growing problem that I have noticed for some time – tribal rent-seeking. Generally, when people think of rent-seeking (or, in James Madison’s old terminology, “factions”) – in other words, the exertion of (often very costly) political pressure to obtain special privileges from the government – we think of purely monetary interests. We think of labor versus management, of domestic producers seeking protection versus consumers, of drugs companies, of doctors and so on.

But tribal groups – those based on religious, ethnic, or linguistic identity and, increasingly, sexual orientation – increasingly try to get the government to subsidize their ethnic capital – the public perception of their history, their children’s language skills, their ability to obtain rewards strictly on the basis of group identity. This takes different forms in different countries. In the U.S., struggles over textbooks are far from the only example. (Attempts to enforce politically correct speech via cultural rather than state censorship might be another.) In India, the so-called Scheduled Castes and Tribes and the unapologetically termed Other Backward Classes benefit from an immense array of reservations (a severe form of what Americans call “affirmative action”), which have predictably lasted much longer and come to benefit far more Indians than its naïve founders originally predicted. In Belgium, there is conflict over the distribution of resources between Flemish and French speakers. (Soon,if these guys have their way, Belgian Arabs will join these fractious rent-seeking battles.)

This is unfortunate because, while ethnic identity can be shaped by the broader society (“white” and “black” Americans in the 19th century used to distinguish between different percentages of “blackness” – mulattoes versus quadroons versus octoroons), for a given generation it is something unalterable. Apart from religious conversion, you cannot change tribal identity. Social conflict based on these grounds is thus likely to be the most intractable. If the government is unfair to accountants or farmers, the children of these groups always have a chance to try something else. But one can’t really become un-Chinese or un-Arab. The more tribal groups turn to the state to mediate their differences, the more tribal conflict there is going to be. Tribal characteristics are harder to downplay as identity sources than other forms - there is a reason ethnoreligious warfare has been far more common and costly throughout history than class warfare - and so having the state reinforce it is a bad idea. Tribe then becomes destiny, the sole explanatory principle for explaining why society is the way it is. Absurdly, just today the BBC reports that a group in France with links to French nationalist groups has begun serving soup with pork in it to homeless people, and calling the dish "Identity Soup." And so in their estimation to be French is to abandon Muslim or Jewish religious beliefs, while the first response for the French "anti-racism" activist Bernadette Hatier is to ask the French government to ban the giveaways. The gap between the public and private sphere becomes obliterated, and tribal conflict becomes greater in both.


What to do? A rent-seeking approach to tribal conflict has not in my judgment been given the attention by scholars it deserves (although I am trying to remedy that in my research and on this blog in entries such as this one). But some basic postulates of that literature suggest themselves. First, it is imperative to encourage voluntary cooperation where possible. The more people trade, the less they fight, and so the less burdened by the dead hand of taxation, regulation and so on the economy is, the greater the opportunities for transacting across tribal lines, and the less urgently therefore people run to the state to protect of subsidize their tribal identities. Second, decentralize. If tribal groups are geographically confined, don’t force people in other regions to subsidize tribal-specific capital. This is the approach Switzerland generally takes, where the French, Italian and German communities there are in charge of (and fully finance) their own education (an area where the subsidy of tribal capital looms large), with the central government confined to truly national tasks – defense, law enforcement, funding the welfare state on tribal-neutral terms, etc. The more a person has to pay to reinforce or raise the returns to someone else’s tribal identity (and therefore lower the relative returns to one’s), the angrier he is going to be. More heavily interventionist societies such as many of those in Europe probably have more to worry about in this regard than Americans do, but as the article that began this piece suggests, we have our problems too – bilingual education, for example. Finally, (and this is hardly a novel observation), encourage assimilation, not multiculturalism. “Assimilation” does not mean making everyone into white-bread Episcopalians, but it does emphasize the notion of a common social identity, where investments in and drawing on tribal capital are fine within the home and other arenas of voluntary exchange, but in the public square – in the laws and in their making – the universal features of citizenship are the currency of the realm.

Monday, January 23, 2006

Big Oil as Big Jackpot

The New York Times reports with the usual high dudgeon (as high as dudgeon can afford to be, at any rate, in nominally objective news reporting), that Big Oil is in essence ripping off the taxpayers by unfairly calculating what it “owes” them in royalty payments when it leases drilling rights:

At a time when energy prices and industry profits are soaring, the federal government collected little more money last year than it did five years ago from the companies that extracted more than $60 billion in oil and gas from publicly owned lands and coastal waters.

If royalty payments in fiscal 2005 for natural gas had risen in step with market prices, the government would have received about $700 million more than it actually did, a three-month investigation by The New York Times has found.
But an often byzantine set of federal regulations, largely shaped and fiercely defended by the energy industry itself, allowed companies producing natural gas to provide the Interior Department with much lower sale prices - the crucial determinant for calculating government royalties - than they reported to their shareholders.

As a result, the nation's taxpayers, collectively, the biggest owner of American oil and gas reserves, have missed much of the recent energy bonanza.


The piece cavalierly assumes, without any evident introspection, that the purpose of oil leasing is to maximize the money flowing to government coffers. It is replete with phrases like “losses to taxpayers” – as if “taxpayers” had any claim on the profits that come from the risk-taking that goes on when crude oil buried under thousands of feet of water is transformed into gasoline at the pump available at the prevailing price for use in your car. (Leave aside that when the press is reporting some benevolent use of taxpayer funds, the benevolence never comes from “the taxpayer” but from “the government, while here, it is the former rather than the latter deprived of their just shares of higher oil prices.)

The drilling program apparently allows oil companies to use a highly complex set of rules to calculate the value of the oil they received, and then fork over to the government some fixed share of that value. The rules are as complex as they are because of some long-since-forgotten interest-group struggle involving oil companies who wanted to fork over as little as possible to the government and advocates for extracting money from the oil companies to fund their pet spending causes.

But why do oil company workers, shareholders and people who use a lot of oil have any unusually large responsibility to fund government? It is not clear to me why high oil prices should translate into more resources available for funding whatever government projects can currently command the most political support. That is a basic violation of the rule of law as Hayek understood it – the idea that, whatever the proper sphere of government is, we all bear equal responsibility for funding it. (This is a specific example of what I once more generally called the Money Tree fallacy.)

I propose an alternate objective to guide us in deciding how to allocate oil-drilling leasing rights – the maximization of gains to trade for oil buyers and sellers. Assuming that the government should be in the position to decide who gets to drill offshore to begin with (more than debatable, but possible if property rights in the ocean are sufficiently costly to establish and enforce), this suggests that the rights should simply be auctioned off. Once those fees have been paid, the government should simply leave oil producers alone to work their magic, whatever the prevailing market price of oil happens to be today. When it is lower than expected under an auction system, oil companies will be left holding the bag for having guessed wrong, but if it is high, they get rewarded for risk-taking and guessing right. That, one supposes, will be an incentive to try to guess better. It will also avoid the situation we are in now, where the oil companies are simply a giant grab bag to fund everyone’s special programs. But, oil companies being what they are in the public mind, the politics of that are much more complicated than the economics and the morality.

Thursday, January 19, 2006

The State as Ultimate Problem Solver

"I love Ellen because she is going to do so much for us. With Ellen, anything is possible."
- Pandora Matati, former Liberian warlord gang member

“Ellen” in this instance is Ellen Johnson-Sirleaf, the new president of Liberia. Not without reason, the press has focused on the importance of her being the first female head of state in Africa. But the sentiments expressed by Mr. Matati are to me of far greater interest.

The economist Daniel Klein has written a marvelous essay in the Independent Review called “The People’s Romance: Why People Love Government (As Much as They Do). (It is available here.) In it, he explores the psychological underpinnings for the desire of people to have the state tackle problems, even if they observe empirically that the state is likely to do a poor job. He attributes this to a psychological imperative to be part of a high-minded collective activity, and says that advocating individual choice will always seem inferior because it ignores this instinctive drive to be part of something bigger and more noble.

The interplay between cognitive psychology and political behavior is still in its infancy, but I think another result is also tenable. That is the one expressed in the remark above – a deep faith, especially but not exclusively in societies emerging from years of tyranny, that government is the Great Problem Solver. I remember that shortly before the first elections in Indonesia after the Suharto dictatorship was overthrown, the BBC interviewed an impoverished street peddler about what she thought about the election. She was almost delirious with excitement, confident that the replacement of Suharto with a democratically elected leader would make her life immeasurably better.

Her answer has always haunted me, because it is indicative of the way we profoundly believe, and indeed want to believe, that government is the way we harness our best instincts for The Public Good. In the U.S. at least it was not always thus. In the founding generation there were schools of thought that echo even now. There was the Jefferson vision, which was not just an angry vision about the rights of man – try to take them, if you dare - to be free but the free society as the good society. The free society would allow, after all, citizens to “pursue happiness.” In modern times some strains of libertarian thought argue in favor of market solutions on these pragmatic grounds – because they work better, and mean that ultimately people are happier.

But then there is the dismal view, with Madison perhaps the best-known proponent. The function of the checks and balances is not to enable happiness but to protect us from one another, and from our government. This has very strong echoes in the modern public choice movement, which has built a vast scholarly foundation for Madisonian ideas by arguing that the people who work for the state are no worse than those they govern, but neither are they any better. Given power, they will use it in their self-interest. Since their power (buttressed ultimately by their power to tax, to imprison and their control of the military) is far greater than that of the citizenry, restraining the state so that it finds it difficult to break out of the fence imposed between the tasks only it can do and those that it must not do becomes imperative.

But this vision, consistent with the conservative view of man as always prone to causing trouble given the opportunity rather than as the raw material for “social progress,” has run aground in recent years. Particularly in developing countries but in the advanced ones too, government has become such an omnipresent force – where so many of us turn for our retirement income, to adjudicate disputes with our fellow citizens, to decide what is ethical and what is not in biomedical research – that we naturally turn to it when we see problems in need of solving. The press is the worst in this regard. If people are living on the streets, or a medicine is tied to adverse side effects, or something else in a complex world full of tradeoffs is not ideal, the reporter always first calls some assistant under-secretary of something or other and asks him what the government is doing about this problem, not questioning whether that is a problem the government should be trying to do something about.

But the empirical record of government as problem-solver does not inspire confidence. And so people like that Indonesian peddler and the retired Liberian soldiers will quickly learn that government as it is rather than as they imagine it will not in fact solve all of their problems, but will simply reshape them. A loss of confidence in democracy, rather than in the ability of the state to substitute for voluntary cooperation as a way to resolve problems, will follow quickly. In the advanced nations, particularly in Europe, where government is the provider of health, welfare, and protection from all of life’s nastiness, there is growing crisis of confidence in government institutions. In other societies (Venezuela, e.g.) there is a supine willingness to transfer power to the Big Man so that he can get things done. When the Big Man fails, something else has to be tried, and the historical record of what happens after that moment arrives does not inspire confidence.

To be fair, Ms Johnson-Sirleaf is a vast improvement over what came before her – the warlord president Charles Taylor. But that is not setting the bar very high. Ironically, the more she is asked to do, the worse she will do it, and Liberians, if they see their leader as savior, will find themselves no closer to a decent society than the day she took office. There is a lesson in that for all of us.

Tuesday, January 17, 2006

China: The Pot Boils

For the second time in the last six weeks the Western press has reported on a very unusual event in a totalitarian society: a public protest, culminating in violence against the protesters. The society is China, and in this instance dozens were wounded and at least one person killed in fighting between protesters and security forces. On Dec. 6 it was the village of Dongzhou, where clashes resulted in perhaps thirty killed. (A roundup of media reports from the latter is here.) In neither case was the global media allowed much access, but the times being what they are, word tends to get out. (Even Kim Jong-Il can’t keep a secret anymore.)

There is much buzz over the idea that undeniably growing rural unrest over land thefts and environmental damage threatens the Chinese miracle and even the integrity of the vast Chinese state. But historical perspective is helpful. Almost every nation that goes through the sorts of transformation China is witnessing also endures tremendous upheaval. Areas emanating farther and farther out from the cities are put in play, long-established social traditions are overwhelmed by the machinations of the new robber barons, and in the end some people make out like bandits while others are swept aside in the interests of modernization. That is what we are seeing in China now – the unavoidable disenfranchisement of groups of people whom the Communist Party historically favored, but whose continuing maintenance would upset the remaking of China into a prosperous world power. The peasantry is eminently expendable in a tradeoff such as that, just as they were during the early industrialization of the UK (think of the enclosure movement, which disenfranchised the peasantry, was widely seen as unjust but was ultimately unavoidable) and continental Europe. The proletariat in the latter made their stand in 1848, but progress was ultimately irresistible. That the U.S. avoided much of this turmoil during our own modernization transformation is probably a function of easy access to land during those years and a relatively constrained state which limited the incentive to resort to redistributive rent-seeking in lieu of entrepreneurial gambles.

China is going where Britain, the U.S., Europe, Japan, Taiwan, Korea, Singapore, etc. went before them – the radical remaking of society to enable a dramatically higher standard of living. It is true that history’s castaways in the countryside (justifiably) feel cheated, as their predecessors did in other places, but it is also true that urban Chinese are seeing a dramatic increase in their standards of living and control over their own lives. And that will be the dominant effect on Chinese decision-makers and on any support for a movement (difficult in a police state to generate in any event) that proposed a radical redistribution of rights and privileges. Because of the endemic corruption and the bad economic decisions that result from the resultant hidden information, China may be due for a major short-term correction. But while the whole process of building an East Asian miracle in a totalitarian state is without precedent, it has now been going on for 25 years and is firmly entrenched.

Despite the short-term difficulties (and recognizing that there are some long-term issues that could cause everything to break down) problems such as the growing impact of sex-selection abortion and even infanticide on gender balances), the most relevant historical precedents make me on balance optimistic that the Chinese miracle will, in fits and starts, see its way through to the finish line. The most relevant precedents are the ability of a totalitarian state to ultimately crush even widespread dissent (Cuba, Hungary, Czechoslovakia), the benefits to the winners overcoming the damage to the losers in the political-pressure process (in almost every society that has modernized), the ability of East Asian nations to remake themselves in just a few decades, and the absence (with only one potential exception of which I am aware, Indonesia) of counterexamples of societies that come as far as China has and then see it fall apart. That is the way to think about China’s boiling pot – as a normal episode on the way to better things, The effects it has on geopolitics, on oil demand, etc. are another matter, but we had best start preparing for it, because historical precedents both near and far suggest that in the long run the Chinese transformative miracle is for real, whatever the papers are saying right now.

Monday, January 16, 2006

Segregation, the South and the State

Slavery, it is sometimes said, is America’s original sin. A society conceived with the idea of fleeing a corrupt continent found, within a few years, that it had brought over and even expanded some of that continent’s worst flaws. The federal holiday in honor of Dr. Martin Luther King, Jr. commemorates a man who was perhaps the most important in ending legalized separation of the races. But how the U.S. came to be in need of a martyr such as King is not a story that is as well-understood as it should be.

The 13th, 14th and 15th amendments to the U.S. Constitution were passed quickly after the Civil War, and were designed to confer on freed slaves full legal equality. But very quickly after Appomattox white racist opponents of interracial commerce in the South moved to prevent blacks from obtaining control over property, engaging in contracts and taking advantage of the other aspects of economic freedom that are so critical to giving people control over their own destinies. The years 1865-8 saw the enactment of what came to be known as the Black Codes, laws that often began by promising that “freedmen, free negroes and mulattoes” would have full contracting and property rights, before going on to carve out massive exemptions disadvantageous to blacks. Here are excerpts from Mississippi’s:

Section 5. Every freedman, free negro and mulatto shall, on the second Monday of January, one thousand eight hundred and sixty-six, and annually thereafter, have a lawful home or employment, and shall have written evidence thereof as follows, to wit: if living in any incorporated city, town, or village, a license from that mayor thereof; and if living outside of an incorporated city, town, or village, from the member of the board of police of his beat, authorizing him or her to do irregular and job work; or a written contract, as provided in Section 6 in this act; which license may be revoked for cause at any time by the authority granting the same.

Section 6. All contracts for labor made with freedmen, free negroes and mulattoes for a longer period than one month shall be in writing, and a duplicate, attested and read to said freedman, free negro or mulatto by a beat, city or county officer, or two disinterested white persons of the county in which the labor is to performed, of which each party shall have one: and said contracts shall be taken and held as entire contracts, and if the laborer shall quit the service of the employer before the expiration of his term of service, without good cause, he shall forfeit his wages for that year up to the time of quitting.


This practice of requiring that freed blacks engage in economic activity contingent on licensing by the authorities was in fact a way to maintain control over their activities, so as to alter their terms of trade with respect to whites and therefore force them back into subservience. In the modern context it is well-known that such licensing practices lead to what is now called “rent-seeking,” which enables state officials to reward their supporters and punish their opponents in part by limiting the trading opportunities of the latter. (And this was far from an exclusively Southern phenomenon, as many Northern states imposed similar restrictions. Ohio, for example, had enacted similar legislation in 1804.)

The Black Codes did not last long, with Reconstruction authorities annulling them in short order. And there is some evidence that during this time, gains from trade motivated interracial commerce despite centuries of hostility and fear. Integrated rail cars were used by some adventurous entrepreneurs and the agricultural income accruing to black workers rose sharply.

But the liberal moment did not last. Owing to violent protests and the unwillingness of a weakened Ulysses S. Grant Administration to use federal troops to restore order amid widespread violence, Reconstruction ended. Racists quickly took control of Southern governments and founded the far more widely known restrictions on interracial commerce known as Jim Crow, which limited blacks’ ability to contract and to sell their agricultural produce and often required that businesses segregate. (Rosa Parks, recall, was defying a law that required that municipal buses separate the races.) It was the state interfering in private contracting and telling entrepreneurs how to run their businesses that was the most compelling feature of the segregated South, although it was restrictions on blacks voting that drew the most public attention.

Left to their own devices, people are often driven by commercial opportunities to transact across racial lines. This was no less true in the American South than anywhere else. This story – of early attempts to “bind up the nation’s wounds” through trade, frustrated ultimately by racist rent-seekers gaining control of the state – is not well-known. Robert Higgs has written a book outlining the story, and some of it is also found in Richard Epstein’s Forbidden Grounds: The Case Against Employment Discrimination Laws. Indeed, there is much quiet heroism in resistance to segregation by ordinary people pursuing their interests. Rosa Parks is justifiably famous, but few have heard of Sputnik Monroe, a white pro wrestler who helped desegregate Memphis. For example, he often wrestled in an arena with a small section in the highest rows reserved for blacks. He made an effort to appeal to black fans, and ultimately bribed arena employees to get them to undercount blacks arriving through the doors, thus forcing them to expand out of the segregated section and into “white” seats. In this, he resembles (although gets less credit than) the Freedom Riders, who in part were fighting laws requiring segregated transportation. There were undoubtedly many Sputnik Monroes whose names are lost to history, fighting an opportunity-destroying edifice erected because people with particular preferences violently held had gained control of the state. The liberal (in the traditional, i.e. free-trade sense) society is the greatest enemy of tribalism. People are black, white, red and yellow but money is all green, and that foundation of common interest tends to frustrate the desires of those who want their environments to remain racially (or religiously or ideologically) pure. Only by engaging in political pressure and achieving state-mandated rules can such purists frustrate this process.

In that sense Dr. King’s legacy, at least as refracted through President Lyndon Johnson and the contemporary Congress, is mostly but not entirely positive. His fight against state racism and restriction of voting rights, often at literal threat to his own life, is on its own enough to qualify him as a hero. But the legacy of the civil-rights legislation, which limited the right of private employers and landlords to do something competition would probably force most of them to abandon anyway – racially discriminate – set the table for decades of acrimonious litigation over “affirmative action,” “quotas” and other vague notions. Once the state got into the bean-counting business – unavoidable if “anti-discrimination” laws were to be enforced – conflict over what was a bean and what wasn’t and how ultimately to count them was inevitable. These laws have inevitably brought about a presumption among tribal pressure groups that all ethnic differences in labor-market outcomes are due to “discrimination,” and have made it almost impossible for people to agree on what the discrimination-free society will ultimately look like.

It would ultimately have been better in the aftermath of the March on Washington to mandate equal access for all races to state-provided resources, particularly education, and then to let the market allow people to pursue their interests. In other words, to prohibit the state from discriminating, but to let individuals trade freely, whether they wish to keep minorities out or (as now) to court them avidly in the cause of “diversity.” Instead, we have a society where zero-sum litigation and legislation drives most discussion of tribal tensions, and that is a shame. The more closely we approximate a truly free society, the less tribal animus we will have.

Friday, January 13, 2006

McSurgery

International economists like to distinguish between “tradeable” and “nontradeable” goods. Tradeable goods are those that can be obtained in other countries, and whose flows are subject to the various theories of international trade. Nontradeables must be produced domestically for technological reasons. The classic example is haircuts: no matter how much transportation costs fall, how globalized we become, haircuts are always bought from a domestic producer.

But one clear effect of globalization – the reaction of producers to falling transport costs – is to convert nontradeables into tradeables. And one of the most interesting manifestations of this phenomenon is the globalization of medicine. Medicine was once a quintessentially nontradeable good. You drove to the family doctor, and then he might have sent some X-rays to be developed on the other side of town. If things were bad enough, your doctor admitted you to a hospital nearby.

But no more. Increasingly MRIs are sent over the Internet to be read overnight in India, so that they are ready to be interpreted by the physician in the U.S. in the morning. Surgery clinics in places like Thailand now cater to Westerners who (if they are Americans) don’t want to pay very high out-of-pocket fees or (if they are Europeans or Canadians) don’t want to be on the waiting list for months or years. The medical care is state-of-the-art, English is the language of interaction, the Westerners can afford to pay a lot, and everyone is satisfied.

This raises the possibility that soon medical care will be a brand-name enterprise, with global medical brands every bit as recognizable as those for athletic shoes, accounting or fast food. One could easily imagine the Mayo Clinic or the M.D. Anderson Cancer Center having a location (or a franchise) in Paris, Shanghai or Dubai. Indeed, another of America’s well-known medical brands, the Cleveland Clinic, has already taken baby steps in this direction, moving out of Cleveland to establish two clinics in Florida.

Medicine is tailor-made for the brand name. The economic theory of brands says that they help consumers save search costs. If you are in a city you don’t know and you have the choice of Applebee’s or Joe’s, with Applebee’s you know what you are going to get. If you have a high tolerance for culinary risk and/or a lot of time to look into it, you may prefer to take a flyer on Joe’s. But if you want a known quantity you stick with the brand name. This is why, as long as their use is adequately policed by the owner, trademarks (unlike patents and copyrights) never expire. With the latter two there is a tradeoff between the need to encourage innovation by outlawing cheap copying of a breakthrough that was expensive to develop and the undesirability of legalized monopoly. But trademarks are capable of performing their function of conveying useful information about product traits forever. Brands are most likely when there is uncertainty about the quality of the product, and obviously medical care, which most patients know little about, is a good that has this feature in spades. And it is still true that the reputation of some of the American clinical medical brands is unrivaled.

I predict that these major brands will ultimately be established worldwide, both to serve foreign markets and medical tourists coming from the home countries. That many Indians who have made it big in high technology in the U.S. have gone home is well-known. The huge representation of South Asians in American medicine may yet spur a return of physicians home to establish American medical franchises there. The tricks will be, first, to maintain brand quality and, second, to try to keep rich-country residents paying rich-country prices, because local clinics in Thailand, India, Mexico and so on will compete on the basis of price. The interesting question will be whether these developing-country clinics can establish their own brand identities, in the way some Chinese car companies (e.g., the Chinese car maker Chery) are trying to do in Western markets. But I would expect to see a Mayo Clinic Bangalore ere long, and some compelling Indian brand penetrating the U.S. clinical market soon after.

Tuesday, January 10, 2006

The Precedent Value of Roe v. Wade

The confirmation hearings for Judge Alito’s nomination to the Supreme Court have begun. The chair of the Senate Judiciary Committee, Arlen Specter, was first out of the box and, predictably, his first series of questions involved the “right to choose.” In his questioning, he ignored the fundamental rightness or wrongness of Roe v. Wade and instead focused on its value as precedent. This basic principle of stare decisis, a great reluctance to overturn judicial precedent, is a cornerstone of Anglo-American jurisprudence. In the hearings there has already even been talk of “precedents,” “super precedents” and “super duper precedents.” And so increasingly this is how the defense of Roe is couched. Below is an excerpt of the opinion by Justices O’Connor, Souter and Kennedy from the 1992 case Planned Parenthood of Southeastern Pa. v. Casey, which upheld Roe and is arguably the high-water mark of the anti-abortion movement's campaign to overturn it:

While neither respondents nor their amici in so many words deny that the abortion right invites some reliance prior to its actual exercise, one can readily imagine an argument stressing the dissimilarity of this case to one involving property or contract. Abortion is customarily chosen as an unplanned response to the consequence of unplanned activity or to the failure of conventional birth control, and except on the assumption that no intercourse would have occurred but for Roe's holding, such behavior may appear to justify no reliance claim. Even if reliance could be claimed on that unrealistic assumption, the argument might run, any reliance interest would be de minimis. This argument would be premised on the hypothesis that reproductive planning could take virtually immediate account of any sudden restoration of state authority to ban abortions.

To eliminate the issue of reliance that easily, however, one would need to limit cognizable reliance to specific instances of sexual activity. But to do this would be simply to refuse to face the fact that, for two decades of economic and social developments, people have organized intimate relationships and made choices that define their views of themselves and their places in society, in reliance on the availability of abortion in the event that contraception should fail. The ability of women to participate equally in the economic and social life of the Nation has been facilitated by their ability to control their reproductive lives.

Richard Posner, who almost single-handedly invented the modern field of law and economics, who writes about 1 book a year, numerous journal articles, teaches at the University of Chicago Law School and is by the way the chief judge of a federal appellate court, has written about an economic model of judicial precedent. Adherence to precedent allows for consistency over time, meaning that people thinking about their options know the legal consequences of those options. To constantly overturn precedent is to make long-term planning impossible, and to thus lower risk-taking and wealth creation generally. On the other hand, society changes over time – the technology governing exchange in particular. And so to allow precedents to last too long is to risk having obsolete rules for a changed society. For example, if large media organizations both gain more market power and come to have more influence over governance, as arguably happened beginning in the 1970s, it becomes more important to subject their claims to scrutiny. Traditional rules governing copyright protections may then be too strict, with efficiency requiring that the fair-use exception to copyright be loosened. (For the record, this is not an argument that has been affirmed nor, as far as I know, even been offered in federal jurisprudence over copyright in the Internet age.) Judge Posner ultimately likens judicial precedent to a machine, which produces the output of rules, but whose usefulness deteriorates over time even as a machine depreciates in productivity.

So how should Roe be evaluated in this framework? One can sense the difficulty the three Justices had in Casey know they face in transferring the standard argument for stare decisis to Roe. Arguably the only people whose expectations would be upset are those who are currently pregnant at the time the decision is overturned. They tackle that argument in the second paragraph of the above excerpt, but this is really the only sensible way to think about it. Even if one thinks about all women (and for that matter, men) who are currently of childbearing age as having “organized intimate relationships and made choices that define their views of themselves and their places in society” (and this is meaningless blather), that too is an effect that passes after one generation.
In fact, the Posner argument means that Roe is less valuable as precedent than before. The improvements in contraceptive technology and (rightly or not) the sharp decline in social stigma (and hence, lost opportunities) attached to out-of-wedlock birth mean that if anything that unplanned pregnancies are more avoidable and less costly than they have ever been. If one takes the position of Roe that there are conflicting interests to be traded off – those of the fetus in life versus those of the women in liberty to have the baby or not – then the current balance favors the former more than it ever did. I have purposely avoided the broader issues of the extent to which women will then choose unsafe abortions, the philosophical issue of what life is (and whether that decision should be left to the woman) and the other more common controversies over the right to abortion, all of which are substantial arguments. But the precedent argument is much weaker for Roe than for other cases that were overturned (Plessy v. Ferguson, most famously. Roe (unlike the jurisprudence of the early 1930s on economic freedom overturned just a few years later by justices fearful of Franklin Roosevelt) can be more justifiably overturned because of the changes in the nature of the society around it.

Monday, January 09, 2006

The Uncertain Future of War

One of the most strikingly underplayed features of the contemporary world is the decline in both the number of wars and the deaths they cause. Amidst all the chatter about the clash of civilizations and collapsing Arab civilization and the hopelessness of Africa and on and on, the data indicate that since the middle of the 20th century, and particularly since the end of the Cold War, peace has been breaking out everywhere.

In their ongoing series “Peace and Conflict,” Monty G. Marshall and Ted Robert Gurr document the decline in both interstate and intrastate (i.e., “civil war”) conflicts. In their most recent report, Peace and Conflict 2005, they report that the number of wars of all kinds more or less rose steadily after 1945, but then peaked in 1982-3 at roughly 180 conflicts. Since then, this number has been in steady retreat, to about 60 conflicts as of this year. To be sure, they identify threats of future wars that still exist – particularly between the Koreas, Taiwan and China, and India and Pakistan. But the data to this point in what with perhaps unintentional irony is termed the postwar period cannot be interpreted anything but positively.

Even ethnic conflict and civil wars are falling, to the lowest level since 1960. While six new such wars began between 2001-4 in such places as Darfur and the Indonesian province of Aceh (and the newspapers tell us all we want to know and more about such conflicts), thirteen were settled during this period. Indeed, while many predicted that the end of superpower support for ruthless client states during the Cold War would lead to an outbreak of such wars, a brief uptick in the early 1990s was followed by a subsequent, ongoing decline.

The decline of war is by any measure an astonishing (if not necessarily permanent) development. If (a very big if) it persists, it will be perhaps the greatest human achievement in history. What explains it? Profs. Marshall and Garr attribute it to the collective-security structure begun with the UN after World War II and followed by greater dominance of the world by responsible democracies. But another possibility which receives relatively little attention is the growth of global trade. There are two ways to get what someone else has: to persuade him to give it to you in exchange for something of value that you have and he wants, or to forcibly take it. As the productivity of modern technology grows, the incentive structure may tilt toward trading and away from warring. (Of course, this is a double-edged sword; as the late economist Jack Hirshleifer was fond of noting, technological progress can strengthen relative advantages in fighting as much as the gains to trading.) But it is well-known that nations that trade more fight less, and so the growth in global economic networking may not only make it more costly to fight your trading partners but strengthen the incentives to preserve the international-security order that makes such trade possible.

Why do we hear so little about this dramatic development? First, it is obviously not universal. Many contend that in many non-industrial societies, and Arab and Muslim-majority countries in particular, the clash between the cultural requirements of globalization and the local traditions (and those empowered by those traditions) is great, great enough in fact to prompt a violent backlash. And given that the U.S. is involved in one major ongoing war, Americans are naturally predisposed to a the-world-is-going-to-hell argument. And the media naturally tends to play up bad news of all kinds, even though the economist Julian Simon made a second career out of meticulously documenting (in such works as The Ultimate Resource 2) that despite all the hand-wringing about how things are getting worse, they are by all available measures (life expectancy, health damage from pollution, food production, etc.) getting better, as they have for about 300 years without interruption, and always will. Local television news reporters like to say that “if it bleeds, it leads,” and the media in general will play up bad news not because they are dour or cynical by nature but because (perhaps because we are evolutionarily predisposed to) it is what we will watch and read. But the decline in warfare and deaths from it may qualify as one of the most underreported stories of the last fifty years.

This poses a possible problem for the essential conservative view of man as unchanging, always prone to prey on his fellow man (a view increasingly supported by sociobiologists), a contrast to the leftist view of man as a work in progress, each century’s rollout better than the one before it because of all the hard work of moral uplifting done by political progressives. But in fact man can still be what he is and prosper and succeed, as long as he resides in an institutional environment that strengthens his best instincts and restrains his worst ones. And that framework is something that cannot be designed, but only discovered one mistake at a time. The current combination of the spread of consensual government and of global trade, both tendencies that promote cooperation rather than conflict, may be the culmination of an experiment that has taken centuries to reach this climax.

But we should be careful not to take this line of thought too far. A future Nobel Peace Prize winner once contended that economic interdependence made war obsolete, as those nations that cooperated were those that advanced, and only the rear guard of nations in decline had to resort to territorial conquest to get what their people needed. As trade and technology advanced, war would recede into nothingness. The author was Norman Angell, the book was The Great Illusion, and the year was 1910.

Thursday, January 05, 2006

Why is there a Rose Bowl?

I don’t mean in the sense of, “instead of a playoff system, as in every other NCAA sport.” The convoluted BCS system is easy enough to explain in economic terms. The bowl system has its origins in the time before the NCAA took control of college sports, when Pasadena established the Rose Bowl as an adjunct to its Tournament of Roses promotion. Over time, the gains to cities and individual schools resulting from the proliferation of bowls (almost half of Division I football teams went to one this year) means that the switching costs of going to a playoff are too great, even for a cartel like the NCAA. The big four bowls have found enough gains from trade from satisfying the fans (which are even greater with Fox Sports taking it over next year) to establish the Bowl Championship Series contraption. It will be with us for some time. This may be an example of path dependence, in which historical momentum means that an inferior standard gets locked in at the expense of an alternative that, from the point of view of year zero, is more efficient. Examples cited include the QWERTY keyboard you are using, which is allegedly inferior from the point of view of typing speed to the Dvorak layout. (Compare them here.) But people are trained on QWERTY, and so to retrain them on Dvorak is too costly. Similarly, no manufacturer wants to produce a lot of Dvorak keyboards, because not a lot of people are trained to use them. The dominance of Windows over allegedly superior operating systems is another example (although a debatable one). A playoff would be better for the fans, and would certainly provide more confidence that the winner is really the best team, but too much is at stake in the current bowl system.

But the larger mystery is why there is big-time college sports at all. The notion of huge numbers of people watching at the stadiums and arenas, and in the process generating billions of dollars in revenues for universities, is as far as I know peculiar to the United States. In other wealthy countries universities have athletic teams, and occasionally (e.g., rowing competitions in the UK) they generate interest in the broader population, but there is nothing like the frenzy over football and men’s basketball in the U.S. That universities invest so much in sports is all the more striking given the public-relations costs they must bear from athlete misbehavior, the relaxation of academic standards, the occasional point-shaving scandal and so on.

A number of economic theories have been proposed to explain this. One has it that big-time athletics is a form of what economists call signaling. The university is trying to appeal to potential students (and potential faculty too, perhaps) who have a hard time discerning university quality. When product quality is unknown but it is possible to establish its quality by spending money on a signal unrelated to that quality, doing so will allow buyers to distinguish higher- from lower-quality producers. The high-quality car sellers, for example, will invest in expensive car-loan programs, elaborate dealership facilities, and so on because it persuades the buyer that the dealer has the money to invest in high-quality cars. Similarly, universities that spend a lot on frills such as big-time sports might also have a lot to invest in the quality of instruction and research.

But this seems hard to believe. University quality is readily observable. Students generally know high- from low-quality schools, and many of the highest-quality ones (the Ivies, MIT, Cal Tech, Chicago) spend very little on big-time sports. Another theory is that it is a signaling device targeted at alumni, to persuade them that the university is continuing to invest in university quality, which means that the reputation of alumni degrees is maintained. But the farther away you get from graduation the less important your education is relative to your work experience, so this too seems strained.

Murray Sperber, a professor of English at Indiana, very provocatively contended in his book Beer and Circus: How Big-Time College Sports is Crippling Undergraduate Education that athletics is a way to distract undergraduates while resources are diverted from instruction to what the faculty value, which is research. This is a tough hypothesis to test, although some work shows that spending on all athletics and membership in a major athletic conference, after standardizing for other things that should be accounted for, is associated with higher-quality students, while the Sperber theory might predict the opposite. It is also possible that colleges offer athletics because students like it, although the exceptionalism of the U.S. relative to other rich countries would then remain to be explained – why don’t universities in Germany or France solicit students this way?

Another possibility is the vastness of the U.S. Because it is such a big country (with big travel costs), and because pro sports leagues are inherently limited by size (England, at 130,423 square km of area, has 20 Premiership soccer teams, while the U.S., with 9,161,923 square km, only has 32 NFL teams), vast areas of the country are unavoidably underserved by pro sports leagues. (Until the Colorado Rockies were born in the mid-1990s there was not a single major-league baseball team in the Mountain time zone.) College sports then appeals to the surrounding populace that is not attractive to the pro leagues. (Anyone who has watched people travel across Tennessee or Nebraska on game day with school flags flying out of the windows of their cars can appreciate this.) This theory is also supported by the fact that in the Northeast, the most densely populated part of the country (in terms of people and pro franchises), big-time college football is least popular.

But I confess this is not an entirely satisfying theory at the intuitive level. Jamming 100,000 people into a stadium or being transfixed for three weeks by the Division I basketball tournament to watch university athletes who are for the most part of lesser talent than professionals is still a mystery in need of an explanation.

Tuesday, January 03, 2006

Of Laws, Men and the NSA

President Bush has been using the first person (singular and plural) quite a bit in defense of his authorization of possibly illegal warrantless monitoring of communications within the U.S. by the National Security Agency (historically charged with monitoring overseas communications for intelligence-gathering purposes). Here he is on Sunday, Jan. 1:

"I think most Americans understand the need to find out what the enemy's thinking, and that's what we're doing. They attacked us before, they'll attack us again if they can. And we're going to do everything we can to stop them."

"This program has been reviewed, constantly reviewed, by people throughout my administration. And it still is reviewed. Not only has it been reviewed by Justice Department officials, it's been reviewed by members of the United States Congress. It's a vital, necessary program."

"The NSA program is one that listens to a few numbers. In other words, the enemy is calling somebody and we want to know who they're calling and why."


But in a free society the duty of the citizen is to read that last remark as "In other words, I assert that the enemy is calling somebody and we want to know who they’re calling and why." It is not enough for the President to ask us to trust that he is doing the right thing, and thus to accept it. Nor is it even enough to assert that officials in his administration or even Congress have reviewed and cleared it. (The latter assertion is probably debatable.) Because we have known for centuries that a pillar of free societies is that the executive must ultimately be answerable to the judiciary. Even if the President is behaving honorably (and we should not expect his domestic opponents to accept that on his say-so), some, maybe most presidents will not. To evade judicial oversight is to set a miserable precedent, which is why the public reaction during the 1970s after the Church Committee hearings (conducted by the Senate into intelligence abuses in a conflict against the Soviet Union, a far more dangerous adversary than the jihad) was so dramatic. This is also why the Fourth Amendment language is so uncompromising:
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.


The most troubling thing about the warrantless searches is that the act of getting a warrant is by all accounts not particularly difficult. Procedures for judicial authorization of interception of communications inside the U.S. were set up in the late 1970s after the aforementioned hearings. And the special panel of federal judges set up to authorize such warrants apparently does not set the bar very high, approving almost all requests that come before it. Undoubtedly many supporters imagine a nightmare world in which judicial dithering prevents the prevention of some imminent WMD attack on the U.S. But it is conjured fear such as this that is the greatest threat to freedom.

Defenders of the President argue that it’s war, and the tradeoffs between liberty and security when gathering intelligence during war are different from those in criminal prosecution. But this is a war like no other. It will not end with the clarity of Osama Bin Laden surrendering on the Missouri. As the President himself noted shortly after Sept. 11, it will be a long struggle (the Cold War lasted roughly forty years) against an ideology rather than a nation, a war whose end may not even be clear to us when it happens. It is in circumstances like this that permanent encroachment of the state upon basic liberties is most likely, and that is what makes the President’s appeal to his fundamental decency most troubling. A free people cannot be asked to simply trust the executive, and even less so to trust all executives who might come down the pike over the next several decades. Even if they do (and some polling indicates there is no great outcry about this program), it is in times of perceived crisis that the terms of trade between liberty and security are so unfavorable, and thus then that diminishment of the former must be most strongly resisted.

The proper task for a citizen of this republic when thinking about the importance of what the President has done is to assess this tool in the hands of a possible future president whom he doesn’t trust. This is why we strive for a society of laws, not men. That standard makes it clear that this expansion of State power is a mistake.

Get a warrant.