Monday, December 12, 2005

The Future of Shame

“Shame is the starting point of ethics.”
- Alain Fienkielkraut, Haaretz,
Nov. 17, 2005.

Shame is underrated these days. In the modern popular culture there are fewer more heroic archetypes than the person who overcomes the shame, guilt and stigma imposed by the repressive society and then basks in our highest modern achievement, self-expression.

But shame is historically the way we save us from ourselves. Long before there was law there was exile, public humiliation and the like. Indeed, some punishment then (think of colonials placed in the stocks) and now is about shaming as much as material deterrence. And while the modern reaction by students when reading the fate that befell Hester Prynne is distaste (and even Cliff’s Notes impresses upon them how objectionable the "harsh, stark, unbending Puritan social and moral structure" is), what was objectionable is not that she went through it but that her paramour, Rev. Dimmesdale, didn’t. It is the primary function of civilization to deter us from doing that which our instinctive desires otherwise command us to do. The ability of the law to do this is quite properly limited; to conceive a child out of wedlock has been shown by reams of research to be a socially costly act, but no society could call itself free if it imposed formal legal penalties on the parents. Instead, it falls to the rest of us to do that. When (as is often true) the law cannot perform this civilizing function, stigma imposed by others is all we have left. And stigma need not be simple punishment; it can be a force for introspection by the stigmatized, which may in turn lead to better decisions in the future combined with forgiveness and help by others.

But shame is in decline. Charles Murray has contended that the rise of Big Government is responsible for this. The existence of anti-discrimination and tenant-protection laws makes it far more difficult to discharge or evict people for good reasons, because a court or government agency may wrongly decide that the reason is an unallowable one. Similarly, the subsidy of socially costly habits through transfer payments to single mothers, erratic child enforcement on fathers and the like encourages things we ought to discourage. When people are once again free to hire or fire and to rent or not as they like, and when parents are uniquely responsible for providing for their children (under penalty of losing them if they cannot), then, to paraphrase, to spend months without working is to once again be a bum, to fail to uphold one’s marital vows is once again to be a tramp or a rake, and so on.

There is something to this, but that theory is, I think, insufficient to explain the decline of stigma once imposed against behavior (which, thought about differently, simply means the shrinking of the zone of what is truly aberrant – what the late Sen. Daniel Patrick Moynihan famously called “defining deviancy down”). We are now a highly mobile society, and this mobility combined with the dominance of work and the rise of forms of time-hogging entertainment that don’t involve face-to-face contact with others (first TV, now the Internet) mean that society is simply unable to impose stigma. For it to work the person facing it must care about the views of those imposing it. When you live your whole life surrounded by the same community, and you know them well and they know you well, their opinion matters in both an emotional way (because they are close to you) and a material one (because your livelihood is so closely tied to theirs, and hence their belief that your reputation is bad will hit you in the wallet). When people are constantly drifting from neighborhood to neighborhood and state to state, and when they don’t even know their neighbor’s children’s names, it is difficult for stigma to have any force. If your children are running wild around the neighborhood, what your anonymous neighbors think of it means little to you. The argument is actually similar to the more famous one by Harvard social scientist Robert Putnam, who in his famous book Bowling Alone contends that these same forces erode American civic engagement.

If the same holds for shame, the problems that it is meant to combat, types of behavior that are socially costly but unwise or improper to control via the legal system, begin to spiral out of control. The atomization of modern life (more true in some modern societies such as the U.S., it must be emphasized, than in others such as Japan) is a process that brings tremendous benefits in terms of self-liberation, but unless handled with care it can bring costs too.

Wednesday, December 07, 2005

The Returns to Victimization

One of the most attractive strategies for some pressure groups in the Western world is to achieve status as an Officially Certified Victim (OCV) class. The culture wars are especially prone to this sort of thing. In the quarrel over how to teach biology I have heard many highly educated people I know talk about a rampaging culture of anti-intellectualism, even though the number of jurisdictions where Darwin is subject to skeptical scrutiny is minuscule, with those who support exposing students to such skepticism often losing their seats at the next election. Secularists are fond of depicting themselves as brave "freethinkers" over whom the ignorami run riot with their hostility to truth, science and the American Way. On the other hand, evangelical comic Brad Stine once told The New York Times that "Christians are totally marginalized on television, unless it's a serial killer."

I suspect that the source of this tactic is the increasing rewards to the rest of society perceiving your group as a victim, and that those rewards in turn stem from the massive increase in government, in combination with the increasing skepticism of the merits of Western society in general and American society in particular. The origins of the victimization template and the incentives it provides in the U.S. are easy to understand, because once upon a time there were genuine victims, particularly Southern blacks who in 1965 emerged from roughly 70 years of crippling oppression deriving from white racists having seized control of Southern state governments after Reconstruction and then imposing laws prohibiting interracial commerce.

But efforts to redress this undeniable burden through Big Government – a massive edifice of antidiscrimination laws and enforcement agencies, explicit and later tacit racial quotas – and pressure on private firms to “diversify” their lending, contracting and so on were a bad prescription resulting from misdiagnosis of the problem, which was simply unequal access to education and bans on integrated businesses. And this misdiagnosis has given people a stake in maintaining OCV status, long after the historical circumstances generating the group’s handicaps have ended. Since the EEOC’s powers were substantially expanded in the early 1970s, anti-discrimination lawsuits have increased far more rapidly than the portion of the labor force consisting of those other than white males, even though the most casual perusal of interracial marriage rates, neighborhood tribal diversity and so on makes it clear that racism is a far smaller problem now than in 1965. We thus move from the problem being "racism" as we have always understood that term to being something called "institutional racism," whose elimination has no firm markers to guide us. We have no measures, in other words, for the extent of institutional racism, and thus no way to decide that it has been solved and government efforts to redress it may correspondingly cease.

The rewards to being an OCV also meant tha the oppressed-minority model generated new OCVs, as first "Latinos" (artificial though they were), women and "Asians" eventually had to share the victimization stage with Christians, atheists, Muslims, and Jews, with each demographic subset having pressure groups arguing that they are oppressed by some different conception of Everyone Else.

That governments in the West can now subsidize groups claiming to speak on behalf of OCVs, or can make legal changes increasing the return to being an OCV, has raised the incentive to be so certified. And that attitude has come to dominate all sorts of political pressure, whether imposed on the government or the private sector. The more you can depict your group as the victims of the dominant culture, the greater the reward you are likely to negotiate. An interesting offshoot of this basic approach is the oppressed-majority template. When your group is clearly a large one, and hence the notion that you are a besieged minority is a tough sell, you might depict it as struggling against a minority that has siezed control of the most dominant influences in society – the media, the academy, the entertainment industry and so on.

The returns to victimization are even greater when the ideology of skepticism about the merits of one’s own culture is an important presence in the marketplace of ideas. This then leads decision-makers to force the opponents of OCV certification to fight a presumption of victimization. Minorities, ironically, will ultimately be more successful at playing the OCV game than majorities. The reason is one of free-riding. It is well-known in public-choice theory that small groups tend to get many more benefits per capita than large ones, because the former are better at controlling free-riding. And so this is perhaps why majority groups (in the U.S., “whites” or Christians) so often resort to the oppressed-majority argument. Whether you perceive yourself as an oppressed majority or an oppressed minority, the adjective is much more important than the noun. An implication of this approach would be that even non-tribal pressure groups – those based on ideology, for example – will more and more adopt the victimization approach as they organize political pressure. Another would be that the bigger government redistribution gets, the more important the OCV becomes in Western political debate. That it has become so attractive to paint gaps in the achievements of demographic subsets of the population as best remedied by equalizing outcomes rather than improving opportunities for trade for those subsets will ultimately be seen as a costly but lengthy diversion.

Monday, December 05, 2005

The Globalization of Disease

In recent years there has been much alarm over one newly emergent virus after another – Ebola, Marburg, SARS and now avian flu. The fear arises because these diseases are extremely dangerous and because, it is thought, modern transportation patterns threaten to transmit the virus around the world at lightning speed. I suspect these fears are exaggerated.

With an infectious disease one can best think about the danger (for a given level of intrinsic deadliness) by considering which of two sets of factors is more powerful: those that dampen the disease’s spread, and those that accelerate it. Much of the more panicky coverage has emphasized the latter and not thought much about the former. This is not necessarily a bad thing; we want people at places like the CDC to be thinking about worst-case scenarios. But for the rest of us thinking about how much to worry about such things, and whether we should change much of our behavior, some perspective is in order.

The most prominent factor promoting acceleration is global travel. Someone can get sick with a new virus produced in the toxic brew of highly dense populations and close human-animal contact that prevails in East Asia or Africa and be in New York or London within hours. Indeed, this was the way SARS spread so rapidly from China to Canada and the U.S. But the SARS lesson is actually striking for what did not happen. The disease was frightening when it first came out, and as it spread from Asia to Toronto. (As an aside, Mark Steyn in a column requiring free registration at the Western Standard, makes the interesting if debatable point that the spread of single-payer health care systems is an accelerating factor. He claims that the inattention to hygiene plaguing Canadian state care as opposed to profit-driven American health care made the disease much worse in the former.) If one compares the ease of global travel and the extent to which people from all parts of the world mingle now compared to the time of the 1918-1921 Spanish flu, perhaps the most devastating epidemic of the modern era, it is clear that the forces promoting contagion are considerably greater. And yet SARS did not turn into that, despite many contemporaneous predictions that it would.

And that is because the forces promoting dampening are so strong. The ability to detect the disease through quick and easy lab tests is much greater, so that those who might have been exposed can be tested (perhaps because they are ordered to, and governments giving orders is also a process that is much more efficient now) even if they are not symptomatic. The ability of the pharmaceutical industry to capitalize on eighty-plus years of accumulated knowledge, plus any new knowledge they and the biomedical establishment generate in the course of responding to new diseases, is incomparably greater. This combination of scientists and salesman, public and private is a far more formidable force than it was in previous epidemics. The capacity to develop medicines and vaccines might then be correspondingly greater. And globalization probably helps much more than it hurts – global communications technology and competition among information-providing firms allow (despite government attempts to minimize the danger in societies such as China) much better monitoring of the spread of the disease, which in turn feeds back to public-health authorities.

The sum of the effects on both sides of the ledger is likely to strongly favor containment, as has been the case for all the other potentially catastrophic viral innovations mentioned above. The obvious counter-example is HIV, which was first identified in the early 1980s and has gone on to kill millions. But that disease has several factors promoting acceleration that do not apply to an airborne pathogen. Because it is a venereal and needle-borne disease, being spread through the most intimate types of behavior, the ability of public-health authorities to get truthful answers and monitor disease spread is much more difficult. No sense of shame attaches to the behavior leading to acquisition of influenza in most places (although the material consequences of quarantine might be substantial). This combined with the rapid onset of severe symptoms means that victims, often because they go there themselves, become known to public-health authorities soon after getting ill, rather than harboring the virus for years and spreading it, perhaps unknowingly, through sexual and needle contact. And the response, especially in the most advanced societies, of authorities to this knowledge will be dramatic. And so there is a reason the constant stream of new airborne viruses generated in Africa and Asia has not generated a new sort of Black Death, and is not likely to.

Friday, December 02, 2005

Diversity, Multiculturalism and the 49th Parallel

Canada and the U.S. are currently undergoing one of the most profound social experiments of our era. Both societies are absorbing huge numbers of immigrants from societies with dramatically different social traditions. In absolute numbers the U.S. absorbs by a significant margin the largest number of immigrants of any country. As a percentage of the population in 1996 9.3 percent of the U.S. population was foreign-born, and in Canada it was a somewhat astonishing 17.3 percent – over one in six of all Canadians. In principle this experiment can result in a happy ending, a sort of giant Coke commercial where everyone gets along in blissful post-tribal harmony, or in an increasingly strife-ridden society plagued by tribal warfare, in the political and even the literal sense.

Of course immigration has been a constant for both countries since their formation, but it is different now for several reasons. One way that is obvious to anyone who opens the restaurant section of the Yellow Pages in any decent-sized city in either country is that the national/racial diversity is much greater now than then. But contrary to the claims of the tribal-grievance industry, this is not such a radical transformation. The Italians and Irish of the 1800s were if anything far more scorned in the U.S. than "Hispanics" are today. But public policy has changed in many ways since then, the cultural differences between some immigrants’ native societies and their host countries are larger, and one of the most fascinating things about the immigration-from-everywhere experiment is the different way the two countries are approaching it. To oversimplify somewhat, the Canadians are gambling on multiculturalism and the Americans on assimilation.

First, it is probably best to get two terms, often confused in the public mind, straight. "Diversity" refers to the division of the population among various tribal (religious, ethnic, linguistic, what have you) groups. A society with ten groups, each consisting of ten percent of the population, is much more diverse than one with two groups, one of which is 98 percent of the population. It is simply descriptive, like the distribution of total population among states or provinces. (In truth it is not quite that simple, because the borders of states are undisputed, while how tribes define themselves is a subject of dispute, but it will do.) "Multiculturalism" is not a description of the world, but is instead a policy created in response to perceived diversity. In particular, it is the use of the state to subsidize the preservation of existing tribal identity and practices within the broader society. It might include such steps as bilingual education, laws forbidding the use of some languages in advertising, the provision of tax money to tribal groups for purposes of bolstering their children’s cultural capital, and so on. It is an alternative to either a policy of non-subsidy or of subsidy of the contrary, the adoption of traits of the broader society at the expense of culturally specific attributes, in other words of assimilation. (The banning of religious symbols, especially headscarves, in French schools is an example of the latter.)

And in Canada the reliance on multiculturalism is much more dramatic than in the U.S. Canadian multicultural efforts at the federal level date back at least to 1971, when the Royal Commission on Bilingualism and Biculturalism recommended that government policy try to achieve the following objectives, according to this Parliament report:

• To assist cultural groups to retain and foster their identity;
• To assist cultural groups to overcome barriers to their full participation in Canadian society; (Thus, the multiculturalism policy advocated the full involvement and equal participation of ethnic minorities in mainstream institutions, without denying them the right to identify with select elements of their cultural past if they so chose.)
• To promote creative exchanges among all Canadian cultural groups;
• To assist immigrants in acquiring at least one of the official languages.

In economic terms, the first and to some extent the second are clearly a subsidy of preexisting culture. The third is hard to dispute as an objective (mutually beneficial exchange generally to be applauded), but why it needs government “promotion” is not clear. In addition, it supposes that preservation of older cultural forms (instead of their replacement by evolutionarily superior ones through trade and competition) is worthwhile. The fourth is clearly an attempt to achieve assimilation. Subsequent legislation, particularly the Canadian Multicultural Act of 1988, further cemented the multicultural approach. Even Section 27 of the Canadian Charter of Rights and Freedoms, their rough analogue to our constitution, requites that it “be interpreted in a manner consistent with the preservation and enhancement of the multicultural heritage of Canadians.” Whether Canadians know it or not, the goal of Canadian policy is that tribal-specific practices will be as common within tribal groups thirty years from now as they are now. This is perhaps why the recent attempt to allow Islamic law to be enforced via consensual arbitration in civil contracts in Ontario (including in marriage, where some Canadians felt Muslim traditions conflicted with Canadian values) was so controversial, even though similar privileges had already been extended to Jews.

In the U.S., in contrast, the primary bulwark of tribal policy is the fourteenth amendment to the Constitution, which provides equal protection of the laws to individuals. The explicit notion of protection of cultures or groups simply does not exist in U.S. law, although it is true that clever litigation has to some extent allowed the backdoor imposition of quotas. It is unquestionably true that the goal of multiculturalism in the Canadian sense has many fans among U.S. intellectuals, and powers the whole diversity-management industry. But the legal framework against which diversity will be accommodated is without a doubt significantly different in the two countries.

So which approach is likely to work better, if by “work” we mean promoting the ability of individuals to achieve their goals and minimizing the amount of tribal conflict? A basic implication of economic theory is that that which we subsidize, we will see more of. The subsidy of tribal identity should cause people to emphasize that identity more, because it is rewarded. This in turn gives people an incentive to further agitate to obtain more subsidy from the state, which of necessity comes at the expense of everyone else’s subsidy. And so a multiculturalism policy should lead to more and more elaborate tribal pressure groups and more tribal conflict.

Among the empirical implications of this theory, assuming policies in the two societies do not converge, is that in the U.S. we should see more tribal fusion – more intermarriage, more novel intertribal cuisine, more diversity within businesses. In the U.S. the norm might be a firm where a Muslim, Chinese, and intertribal (e.g., mixed-race) employees work at adjacent desks. In Canada, it might be more common to see businesses owned and staffed exclusively by Muslims or Chinese, but competing on equal terms. With respect to intermarriage, this paper shows that interracial marriage (which is only one form of inter-tribal marriage) in the U.S. is currently higher than in Canada, but not much. If this simple economic model of approaches to diversity is useful we would expect that difference to increase in coming years. We would also expect that public pressure for employment and other tribal quotas would be greater in Canada and the U.S. The establishment of private schools for those most concerned about preserving their tribal traditions would be more common in the U.S., while government subsidy of such schools (which would cause such preservation to be more common) would be the norm in Canada. Members of tribal groups who wish to abandon tribal practices would find it easier in the U.S. than in Canada, so that orthodox religious observance might be more common in the latter in response to pressure from parents and community elders. David McKintry notes that some of these very predictions are unfolding in Britain, where there is also an official commitment to Canadian-style multiculturalism. Finally, we might expect something that has clearly been borne out, that tribal groups would be able to obtain greater protections against offensive speech and behavior from members of other tribes, and indeed “protections” against hate speech are becoming more common in Canada, whereas they hardly exist in the U.S. public square apart from public university campuses (where even there they fail more often than succeed).

In general, in the Canadian model tribal groups would be more likely to attempt to reconcile their conflicting goals via the state, while in the U.S. they would have to make accommodations themselves – by forming private associations, by moving to neighborhoods that reflect their tastes, and by other ways that rely on their own initiative and energy rather than government force. My very strong instinct is that the latter approach is considerably more conductive to tribal peace than the former, but (despite this being the feeblest of ways to end a blog post) only time will tell.