Wednesday, May 30, 2007

Numbers Matter

I have come across a fascinating interview with Gunnar Heinsohn, a German historian, at Gates of Vienna (along with a lot of apocalyptic commentary about Europe’s likely future) about the “Youth Bulge” theory of historical events. This view holds that large numbers of idle young men cause much of history’s trouble. (A more sober Financial Times review of Prof. Heinsohn’s book, which is unavailable in English, is here.)

Every term when I talk about the sustainability of the Chinese miracle, one of the things I tell my students is that there are already significant gaps in the first few years of life of men over women, in proportions far larger than normal. This gap has been growing for some time, a result of the one-child policy of China’s government combined with a cultural preference for boys. I tell my students that this portends social instability in China because angry young men, especially at the bottom of the socioeconomic ladder, with no wives to socialize them and no jobs to occupy them are the seeds of social upheaval. And in recent years there has in fact been a huge surge in protests in China over land seizures. The idea of idle young men causing a fair amount of the troubles documented in the world’s history books is the only idea offered up in any of my classes that some of my students greet with outright laughter. (Believing in free markets in ideas, I tolerate this good-naturedly.)

It is Prof. Heinsohn’s provocative claim that most interpretations of the great events of history are bunk, in that the ideological causes that drive war, peace and conquest are not the root causes; demographics are. Any society with enough idle young men will have violence of all sorts, and any ideology accompanying the violence is just the sheen on the phenomenon, not the phenomenon itself.

If so, the violence in the Arab world over Palestine, American troops in Iraq, the lack of an Islamist government over Algeria or whatever is simply going to get worse for awhile; there is no pacifying it. The good news is that the jihad may be living on borrowed time; the always-provocative Spengler claimed two years ago that the jihad is going to start running out of fuel some time in the middle of the next decade.

It is a rather dispiriting view of history – the idea of ignoring ideas because they don’t matter; only raw biological energy does. Human society is just a glorified baboon tribe, with fancier tools. But like any view, the only thing that matters is how well it fits the facts.


Tuesday, May 29, 2007

Organic Protectionism

The BBC tells of a possible campaign by the largest UK organic farmers’ group, the “Soil Association,” to strip the “organic” label from organic foods grown in faraway countries and brought in by plane. The argument is that the plane trips create too much CO2 pollution.

Seldom has the conflict between modern environmentalism and prosperity and independence for the world’s poorest people been brought into sharper relief. By all accounts farmers in places like these countries targeted byt the association grow food that has most of the traditionally beneficial attributes of organic food – better taste, (allegedly) better health. But the “organic” label now reveals itself to be about not food but about food grown by a certain kind of people – namely British farmers. The association notes, accurately, that most food flown in is perishable, as we would expect. (Less perishable goods could be shipped more slowly and thus cheaply.) But, wouldn’t you know it, if highly perishable food from Kenya is punished, the only option British consumers of organic, perishable food have is that grown by British organic farmers.

So far the association claims it will try merely to prohibit the organic label for food that in an earlier era clearly would have been thought of as organic, but it reserves the right to campaign for harsher measures. Supermarkets naturally and deferentially claim that “they stock local produce whenever possible,” but what that means is that they stock it unless price considerations dictate otherwise, because price is important to our customers too. Any campaign against it to which the supermarkets deferred would naturally limit the options for people at both ends of the global organic-food chain - buyers in Britain and sellers in Thailand, Uganda and elsewhere.

The modern environmental movement is increasingly mutating into something crabbed and dangerous – hostile to modern life, to the extension of the opportunities we take for granted (many of which will be used to live the way people in the West do, as much as these campaigners despise it) to most of the world’s population, and about imposing the life preferences of a few on all of us, whether we share them or not. The quasi-religious nature of modern environmentalism, with its taboos, original sin, certainties and judgments (apologies to Michael Crichton), makes it a substitute belief system for those for whom the old ones have faded away. Unlike most religious movements, the environmentalist turns first to politics rather than proselytizing. And unlike most religious movements, environmentalism thrives on the destruction of opportunity for other people, on the pulling up of the ladder just as the world’s masses are starting to climb it. These, surely, are its most unpleasant (and futile) features.

Sunday, May 27, 2007

Us and Them

I had an interesting experience at a professional conference this weekend. A paper was being presented on a rather arcane but interesting problem -- what to do about the low degree of competitive balance in European soccer leagues. Year after year, the top places in the league are dominated by a very small number of teams. In any given year, most members of most of these leagues have a vanishingly small shot at the league championship.

The professor presenting the paper, who was from a European country, quickly drew the conclusion that what was needed was European Union regulations to enhance this competitive balance. The audience was probably half American and half European, and the presenter knew that this would be a tough sell with the American audience. Thus, he immediately tried to impress upon them the importance of "non-market" factors to many Europeans -- the value of tradition, of the need for social fairness, harmony and solidarity, etc.

What was striking about his proposal and sales pitch was the ease with which he got to justifying that the EU, having evidently solved most of its other difficulties, should take on soccer standings as part of its purview. But all of this assumes so much. First, why assume that competitive imbalance is the worst outcome? One could easily come up with many reasons why such leagues might confine their winners to a small subset of their members. For example, the possibility of relegation to an inferior league for the teams at the bottom already adds extra suspense. When you add to that the idea that transnational European competitions like the Champions League have created many fans in other countries, and not just in Europe, for marquee teams like Manchester United or AC Milan, it might well be that the welfare-maximizing outcome is one in which those teams get a lot of global exposure.

But even if it were true that competitive imbalance was bad for the leagues and their fans, why do the leagues not recognize this problem? Why run to government at all, and if so why to the highest possible level of government? Leagues have every reason to solve this problem if it exists because they internalize all the positive and negative consequences of competitive imbalance, and yet they choose it anyway. Perhaps the economist analyzing such problems should first think long and hard about the consequence of his remedy, the possibilities he has ignored, and the possibility that his services might be limited to pointing out the problem to a private actor rather than running not just to his own national government but to a transnational solution to impose a single, rigid solution on over a dozen leagues (or, more generally, on hundreds of millions of people).

And this is fundamentally the difference between the attitude of many (but not all) Europeans and many of their American sympathizers and many (but not all) Americans. Europeans are sure they see problems everywhere (lack of “social justice,” “monopoly abuses,” whatever), and default to central planning to solve them. The possibility that the remedy for a particular problem (assuming it is a problem to begin with) might create other, worse problems does not enter their mind with as much force. Americans are sure they see opportunity everywhere, believe that things happen because some people had a good reason for making them happen, and default to individual freedom to solve whatever problems do indeed exist. It is a difference that appears to be growing with time, making the transatlantic ties needed to address honest-to-goodness global problems harder to forge than ever.


Tuesday, May 22, 2007

A Right to Work, But Not for Your Kind

The BBC has the story on an unusual labor dispute in Britain. Remploy is a manufacturing company set up after World War II to employ disabled serviceman. It is still going over sixty years later, but its disabled-only factories are very costly to run, and the company would like to close them and mainstream the workers, something that is occurring more and more all over the industrialized world.

But the union objects, preferring to keep the factories open, even though many pressure groups representing the disabled believe in mainstreaming and are supporting the company’s efforts. The event is interesting because it reminds me of something that is not nearly as well-known as it should be, the role of organized labor in frustrating the advancement of historically disadvantaged groups.

When we think about “employment discrimination” we usually think of actions by employers against ethnic minorities, females, the disabled, etc. Indeed, modern antidiscrimination law was built on the premise that this is the most important kind of labor-market discrimination. But in fact an employer who discriminates is operating against his other interest, that of making as much money as possible. This doesn’t mean no discrimination ever occurs, but that when it does it is costly, and hence more competitive conditions should diminish it.

Unions, on the other hand, are all about restricting the freedom of workers to offer their labor – about restricting labor supply, in other words. Often they gain from segregated workforces, and union contracts have historically allowed union members to indulge their own tastes for discrimination; racial and sexual job segregation often is the easiest way for organized labor (with minorities and women historically often conspicuously excluded or given subordinate status) to facilitate its members’ interests. An effective businessman can’t afford to discriminate, in other words; an effective union often can’t afford not to.

Thus, U.S. history is full of examples of the most atrocious conduct by unions against racial minorities. Attempts by organized labor to legally exclude blacks from certain occupations date all the way back to colonial times. When whites-only unions attempted to shut factories down in the decades after the Civil War, employers often attempted to use black strikebreakers, who were eager to show off their skills so as to earn higher incomes. But whites often broke these strikes with deadly force; violence fomented by the Knights of Labor during a mining strike in 1885 in Spring Rock, Wyoming led to the deaths of 28 Chinese workers. The California constitution of 1877, heavily influenced by organized labor, prohibited state contractors and even corporations chartered in the state from hiring Chinese workers. The Railway Labor Act and later New Deal legislation relegated blacks to low-rung jobs, and it was not until several decades after the Taft-Hartley Act of 1947 limited union ability to interfere with free commerce that labor union discrimination faded away. Many black leaders, ranging ideologically from Frederick Douglass to Booker T. Washington to W.E.B. Du Bois, saw white unions as perhaps the single greatest obstacle to achievement for blacks after the Civil War. (Paul Moreno’s Black Americans and Organized Labor gives a detailed history of these events.)

Undoubtedly the British unions at Remploy's factories are not devoted by prejudice against the disabled, although in an earlier era American unions unquestionably were. But by cartelizing the labor market they make it more difficult for groups like the disabled that have suffered from past discrimination by employers or unions to get ahead. That is an unavoidable effect of unions using special legal privileges to restrict the labor supply, and always will be.


Monday, May 21, 2007

Jerry Falwell

Say what you will about the Reverend Jerry Falwell, he was an American original. I have mixed feelings about his public life. When I was becoming an aware citizen in the late 1970s and early 1980s, he was ascending to the top of his power. Like so many of that age in that age, I found him appalling. Not just in his politics, but in his very appearance and mannerisms. With his expensive suits, excessive waistline, and smooth delivery, he seemed almost the very personification of Elmer Gantry. Years later, I understand that it is important to ask not just whether a person is an Elmer Gantry, but why the cultural industry finds it so easy to generate archetypes like Elmer Gantry – a corrupt, clownish evangelist – in ways that it does not for corrupt people of the left. (Michael Moore, for example, practically cries out for satire, but there are no takers in the temples of the opinion manufacturers.)

Clearly, much of what Reverend Falwell promoted in the political arena is anathema to those who believe in freedom and limited government. The notion that the state should be the tip of the spear for enforcing personal morality is bad both for society and for the success of that morality, which are largely accept. And yet I feel America is much healthier for his having energized the constituency of evangelical Christians – whether they are an actual “moral majority” or not – who felt helplessly trod upon by a secular culture hostile to all they believed in and stood for, and which used the universities and the courts to promulgate this hostility. In most of Europe, such questions as whether abortion is infanticide and whether promotion of gay marriage is bad for society are completely off the table. To American proponents of such views, this is a sign of European enlightenment. But I think a better characterization is that in Europe the elites have decided these questions for the citizenry, while in America some notion of self-government is still intact. And the popular media was occasionally grossly unfair to him, for example in the controversy over his remarks about that Teletubby with the purple outfit and the purse that was advancing the gay agenda. (It turns out that there were significant references, not least among friends of the gay-rights movement, to Tinky-Winky as a minor-league icon of that movement well before Reverend Falwell made his remarks, for which he later self- deprecatingly apologized.)

He and his success were signs of a civically active society, not yet ground down by the relentless wheels of the cultural elite, who seek to install their own constraints on individual freedom in the name of multiculturalism, enforced secularism, etc. He will be missed, not least for the reasons that once caused me to dislike him so.

Thursday, May 17, 2007

Beauty Will Out

Having evidently satisfactorily addressed all the country’s other problems, the Iranian government has turned to cracking down on women in “bad hijab,” women's head coverings that show a little bit too much. That this particular government makes the effort is unsurprising, but it is doomed to failure just the same.

The human drive to beauty is one of our most powerful. We want to look good not just due to social pressure (though that is important) but because it is an innate desire. And “beautiful” need not connote a particular conception of beauty – an Armani suit on a man or stylishly long hair, for example. It simply requires an effort to dress in a way that leaves a favorable impression in one’s own mind on others. From the point of view of the wearer, a ratty pair of jeans or a ripped T-shirt may be beautiful. (I wonder these days whether tattoos can be beautiful under any conception of beauty, but their popularity suggests that evidently they can be.) Even in sartorially totalitarian Iran, the young rebel as best they can – by showing a little more hair, wearing makeup a little more confidently, etc. (Note in the article linked above that men too are subject to Islamic dress restrictions, and they too don’t much like it.)

The effort to throw every Iranian women into a nondescript, non-expressive, color-free, identical garment so as to drive out every hint of sexual excitement in Iranian men is doomed to fail, like Chairman Mao’s drab eponymous suits before it. Not just because Iranian men will invariably find something to look at and admire no matter the dress code, but because Iranian women will inevitably find ways of distinguishing their appearance; this is what humans do.

In her recent book The Substance of Style, Virginia Postrel explains how expanding markets are giving us more opportunity to remake ourselves according to the imperatives of beauty we all have (by wearing a really good-looking suit, for example). Technological improvements in plastic surgery, chemicals applied to the skin to give it a youthful appearance, more interesting clothing and a host of other innovations are giving people more chances to look better, and we are taking them. This is not something to be condemned as artificial, superficial or objectifying, but as value-enhancing like most anything else that satisfies us. Elsewhere Ms Postrel notes that there is a basic constraint all beauty-seekers must cope with, the universal idea of the beautiful body, which is apparently similar in terms of facial features, body shape, etc., across a wide variety of cultures. But the technology allows us to more closely approximate this ideal, even as we can more easily experiment with temporary fashions to further distinguish ourselves, paradoxically, by trying to look like something else. Any ideology that works against this basic instinct – whether it’s the worker’s paradise, militant religion imposed on non-militant people, dreary feminism or anything else – will in the long run be a poor fit for humanity, and will at least for that reason be rejected.

Tuesday, May 15, 2007

On the Suit

Mark Cuban, the software gazillionaire and owner of the underachieving Dallas Mavericks basketball team, has a post on his blog ripping the suit and expressing bafflement that any man would wear one (hat tip: Wretchard, at whose site some discussion of the article follows):

Why am I such a suit hater ? I'm not a suit hater, I just could never think of any good reason for any sane person to wear a suit in the first place.

Exactly what purpose does a suit serve ? Why in the world are so many people required to wear a suit to work ? Do the clothes make the man or woman in the western world today ? Does wearing a tie make us work harder or smarter ? Is this a conspiracy by the clothing, fabric or dry cleaning industry to take our money ?

Or are we all just lemmings following a standard we all know makes zero sense, but we follow because we are afraid not to ?

I'd like to suggest that suits, and first-rate clothing generally, perform three functions that Mr. Cuban is unwise to underestimate the importance of. First, suits allow people to solve a basic asymmetric-information problem. When people are applying for a job, presenting research at a professional conference, or attempting to make a sale, the people to whom they are trying to appeal usually know little about them. This is like the classic problem of trying to sell a used car when the buyer has no way of differentiating high-from low-quality vehicles. Under reasonable economic models, only sellers of high-quality vehicles will offer such warranties, allowing the consumer to tell the high-and low-quality cars and yet not causing the manufacturers of high-quality vehicle to incur any extra expense from the offering of a warranty. The willingness to incur the cost lets the audience know that the seller can afford to offer the warranty, providing a useful single that he can also afford to offer a high-quality product.

So too the wearer of a suit is signaling his audience. For a job applicant, incurring the expense of a suit (and for an employee, incurring the returning expense of cleaning existing and buying new ones) tells the employer that the applicant is willing to invest in himself, providing some evidence that he will provide more effort on the job. That suits are somewhat uncomfortable to wear and take time to put on only adds to this effect. If they have a significant informational role, the theory would predict that those whose reputations are already known – say, a scholar at a conference who already has substantial research achievements – needn’t incur this expense. Nor, for that matter, need someone like Mr. Cuban, who has made so much money and has such a well-known reputation that he needs to signal nothing. But when he was in his 20s, I'm guessing, he wore them all the time. (The signaling role of professorial dress, which is often of intentionally low quality, is a problem I have discussed before.)

The second reason to wear a suit is to make some sort of endorsement of civilizational common ground. The great scholar Jacques Barzun, in his astonishing and magisterial From Dawn to Decadence: 1500 to the Present: 500 Years of Western Cultural Life, argues toward the end, in his evaluation of the modern individualistic revolution, that Western and especially American dress has drifted toward what he calls the “demotic,” meaning roughly the freedom of all individuals to dress more or less as they please all the time, and often in sympathy with ordinary Joes. As recently as the early postwar years the men at baseball games were invariably wearing suits, ties, and hats, as were the male customers in any decent restaurant. Such a scene is unimaginable now, with shorts and T-shirts being the order of the day not just at the ballpark but on airplanes and (in California) even in first-rate restaurants. To wear a suit nowadays is thus to sign on to the historical continuity of Western culture, in particular with the idea that standards matter -- that individualism can only be carried so far. Of course, individual freedom is a cornerstone of the very Western culture the wearing of a suit might celebrate, so this view has internal contradictions. It would be an interesting experiment to go to two political conferences, one of the hard left and one of the hard right, and document the difference in styles of dress – the conservatives in their restraining garb, and the progressives in their let-it-all-hang-out casualness. The suit nowadays is a way of saying, "Men have always worn suits (and fathered their children in wedlock and gone to church and learned about why the Romans and the Magna Carta are important), and if I have anything to say about it, will keep doing so."

But the most straightforward reason to wear a suit is, frankly, because it looks good. The human drive to beauty, which is worthy of a post in its own right, has always been with us and always will be. And, at least in terms of current aesthetic tastes, suits are the highest everyday fashion statement a man can make. Note that this does not mean that all men must look the same -- matching different colors and styles of shirts, coats and ties, wearing neck- or bow ties, employing different kinds of tie knots, and other tactics all provide immense room for individual expression even while confining your wardrobe to coat, slacks, and tie. People want to look good, and a suit is a compelling (if uncomfortable) way of doing that. As long as people value looking good merely for its own sake (and all of us, even if we define "looking good" differently, do), wearing a suit provides a compelling way to do that. Mark Cuban has enough money and fame to make a different statement without consequence, but there will always be enough suit-wearers to leave him flummoxed.

Friday, May 11, 2007


A friend of mine recently gave me (by tearing a paper copy out of the magazine and leaving it in my mail box; he’s charmingly old-fashioned that way) an article from The Atlantic (only available online as a summary, unless you have a subscription) on a growing body of research indicating that intergenerational income mobility in the United States is declining. When you compare the extent to which a person’s income depends on his parents’ income in the US versus, for example, Sweden or Denmark, you find that the correlation is much higher in the US – as much as 40 to 60%. And so Europe, not the US, is increasingly the land of opportunity. If you are born poor in the US, you’re much more likely to stay poor.

But I think that this misconceives what "opportunity" is really about. Opportunity is not about whether everyone can achieve a particular income level, or about how much generations drift between portions of the income distribution. Opportunity is about the ability to achieve as much as other people are willing to grant you in voluntary exchange. And by that standard, the US, while far from perfect, may still be a considerably more mobile society, properly defined.

Suppose that what you earn depends on things transmitted to you directly from your parents – intelligence, attitudes toward work, norms that lead to better and better-paying jobs, etc. – and on things that you get from the state – public education, public health care, etc. Suppose further that there are two societies. One - call it the America - is a complete meritocracy, in the sense that you are entirely paid on the basis of what you are worth in the market, but there is no attempt to adjust anyone's endowment of the second input. In the other - call it Europe - the state resdistributes the second input so that everyone enters the market with equal amounts of that input.

So which society has more opportunity? That depends on the nature of the production function. If the second input matters the most, Europe does. No matter which inupt matters, in Europe anyone can move up in the income distribution through state redistribution , while in America the distribution is cemented in time for eternity. But if the first input matters, America is the opportunity society. But if the second input is the most important, Europe equalizes the initial unequal distribution, and thus provides more opportunity. Since it doesn't matter in the market, those with a lot of the first input (those who earn the most in the market) find that most of the proceeds from having it are taken by the state. Poor European children still end up moving up in the distribution, but high-achieving European parents find that their children move down. So there is lots of mobility, but little growth, and little rewarding of the most productive, arguably a better definition of "opportunity." In America everyone can earn his full potential, but there is little mobility. If in Europe the high-achieving emigrate, European mediocrity, slow growth and American achievement at the high end all coincide. This interpretation seems to better fit the facts.

Numerous arguments have been put forth over the years to suggest that the US is in fact becoming a highly stratified society in terms of things that might affect market productivity - in other woerds, a society where earnings depend more and more on parental endowments (i.e. the first input). James Q. Wilson argued that the primary divide in contemporary America is between couples who marry and couples who don’t. Those who marry raise their children to get married, are able to invest more (by virtue of having two parents instead of one always on the scene) in their childrens' skills and attitudes, and their children go on to live a lifestyle substantially different from those who don’t, or those who get married but divorce quickly. Some years prior, Richard Herrnstein and Charles Murray notoriously contendedthat how intelligent you are is a major predictor of how much money you go on to earn, and that America is increasingly undergoing what they called “cognitive stratification,” the tendency of people to create children largely through union (married or not) with people of similar intelligence. The bottom line, then, is that "opportunity" does not necessarily equal "lots of shuffling among generations in the income distribution," nor does extensive income redistribution for the purposes of public investments necessarily improve "opportunity." That depends on the extent to which people are awarded purely by their talents, and also on the extent to which your talent depends on your parents’ talent. And these are very open questions.

It is a simple model, and leaves out a lot. Thomas Sowell has noted that changes in the overall age distribution of the population matters a lot too, and one could also argue that the huge influx of low-skilled immigrants, especially from Latin America, into the US, combined with increasing returns to the skill set possessed by people at the top of the income distribution almost unavoidably make intergenerational mobility slower. But the notion that whether a society is one of open opportunity or not should be judged by how likely it is that the children of someone in the low end of the income distribution ends up at a much higher percentile (or, equivalently, how likely it is that the children of someone at the top of the distribution end up at the low end) clearly leaves much to be desired.


Tuesday, May 08, 2007

Does the Left Have a Problem with Democracy?

There have been two nights of riots, mostly by the hard left rather than the angry jeunes in the suburbs, in France in reaction to the election of M. Sarkozy. (Le Figaro and No Pasaran currently have roundups.) As the Brussels Journal sarcastically notes, one is hard-pressed to remember riots by the right greeting such left-wing victories as Tony Blair's 1997, François Mitterand’s in 1988, Bill Clinton's in 1992 and 1996, etc.

And in the US it always seems to be that it is the left that litigates close election losses rather than the right. Al Gore famously first accepted, then rejected the fact that he lost Florida in 2000, subjecting the American people and the world to over a month of farcical litigation. There were also lawsuits filed over Democratic losses in House races in Ohio and Florida in 2006, both of which were unsuccessful. I am unaware of any close losses by GOP candidates in recent years that have resulted in any similar action. And of course there were the angry accusations of fraud in Ohio in the presidential election in 2004 by the left, mostly involving accusations of purposeful under-delivery of voting machines to heavily Democratic precincts, resulting in long lines and frustrated voters. On the other hand, when South Dakota Senator Tim Johnson was incapacitated after a brain hemorrhage, there was no attempt by Republicans to argue that he should be replaced by the Republican governor of South Dakota, nor (as far as we know) to get Joe Lieberman to switch parties, throwing in the Senate back to the GOP. And George Allen lost a nail-biter of an election in 2006, yet never seriously contemplated litigating it.

To be sure, the right gets angry, sometimes delusionally so, about its defeats. These defeats are often attributed to vote fraud, which has not been documented in recent years (although it was in the 1960 presidential election, when LBJ and Mayor Daley were able to throw Illinois and Texas to John Kennedy.) But it seldom contests them, either in the courts are in the streets. Why? Perhaps the left believes in the inevitability of its own progressive agenda, perhaps the extreme left hates the extreme right more than the reverse. I cannot really say why. But I was surprised that it has seldom been commented upon before now. That it is the left that so extols democracy as the highest form of government (people of the right generally also placing great stocking tradition and, in the American variant of conservatism, in limited government), while so frequently denying the legitimacy of its outcome, is also ironic.

Friday, May 04, 2007

No One Ever Learned From a Free Laptop

Several years ago, giving away laptop computers to high-school and younger students was all the rage. The economist and former Harvard President Lawrence Summers once said that no one in history ever washed a rented car, and it turns out that people don’t get much use from or take very good care of laptops that they get for free. Alas, for that and other reasons, the big laptop giveaway seems not to have worked out as planned. The New York Times has the scoop:

The students at Liverpool High have used their school-issued laptops to exchange answers on tests, download pornography and hack into local businesses. When the school tightened its network security, a 10th grader not only found a way around it but also posted step-by-step instructions on the Web for others to follow (which they did).

Scores of the leased laptops break down each month, and every other morning, when the entire school has study hall, the network inevitably freezes because of the sheer number of students roaming the Internet instead of getting help from teachers.
So the Liverpool Central School District, just outside Syracuse, has decided to phase out laptops starting this fall, joining a handful of other schools around the country that adopted one-to-one computing programs and are now abandoning them as educationally empty — and worse.

Many of these districts had sought to prepare their students for a technology-driven world and close the so-called digital divide between students who had computers at home and those who did not.

“After seven years, there was literally no evidence it had any impact on student achievement — none,” said Mark Lawson, the school board president here in Liverpool, one of the first districts in New York State to experiment with putting technology directly into students’ hands. “The teachers were telling us when there’s a one-to-one relationship between the student and the laptop, the box gets in the way. It’s a distraction to the educational process.”

The output of education depends on numerous inputs, some of which schools can provide and some of which must be provided outside the school. The inputs that come from the school – lower student/teacher ratios, free laptops, whatever – can be substitutes or complements for the inputs that come from outside the school – parental emphasis on education, neighborhood norms that reinforce or stigmatize learning, etc. They are assumed to be complements, and I think this is reasonable. However, the degree of complementarity is probably very small.

An acquaintance of mine who does research on public-school performance once said that 90% of what public schools turn out depends on what goes on in the home. Shattered family structure, lots of television, parents who themselves do not value education much and thus expect the school to do all the teaching – no amount of “school reform” or computer giveaway or increased funding will make up for this. I suspect that if we account for such problems, there are probably a lot of proposed education policies for which there is “literally no evidence it [has] any impact on student achievement.”

Thursday, May 03, 2007

The Rational Delusion

I just came across a nice essay from 2005 by Roger Kimball in The New Criterion to mark the occasion of a new edition of Hayek’s The Road to Serfdom, published by the University of Chicago as part of an ongoing release of Hayek's collected works. In it, Mr. Kimball discusses the rational delusion, the belief that messy human society can be improved upon by a central plan rationally derived and imposed from on high. That the historical record of such plans, from Robespierre to Lenin to Mussolini (who once noted that “the more complicated the forms assumed by civilization, the more restricted the freedom of the individual must become”) to Pol Pot does not inspire confidence does not at all lessen the appeal of controlling human affairs through The Grand Plan, as the drive for a supranational organization to control global warming, among many other trends, makes clear.

Perhaps it is inevitable that in freedom are sown the seeds of freedom’s destruction. Mr. Kimball argues:
Hayek, like Tocqueville, saw that in modern bureaucratic societies threats to liberty often come disguised as humanitarian benefits. If old-fashioned despotism tyrannizes, democratic despotism infantilizes. “It would,” Tocqueville writes,
resemble paternal power if, like that, it had for its object to prepare men for manhood; but on the contrary, it seeks only to keep them fixed irrevocably in childhood; it likes citizens to enjoy themselves provided that they think only of enjoying themselves… . It willingly works for their happiness; but it wants to be the unique agent and sole arbiter of that; it provides for their security, foresees and secures their needs, facilitates their pleasures, conducts their principal affairs, directs their industry, regulates their estates, divides their inheritances; can it not take away from them entirely the trouble of thinking and the pain of living? … [This power] extends its arms over society as a whole; it covers its surface with a network of small, complicated, painstaking, uniform rules through which the most original minds and the most vigorous souls cannot clear a way to surpass the crowd; … it does not tyrannize, it hinders, compromises, enervates, extinguishes, dazes, and finally reduces each nation to being nothing more than a herd of timid and industrious animals of which the government is the shepherd.

Echoing and extending Tocqueville, Hayek argued that one of the most important effects of extensive government control was psychological, “an alteration of the character of the people.” We are the creatures as well as the creators of the institutions we inhabit. “The important point,” he concluded, “is that the political ideals of a people and its attitude toward authority are as much the effect as the cause of the political institutions under which it lives.”

This analysis is confirmed everywhere, from the hectoring grandees of the EU seeking to centralize more and more of European life under its umbrella, to the continuing public importance of Al Gore (who once told The New Yorker that he saw American society as an exercise in parallel processing, with individuals and then local governments the subcomputers, all feeding information up to the main computer of the U.S. government, which would use it to improve the lives of the American people), to the attempts of both major U.S. political parties to win elections by promising more and more government stuff to appropriate away life’s little vicissitudes. I mean, really: is there no longer a constituency for arguing that the proper approach to farmers’ or small businessmen’s woes is to make them, and the rest of us, once again free men by eliminating the vast government apparatus that expropriates productive wealth to redistribute to those who persuade enough of us that their claims are worthy? Is there no audience in modern America for the idea that gigantic levels of government spending and regulation, with the enforcement powers needed to prop it up and whip the disobedient people subject to it into shape, are not just expensive, or destructive to economic growth, but an actual danger to the foundations of a free society?

Hayek died in 1992, having lived just long enough to see the fall of the last great Western totalitarianism. I wonder what he would’ve made of the relentless growth in government that has occurred since the Wall fell, not in the poor countries once throttled by big government but now moving toward freedom, but in the very cradle of limited government – the West, and Anglo-America in particular. The societies that have most solved many of history’s major problems – starvation, poverty, pestilence – are those where the problems to be solved are ever more minor (how many days of vacation – the concept itself unknown for most of history – we should get as a “right,” e.g.), and yet the government machinery allegedly needed to solve them ever-more elaborate and intrusive. It is a sort of democratic race to the bottom – an attempt to win elections by having the state trample onto more and more of the private sphere to make life easier and easier. All the while, our tax rate moves ever farther beyond the rate paid by actual serfs, who used to fork over a third or so of their crops to the manor. ‘Twould be a sad irony if the very prosperity enabled by a free society made freedom itself intolerable to the people who once knew it.


Can American Universities Stay on Top?

The American university has for years been the envy of the world. It draws students from everywhere, its scientists dominate the Nobel prizes, they are said to generates tremendous technological spinoffs for American society, and hardly a day goes by in which some important scientific breakthrough does not emanate from one of them. Such things are of course always subjective, but they dominate such world rankings of universities as exist. Shanghai Jiao Tong University publishes a list every year, and this year’s has U.S. universities as eight of the top 10. The Times Higher Education Supplement now also publishes such a list, and its 2006 rankings, which rely on subjective peer review rather than things you can count, are less heavily tilted toward US universities, nonetheless. They are nonetheless ranked as seven of the top 10 worldwide.

Can this high quality continue? In answering it is first worth noting that university quality is not a zero-sum game. If American universities and universities in the rest of the world are simultaneously getting better, the world is the winner. And if one nation’s universities are generally the best, it is only natural to expect that the natural forces of competitive convergence would cause the universities and other societies to get closer. But it is still useful to think about the likely future of the American university; it has weaknesses and strengths.


1. Raw material. American students come into college more poorly prepared than students in most countries. The dirty little secret of many large American universities, even those with extensive international prestige because of their research competence, is that a very large majority of those who apply as undergraduates get in. Some significant fraction of these require remedial work, and their lack of preparation slows down the rate at which classes can move. This is the flip side of making college so accessible; it becomes open to people who otherwise might choose to do something else, some of whom impose significant negative externalities on their classmates. In some essentially open-admission universities, an administrator can, not entirely without reason, argue that it is the job of the faculty to reconcile themselves to the students as they find them, but in universities that proclaim higher standards, this disjunction between what is supposed to come out and what comes in is fundamentally dishonest.

2. Nonsense. The humanities departments at far too many American universities are crippled by their devotion to trendy academic fads. When one reads about people with doctorates in education or English teaching courses in “whiteness studies,” a discipline devoted entirely to the idea that whiteness is an invented idea and that white Americans benefit from invisible privileges that they know nothing about, one wonders first why such a field has anything to do with the training the professors received (shouldn’t English professors be doing literary criticism?), and then why the scarce time of students isn’t assumed to be a little more valuable than that. So too with the whole postmodern detour, where knowledge itself – the discovery and transmission of which is supposed to be the very mission of the university – is assumed to be fuzzy. For all their problems, university students in China or India did not have to waste their time on this kind of stuff. Fortunately, on balance, I suspect that most students who go through this kind of coursework emerge relatively undamaged, and as far as I know these ideas have yet to seriously harm what goes on in the natural sciences.


1. Immigration-friendliness. Cutting-edge knowledge can obviously be generated by people from anywhere. A great university simply must draw its faculty from all over the world. The United States will always have an edge in this department, simply because of our greater receptiveness to immigration, especially skilled immigration. It is one thing for a university in, say, Sweden, to say that it wants to internationalize its faculty and draw the best and the brightest from everywhere. But if the faculty get there and find that it is very difficult for them to become a citizen, and that even if they are they are not treated as real Swedes, Sweden becomes less attractive as a destination. That we are now in an age when Americans are drawn from the entire world population makes it much easier for us to attract the brightest minds. For a country like China or Japan, which has no history at all of immigration, defining citizenship instead entirely by blood, this is a tremendous disadvantage. A genius born in Botswana can work in Germany, but he will never truly be German. If he comes to America he will be a full equal.

2. Lack of academic incest. It is common in universities in Europe and East Asia for people to work where they got their Ph.D., and for their career to be supervised by the person who supervised their dissertation. In Japan, a research agenda can be implicitly directed by the senior in a junior-senior relationship. This retards independent inquiry, causes people’s thinking to become undisciplined, and helps make sure that university that start out mediocre stay that way. In American universities is relatively rare for a department to grant a Ph.D. and then hire the new graduate. Multiple perspectives are believed, accurately, to generate more interesting ideas. (In fairness, this is a deficiency that foreign universities are starting to recognize.)

3. Self-direction. Closely related is the way professors are supposed to relate to students. While American professors are often unjustly criticized for simply asking her students to repeat back with a professors told them in class, students in the US are often held (see here for an example) to engage in more independent thinking activity than in other countries. The professor may give a lecture or an assignment, but the student is often required to do quite a bit of work in terms of collecting information, organizing his thoughts on his own terms, etc. Of course, some foreign universities have the practice of having a comprehensive examination for undergraduates, which requires a significant amount of self-directed preparation. Assuming U.S. students would tolerate this (doubtful), this is a practice we might do well to adopt.

4. Humility. The full-of-himself Professor is justifiably a common stereotype in the American university. But in fact, American professors are overall probably more approachable than in many societies, where they are treated as authority figures to be submitted to blindly rather than colleagues to be worked with in the pursuit of knowledge. I often find it ego-gratifying when students from foreign countries treat me with profound respect merely because I am a professor, but if all they get out of the experience is that I am to be obeyed, their time in my classroom will largely have been wasted. The students who talk back and ask difficult questions are always the most interesting and enjoyable to teach, and the more of them there are in class, the better.

I think that on balance, the strengths outweigh the weaknesses, but worry about the potential of the weaknesses to ultimately capsize the ship. Still, all in all, in 2007 things could be worse. With any luck, the rising global competition for high-quality students will force American universities to think hard about what makes them strong and what holds them back.