Shocking Numbers, But Do They Add Up?

John Lott,
American Enterprise Institute

From the Wall Street Journal, Aug. 30, 2001

It seems that hardly a day goes by without some alarming set of statistics appearing in a newspaper article or TV report and causing, well, alarm. But a little skepticism might be the better response. Actually a lot of skepticism. Both "It Ain't Necessarily So" (Rowman & Littlefield, 249 pages, $24.95) and "Damned Lies and Statistics" (California, 190 pages, $19.95) argue that the media are very good at trumpeting the "shocking results" of various studies and very bad at spotting bias or examining data in detail.

The authors of "It Ain't Necessarily So" — David Murray, Joel Schwartz and Robert Lichter — provide several case studies of media alarmism. Are sperm counts falling? Is stored nuclear waste at risk of exploding? Does the northward movement of the checkerspot butterfly provide evidence of global warming? In each case, the media answer was "yes," and the real answer was "no" or "not proven."

Take the mystery of falling sperm counts. The claim was made, in the 1996 book "Our Stolen Future," that man-made chemicals attack masculine development in the womb, affecting sperm production later. No less a person than Al Gore wrote the introduction to the book, and media outlets everywhere — including U.S. News and World Report and Business Week ("the last endangered species could be us") — shouted the news.

But the news was dubious. A crucial Danish study, heavily relied upon by the authors of "Our Stolen Future," purported to measure sperm counts over 50 years, but it had very little data before 1970, and the post-1970 data, taken by themselves, actually showed an increase in sperm counts. What is more, as Messrs. Murray, Schwartz and Lichter note, it was hard to generalize from the study's samples because "sperm counts will tend to have data on those with reason to believe that their counts are high or low.” In short: The counts come from a self-selecting, and not representative, sample.

What about the study linking the migration of the checkerspot butterfly to rising global temperatures? The study's author, a biologist named Camille Parmesan, noticed that the butterfly (found in the American West) had migrated about 100 miles north. She found this suggestive of "climate change" but did acknowledge that more studies were needed. The press, however, grabbed hold of her findings as if they were nearly conclusive. As Messrs. Murray, Schwartz and Lichter observe wryly: "If one swallow does not a summer make, it is at least as true that one butterfly does not a global warming prove.” Among other problems, the temperature changes Ms. Parmesan noticed were caused by urbanization, and "a warming that has not affected rural southern California cannot explain the butterfly's departure from sites there."

"It Ain't Necessarily So" details how many of the "facts" that drive sensational claims derive from how numbers are defined. A dramatic jump in sexual assaults, reported breathlessly in 1992, meant only that "an altered interviewing procedure was bringing to light [hitherto unreported] incidents.” And what about the claim, from a 1987 survey, that 27.5% of college women had been raped since the age of 14? (It was reported in the New York Times.) It's "true" only if you define rape so broadly as to include women who voluntarily had alcoholic drinks or took a drug that lowered their "inhibitions" and led to consensual intercourse that they later regretted. Just eliminating this single category of rape reduces the percentage to about 3.8.

Is a liberal media bias to blame for this inexact reporting? Partly, perhaps. But there is more to it than that. What becomes news is what fits the conventional wisdom of reporters and their editors, and that "wisdom" can shift as perceptions do. For instance, the Tiananmen Square massacre caused a dramatic media attention to China's human-rights abuses that simply had not existed before — although the abuses certainly had.

In "Damned Lies and Statistics," Joel Best argues, among other things, that social activists, as they compile their own statistics, favor broad definitions. Thus advocates for the homeless "suggest that people who stay in the homes of friends or relatives — but who have no homes of their own — ought to be counted as homeless.” Perhaps so, but such sweeping latitude had better be noted by whoever is reporting the number. A similar sort of broadness showed up in the church-arson "epidemic" of 1996, which relied on a vague definition of "suspicious fires."

Mr. Best also provides some examples of what he calls "mutant statistics," error-prone numbers that get reused in different contexts. Alfred Kinsey's claim about the incidence of homosexuality in the population (10%), although regarded as too high by more precise researchers, has been used by some advocates to calculate, for example, that a third of teen suicides involve gay or lesbian adolescents. A cruder sort of mutation was evident when certain feminists, seizing on a claim that 150,000 women were anorexic, went on to claim that each year "150,000 women died from anorexia."

In his conclusion, Mr. Best comes up with a taxonomy of statistical mind-sets. The Awestruck bow reverentially before the "magical power" of numbers, understanding them hardly at all. The Naïve "know something about percentages, rates, and the like" but basically accept and disseminate the numbers they receive. The Cynical are suspicious of statistics and view them as efforts to manipulate opinion. The Critical, finally, assume (rightly) that no statistics are perfect but that flaws vary: Some are fatal and some are not. The key, writes Mr. Best, endorsing the Critical view, is to "evaluate numbers" and "distinguish between good statistics and bad statistics."

It's safe to say that Mr. Best, along with Messrs. Murray, Schwartz and Lichter, would wish that journalists would move from Naive to Critical, and soon.

BACK The 4th Estate

TYSK eagle

News Depts Articles Library
Lite Stuff Links Credits Home

 

 

11 sep 2001