Sign in or create an account

| Forgot Password?

10- The Use (Not Abuse) of Statistics

Posted on

10. The Use (Not Abuse) of Statistics

Lynn Scarlett, Former Executive Director, Reason Public Policy Institute

 

 

Economist Thomas Sowell quipped in his book Knowledge and Decisions that information is everywhere, but knowledge is rare. Sowell may have had in mind the welter of statistical data that accompanies so many public policy discussions. No matter what the issue, no matter what the philosophical or political perspective, combatants in policy debates arm themselves with data.

 

Data constitute information; they do not constitute knowledge. Knowledge results from organizing data and other information into useful sets and situating that information into a broader interpretive context. Statistical analysis is a tool for finding patterns amid informational “noise.” How meaningful those patterns are and whether the purported patterns actually exist depend on the methodological care with which an analyst proceeds. Data can be diced, sliced, merged, correlated, averaged, and extrapolated in endless ways. Not all these efforts are equally illuminating.

 

Statistical data are seductive. They lend an aura of credibility to what would otherwise appear as simple matters of opinion. But statistics can mislead. Even casual observers of public policy debates need to be wary of the perils that lurk behind numbers. Six perils stand out.

 

 

Beware the Timeframe.

 

Policy analysts often proclaim success or failure of public policies, or point to the emergence over time of new problems. These proclamations require showing how circumstances have changed from one moment in time to another. We see claims that things are getting worse—illegitimate births are increasing, the weather is hotter, students are performing worse than a generation ago. Or, we see claims that some policy has made things worse or better. Comparison of circumstances across time is a legitimate exercise. But not any old time-frame will do. Which time-frame is appropriate depends upon the topic.

 

Ten years of data may be an acceptable time-frame for assessing whether California’s mandatory class-size reduction has improved student performance. A month, six months, or even a year all present time-frames too short to draw any conclusions about class size and student performance. But a year may be an appropriate time-frame for discerning whether, say, privatizing waste collection systems is increasing or decreasing program costs. Seemingly impressive statistics can quickly become misleading or meaningless if the data cover a time-span unsuited to distinguishing real from spurious patterns.

 

 

Beware of Apples and Oranges.

 

How many times have we heard the “apples and oranges” caveat? Yet would-be pundits often ignore it, rushing to find meaning by comparing unlike situations. We often see this sleight of hand. Student test scores are getting worse, say pundits comparing SAT scores of the 1980s with those of the 1950s. But are we talking about comparable sets of students? A much larger percentage of students take SATs today than 30 years ago. Perhaps only those most likely to do well took the test 30 years ago. If so, mean test scores would be higher than today, even if “average” student performance was no better than today. Students may, on average, be performing worse today than 30 years ago, but reporting aggregate SAT scores, with no adjustment to compare matched groups of students, cannot confirm this conclusion.

 

 

Beware of Total Context, or Confounding Variables.

 

In a world of so much specialization, scientists often “know” a great deal about a narrow set of phenomena. They may be excellent scientists; they may control for many variables that they can conceive might be important. But their scope of knowledge may exclude critical factors. An air chemist may know little about astrophysics; a biologist may know little about nuclear physics; a meteorologist may know nothing about biology. This challenge is especially important when observed correlations among different phenomena are, though statistically significant, still rather weak. What else i s going on? The problem of confounding variables can haunt even fairly simple problems. Does the advent of mandated recycling account for recent reductions in per capita waste disposal? Or are reductions the consequence of ever-improved use of materials that cut down on the amount of stuff used per item consumed? Perhaps introduction of user fees or waste education programs account for the decline. Well done studies attempt to account for different possible explanations of perceived events.

 

 

Beware: Correlation Is Not Causation.

 

Incidence of heat stroke correlates highly with ice cream consumption. The higher the consumption, the more cases of heat stroke. Ice cream consumption also correlates with outdoor temperature—the greater the consumption, the higher the temperature. So, does eating ice cream cause heat stroke or high temperatures? Silly example. We know these correlations of different phenomena do not mean the one caused the other. Maybe they did; maybe not. Maybe causation is in the opposite direction: high temperatures may cause people to want more ice cream. In fact, these correlations simply cannot tell us anything about causation.

 

Yet, in less obvious contexts, we see observations of correlation translated into statements of causation all the time. Raising speed limits may be correlated with fewer accidents, but did the higher speeds cause the improved safety? Or was it drunk-driving pullover programs? Better driving habits? Improved roads? Citing a single correlative relationship is insufficient to draw conclusions about cause and effect.

 

 

Beware the Relativity Problem.

 

How many times do we see pundits announce that Town X privatized its trash collection system and saved $750,000, or $1 million, or whatever? Great data? Interesting data? Important data? I don’t know. These pronouncements give me no context. The $750,000 might be a lot; it might represent minimal savings.

 

Unless I know how much was previously spent on the trash system, I have no idea whether $750,000 is a 1 percent savings, a 10 percent savings, or a 90 percent savings. And unless I have some idea of savings ranges achieved by other practices or for other services, I don’t even know if a 1 percent savings is noteworthy or not.

 

There is, of course, another “relativity” problem. We repeatedly see announcements that “Study Y found a 3 percent (or 4, 8, or whatever percent) increase in tumors among frogs (for example) exposed to some chemical.” Sounds impressive, but is it a statistically significant finding? Without more information, we don’t know. We need to know the size of the population tested; we need to know how much random variation in tumors occurs in given frog populations.

 

 

Beware the Effects of Dynamism.

 

Policy pundits often warn, “if present trends continue,” some outcome will ensue. Human population will explode to the point of crisis; traffic congestion will snarl city roads, slowing us to perpetual jams in which we move along at five miles per hour. Extrapolating from present trends to future outcomes is tantalizing and often produces specters of future disaster. But, of course, present trends often don’t continue. In modern bumper sticker parlance, “stuff happens.” Birth rates fall; people move, change their driving hours, hop on a train, carpool, telecommunicate—they react and respond, and trends alter. Thus, a challenge for the analyst is to recognize the pitfalls of extrapolation.

 

 

Think Strategically.

 

Data are seductive. Policy analysis requires careful acquisition and review of data. But data are not knowledge, and all uses of data are not equally illuminating. Indeed, facts and figures sometimes deceive.

Many data perils challenge us. But there is an even larger consideration for analysts. Jane Jacobs, author of a path-breaking critique of city planning, The Death and Life of American Cities, said: “one of the main things to know is what kind of problem cities pose, for all problems cannot be thought about in the same way.” Jacobs added, “Which avenues of thinking are apt to be useful or to help yield truth depend not on how we might prefer to think about a subject, but rather on the inherent nature of the subject itself.” Mental methods or strategies for thinking are, she concluded, central.

 

Jacobs pointed to three kinds of problems: I) problems of simplicity; 2) problems of disorganized complexity; and 3) problems of organized complexity. The first are two-variable problems: gas pressure, for example, depends mainly on gas volume. “The essential character of these problems rests in the fact that … the behavior of the first quality can be described to a useful degree of accuracy by taking into account only its dependence upon the second quality and by neglecting the minima l influence of other factors.”

 

Problems of disorganized complexity involve billions of variables, as Jacobs noted. Predicting motion of a single billiard ball on a table may be a simple problem, but analyzing ten or fifteen balls on the table at once quickly becomes intractable. However, what one can do is ask “on average” types of questions: on average, how many balls will hit the rail at any particular moment; on average, how far does a ball move before being hit; and so on. This kind of problem can be seen as one of disorganized complexity. Individual variables may be impossible to analyze, but the system can nonetheless be characterized in ways that make some analysis possible. Probability theory and statistical mechanics become useful tools of “truth-searching,” albeit constrained by many methodological caveats.

 

But not all problems are suitable subjects for probability theory. A third category, problems of organized complexity, carries a fundamental challenge in that it is not the sheer number of variables but the degree of interdependence among variables that poses analytical difficulties. Jane Jacobs saw the “problem” of understanding how cities function as just this sort of problem of organized complexity. Dozens of factors in the city are all “varying simultaneously and in subtly connected ways.”

 

Jacobs viewed problems of organized complexity as requiring three different “habits of thought.” Illumination of these problems necessitates, she proposed, thinking about processes; working inductively from the details or particulars to the general; and seeking “unaverage” clues involving very small quantities, which reveal the way larger and more “average” qualities operate. These “unaverage clues” are akin to what Nobel laureate economist F.A. Hayek referred to as the knowledge of time, place, and circumstance.

 

Cities are just one example of organized complexity. Many human actions and human institutions involve matters of organized complexity where feedback loops and interconnections produce complex order. Acting as if these problems are matters of disorganized complexity, involving large numbers of randomly acting variables, will mislead the analyst into thinking, as Jacobs puts it, that “any specific malfunction [of a city] could be corrected by opening or filling a new file drawer”—for example, a new housing program or a new recreation program.

 

Knowledge requires dissecting, first and foremost, what kind of problem we are analyzing so that we apply appropriate analytical tools. Statistical analysis may be enormously useful in helping us understand questions of disorganized complexity; process analysis, induction, and knowledge of specifics— “unaverage clues”—may be more illuminating tools for understanding matters of organized complexity.

 

 

Lynn Scarlett, former deputy secretary and chief operating office r of the U.S. Department of the Interior, is a visiting scholar at Resources for the Future in Washington, D.C., and an environmental advisor working on climate change adaptation, ecosystem services, water, landscape-scale conservation, and science and decision-making. In 2009, she served as a visiting lecturer on climate change at the University of California Bren School of Environmental Science and Management and is co-teaching a course on climate change at the Massachusetts Institute of Technology. She is a fellow of the National Academy of Public Administration.

 

From 2005 to January 2009, she served as deputy secretary and c hief operating officer of the U.S. Department of the Interior, a post she took on after four years as the Department’s assistant secretary for Policy, Management and Budget. She served as acting secretary of the Department for two months in 2006. Scarlett chaired the Department’s Climate Change Task Force and now serves on the national Commission on Climate and Tropical Forests. Scarlett is author of numerous publications on incentive-based environmental policies. Her most recent publication, “Green, Clean, and Dollar Smart,” on urban greening, was published in February 2010 by the Environmental Defense Fund. Scarlett serves on the board of the American Hiking Society, the Continental Divide Trail Alliance, and RESOLVE (a nonprofit environmental dispute-resolution organization), and is a trustee emeritus of the Udall Foundation. She received her B.A. and M.A. in political science from the University of California, Santa Barbara, where she also completed her Ph.D. coursework and exams in political science and political economy. She is an avid hiker, canoe  enthusiast, and birder.

 

This entry was posted in Resources. Bookmark the permalink.