Take time to deliberate, but when the time for action has arrived, stop thinking and go in.
- Napoleon

Tuesday, December 5, 2006

Thinking Critically About Statistics

Every day we hear the television bark out statistics in order to sell a product, an idea or an ideology. Very rarely do we stop to consider what those statistics actually mean.

Anyone with a point to prove can use, abuse, and misuse statistics in the attempt to prove that point. The trick for us is determining whether the statistics being presented are factual or fictional. Fortunately, there is no trick to it. Evaluating statistics is just a matter of critical thinking: investigating facts and determining whether they support the conclusions being offered.

Take the so-called problem of deforestation. For instance: According to the Food and Agriculture Organization (FAO) of the United Nations, in the year 2000, there were only 22,599,300 acres of forest in the U.S.[i] However, statistics from the United States Forest Service for the same year, state that there were 503,664,000 acres of federally-owned forest here.[ii] Furthermore, the U.N. presents their statistics in a manner to make them appear even worse. They list the numbers in hectacres – or hundred acres – making the number appear to be only 225,993.

From here, one might investigate the reason for these differences in data. Perhaps the United Nations and the U.S. Forest Service have different ideas of forestland. The FAO held a summit in 2002 during which they decided that the definition of ‘forest’ would include all natural growths of trees but “excludes stands of trees established primarily for agricultural production, for example fruit tree plantations. It also excludes trees planted in agroforestry systems.”[iii] Under that definition paper forests, citrus groves, orchards and lumbering sites are all excluded. The citrus groves in Florida alone amounted to 832,426 acres of trees[iv] uncounted by the FAO. It is certainly easier to convince the public of deforestation if only naturally occurring forests are counted.

Another instance of statistical skewing occurs when we attempt to investigate the numbers on mortality: According to the National Center for Health Statistics (NCHS) - a division of the Department of Health and Human Services: “In 2000 a total of 2,403,351 deaths occurred in the United States.” However, from the same site, if you add up the numbers of people who died in 2000 (according to the NCHS’s list of the Top 20 Leading Causes of Death) 4,143,878 died in 2000. [v] If we look further into the death statistics of the NCHS, we find that 1,435,952 people died of heart disease in 2000[vi]. Furthermore, if we look at American Heart Association (AHA) figures from the same year, only 945,836 people died of CVD (CardioVascular Disease)[vii]. Now one could dig deeper to see if the NCHS thinks heart disease and CVD are different things. Perhaps they have included deaths from other chest-related diseases. Maybe the AHA wants the world to think they’re doing their jobs so well that fewer people are dying. Maybe the NCHS wants the world to think that more people are dying of heart disease so that they can pass a bill outlawing fast food. Upon further investigation, the facts behind mortality statistics aren’t so bleak.

Statistics can be innocuous: 4 out of 5 dentists recommend this brand of gum, or 9 out of 10 doctors prefer that type of hemorrhoid cream; or statistics can be worrying: “Every three minutes a woman in the United States is diagnosed with breast cancer”[viii], or “41,821 people were killed in auto accidents in 2000”[ix]. The point here is that in order to understand any statistics, one must research the sources of those statistics, consider the causes behind them, and judge whether the conclusions offered are valid. One must be prepared to investigate statistics and the assertions being made with them rather than just accepting them blindly or dismissing them outright.

Indeed, 10 out of 10 critical thinkers recommend it.

(Please note, this essay was written in 2003, so some of the links below may not be as correct as they were at that time.)
[i] http://www.fao.org/DOCREP/005/Y7581E/y7581e16.htm#TopOfPage
[ii] http://www.ncrs.fs.fed.us/pubs/gtr/gtr_nc219.pdf
[iii] http://www.fao.org/DOCREP/005/Y4171E/Y4171E10.htm#P1208_84442
[iv] Pg 29, http://www.nass.usda.gov/fl/citrus/cs00/cs0001.pdf
[v] http://www.cdc.gov/nchs/default.htm
[vi] http://www.cdc.gov/nchs/default.htm
[vii] http://www.americanheart.org/presenter.jhtml?identifier=107
[viii] http://www.breastcancer.org/press_cancer_facts.html
[ix] http://www.car-accidents.com/pages/stats.html

2 comments:

Janimé said...

Indeed. I would add that one must also investigate the methods the researchers used to gather those statistics.

B.E. Sanderson said...

Exactly. Use everything in your power to make sure the info you're relying on is correct. Thanks Jan.