Find the evidence
10:38 pm
Mon June 23, 2014

Commentary: Teaching People To Search For Statistical Truth

About four years ago I realized that, although my business statistics students were understanding the math fairly well, they didn't “get” the statistics part. When asked to apply their knowledge, they were fairly clueless. My challenge was to change that. But how?

That challenge had another prompt: In the book “Academically Adrift,” authors Richard Arum and Josipa Roksa provide strong evidence that a student only experiences small to nonexistent gains in critical thinking, complex reasoning and writing skills over the entire undergraduate college education.

Credit Ulrik |

Not surprisingly, these are the exact same skills that society and business expect new graduates to master. A poll of employers that the Association of American Colleges and Universities released in 2013 found “broad support for the idea that students should learn to think critically, communicate clearly, and solve complex problems,” or what the association described as "a 21st-century liberal education."

Critical thinking skills are imperative for all ages, not just college graduates, to actively participate in our 21st century democracy. In this age of shouted punditry on the right and left, it is imperative for an informed citizenry to be able to look behind the sound bytes and factoids that pass for evidence in our so-called public discussions.

Even though I had been teaching statistics for 20 years, it didn’t dawn on me until recently that statistics is an ideal vehicle to teach these essential critical thinking and communication skills. Carefully examining the use of statistics provides more than ample examples of both good and bad evidence-based thinking. So to tackle both challenges outlined above I developed a strain of homework called Statistics as Evidence.

The charges to my students should be relevant to all:

Context: All too often, stand-alone numbers are used when telling a story, whether it be economic or political or scientific. Yet, without context these numbers have little or no meaning. For example, the U.S. unemployment rate was 6.3 percent in April. Is that good or is that bad?

That number is calculated by the Bureau of Labor Statistics and its monthly history goes back to 1948. Now you have context! An interactive graphing program even allows everyone to see the same information. Is it high? Will it come down? Those questions can be addressed, albeit with a definite dash of uncertainty, but at least everyone has the same starting point.

Definition. In this example, what do the numbers actually stand for. It only measures those who are actively looking for work, not all who are unemployed. Even though this a flawed measure, as is every measure in some dimension, the good news here is that you know what it does measure and what it doesn’t. We can infer then that the published unemployment rate is an underestimate of the true value.

Advocacy: You’ve heard the refrain, “There are three kinds of lies: Lies, Damn Lies and Statistics,” but what does it mean? Can we trust any statistics?

To address the critical issue of advocacy and its effects, I have students visit the websites of two diametrically opposed organizations: The National Rifle Association and The Brady Campaign Against Gun Violence.

I tell my students, that I don’t really care what their personal views are regarding guns. Even though many have strong opinions on either side, I am asking them to try and step back, and look at the numbers. Where do they come from? Are the sources trustworthy? Have the advocates used the data in the appropriate context?

It takes some real unlearning for students to adopt a more evidence-based approach. It doesn’t happen right away. I recognize this. When their homework is graded, I am looking for arguments, and the data to back up those arguments. I am not looking for positions! A statistical truth is out there, but we must search for it in many and sometimes uncomfortable places.

This approach to looking at data (adopting evidence-based thinking) is being supported by some new outlets and websites. One of the most prominent is Nate Silver’s new Five Thirty Eight site. And a newish movement called data journalism is trying to start with the data, and then draw conclusions. For more information check out the Tow Center for Digital Journalism at Columbia Journalism School.

My hope is that we are seeing a new path forward that emphasizes evidence and data, taken in proper context, and less punditry, to draw sounder conclusions with better decision making, both in the public and private arenas. The work is just beginning.

Mark Ferris is an associate professor in the John Cook School of Business at Saint Louis University. He tweets at @ferristician

Listen to the shorter radio version of this commentary.

Related Program