During my days at Uni, I was fortunate to take a few elective class from the School of Philosophy. While many of these classes were lessons in how to think and of questionable use in a practical day-to-day scenarios, i.e. in my Ethics class we were taught how to put forth the argument of why children and people with disabilities should be granted the same rights as a leafy vegetable or a garden rock – but you’ll be pleased to hear that we were also equipped with the counter arguments.

There was however, one class that has proven to be invaluable in the field which I find myself in – the field of research. The class was Critical Thinking, and it gave me a basis of how to use the mental processes of discernment, analysis and evaluation to form a solid judgement that reconciles scientific evidence with common sense, i.e. separates valid arguments from B.S. (and that’s not Baruch Spinoza one of the great rationalists of 17th century philosophy). It also outlined the main logical fallacies used by people everyday in argument. A good example of the ‘loaded question’ fallacy is to someone – “Have you stopped acting like a moron? Yes or no!” which presupposes the person has been acting like a moron. The respondent looks bad no matter what.

In research, the ability to discern that data with a fist-full of commonsense can mean the difference between a marketing campaign that works and one that is a poor return on your investment.

“Oh, people can come up with statistics to prove anything. 14% of people know that.”Homer J. Simpson

The truth is…statistics lie! Or more to the point, they are open to interpretation and at worst, they could lead you in completely the opposite direction. Here is an example of how a poll in the US was used to show the opposite of the truth.

—–

Princeton’s Office of Public Opinion Research once did a poll of people to determine racial attitudes. People were asked if they felt that blacks had as good a chance to get a job as a white.

Other questions were asked to determine the racial attitudes of the person. Two-thirds of those who were sympathetic to blacks said that blacks had a poorer chance of getting a job. Two-thirds of those showing prejudice felt blacks had an equal chance of getting a job.

Thus, the poll could be done during a time of relative racial harmony, then again during a time of racial strife. As racial attitudes worsened, the poll could show “Blacks have an increasing likelihood of being able to get jobs.”

Source: Bad Use of Statistics and Polling by Dr. Fred Worth

—–

How about this statistic –

More doctors smoke brand X than any other brand.

That may very well be true, does that mean that smoking brand X is a healthy choice?

And we’ve all heard this one –

3 out of 4 dentists choose brand Y toothpaste.

That’s an easy claim to make, just find three dentists who prefer brand Y and one that doesn’t. Does this mean that using brand Y toothpaste is a better choice?

I’ll give you another example of how the misinterpretation of research results in an invalid argument. Suppose I ran a mini golf centre and I wanted to do a study of the average age of my customers. I survey 63 of my customers over the weekend and find the average age of golfers is around 20. Good conclusion? Do I focus my marketing efforts on people aged 20? Well, it turns out that my customers are really fathers bringing along their children. So, the frame of reference is vital to proper interpretation of statistics.

As researchers we are given data from which we base our recommendations all the time, and we would be in trouble if we didn’t question the context, methodology and validity of the data in our research and also make sure we don’t draw conclusions that don’t follow commonsense.