There is nothing more deceptive than an obvious fact. Evidence is a very tricky thing, it may seem to point straight to one thing but if you shift your own point-of-view you may find it pointing in an equally uncompromising manner to something entirely different 

- Arthur Conan Doyle


In a recent survey ninety percent of respondents said they were much less biased than the average person and 80% also believed they were much better informed. In another survey almost all respondents thought they either “understood” or “thoroughly understood” a particular topic – despite it having highly technical details where knowledgeable experts are few and far between. The number of those who admitted they did not understand was no greater than the inaccuracy figure of the survey itself. 


Almost all of us sincerely believe we are both remarkably fair-minded and especially well informed. Social scientists are aware of the Over-Confidence Effect which is based upon tests that reveal the greater the confidence a person has that they are correct, the more likely it is they are mistaken. One interpretation of this extraordinary discovery is that confidence is a significant disadvantage if one aspires to hold correct opinions.  Uncertainty or humility, it appears, is an asset.


How can this be reconciled? In the quotation above, from the author of literature’s celebrated detective Sherlock Holmes, being misled is just a matter of looking at things from the wrong perspective.


Philosopher of science Karl Popper (1902-1994) takes bias a step further than Holmes: “Whenever a theory appears to you as the only possible one, take this as a sign that you have neither understood the theory nor the problem which it was intended to solve”. Popper is suggesting it is exceptionally easy to misunderstand issues, and human intransigence affects one's attitude to such a degree, that the door will always be open for biases to enter. Patricia Grace Devine, professor of psychology at the University of Wisconsin–Madison, has studied bias by focusing on racial prejudice. She proposes a three step program to solve the problem of bias: 1] become aware of one’s inherent biases; 2] recognize and empathize the consequences of the biases; 3] replace the biased responses with ones that reflect conscious, non-prejudiced, beliefs. This approach - with step three being ill-defined - seems based upon the perception that biases are bad habits that can be overcome with some awareness and focus. It appears that this opinion is endlessly repeated in other research but observations and anecdotes indicate it has the effect of replacing one bias with a more complex and intransigent one.


Common definitions of bias are clearly imperfect due to their reliance on emotional or ambiguous terms like preconceived, lack of fairness, and narrow viewpoint. An effective definition should allow explicit evaluations and clear-cut conclusions. This is a more useful description: An over-emphasis or focus on the negative (or positive) aspects of an object; and an under-statement or disregard of the positive (or negative) aspects of the object, as compared to other objects (specified or implied). An object being: a person, idea, product, or specified grouping of people.


A number of observations follow from this: Bias is not a problem that can be solved by fiat – or by a set of prescribed beliefs – because no person or group is qualified to make such judgments. We need to acknowledge certain realities about the bias puzzle. To start we must admit that bias is not the real problem: it is the stubborn belief that “other” people are biased while you, and those who agree with you, are not. Also, that we are all influenced by our biases, but the big difference between people is that some are aware of this, while others are not.


The study of bias addresses these important questions. Start by clicking here for a quick introduction to bias.