The Bias Problem and Opinions


This page shows  how our personal biases play a major role when we form our opinions based upon inadequately considered facts - especially when we believe we are relying on scientific and technical knowledge


Many people form their opinions based upon the findings of science. This is highly commendable providing one is aware that technically complex* topics need to be looked at from multiple perspectives before a rational opinion can be formed. This investigative step is rarely done well, so biases find their way into our values and beliefs.  I call this "The Bias Problem". Many people, including scientific and other experts, claim they can readily detach themselves from those biases. But is this true? 

 

My career as a forensics investigator (a topic expert that assists lawyers and courts understand engineering and scientific issues) has been a fascinating journey in how to deal with "The Bias Problem".  That I managed to uncover helpful steps to deal with bias was certainly not because I was clever. Over time, I simply became conscious of my many shortcomings; and along with this awareness came an irresistible urge to eliminate them.  Forensics opened my eyes because once designated as an "expert" I was exposed to relentless cross-examination by sharp lawyers from opposing sides, ably assisted by subject-matter experts from a multitude of related technical fields.  Not many people are subjected to this level of technical interrogation of their opinions, so few are given the opportunity to learn from it.  

 

No matter how open-minded and objective you believe your opinion to be, there will almost certainly be another impartial and honest person who believes the exact opposite 

 

Before starting into this line of work I was vaguely aware of the perils of the job. I also knew that statistical studies were unanimous: experts' opinions are often less reliable than blind chance. So, I dug into the findings of the social sciences and the philosophy of science to learn why critical and logical thinking can so readily go off the rails. My first finding showed that just being aware of the various biases and logical thinking fallacies, is insufficient to avoid being wrong.  What is needed, as a minimum, is to deeply reflect on the potential ways bias can affect one's thinking, on a moment-by-moment basis; and few people are capable of that degree of awareness and self-control.

 

What also surprised me was that many colleagues in forensics, and those in academia, did not seem to share my passion on how to discern good conclusions from bad. They appeared to assume that by focusing on their own specialist knowledge, and always using what they perceived to be quality procedures, that their academic training and environment would effectively immunize them, and their colleagues, from biases and logical mistakes.  They failed to recognize that facts about complicated topics always require interpretation; and it is during this step that bias infiltrates its way into thinking processes.

 

Scientists, with this attitude of invulnerability, are subjected to another powerful influence: they are trained, encouraged, and often financed to address issues which are, in some way, restricted to a particular point-of-view (never a good idea for intricate scientific work): what could be described as the academically correct (AC) one.  The scientists then find it virtually impossible to recognize their critical errors; and worse, resent those who point out those mistakes.

 

The field of Science assumes the reasoning processes of the trained human brain is dependable for its purpose; but rarely asks how a brain - formed by chance and survival strategies alone -

could turn out to accurately discern truth

 

To demonstrate this issue I will briefly address three current topics - not to provide a specific opinion but to demonstrate how, without any bias, an issue should be framed. Each topic, however, is almost always framed by the media and academia entirely differently; with the important perspectives disguised or hidden.  The approach focuses on logic and the construction of a bias-free (as far as practical) statement that all sides can accept as a starting point for any resolution - a starting point Preferred Perspective Statement.  The following are proposed:

 

1] In the debate over the origin of life and subsequent development of species the preferred perspective is that a fundamental and necessary component of "life" has now been discovered. This discovery has been distilled by scientists over the last fifty years, to be: an extraordinarily compressed and multi-layered, digitally coded, "information system" of immeasurable complexity and precision, for which evolution theory has yet to adequately express.  2] With the discussion about our changing climate the preferred perspective is that the physics, the chemistry, and the biology that naturally affect the climate must be fully and accurately quantified, by use of the more important global heat content concept (not an average global air temperature), before it will be possible to get to grips with the effects of human activities on the climate. 3] With the issue of a reliable and efficient public energy supply the preferred perspective is that each component of every current public energy type - its efficiency, its reserves, its production, equipment, and infrastructure lead-times, its transportation, its storage, its required back-up systems (typical/extreme), its waste recycling, its packaging / distribution, and its expected life – must be determined, before the issues of ecology, sustainability, health & safety, overall carbon dioxide release, and all-inclusive costs be calculated and evaluated. 

 

If you take issue with this write your own preferred perspective, analyse it for effectiveness, bias and logic, and I would be delighted to consider it.

 

Bias results from allowing one's emotions to accept facts which are likely selfish, generally comfortable, and mostly simplistic; and to reject facts which are annoying, embarrassing, anomalous, or difficult to accept

 

We need to know why this blindness among experts occurs so often. The social sciences have extensively studied biases and poor thinking.  Neuroscientist Jonah Lehrer author of How We Decide (2009) concludes: the brain is awash in feelings…even when we try to be reasonable and restrained, these emotional impulses secretly influence our judgement.  Dr. D. Kahneman of Princeton University and author of Thinking, Fast and Slow states: I am generally not optimistic about the potential for personal control of biases. And Dr. K. Stanovich at the University of Toronto in his book What Intelligent Tests Miss has research which shows: the correlation between unbiased reasoning and intelligence (is) nearly zero.

 

A 2010 study* carried out at the University of Neuchâtel in Switzerland provides a powerful clue to discerning the bias problem.  The authors conclude: Humans (including scientists) do not use reasoning as a way to expand knowledge and make better decisions, as is generally assumed. Rather, human (including scientific) reason is no more than a means to construct arguments for the purpose of persuading others.  Of course, the additions in parentheses are mine; but this astonishingly new way of looking at human reason needs to be rephrased:  Humans are intrinsically biased to such an extent that they will essentially never see their own biases, even as they describe them in others. In truth, their goal is simply to win arguments and debates. 

 

Professor Owen M Williamson at the University of Texas El Paso provides the perfect example of this type of academic bias: he developed a list of 146 logical fallacies, or what he calls junk cognition. What is interesting is that his purpose for the list is not to educate people about the thinking traps, so they could improve their personal thought processes. Rather he developed the list for "intellectual self-defence" so his supporters could recognize the "fake or deceptive" arguments of others.  The unspoken assumption being that he, and presumably his approving colleagues, are so perfect in their thinking that they are have no need for this list as they are invulnerable to any type of thinking disorder.

 

From the above we see that regular folk will be plagued with biased opinions. But it seems equally true for scientists and experts. That fact corresponds with our daily experiences: we all have come across lawyers, book and movie reviewers, accountants, and news pundits who are frequently plagued by poor opinions. There are reports that as much as 90% of medical knowledge of the recent past has now been judged to be substantially or even completely wrong.  A 20-year long study of the work of political experts found their predictions were no better than using the spin of a coin. As this is likely true for scientists they can not - as they do now - rely on peer review and simply assert they are without bias; they have a primary responsibility to draw attention to the steps they have taken, at every step of their research, to ensure that their work has addressed The Bias Problem.

 

 Those whose views are narrow-minded cannot be simply reasoned out of their biases; because it was not reason that caused their bias in the first place 

 

Findings suggest a large cause of human bias is caused by our egos - which I call Egoistic Bias. This is not to be confused with Egocentric Bias which simply addresses everyone's personal and legitimate points-of-view.  An egoist, on the other hand, has little to no regard for the experiences, interests, beliefs, or attitudes of others - acting in ways that appear to be similar to many of the higher animals. These attitudes combine: conceit, lack of humility, greed, and even a lack of respect for people who possess different experiences than them.  Egoistic Bias is such that it can entirely take control of a person's thought processes when one of the following is concerned: a strong belief in an idea, a need to influence or judge others, a need for affirmation, or and especially a fear their beliefs may be wrong. 

 

My observations regarding ego imply a prevailing mind-set or attitude. One or more of the following exist:  my needs are likely more important than yours, my emotions are likely more authentic than yours, my ideas are likely more valuable than yours, my empathy is likely more sensitive than yours, my group of people are likely more knowledgeable than yours - and certainly my assumptions will be more correct than yours. The conviction of the chronic egoist is, therefore: you are biased, and I am not.  And based upon their language many experts, researchers, and distinguished university professors do hold these beliefs. Egoistic attitudes certainly account for the verbal insults that flourish in so many discussions and debates.

 

A dramatic example: philosopher and author J.D. Trout in his book The Empathy Gap, discussing ways to control biased people, claims: Self-control doesn’t work. We can only overcome our biases… by (government) policies that regulate our behaviour from a distance and control our options.  How he expects a group of people (political thinkers) subject to Group-Think, or the Bandwagon Effect, to be free of bias he predictably fails to deal with.  A dangerous example of a personal ego mind-set.

 

So, what are the steps required to construct a proposition that is most likely to be true. Firstly, it is to bury one's ego along with any investment you may have in the outcome. This is difficult to achieve and impossible if one's image and career depend upon a particular outcome. Next, the question shall be inspected from each of five independent lines of reasoning:  1: Has logical thinking (for a scientific issue this means the original description of the scientific method) been scrupulously applied and subjected to rigorous falsification? 2: Have all assumptions been examined and compared, with all conceivable points-of-view evaluated?  3: Have all logical consistencies been examined and compared without circular or self-refuting reasoning being involved?  4: Have all mathematical consistencies been examined and compared and tested by scrupulously applied mathematics (not theoretical math) and do any probability calculations consider all relevant variables?  5: Have all philosophical consistencies been examined and compared considering all of the known human biases? 

  

Bias: the dogmatic refusal to honestly consider opposing points of view

 

In most cases input into each question will be required from people holding a variety of opinions. This does not mean relying upon social media with its emphasis on feelings, and the popular search engines and other media.  These sources are driven by sophisticated misinformation techniques.  It is therefore sensible to ask: have I honestly studied and analysed the topic adequately without relying upon typical internet sources (e.g.: Google, Wikipedia, Snopes# )?  

To quote Richard Feynman, one of the most renowned physicist of the 20th century: The first principle (of knowing truth) is that you must not fool yourself; and you are the easiest person to fool.

  

Science and reason that is prejudiced is the worst form of bias, because science and reason are the instruments for liberation from bias

Paraphrase Allan Bloom – The Closing of the American Mind 


* Complex Topic - is one that cannot, or has not, been tested or observed and fully resolved in a tightly controlled experiment - typically in a laboratory - with all variables realistically represented. See Bias in Science for the differences between Complex Science, and its counterpart, Foundational Science. 

 

Google, Wikipedia, Snopes - the reason typical internet sources need to be used cautiously is they respond to one or more of the following: Popularity, Promotion, Presuppositions. Biases are hidden in all three.

 


Other LINKS Connected to this TOpic

The Big Three Biases

 

Each of The Big Three Biases are influenced by Egoistic Bias in some way, but the Overconfidence Effect - one of the Big Three - is directly related to Egoistic Bias

 

Learn more about forensic engineering and science at the Bias & Forensics page

The Three Bias Reflexes

 

Each of The Three Bias Reflexes cause and influence our biases; but it is the third, called Egoistic Bias, that tends to cause the most problems in our thinking





Biases start with human nature but they are energized and directed by the lack of control of one’s emotions - often egoistic emotions