Most (people) when they are thinking are merely rearranging their prejudices 

 - William James - Psychologist


The subjective human dimension of science can never be divorced from the objective, because everyone operates with presuppositions (biases)

- William Whewell – Originator of the word “scientist” in 1833


Bias Aware Thinking (BAT) is a way to enhance critical thinking by being aware that human biases will affect people’s thinking processes - even the thinking of the most intelligent and sensitive.



BAT provides a practical approach to what is called Evidence Based Thinking. It touches on the often theoretical topics of theory of knowledge and the nature of evidence. Evidence, which is discussed below, is information that supports or challenges a claim, theory, or argument. 



Bias Aware Thinking is based upon how people learn and how they think.  We learn using similar methods but we often end up thinking in entirely different ways - because the thinking process interprets what we have learned according to our subconscious assumptions; and it is this internal discussion we all have with ourselves that forms our beliefs, attitudes, and opinions.  


Learning basic facts is the straightforward part: it is achieved by four interrelated qualities of the human brain. They are: direct perception (from our senses), direct input (from the learning of others), induction (from our personal experiences), and deduction (from use of our logic). However this learning provides little assurance, to ourselves and especially others, that the facts we have learned are actually true. 


But our thoughts, understandings, and beliefs are formed by a subtly connected system. They follow on from the totality of what we have learnt (and presumably remembered), but they are supplemented by two additional factors. They are: our trust in the truth, or the disbelief, of what we have learned, combined with our inner sense of what we think it is right to believe. These subconscious trusts and convictions focus our attention, form our beliefs, and build our biases. If we are not careful we can become overconfident as we juggle the data we accept and the skepticism we should have - bearing in mind the personal way we gathered our "evidence". A simpler way of expressing this is that we develop our understandings by one or more of our observations, our feelings, and our trust in a selected "expert".


The brightest and scholarly possess no magical way of learning, nor are they immune from the impulse of the biased knowledge and subconscious assumptions they bring to the "thinking table". They are regular folk like everyone else. Indeed, of all the ways of learning - other than from the output of Foundational Science (see Bias in Science) - there is no guarantee that any of the learning methods will provide for rational and true beliefs. Scientists do, however, have very specific knowledge of certain facts and methods. The learning of scientists, for example, can include: formal logic; advanced mathematics (e.g.: statistics, partial differential equations and solution methods, and tensor analysis)*; the use of test/experimental equipment and instruments/data acquisition*; quality procedures*; specifications and reporting*; along with the collaboration with peers*. 


* I have studied each of these specialist topics with respect to engineering. They increase ones knowledge in certain areas but provide no assistance for verifying evidence and the truth of a belief. Experience as a forensic investigator is more helpful.


But what we learn is secondary to the Evidence we develop (see below), which may include the trust and convictions we develop. Bias Aware Thinking recognizes these issues and provides techniques to resolve many of the problems that result.  See the sections that follow on this page. 



Evidence is not a precise commodity. Whether in law, complex science, or everyday life it comprises an assortment of objects, data, testimony, indeed any sort of clue that supports or challenges a claim, theory, or argument. With complicated issues a single piece of evidence, by itself, can rarely establish proof. In intricate cases evidence comes as a diverse package, which a philosopher of science would call an "Evidence Base”. As individuals, or monolithic groups, almost always possess different evidence bases, they would reasonably hold different opinions. This would be true even if each had an equal amount of rationality and logic.


Based upon the idea of an evidence base, fully rational belief, is not as straightforward as weighing the direct evidence. No person or group possesses irrefutable logic and has access to every piece of evidence that could be applicable to any matter at hand. People also fail to disassociate themselves from their peripheral or apparently unrelated knowledge and the trusts they have developed over their lifetimes. So they end up considering all the evidence at their disposal, and not just the immediate evidence involved in the question. This is not censure but a statement about reality. It is why honest and rational people sometimes need to agree to differ. 


Western culture currently has a problem by not recognizing this issue. It is claimed only selected parts of academia and science possess the acceptable evidence base to establish incontrovertible truths. But this opinion presupposes that the acceptable evidence base is both logical, complete, and without bias. Bias Aware Thinking rationally rejects those assumptions because biases, lapses in formal logic, and unawareness of certain information pervades all thinking, including the ideas and beliefs of scholars and scientists.  


If you are comfortable with the idea that human biases have been conquered by our society's scholars and that bias is not an important issue, then you can reject the concept of "Bias Aware Thinking".  But first you should consider that scientists repeatedly tell consumers of their work that public must not trust their own observations. Yet when a scientist "observes" their observations may include any type of scientific data, even if it involves inferences, conjectures, and judgments. Until academia can establish what allows them to make flawless judgments and everyone else to be at the mercy of human frailties then the discussion over evidence will continue.  At the moment scientists cannot even recognize the variation in the quality of modern day scientific practices.




Another way to ask the above question is to ask: why do our evidence bases vary so much? We will address those issues here:


We are all surrounded by our personal experiences and our social environment. A student’s environment will be different from someone who works on a farm, a factory, a school, an office building, or a construction (building) site – or from someone who has no job.  If we live in a small town our environment will differ from that of a city dweller. We are encircled by people – family, friends, and mentors – and have other influences: newspapers, internet sites, and books we choose to read. The information from these people and environments will vary: some of it will be true, some of it false; but most of it will be unclear and almost certainly contradictory. 


Out of this farrago of "evidence", ideas, and thoughts, you will form your beliefs and opinions.  And you will do so based upon the reflexive degree of trust (and mistrust) you have in the source of the data – almost certainly not on the intrinsic quality of the data.  Foundational Science being an important exception - see Bias in Science.


It is not surprising we rely on trust for our opinions, as we also do it for our actions: we buy a book based on our trust in the reviews or reputation of the author; we take a job on our trust we will be paid. We even sign contracts on the trust the other side will generally fulfill the details described in the legal document. Ultimately everything in life is based upon trust.  


So of course we will be astonishingly biased - especially those involved in "groups" consisting of people with common beliefs and understandings.



Bias Aware Thinking (BAT) is a way to enhance critical thinking by being aware that human biases will affect people’s thinking processes - even the thinking of the most intelligent and sensitive.


Two principles follow from BAT:

1: Biases spring from individual minds as well as the prevailing groups, institutions, and culture(s) in which we live;

2: Other than Classical or Foundational Science (see Bias in Science) the idea of certainty is an illusion.


A number of different approaches have been devised over the years to attain the Holy Grail of clear, accurate, and reliable thoughts. They go by names such as Critical Thinking, Rational Thinking, Integrative Thinking, Free-thought, Philosophical Thinking, and The Scientific Method. Each approach may have helpful features for certain purposes but none of them, to our knowledge, focuses on the practical behavior of bias as a fundamental issue. As William Whewell observed nearly two centuries ago "the human dimension" can never be divorced from even the most carefully considered thoughts.


Bias Management provides a way of implementing BAT. 

Everyone's (inc. groups' and experts’) convictions and understandings are shaped by their unconscious assumptions - and these assumptions form our biases

See May 2016 blog for more information about Bias Aware Thinking