Oct 09, 2014 | By George Gantz
The Challenge of Bias to Rationality
In our discussion of rationality, we began (September 15) by reviewing the three elements of rational decision making: reliance on evidence, focus on desired outcomes and logical reasoning. The following week (September 22) we questioned whether the universe was rational (it does not seem to be). Last week (October 1) we looked at emotions and concluded that they are integral to rationality, both as evidence and more importantly as the guide by which we weigh and choose among desired outcomes. Our consideration of emotions was, however, incomplete.
Emotions are necessary to rationality but they can also be disruptive. Emotions, along with a number of other factors, can be hidden from our conscious view, but still influence our thinking. In this case, as we mentioned in the overview discussion, these “hidden impulses, which are more appropriately labeled as ‘biases’ … can distort rational decision-making”.
What is a bias? While we could define bias as a non-rational influence on otherwise rational decisions, this leads to a bit of circularity. Better, perhaps, to define bias as any factor that influences decision-making without a conscious recognition that it is doing so. These influences may be un-examined or subconscious and by being invisible they remain below the level of our conscious awareness and control. In some cases, the consequences of a bias may be minimal. But in others, as in the case of the totalitarian mind described by Kurt Vonnegut (see text box), the consequences can be enormous.
For example, an emotional reaction can be a factor that we are aware of and capable of thinking about. If I hate spiders, and know that I hate spiders, it would seem perfectly rational to choose to kill one that crawls onto my shoe with a piece of paper. If however, I see the spider and react in fear by hitting it (and my shoe) with a hammer, one might conclude that my response was irrational. The difference is that the first behavior is a conscious choice and the second a pre-rational reaction driven by fear.
The important role that pre-conscious factors can have in our decision-making process is dealt with at some length in Jonathon Haidt’s book, The Righteous Mind (2012). The evidence suggests that we usually make moral judgments on the basis of intuitions that are grounded in our biological and cultural heritage, rather than on reasoning. These intuitions are largely pre-conscious and invisible – they are just “how we see things”. We make a moral judgment on the basis of our preconscious intuitions, and then our rational mind becomes engaged in a thinking process to unpack why we made that judgment.
In a sense, once we have made a decision we have to justify that choice to ourselves, using our rational faculties and any factors we can identify that support our decision. Rather than directing the evaluation leading to a judgment, the rational mind is recruited to support the judgment we already made.
Jonathon Haidt is not the only researcher to point out the importance of pre-conscious factors in our decision-making. There is widespread recognition (for example, see Daniel Goleman’s book Emotional Intelligence 1996) that while the human brain has evolved highly sophisticated cognitive capabilities in the neocortex, it still retains its instinctive “lizard brain”, which includes the limbic system that powers and controls our emotional reactions. Notably, our “fight or flight” response is triggered very quickly in the human body before we are consciously aware of the triggering stimuli. Arguably this is good for survival in an environment with large predators, but it complicates the question of whether we can truly be rational.
Taken to an extreme, the idea that we largely decide things on the basis of subconscious factors could support a form of determinism and the conclusion that we really do not have free will! (see Neuroscience.) While Haidt does not quite go that far, he is somewhat pessimistic about how much influence our rational mind has over our moral decision-making, and he points out why this tends to lead to intractable and unproductive “dialogue” when dealing with politics and religion, areas of human discourse that are significantly moral in character.
If pre-conscious emotional reactions and moral judgments play such a significant role in human decision-making, we have to admit there are significant practical challenges, in addition to the more theoretical ones, to the goal of being rational. There is considerable research that would seem to bear this out. There is an extensive literature on cognitive bias, biologically based gender bias, and implicit bias (see: http://www.biasproject.org), as well as new findings on belief superiority and the moral behavior or ethics professors, all of which bears this out. Recent articles in Nautilus explore the brain chemistry and perceptions that keep people engaged in the rigged game of state lotteries, and the cultural factors leading to the rise of sacred beliefs in religion and football. Most of what the literature describes is evidence of pre-conscious or subconscious factors that significantly drive human decision-making.
The research gives new meaning to the concept of blind justice. Indeed, we are, much of the time, blind to the influences that affect our decisions, including our moral judgments. If our goal is, as I think it should be, to be fully aware and rational in making decisions, then perhaps we should strive to take the blindfold off.
Indeed, the only way we can preserve the goal of rationality is to expose and understand, as much as we possibly can, the “hidden factors” that influence our decisions. As inscribed over the ancient Oracle of Delphi – “Know thyself.”
4 Responses to “The Challenge of Bias to Rationality”
Join the Discussion