Breaking Out of Bias
Cognitive scientists have identified up to 150 different biases. A bias is an unconscious device that influences our view of the world—natural evolutionary tools that help our brains organize the world in order to make faster and more efficient judgements and decisions. However, these mechanisms also blind us to new information or seeing alternative options—that might lead to better judgements and decisions.
While we can logically recognize biases may exist, in the heart of a decision, it is almost impossible for us to identify the bias(es) that are influencing our decision. For everyone, bias is a part of every decision from hiring the right candidate; choosing a job for ourselves; strategic initiatives; a vacation destination, or what to have for dinner. That is how the brain works. It seems the more important the decision, the more time we spend thinking, the harder it is to identify our bias about the decision we have made.
David Rock and his peers at the Neuroscience Leadership Institute have grouped the plethora of well-studied biases into 5 primary categories, called the SEEDS model.
We’ve all probably been called out on our biases before. Having words to describe our seemingly invisible biases is a first step. But how do we break out of our biases? As stated by David Rock, “It would be like asking our bodies to watch out for how much insulin our body is producing”.
Insulin levels do require self-regulation of behavior, and we cannot rely totally on ourselves. In an organization, breaking out of bias is aided by structure and process. Here are a few suggestions I have found useful.
Create a list of commonalities you have with every candidate. Don’t try to fight the natural inclination. Rather build a larger affiliation with new people or groups. The desire for safety is the source of this bias. It is not logical or real, only the brain’s desire to keep it simple.
Create decision timelines that allow for slower decisions. We are not all firefighters. And emergency personnel practice for rapid decision-making situations. Force members of the group to each take on alternative perspectives through role-playing; being a devil’s-advocate or doing a post-mortem (imagining a decision was the wrong one and asking what went wrong). The perceived need for speed is the source of this bias.
Ask a lot of opinions. It takes an unbalanced amount of alternative data to counteract what we believe to be true. We have evolved to believe all we see and have seen is all there is to see, and that it is accurate. This bias is particularly harmful when we believe our perception is reality—because the only option is that anyone who disagrees is wrong or lying.
Take a trip in time or space. We tend to overvalue those things that are close to us in time or space. When making a decision it is valuable to mentally go to a set time in the future, describe the reality of that future independent of the decision and then discuss how today’s decision fits within that future. If the decision is about another place, it is useful to go there if possible to make the decision. If this is not possible, bring items or knowledge (especially cultural) of that place to you in order to counterbalance all that currently surrounds you.
Recast events to force taking different perspectives. The attraction of potentially avoiding a loss is great than the attraction of potentially achieving a gain of equal value. Take on the decision making process from the point of view of an advisor to the group or describe what success looks like, and how you got there. (Our brain is already listing the ways it can fail.) Safety is about physical as well as social safety. We do not like to be wrong among our peers.
All of these practices engage the brain in a different way and effectively to slow down naturally reflexive thinking. One cannot control or identify all biases. In your group identify just one class of bias that might be in play and create a process to slow down the thinking.