A bias is any inclination toward a particular belief or perspective, often one that is ill-supported by reason or evidence. When we call another person “biased” we usually mean that they are incapable of looking objectively at the facts. They are too stuck in their own world view, with their own prejudices, and usually unwilling have an open mind about certain issues.
Psychologists claim that many of our biases are evolved mental processes which at one point may have been adaptive to our environment. Because the mind is not a perfectly calculating machine, it uses many different heuristics (or “rules of thumb”) that help guide the decision-making process.
Although this process isn’t perfect, it often gets the job done in terms of survival and reproduction. Many of these biases may still be functional in today’s world, but others can greatly inhibit us from making rational and intelligent decisions. I hope to discuss some of these biases with you today. And maybe by being more aware of them you can avoid some of your faulty thinking and make smarter decisions in the future.
Bias Blind Spot
The bias blind spot is our tendency to think we are less biased than others. That’s right, as you read this article, you may think that all of these biases are interesting phenomena, but they don’t describe you, right? You’re special. You’re smart. You’re better. BUT – if you actually believe this, you may have already discovered your first personal bias. Many social psychologists find that people on average think they are above average, when of course they aren’t (everyone being above average is obviously a statistical impossibility). This effect is sometimes called “illusory superiority.”
Solution: Try to be more modest in estimating your own intelligence and admit that you are subject to biases like everyone else.
Another bias many of us face is we tend to favor information that supports our preconceptions. When we come across information that supports our theories and hypotheses, we are much more likely to pay attention to it and remember it. But when we find something that challenges our assumptions, we often search for ways to invalidate it or ignore it. This tendency can keep us stuck on old beliefs even when we have been presented evidence that disproves them.
Solution: Try your best to set equal standards for all evidence regardless of whether or not it supports or challenges your current view. Always be open to changing your mind.
The Framing Effect
The Framing Effect describes how presenting the same information from a different perspective can dramatically alter the decisions we make. For example, research by psychologists Amos Tversky and Daniel Kahneman shows us that when participants are presented a hypothetical scenario where they need to save the population from a deadly disease, participants are more likely to avoid risk when a positive frame is presented but seek risk if a negative frame is presented.
Picture yourself in the following scenario. You need to come up with a disease preventive strategy in order to save as many lives as possible. You are presented two options:
A) Save 200 people’s lives.
B) There is a 33% chance of saving all 600 people and a 66% possibility of saving no one
In this case, Kahneman & Tversky found that 72% of participants chose option A.
Now consider a different two options:
C) 400 people die.
D) There is a 33% chance that no people die and a 66% chance that all 600 people will die.
Between these two options only 22% of participants chose option C (the equivalent of option A) over option D (the equivalent of option B). So even though the two options are computationally equivalent, the way the situation was framed (gains vs. losses) changes which option we choose.
Solution: Always look at both sides of a situation (both costs and benefits) and don’t get caught up in only the perspective that was presented to you.
If I toss a coin 5 times and each time it lands on tails, do you think there is a greater likelihood that future trials will land on heads? If so, you have just committed the gambler’s fallacy. This is the erroneous assumption that the odds for something with a fixed probability will increase or decrease depending upon recent occurrences. But in fact, even if I land on tails a hundred times, the chance it will land on tails again in the next trial is still 50/50. It is always 50/50.
Solution: Don’t let recent random occurrences dictate future decision-making.
The Halo Effect is the reason so many advertisers put celebrities in their commercials. We falsely believe that if a person is good at one particular trait then that means they are also trustworthy in something else that is completely unrelated. Why is Dr. Dre telling me to drink Dr. Pepper? Sure, he is a great rap producer, but does that have anything to do with knowing good soda? Why do people listen at all to what Jenny McCarthy says about the causation between vaccines and autism, even though there is no medical evidence to support her claims, and she is obviously not a medical professional? People are willing to believe others simply based on the fact that they like them from an overall viewpoint:
- “Edward L. Thorndike was the first to support the halo effect with empirical research. In a psychology study published in 1920, Thorndike asked commanding officers to rate their soldiers; he found high cross-correlation between all positive and all negative traits. People seem not to think of other individuals in mixed terms; instead we seem to see each person as roughly good or roughly bad across all categories of measurement.”
Solution: Adopt the belief that just because a person may be good in one thing doesn’t necessarily mean they are good at something else. Try to understand individuals in more complex and distinct terms.
Mere Exposure Effect
The mere exposure effect is our tendency to like things merely because we are familiar with them. This effect can be true in both objects as well as people; the more we interact with someone or hang out with them, the more likely we are to develop a friend or intimate partner. While this may seem like commonsense, this bias can be potentially dangerous when we get stuck in negative patterns by not exposing ourselves to new things.
Solution: When you find yourself discontent with an area in your life, seek exposure to new things. Maybe the grass is actually greener on the other side.
The Planning Effect is our tendency to underestimate how much time we need to complete a task. In a study done in 1994, researchers found that students expected to complete their senior theses in 27.4 days (if everything went as well as it possibly could) and 48.6 day (if everything went as poorly as it possibly could). It turns out they completely missed the mark: the average time students took to complete their senior theses was 55.5 days and only 30% of students completed in the amount of time they expected. Since then Lovallo and Kahneman (2006) expanded the Planning Effect to also include underestimating the costs and risks of a project while overestimating the benefits.
Solution: Be more mindful and lenient when setting deadlines. Also plan modestly, we have a tendency to overestimate our ability to plan correctly for the future. It might be better to focus on smaller and more foreseeable goals (especially ones that you suspect will lead to big changes over longer periods of time).
Many businessmen base their marketing off of the theory that consumers will often buy a product for emotional factors and then rationalize their decision later. This “post-purchase rationalization” is said to be a form of cognitive dissonance to help combat buyer’s remorse, that feeling of guilt we sometimes get after buying a product (like a new house or car). In order to get over this guilt we search for reasons to justify our purchases, even though we never had these reasons in mind when we actually made the decision.
Solution: Don’t commit yourself to a decision unless you have identified the real reasons you think it will benefit you.
We like to think of ourselves as rational beings, but modern psychology has shown us again and again the mistakes we tend to make. Unfortunately there are many other biases that I haven’t been able to cover in this short article, but Wikipedia has a great list if you want to learn some more ways your mind tricks itself.
Understanding our imperfections is vital to being able to correct them in the future. We will probably never be able to get rid of all our biases (many of them make us who we are), but when we try to notice these principles playing out in the real world we can act on them with better clarity.
Discover more tools to daily growth in the digital guide The Science of Self Improvement.