Based on the original article by Buster Benson.
This is a list that the author created to group the 175 bias that exist in Wikipedia. There are four problems stated, and the types of thinking that are associated to each one of them. I tried to replicate the article and to summarize it even more. We will start with the main takeaways, the whole list in a chart, and the breakdown of each particular problem.
- Information overload sucks, so we aggressively filter. Noise becomes signal.
- Lack of meaning is confusing, so we fill in the gaps. Signal becomes a story.
- Need to act fast lest we lose our chance, so we jump to conclusions. Stories become decisions.
- This isn’t getting easier, so we try to remember the important bits. Decisions inform our mental models of the world.
In order to avoid drowning in information overload, our brains need to skim and filter insane amounts of information and quickly, almost effortlessly, decide which few things in that firehose are actually important and call those out.
In order to construct meaning out of the bits and pieces of information that come to our attention, we need to fill in the gaps, and map it all to our existing mental models. In the meantime we also need to make sure that it all stays relatively stable and as accurate as possible.
In order to act fast, our brains need to make split-second decisions that could impact our chances for survival, security, or success, and feel confident that we can make things happen.
And in order to keep doing all of this as efficiently as possible, our brains need to remember the most important and useful bits of new information and inform the other systems so they can adapt and improve over time, but no more than that.
Problem 1: Too much information.
There is just too much information in the world, we have no choice but to filter almost all of it out. Our brain uses a few simple tricks to pick out the bits of information that are most likely going to be useful in some way.
This is the simple rule that our brains are more likely to notice things that are related to stuff that’s recently been loaded in memory.
And we’ll generally tend to weigh the significance of the new value by the direction the change happened (positive or negative) more than re-evaluating the new value as if it had been presented alone. Also applies to when we compare two similar things.
This is a big one. As is the corollary: we tend to ignore details that contradict our own beliefs.
Problem 2: Not enough meaning.
The world is very confusing, and we end up only seeing a tiny sliver of it, but we need to make some sense of it in order to survive. Once the reduced stream of information comes in, we connect the dots, fill in the gaps with stuff we already think we know and update our mental models of the world.
Since we only get a tiny sliver of the world’s information, and also filter out almost everything else, we never have the luxury of having the full story. This is how our brain reconstructs the world to feel complete inside our heads.
When we have partial information about a specific thing that belongs to a group of things we are pretty familiar with, our brain has no problem filling in the gaps with best guesses or what other trusted sources provide. Conveniently, we then forget which parts were real and which were filled in.
Similar to the above but the filled-in bits generally also include built in assumptions about the quality and value of the thing we’re looking at.
- Halo effect
- In-group bias
- Out-group homogeneity bias
- Cross-race effect
- Cheerleader effect
- Well-traveled road effect
- Not invented here
- Reactive devaluation
- Positivity effect
Our subconscious mind is terrible at math and generally gets all kinds of things wrong about the likelihood of something happening if any data is missing.
In some cases, this means that we assume that they know what we know, in other cases we assume they’re thinking about us as much as we are thinking about ourselves. It’s a case of us modeling their own mind after our own (or in some cases after a much less complicated mind than our own).
Magnified also by the fact that we’re not very good at imagining how quickly or slowly things will happen or change over time.
Problem 3: Need to act fast.
We’re constrained by time and information, and yet we can’t let that paralyze us. Without the ability to act fast in the face of uncertainty, we surely would have perished as a species long ago. With every piece of new information, we need to do our best to assess our ability to affect the situation, apply it to decisions, simulate the future to predict what might happen next, and otherwise act on our new insight.
In reality, most of this confidence can be classified as overconfidence, but without it, we might not act at all.
- Overconfidence effect
- Egocentric bias
- Optimism bias
- Social desirability bias
- Third-person effect
- Forer effect, Barnum effect
- Illusion of control
- False consensus effect
- Dunning-Kruger effect
- Hard-easy effect
- Illusory superiority
- Lake Wobegone effect
- Self-serving bias
- Actor-observer bias, Fundamental attribution error
- Defensive attribution hypothesis
- Trait ascription bias
- Effort justification
- Risk compensation, Peltzman effect
- Armchair fallacy
We value stuff more in the present than in the future and relate more to stories of specific individuals than anonymous individuals or groups. I’m surprised there aren’t more biases found under this one, considering how much it impacts how we think about the world.
The behavioral economist’s version of Newton’s first law of motion: an object in motion stays in motion. This helps us finish things, even if we come across more and more reasons to give up.
If we must choose, we tend to choose the option that is perceived as the least risky or that preserves the status quo. Better the devil you know than the devil you do not.
We’d rather do the quick, simple thing than the important complicated thing, even if the important complicated thing is ultimately a better use of time and energy.
Problem 4: What should we remember?
There’s too much information in the universe. We can only afford to keep around the bits that are most likely to prove useful in the future. We need to make constant bets and trade-offs around what we try to remember and what we forget. For example, we prefer generalizations over specifics because they take up less space. When there are lots of irreducible details, we pick out a few standout items to save and discard the rest. What we save here is what is most likely to inform our filters related to problem 1’s information overload, as well as inform what comes to mind during the processes mentioned in problem 2 around filling in incomplete information. It’s all self-reinforcing.
We do this out of necessity, but the impact of implicit associations, stereotypes, and prejudice results in some of the most glaringly bad consequences from our full set of cognitive biases.
It’s difficult to reduce events and lists to generalities, so instead, we pick out a few items to represent the whole.
Our brains will only encode information that it deems important at the time, but this decision can be affected by other circumstances (what else is happening, how is the information presenting itself, can we easily find the information again if we need to, etc) that have little to do with the information’s value.
Keep in mind
In addition to the four problems, it would be useful to remember these four truths about how our solutions to these problems have problems of their own:
- We don’t see everything. Some of the information we filter out is actually useful and important.
- Our search for meaning can conjure illusions. We sometimes imagine details that were filled in by our assumptions and construct meaning and stories that aren’t really there.
- Quick decisions can be seriously flawed. Some of the quick reactions and decisions we jump to are unfair, self-serving, and counter-productive.
- Our memory reinforces errors. Some of the stuff we remember for later just makes all of the above systems more biased, and more damaging to our thought processes.
By keeping the four problems with the world and the four consequences of our brain’s strategy to solve them, the availability heuristic (and, specifically, the Baader-Meinhof phenomenon) will ensure that we notice our own biases more often. If you visit this page to refresh your mind every once in a while, the spacing effect will help underline some of these thought patterns so that our bias blind spot and naïve realism is kept in check.
Nothing we do can make the 4 problems go away (until we have a way to expand our minds’ computational power and memory storage to match that of the universe) but if we accept that we are permanently biased, but that there’s room for improvement, confirmation bias will continue to help us find evidence that supports this, which will ultimately lead us to better understanding ourselves.