“All models are wrong, but some are useful.” – George Box
One of my favorite models explaining how we make decisions, evaluate things, and process information is the scale. When we have no opinion about something, the scale is empty, like so:
Learning about something is a process of collecting evidence. Each piece of information we get about something is like a marble: negative information goes on one side, and positive information goes on the other side. If we see a pile of free money, it’d be all positive information:
If we saw a quarter on the floor, that would be a piece of positive information, but not quite as positive, so you’d get a slight tilt:
The kind of information we learn changes the size of the marble that we put on the scale. One big piece of information can make it tip over right away.
“You’re collecting evidence for one option or another,” says Brown University’s Amitai Shenhav. “Marbles are collecting, more marbles are collecting, the more marbles come to mind, and the size of those marbles scales with the value of those attributes.”
When the scale is clearly leaning to one side, and we’ve accumulated enough information to come to a decision about what to do, or whether or not we like something. We’ve crossed the threshold needed to come to a conclusion:
When do we spend more time collecting information?
- If the decision is complex
- If we’re cautious or risk averse
- If the cost of changing our mind or backing out are high
- If we get lots of conflicting information, and the scale tips back and forth
When the scale is already leaning over in one direction, we’ve started accumulating information that favors that thing. A leaning scale means that we’ve already started building up an implicit association, an evaluation that’s so subtle, we may not even realize it.
When the scale starts leaning to one side, information supporting that side becomes easier to process. We begin selectively collecting certain marbles—and pieces of evidence—that back up the side that’s already leaning over. This information is simply easier for us to process. It’s more fluent. It confirms and is consistent with our beliefs.
“Our judgment is strongly influenced, unconsciously, by which side we want to win — and this is ubiquitous.”
When people already have an unconscious belief—even if they don’t realize it—they selectively expose themselves to information or news that confirms their beliefs—giving them more ammunition to present those hunches as fact.
The more we’re exposed to certain kinds of information, the more practice we get processing that information; information that we process faster “feels” right. It’s what we’re used to. How easily we can process or understand information makes it easy for us to mistake it for the truth.
Our implicit beliefs are the things that we believe deep in our core, even when we don’t realize it; the famous Implicit Association Test is merely a test of how quickly we connect ideas, like “white/black/our ingroup” and “good.”
Information that is inline with our pre-existing beliefs becomes easier to process, so we begin actively seeking it out, and ignoring contradictory information. In other words, certain marbles are always easier for us to pick up. If we actively dislike “the Army/feminists/vanilla,” the second we see anything that reminds us of this information, we disregard it.
One key to overcoming your decision-making biases is to have a scout mindset rather than a soldier mindset (explained in this video): seek to learn, not to defend.
Viewing the opposing viewpoint in very simplistic terms is a good sign that someone hasn’t really examined it.
For example, people who hear entire terms and roll their eyes are probably much less informed about that entire field than they realize. I used to roll my eyes at “women’s studies,” “gender studies,” and anything related to weightlifting. Immediate emotional reactions are merely signs of someone’s bias.