What are implicit beliefs? The fluency bias, and why some information is easier to process
“All models are wrong, but some are useful.” – George Box
One of the most useful models explaining how we make decisions, evaluate things, and process information is the scale. When we have no opinion on something, the scale is empty, like so:
Learning about something is a process of collecting evidence, and the evidence we collect can either be construed as positive or negative. Each piece of information we get about something is like a marble: negative information goes on one side, and positive information goes on the other side. If we see a pile of free money, it’d be all information that’s very positive:
If we saw a quarter on the floor, that would be a piece of positive information, but not quite as positive, so you’d get a slight tilt:
How good, bad, or noticeable the information is changes the size of the marble that we put on the scale: one big piece of information can make it tip over right away.
“You’re collecting evidence for one option or another,” says Brown University’s Amitai Shenhav. “Marbles are collecting, more marbles are collecting, the more marbles come to mind, and the size of those marbles scales with the value of those attributes.”
When the scale is clearly leaning to one side, and we’ve accumulated enough information to come to a decision about what to do, or whether or not we like something:
When do we spend more time collecting information?
- If the decision is complex
- If we’re more cautious or risk averse
- If the cost of reversing or backing out of the choice are high
- If we get lots of conflicting information, and the scale tips back and forth
But when the scale is already leaning over in one direction, we’ve started accumulating information that favors one thing over another: say, we’re gathering more evidence in favor of staying in to watch TV instead of have dinner with a coworker. When the scale starts leaning over, we start building up an implicit association, that “Netflix = good.”
When the scale starts leaning to one side, information supporting that side becomes easier to process. We begin selectively collecting certain marbles—and pieces of evidence—that back up the side that’s already leaning over. This information is simply easier for us to process. It’s more fluent. It confirms and is consistent with our beliefs.
When people already have an unconscious belief—even if they don’t realize it—they selectively expose themselves to information or news that confirms their beliefs—giving them more ammunition to present those hunches as fact. In one study, this happened over the span of weeks, when subjects read news about the possible expansion of a U.S. Army base in Italy.
The more we’re exposed to certain kinds of information, the more practice we get processing that information; information that we process faster “feels” right. It’s what we’re used to. How easily we can process or understand information makes it easy for us to mistake it for the truth.
Our implicit beliefs are the things that we believe deep in our core, even when we don’t realize it; the famous Implicit Association Test is merely a test of how quickly we connect ideas, like “caramel/Italian army base/our ingroup” and “good.”
Information that is inline with our pre-existing beliefs becomes easier to process, so we begin actively seeking it out, and ignoring contradictory information. In other words, certain marbles are always easier for us to pick up. If we actively dislike “the Army/feminists/vanilla,” the second we see anything that reminds us of this information, we disregard it.