I’ve written a couple of articles for the local paper over the last couple years (the finance department takes turns writing a weekly column) and this week, it’s my turn again. Usually, my articles focus on the behavioral side of finance, though with the election upon us, it seems a good opportunity to illustrate that behavioral biases affect all kinds of decisions. The article is set to run this Sunday, November 4.
I’ve used this space before discussing how behavioral biases lead to making suboptimal investment choices. By increasing awareness of these issues, I hope we make better financial decisions. Our biases affect other decisions, too; with an election coming up in just a couple days, this is a good opportunity to pause and reflect on how we process information and improve our decision making. Evaluating evidence and setting opinions on a particular topic is mentally taxing, but it’s worth taking the time to do it right.
We like to hear that our opinions are right, while reducing the effort required evaluating new information. When we buy a share of stock, we love seeing confirmation that our decision was smart (analysts say the stock remains undervalued…YES!), while dismissing or rationalizing evidence that our play was poor (this analyst knows nothing…it’ll bounce back). This confirmation bias shows up in politics, too.
We search for information that reinforces what we already believe, while reducing exposure to sources that cause us to rethink our positions. Worse, we actively dismiss contrary evidence, too. Conservatives are more likely to watch Fox News than MSNBC (and vice versa for liberals), because we like hearing evidence that confirms our existing beliefs. It requires less mental effort hearing what already agrees with us than reevaluating our positions in the face of new evidence. Ultimately, we’re likely gathering information in a biased way, leading us towards greater polarization.
If you’ve ever tried convincing family that their beliefs are wrong, you’ve probably seen them settle into a more entrenched, less convincible position–the backfire effect in action. This bias causes people to reject evidence that contradicts an individual’s beliefs and pushes them further towards their own initial positions. That is, not only is it very difficult to sway someone, it’s hard not to push them farther away from your position!
This is particularly frustrating when a belief hinges on a falsehood. Research finds that people are more likely to trust a statement, even a false one, if they’ve heard it before—a familiar claim, regardless of validity, is more trustworthy than an unfamiliar statement. Even if it’s factually wrong, if it’s repeated often enough, people tend to think it’s right. You don’t have to know whether something is valid—instead, just hearing something often makes us more likely to believe it.
It’s hard to expose a well-cited falsehood and change minds. And because we seek out confirming evidence of our pre-existing beliefs, we avoid and dismiss contrary evidence. Even when presented with valid, contrary evidence, we frequently become even more entrenched in our existing beliefs! Our minds play lots of tricks on us, limiting our ability to process information efficiently and accurately.
This weekend, spend an extra moment revisiting your positions. Find contrary evidence and consider the validity of new information. With family gatherings, beware the backfire effect—remember it’s tough to convince someone unless they’re ready to be swayed and you’d probably prefer not to push them even farther away.