Lies stick around in politics. Run an implausible attack ad, call your opponent a monster, and even after you’ve been thoroughly debunked, the negative feelings will linger. Fact-checkers and reporters can busy themselves correcting the record and calling out exaggerations all they want, but it’s impossible to fully undo the effects of a well-placed fib.
There have been plenty of studies about the way erroneous negative information about a politician creates feelings that last well after that information has been discredited. In one 2007 study by political scientist John Bullock, test subjects read about a fictional candidate with unpopular views on the environment and education. Later, they were told that the information was false—that the experimenter had simply made it up. Despite this, the participants still maintained a disapproving attitude towards the candidate. It didn’t matter that the facts had changed; the emotions remained.
This kind of reaction goes far beyond confirmation bias, in which people seek out information that supports their preconceived beliefs and reject facts that are less convenient. Here, facts are unequivocally debunked, we know the truth, and yet, the negative cloud persists. It’s a phenomenon psychologists call “belief perseverance.” When you learn a fact about someone, you create explanations for his or her actions in your mind. That politician must have lousy environmental policies because he’s a greedy jerk in the pocket of the oil industry, you tell yourself. These explanations sit in your brain, remaining easily accessible long after the facts have been cleared up.
The lasting effects of false negative information have been studied extensively, but practically no attention has been paid to false positive information—that is, what happens when a politician who brags about, say, saving a billion dollars is repeatedly discredited? Does belief perseverance mean that positive feelings will linger, too?
A recent study by Michael D. Cobb, Brendan Nyhan, and Jason Reifler, published in the journal Political Psychology (pdf), examined this question. The researchers divided undergraduate students at a public university in the south into three groups. The control group read short biographies of Harold and Michael Davis, two fictional, unrelated Nevada state senators with similarly positive resumes. The second group read the bios, as well as a mock news article about Michael Davis. According to the article, Davis had sponsored the “College Cost Relief Bill,” which provided free tuition and board to students—a fake piece of legislation designed to appeal to undergraduates. The final group read the bios and the article, but was also asked to read a second mock article, a retraction that apologized for misstating the name of the bill’s actual sponsor, senator Harold Davis.
At the end of the study, each group was asked to rate the politicians. As expected, the group that had only read the first article rated Michael Davis much more highly than the control group. The reaction of the group that had read the retraction, however, was surprising: they reported feeling far colder towards the politician than the control group had, and were much less likely to feel that Michael Davis cared about people like them. The warm feelings hadn’t lingered. Instead, debunking the false accomplishment had left participants feeling sour about the politician.
The reason for this, the researchers argue, is that our brains just aren’t that good at accounting for new information. In this case, the researchers say that people were simply overcorrecting. “We argue people attempt to adjust for the perceived influence of the false claim when the information is discredited,” the authors write. We want to judge people properly, and we know we have to revise our feelings about a politician in light of this new knowledge, but we go too far, overestimating how much our judgment had been affected by the good news. Like the NBA referee who, desperate not to bend to the will of the crowd, begins calling weak fouls on the home team, we overcompensate.
The results, while disheartening if you like to think your judgments are rational and well-considered, are at least encouraging for the state of politics. They suggest that résumé-padding and accomplishment-boosting can have consequences, provided there’s someone there to call out the lie. “Allowing one’s achievements to be exaggerated or misrepresented is perilous,” the researchers argue. In a broader sense, though, the study also highlights the tough position politicians are in: “Both discredited positive and negative misinformation cause politicians to be viewed more unfavorably—a result that may help to explain why they are held in such low regard.”