Here at Sempringham we've been trying to figure out how to enlighten people who are clearly misguided on policy issues.
In Feelings vs Facts, we looked at the insights of cognitive scientist George Lakoff, who believes people can be usefully categorized by their attitudes concerning child rearing. Conservatives, in his model, are people who instinctively adhere to a family structure with a strict father. Liberals, on the other hand, believe in a "nurturant" [hate that word – why not just say "nurturing"?] parent for whom discipline is not a critical focus.
Lakoff believes we see events through these "frames", and, to put words in his mouth, this explains why liberals are more likely to see Black Lives Matter as people who who are seeking justice while conservatives are more likely to see them as people who are misbehaving.
David Ignatius brings another dish to the party in this morning's Washington Post. Ignatius cites the work of some social scientists who have demonstrated "that attempts to refute false information often backfire and lead people to hold on to their misperceptions even more strongly."
Trying to correct misperceptions can actually reinforce them .... [Researchers] documented what they called a “backfire effect” by showing the persistence of the belief that Iraq had weapons of mass destruction in 2005 and 2006, after the United States had publicly admitted that they didn’t exist. “The results show that direct factual contradictions can actually strengthen ideologically grounded factual belief,” they wrote.
...[A]ttempts to debunk myths can reinforce them, simply by repeating the untruth. [Researcher Christopher Graves] cited a 2005 study in the Journal of Consumer Research on “How Warnings about False Claims Become Recommendations.” It seems that people remember the assertion and forget whether it’s a lie. The authors wrote: “The more often older adults were told that a given claim was false, the more likely they were to accept it as true after several days have passed.”
When critics challenge false assertions — say, Trump’s claim that thousands of Muslims cheered in New Jersey when the twin towers fell on Sept. 11, 2001 — their refutations can threaten people, rather than convince them. Graves noted that if people feel attacked, they resist the facts all the more.It seems a shame that you have to be so manipulative in order to help someone understand what the Kochs are doing to him.
...The study showed two interesting things: People are more likely to accept information if it’s presented unemotionally, in graphs; and they’re even more accepting if the factual presentation is accompanied by “affirmation” that asks respondents to recall an experience that made them feel good about themselves.
...The final point that emerged from Graves’s survey is that people will resist abandoning a false belief unless they have a compelling alternative explanation. That point was made in an article called “The Debunking Handbook,” by Australian researchers John Cook and Stephan Lewandowsky. They wrote: “Unless great care is taken, any effort to debunk misinformation can inadvertently reinforce the very myths one seeks to correct.”