A recent NY Times article, "I Don't Want to Be Right," provides some insight. The article is about why people refuse to change false beliefs, even in the face of clear factual evidence and acknowledgement of the truth of such evidence. Take, for example, if someone says, "I know that the world is round, but I'm going to believe it is flat." Why would this happen?
Consider this scenario:
...not all errors are created equal. Not all false information goes on to become a false belief—that is, a more lasting state of incorrect knowledge—and not all false beliefs are difficult to correct. Take astronomy. If someone asked you to explain the relationship between the Earth and the sun, you might say something wrong: perhaps that the sun rotates around the Earth, rising in the east and setting in the west. A friend who understands astronomy may correct you. It’s no big deal; you simply change your belief.
But imagine living in the time of Galileo, when understandings of the Earth-sun relationship were completely different, and when that view was tied closely to ideas of the nature of the world, the self, and religion. What would happen if Galileo tried to correct your belief? The process isn’t nearly as simple. The crucial difference between then and now, of course, is the importance of the misperception. When there’s no immediate threat to our understanding of the world, we change our beliefs. It’s when that change contradicts something we’ve long held as important that problems occur (Maria Konnikova, NY Times, I Don't Want to Be Right).Why would false belief happen? It'd happen if the belief tied in strongly with your sense of self, if that belief changes beliefs about a whole range of other things, if that belief threatens something you stood for with conviction. "A man with a conviction is a hard man to change. Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point." So wrote the celebrated Stanford University psychologist Leon Festinger" (See related article: Chris Mooney, Mother Jones, The Science of Why We Don't Believe Science).
What can scientists and physicians do about the vaccine and autism mis-belief, then? The answer is not clear, since the topic has become one of such conviction; one highly invested in emotion for parents and caregivers. But, at least it is not an inherently ideological one. The best we can do is try to prevent it from becoming more so - and that applies to all our politics and debates.
And that, ultimately, is the final, big piece of the puzzle: the cross-party, cross-platform unification of the country’s élites, those we perceive as opinion leaders, can make it possible for messages to spread broadly. The campaign against smoking is one of the most successful public-interest fact-checking operations in history. But, if smoking were just for Republicans or Democrats, change would have been far more unlikely. It’s only after ideology is put to the side that a message itself can change, so that it becomes decoupled from notions of self-perception.
Vaccines, fortunately, aren’t political. “They’re not inherently linked to ideology,” Nyhan said. “And that’s good. That means we can get to a consensus.” Ignoring vaccination, after all, can make people of every political party, and every religion, just as sick (Maria Konnikova, NY Times, I Don't Want to Be Right).
No comments:
Post a Comment