False information is pervasive and difficult to eradicate, but scientists are developing new strategies such as “de-biasing,” a method that focuses on facts, to help spread the truth
A recurring red herring in the current presidential campaign is the verity of President Barack Obama’s birth certificate. Although the president has made this document public, and records of his 1961 birth in Honolulu have been corroborated by newspaper announcements, a vocal segment of the population continues to insist that Obama’s birth certificate proving U.S. citizenship is a fraud, making him legally ineligible to be president. A Politico survey found that a majority of voters in the 2011 Republican primary shared this clearly false belief.
Scientific issues can be just as vulnerable to misinformation campaigns. Plenty of people still believe that vaccines cause autism and that human-caused climate change is a hoax. Science has thoroughly debunked these myths, but the misinformation persists in the face of overwhelming evidence. Straightforward efforts to combat the lies may backfire as well. A paper published on September 18 in Psychological Science in the Public Interest (PSPI) says that efforts to fight the problem frequently have the opposite effect.
“You have to be careful when you correct misinformation that you don’t inadvertently strengthen it,” says Stephan Lewandowsky, a psychologist at the University of Western Australia in Perth and one of the paper’s authors. “If the issues go to the heart of people’s deeply held world views, they become more entrenched in their opinions if you try to update their thinking.”
Psychologists call this reaction belief perseverance: maintaining your original opinions in the face of overwhelming data that contradicts your beliefs. Everyone does it, but we are especially vulnerable when invalidated beliefs form a key part of how we narrate our lives. Researchers have found that stereotypes, religious faiths and even our self-concept are especially vulnerable to belief perseverance. A 2008 study in the Journal of Experimental Social Psychology found that people are more likely to continue believing incorrect information if it makes them look good (enhances self-image). For example, if an individual has become known in her community for purporting that vaccines cause autism, she might build her self-identity as someone who helps prevent autism by helping other parents avoid vaccination. Admitting that the original study linking autism to the MMR (measles–mumps–rubella) vaccine was ultimately deemed fraudulent would make her look bad (diminish her self-concept).
In this circumstance, it is easier to continue believing that autism and vaccines are linked, according to Dartmouth College political science researcher Brendan Nyhan. “It’s threatening to admit that you’re wrong,” he says. “It’s threatening to your self-concept and your worldview.” It’s why, Nyhan says, so many examples of misinformation are from issues that dramatically affect our lives and how we live.
via Scientific American – Carrie Arnold