Nov 22, 2008

Why It's Hard to Change People's Minds

By Indy Media / filmsforaction.org
By Sean Gonsalves. From Alternet.org:

A long time ago, Mark Twain told us: "It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so."

Entwined in Twain's train of thought, is an implicit -- and important -- distinction: the difference between being uninformed and being misinformed.

Today, there's scholarship to back up Twain's theory that being ignorant isn't as troublesome as being certain about something that "just ain't so."

Ignorance can be educated. But what's the antidote to misinformation? Correct information?

Not exactly -- according to political scientists Brendan Nyhan and Jason Reifler, co-authors of one of the few academic studies on the subject, "When Corrections Fail: The persistence of political misperceptions."

While it may seem like common sense to think misinformation can be countered by giving people the real 411, Nyhan and Reifler's research indicates that correct information often fails to reduce misperceptions among the ideologically-committed, particularly doctrinaire conservatives.

That's something many readers of this column understand intuitively after having seen false claims like Obama-is-a-Muslim refuted over and over again and yet, unbelievably, somehow manages to persist.

There's lots of research on citizen ignorance but there's only a handful of studies that focus on misinformation and the effect it has on political opinions. Nyhan and Reifler's work adds to what Yale University political scientist Robert Bullock has found: it's possible to correct and change misinformed political opinions, but the truth (small 't') ain't enough.

In Bullock's experimental study, participants were shown the transcript for an ad created by a pro-choice group opposing the Supreme Court nomination of John Roberts. The ad falsely accused Roberts of "supporting violent fringe groups and a convicted clinic bomber."

What Bullock found was that 56 percent of the Democratic participants disapproved of Roberts before hearing the misinformation. After seeing the attack ad, it jumped to 80 percent.

When they were shown an ad that refuted the misinformation and were also told the pro-choice group had withdrawn the original ad, the disapproval rating didn't drop back down to 56 percent but to 72 percent.

Nyhan and Reifler conducted a series of studies where subjects were presented with mock news articles on "hot button" issues that included demonstrably false assertions like: Iraq possessed WMD immediately before the U.S. invasion. Tax cuts lead to economic growth. Bush banned stem cell research, as Sens. Kerry and Kennedy claimed during the 2004 presidential campaign.

With the Iraq-possessed-WMD-immediately-before-the-invasion assertion, participants were shown mock news articles supporting the unfounded Bush administration claim and then provided the refutation by way of the Duelfer Report, which authoritatively details the documented lack of WMD, or even an active production program, in Iraq just before the invasion.

But instead of changing the minds of ideologically-committed war-backers, Nyhan and Reifler found a "backfire effect," in which Iraq invasion-supporters only slightly modified their view without letting go of the misinformation by saying "Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived." Sigh.

Nyhan and Reifler attribute that kind of "thinking" to the affects of "motivated reasoning," which can distort how people process information.

"As a result (of motivated reasoning), the corrections fail to reduce misperceptions for the most committed participants. Even worse, they actually strengthen misperceptions among ideological subgroups."

Now you know why those back-and-forth on-line debates so often prove to be fruitless. Unfortunately, neither Bullock, Nyhan, or Reifler suggest a way to successfully counter misinformation clung to by those who hold their political opinions with an air of certitude.

Washington Post columnist Shankar Vedantam suggests wrapping refutations in language that enhances the self-esteem of the misinformed.

Whatever you do, just don't forget Twain's timeless advice: "tell the truth or trump -- but get the trick."

Sean Gonsalves is a syndicated columnist and news editor with the Cape Cod Times.
Culture
Videos Under Five Minutes
Watch On Netflix
Trending Videos
Israelism (2023)
84 min - When two young American Jews raised to unconditionally love Israel witness the brutal way Israel treats Palestinians, their lives take sharp left turns. They join a movement of young American Jews...
Pro-Palestinian Demonstrations Surge at US Campuses after Columbia University Arrests
8 min - Growing outrage over Israel's war on Gaza has sparked protests at major universities in the US. Students at Yale, Columbia and New York University have been holding sit-in protests on campus...
TraumaZone (2022)
350 min - An epic documentary by British director Adam Curtis illustrating in seven parts state and decline of the Soviet Union and the development in Russia 1985–1999 using material from the BBC archives.
Trending Articles
Miki Kashtan
The Conflation Trap: How the Left & Right get Fooled into Supporting Elite Interests
Subscribe for $5/mo to Watch over 50 Patron-Exclusive Films

 

Become a Patron. Support Films For Action.

For $5 a month, you'll gain access to over 50 patron-exclusive documentaries while keeping us ad-free and financially independent. We need 350 more Patrons to grow our team in 2024.

Subscribe here

Our 6000+ video library is 99% free, ad-free, and entirely community-funded thanks to our patrons!