13 August 2013
Why Arguing With A Wing-Nut Is Actually More Than Just A Waste of Time
Posted by Dan Satterfield
You almost certainly know that it’s a waste of time to argue with wing-nuts. If not, you’ll soon learn, but there is actually solid scientific evidence that not only does it not do any good, it actually makes the conspiracy lover even more certain of their belief. It’s called the “back-fire” effect.
I am reading an excellent book You Are Now Less Dumb by David McRaney. He goes deeply into the backfire effect which he sums up this way in his very popular online blog You Are Not So Smart:
The Misconception: When your beliefs are challenged with facts, you alter your opinions and incorporate the new information into your thinking.
The Truth: When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.
Physicist John Cook (who writes the excellent blog Skeptical Science about climate change myths) wrote about the back-fire effect in his Debunking Handbook. Over the last couple of days alone, I’ve seen at least three crazy internet posts about HAARP, chemtrails, and watched a congressman declare climate change as a giant hoax. Oh, and I’ll let you read about Donald Trump on ABC Sunday.
So, now that you are likely convinced that trying to argue with crazy is crazy, you might want to know what you should do. John Cook’s handbook has great ideas, but one thing you should think about is preventing others from taking a course in WING-NUT 101. Newspaper editors long ago figured out that they did not have an obligation to publish every nutty letter to the editor, and you will not get your letter published in any major newspaper I know of without a valid name and address. Editing a blog means you get to be the one who decides what comments are worthwhile, and I exercise that as responsibly as I can here and on my social media pages.
Are some people more susceptible to believing in conspiracy theories? It’s not fool-proof but a big defense from what I’ve seen can be summed up in one word: EDUCATION. There is evidence to support this from the research of Dr. Bob Altemeyer on authoritarian personalities. In the meantime, don’t waste your time arguing with wing-nuts, but keep in mind that you can do your part to make sure they are not contagious! Once someone is caught in the conspiracy information bubble, the back-fire effect makes it unlikely they will escape. Telling someone they are losing touch with reality may be mean spirited, but making it socially unacceptable to profess crazy beliefs is probably a powerful deterrent to others.
One last thing. WE ALL suffer from the back-fire effect. Knowing that is at least some defense from it. Scientific method is of course the greatest defense. It always eventually finds the truth because it relies only on observation, experiment and testable theories. It truly is mankind’s greatest invention.
Actually, the ultimate source of the ‘back-fire’ effect lies in group affiliation. When we undergo trauma or intense stress neurotransmitters are released that block the ability of our executive function to over-rule subconscious driven behavior cues. We tighten up who we consider is our ‘in group and trust them more than logic dictates. We search for evidence that people seek to do us harm, and become open to seeing patterns in unconnected events–as such, we end up distrusting our out-groups more than logic dictates. Starting arguments with people who view you as an out-group ends with them moving even closer to their in-group because the argument is continuing the fear arousal and in turn validating the sense of threat. It is especially useless to argue with people see as part of your out-group because your mind manipulates your interpretation of what they are saying and why. Our mind’s goal is to see if we can find a way to keep out-group people away from resources, power and influence since it reduces competition for ourselves and makes our lives better, as such, it works really hard to alter our ability to think clearly about people in our out-groups. They seem crazier/dumber/less open to reason–worthy of keeping away from making any important decisions. The key to making progress is to start by catching and shutting down thinking that validates other people’s role as out group members. It’s really hard, but work to find away to accept them as part of your in-group. If that means treating each person as a beloved Uncle that frustrates the hell out of you, but you need to continue to respect and try to find common ground with, start there.
This is why I choose to argue with them in public, and challenge them to bet on their nonsense.
I don’t want to change the wingnut mind. That would be impossible. Instead, I just want to convince the people listening that there is no wingnut mind. There is truth, and there is bullshit.* And if they’re not willing to bet, then they’re peddling the latter.
Often I think the problem is that fighting people who use passion over logic is that they tend to incite the same in those who would attempt to change their behavior. Our brains tend to birth emotional states when we sense other strong emotions in the room with us. So to attack someone’s beliefs head on will often fail because the attackers own emotions will come out and begin to cause a feedback loop.
Instead, if the desire is to change someone’s opinion then this should be done in parallel to their own thought process. Start with facts that do not run contrary to their beliefs but steer them towards focusing on other points that the person may not feel so strongly about. This allows for the re-engaging of logical thought that can then be used to guide the conversation.
The other benefit to this is we can reteach the person how to move logically from data point to data point as the conversation moves. The head on approach, even if it works, does nothing in a preventative sense to stop the next emotionally driven conspiracy acceptance.
Given the AGU’s stance on anthropogenic responsibility for climate change, which I very much applaud, and their recommendation scientists get involved, I really don’t know what to do with the Back-Fire Effect, as detailed in Cook’s Denial Handbook. When I turn to engage non-scientific audiences and recommend supporting actions to mitigate or set up networks to anticipate and deflect effects, I get reactions that I am being alarmist, that I’m expecting people to move too fast, that rather than updating their homes to be more energy efficient, people should be encouraged to bring their own coffee cups to church. Really? And all the time I have in mind the latest projections, per http://pubclimate.ch.mm.st/RunningOutOfTime.png.
Don’t really know what to do. I try, at http://hypergeometric.wordpress.com, but more direct approaches do really seem to back-fire.