For years in the 1980s and ’90s, U.S. evangelicals, above nearly any other group, warned what will happen when people abandon absolute truth (which they located in the Bible), saying the idea of relative truth would lead to people believing whatever confirms their own inward hunches. But suspicion of big government, questioning of scientific consensus (on evolution, for example) and a rejection of the morals of Hollywood and liberal elites took hold among millennial Christians, many of whom feel politically alienated and beat up by mainstream media. They are natural targets for QAnon…
“Why would we listen to my friend Joe … who’s telling me about Jesus who also thinks that Communists are taking over America and operating a pedophile ring out of a pizza restaurant? … Why would we be believed?”
This idea of a gullible, pliable populace is, of course, nothing new. Voltaire said, “those who can make you believe absurdities can make you commit atrocities”. But no, says Mercier, Voltaire had it backwards: “It is wanting to commit atrocities that makes you believe absurdities”…
If someone says Obama is a Muslim, their primary reason may be to indicate that they are a member of the group of people who co-ordinate around that statement. When a social belief and a true belief are in conflict, Klintman says, people will opt for the belief that best signals their social identity – even if it means lying to themselves…
Such a “belief” – being largely performative – rarely translates into action. It remains what Mercier calls a reflective belief, with no consequences on one’s behaviour, as opposed to an intuitive belief, which guides decisions and actions.
In 2014, a grad student made a joke video about a celestial body coming to destroy Earth, and got way more than he bargained for.
(click on image for full cartoon)
NYTimes Op-Doc exploring the psychological basis of our need for certainty and its pitfalls. Narrated by psychologist Arie Kruglanski who coined the term “cognitive closure.”
From 2016 but still offers meaningful insight into our current moment in politics.
“People who are anxious because of the uncertainty that surrounds them are going to be attracted to messages that offer them certainty. The need for closure is the need for certainty. To have clear-cut knowledge. You feel that you need to stop processing too much information, stop listening to a variety of information and zero in on what, to you, appears to be the truth. The need for closure is absolutely essential but it can also be extremely dangerous.”
We reveal how one of the biggest fake news stories ever concocted — the 1984 AIDS-is-a-biological-weapon hoax — went viral in the pre-Internet era. Meet the KGB cons who invented it, and the “truth squad” that quashed it. For a bit.
There are further episodes linked there as well.
Good Q and A that breaks down conspiratorial thinking. At the bottom is a link for the really well done “Conspiracy Theory Handbook.”
Conspiratorial videos and websites about COVID-19 are going viral. Here’s how one of the authors of “The Conspiracy Theory Handbook” says you can fight back. One big takeaway: Focus your efforts on people who can hear evidence and think rationally.
How do we prevent the spread of conspiracy theories?
By trying to inoculate the public against them. Telling the public ahead of time: Look, there are people who believe these conspiracy theories. They invent this stuff. When they invent it they exhibit these characteristics of misguided cognition. You can go through the traits we mention in our handbook, like incoherence, immunity to evidence, overriding suspicion and connecting random dots into a pattern. The best thing to do is tell the public how they can spot conspiracy theories and how they can protect themselves.
The Conspiracy Theory Handbook
Interesting cartoon that explains the dangers of fake news and how to combat it in your own mind. Unfortunately I am skeptical about the value of laying out such processes to deal with this problem. How can you stop someone from being “fooled” into believing something that they already believe? That confirms and conforms to their deeper world views? The deeper issue is motivated reasoning rather than an ignorance of how to deal with new information. All that being said, this is a fun cartoon, there is more than just this one panel featured below, click on the image for the full cartoon.
“Human cognition is inseparable from the unconscious emotional responses that go with it.”
In theory, resolving factual disputes should be relatively easy: Just present the evidence of a strong expert consensus. This approach succeeds most of the time when the issue is, say, the atomic weight of hydrogen.
But things don’t work that way when the scientific consensus presents a picture that threatens someone’s ideological worldview. In practice, it turns out that one’s political, religious, or ethnic identity quite effectively predicts one’s willingness to accept expertise on any given politicized issue.
“It’s not about foreign trolls, filter bubbles or fake news. Technology encourages us to believe we can all have first-hand access to the ‘real’ facts – and now we can’t stop fighting about it.
“Contrary to initial hype surrounding big data, the explosion of information available to us is making it harder, not easier, to achieve consensus on truth. As the quantity of information increases, the need to pick out bite-size pieces of content rises accordingly. In this radically sceptical age, questions of where to look, what to focus on and who to trust are ones that we increasingly seek to answer for ourselves, without the help of intermediaries. This is a liberation of sorts, but it is also at the heart of our deteriorating confidence in public institutions.”