This idea of a gullible, pliable populace is, of course, nothing new. Voltaire said, “those who can make you believe absurdities can make you commit atrocities”. But no, says Mercier, Voltaire had it backwards: “It is wanting to commit atrocities that makes you believe absurdities”…
If someone says Obama is a Muslim, their primary reason may be to indicate that they are a member of the group of people who co-ordinate around that statement. When a social belief and a true belief are in conflict, Klintman says, people will opt for the belief that best signals their social identity – even if it means lying to themselves…
Such a “belief” – being largely performative – rarely translates into action. It remains what Mercier calls a reflective belief, with no consequences on one’s behaviour, as opposed to an intuitive belief, which guides decisions and actions.
NYTimes Op-Doc exploring the psychological basis of our need for certainty and its pitfalls. Narrated by psychologist Arie Kruglanski who coined the term “cognitive closure.”
From 2016 but still offers meaningful insight into our current moment in politics.
“People who are anxious because of the uncertainty that surrounds them are going to be attracted to messages that offer them certainty. The need for closure is the need for certainty. To have clear-cut knowledge. You feel that you need to stop processing too much information, stop listening to a variety of information and zero in on what, to you, appears to be the truth. The need for closure is absolutely essential but it can also be extremely dangerous.”
We reveal how one of the biggest fake news stories ever concocted — the 1984 AIDS-is-a-biological-weapon hoax — went viral in the pre-Internet era. Meet the KGB cons who invented it, and the “truth squad” that quashed it. For a bit.
Good Q and A that breaks down conspiratorial thinking. At the bottom is a link for the really well done “Conspiracy Theory Handbook.”
Conspiratorial videos and websites about COVID-19 are going viral. Here’s how one of the authors of “The Conspiracy Theory Handbook” says you can fight back. One big takeaway: Focus your efforts on people who can hear evidence and think rationally.
How do we prevent the spread of conspiracy theories?
By trying to inoculate the public against them. Telling the public ahead of time: Look, there are people who believe these conspiracy theories. They invent this stuff. When they invent it they exhibit these characteristics of misguided cognition. You can go through the traits we mention in our handbook, like incoherence, immunity to evidence, overriding suspicion and connecting random dots into a pattern. The best thing to do is tell the public how they can spot conspiracy theories and how they can protect themselves.
Interesting cartoon that explains the dangers of fake news and how to combat it in your own mind. Unfortunately I am skeptical about the value of laying out such processes to deal with this problem. How can you stop someone from being “fooled” into believing something that they already believe? That confirms and conforms to their deeper world views? The deeper issue is motivated reasoning rather than an ignorance of how to deal with new information. All that being said, this is a fun cartoon, there is more than just this one panel featured below, click on the image for the full cartoon.
“Human cognition is inseparable from the unconscious emotional responses that go with it.”
In theory, resolving factual disputes should be relatively easy: Just present the evidence of a strong expert consensus. This approach succeeds most of the time when the issue is, say, the atomic weight of hydrogen.
But things don’t work that way when the scientific consensus presents a picture that threatens someone’s ideological worldview. In practice, it turns out that one’s political, religious, or ethnic identity quite effectively predicts one’s willingness to accept expertise on any given politicized issue.
“It’s not about foreign trolls, filter bubbles or fake news. Technology encourages us to believe we can all have first-hand access to the ‘real’ facts – and now we can’t stop fighting about it.
“Contrary to initial hype surrounding big data, the explosion of information available to us is making it harder, not easier, to achieve consensus on truth. As the quantity of information increases, the need to pick out bite-size pieces of content rises accordingly. In this radically sceptical age, questions of where to look, what to focus on and who to trust are ones that we increasingly seek to answer for ourselves, without the help of intermediaries. This is a liberation of sorts, but it is also at the heart of our deteriorating confidence in public institutions.”
Developing deepfake detection technology is important, but it’s only part of the solution. It is the human factor—weaknesses in our human psychology—not their technical sophistication that make deepfakes so effective. New research hints at how foundational the problem is.
The biggest threat of deepfakes isn’t the deepfakes themselves
“Deepfakes do pose a risk to politics in terms of fake media appearing to be real, but right now the more tangible threat is how the idea of deepfakes can be invoked to make the real appear fake,” says Henry Ajder, one of the authors of the report. “The hype and rather sensational coverage speculating on deepfakes’ political impact has overshadowed the real cases where deepfakes have had an impact.”
Science communication has lost its sense of empathy and misunderstands how fear can alter a person’s belief system.
When we feel so fundamentally disenfranchised, it’s comforting to concoct a fictional universe that systemically denies you the right cards. It gives you something to fight against and makes you self-deterministic.
It provides an “us and them” narrative that allows you to conceive of yourself as a little David raging against a rather haughty, intellectual establishment Goliath.