This is a rich topic that raises lots of questions worth discussing in the knowledge and knower unit. I’m posting a few of the stories here but will put together lessons around this topic next school year.
But the goal of journalism shouldn’t be to craft the most culturally sensitive or partisan narrative. The goal of journalism is to seek the truth. The consequences of telling the truth should be secondary to getting the truth out there in the first place, even if it makes the Trump administration or Republican Senators look good or the Chinese government look bad.
Good journalism, like good science, should follow evidence, not narratives. It should pay as much heed to intelligent gadflies as it does to eminent authorities. And it should never treat honest disagreement as moral heresy.
The Media’s Lab Leak Debacle Shows Why Banning ‘Misinformation’ Is a Terrible Idea: How a debate about COVID-19’s origins exposed a dangerous hubris
But Facebook’s concession that the lab leak story it once viewed as demonstrably false is actually possibly true should put to rest the idea that banning or regulating misinformation should be a chief public policy goal.
It’s one thing to discuss, debate, and correct wrong ideas, and both tech companies and media have roles to play in fostering healthy public dialogue. But Team Blue’s recent obsession with rendering unsayable anything that clashes with its preferred narrative is the height of hubris. The conversation should not be closed by the government and its yes-men in journalism, in tech, or even in public health.
For years in the 1980s and ’90s, U.S. evangelicals, above nearly any other group, warned what will happen when people abandon absolute truth (which they located in the Bible), saying the idea of relative truth would lead to people believing whatever confirms their own inward hunches. But suspicion of big government, questioning of scientific consensus (on evolution, for example) and a rejection of the morals of Hollywood and liberal elites took hold among millennial Christians, many of whom feel politically alienated and beat up by mainstream media. They are natural targets for QAnon…
“Why would we listen to my friend Joe … who’s telling me about Jesus who also thinks that Communists are taking over America and operating a pedophile ring out of a pizza restaurant? … Why would we be believed?”
This idea of a gullible, pliable populace is, of course, nothing new. Voltaire said, “those who can make you believe absurdities can make you commit atrocities”. But no, says Mercier, Voltaire had it backwards: “It is wanting to commit atrocities that makes you believe absurdities”…
If someone says Obama is a Muslim, their primary reason may be to indicate that they are a member of the group of people who co-ordinate around that statement. When a social belief and a true belief are in conflict, Klintman says, people will opt for the belief that best signals their social identity – even if it means lying to themselves…
Such a “belief” – being largely performative – rarely translates into action. It remains what Mercier calls a reflective belief, with no consequences on one’s behaviour, as opposed to an intuitive belief, which guides decisions and actions.
NYTimes Op-Doc exploring the psychological basis of our need for certainty and its pitfalls. Narrated by psychologist Arie Kruglanski who coined the term “cognitive closure.”
From 2016 but still offers meaningful insight into our current moment in politics.
“People who are anxious because of the uncertainty that surrounds them are going to be attracted to messages that offer them certainty. The need for closure is the need for certainty. To have clear-cut knowledge. You feel that you need to stop processing too much information, stop listening to a variety of information and zero in on what, to you, appears to be the truth. The need for closure is absolutely essential but it can also be extremely dangerous.”
We reveal how one of the biggest fake news stories ever concocted — the 1984 AIDS-is-a-biological-weapon hoax — went viral in the pre-Internet era. Meet the KGB cons who invented it, and the “truth squad” that quashed it. For a bit.
Good Q and A that breaks down conspiratorial thinking. At the bottom is a link for the really well done “Conspiracy Theory Handbook.”
Conspiratorial videos and websites about COVID-19 are going viral. Here’s how one of the authors of “The Conspiracy Theory Handbook” says you can fight back. One big takeaway: Focus your efforts on people who can hear evidence and think rationally.
How do we prevent the spread of conspiracy theories?
By trying to inoculate the public against them. Telling the public ahead of time: Look, there are people who believe these conspiracy theories. They invent this stuff. When they invent it they exhibit these characteristics of misguided cognition. You can go through the traits we mention in our handbook, like incoherence, immunity to evidence, overriding suspicion and connecting random dots into a pattern. The best thing to do is tell the public how they can spot conspiracy theories and how they can protect themselves.
Interesting cartoon that explains the dangers of fake news and how to combat it in your own mind. Unfortunately I am skeptical about the value of laying out such processes to deal with this problem. How can you stop someone from being “fooled” into believing something that they already believe? That confirms and conforms to their deeper world views? The deeper issue is motivated reasoning rather than an ignorance of how to deal with new information. All that being said, this is a fun cartoon, there is more than just this one panel featured below, click on the image for the full cartoon.
“Human cognition is inseparable from the unconscious emotional responses that go with it.”
In theory, resolving factual disputes should be relatively easy: Just present the evidence of a strong expert consensus. This approach succeeds most of the time when the issue is, say, the atomic weight of hydrogen.
But things don’t work that way when the scientific consensus presents a picture that threatens someone’s ideological worldview. In practice, it turns out that one’s political, religious, or ethnic identity quite effectively predicts one’s willingness to accept expertise on any given politicized issue.