Below are a few different resources from author Jonathan Rauch discussing concepts of truth, knowledge, misinformation and the roles of institutions in producing knowledge. His work covers a lot of important ground related to TOK.
When Americans think about how we find truth amid a world full of discordant viewpoints, we usually turn to a metaphor, that of the marketplace of ideas. It is a good metaphor as far as it goes, yet woefully incomplete. It conjures up an image of ideas being traded by individuals in a kind of flea market, or of disembodied ideas clashing and competing in some ethereal realm of their own. But ideas in the marketplace do not talk directly to each other, and for the most part neither do individuals. Rather, our conversations are mediated through institutions like journals and newspapers and social-media platforms.
Persuasion Podcast: Don’t Give Up on Truth
These statements reflect a real problem of vaccine advocacy. Proponents of the vaccine are unwilling or unable to understand the thinking of vaccine skeptics — or even admit that skeptics may be thinking at all. Their attempts to answer skepticism or understand it end up poisoned by condescension, and end up reinforcing it.
Here’s why your efforts to convince anti-vaxxers aren’t working
People don’t listen to outsiders. They need enlightened insiders to offer them a ladder to climb down
What our society is really suffering from is myside bias: People evaluate evidence, generate evidence, and test hypotheses in a manner biased toward their own prior beliefs, opinions, and attitudes. That we are facing a myside bias problem and not a calamitous societal abandonment of the concept of truth is perhaps good news in one sense, because the phenomenon of myside bias has been extensively studied in cognitive science. The bad news, however, is that what we know is not necessarily encouraging.
Science suggests we’re hardwired to delude ourselves. Can we do anything about it?
If I had to single out a particular bias as the most pervasive and damaging, it would probably be confirmation bias. That’s the effect that leads us to look for evidence confirming what we already think or suspect, to view facts and ideas we encounter as further confirmation, and to discount or ignore any piece of evidence that seems to support an alternate view. Confirmation bias shows up most blatantly in our current political divide, where each side seems unable to allow that the other side is right about anything.
Attached are some passages from the book, Cribsheet by Emily Oster, an economist who wrote a data-driven guide to parenting. I put together some interesting passages from the introduction and from one of the chapters that does a nice job contextualizing the concepts of data driven decision making, what a good study is, the limits of those studies, and the ultimate uncertainty of all the knowledge produced using data.
Meaningful connections to constructing knowledge and data collection in the human sciences (particularly economics), natural sciences, and cognitive biases. Also deals well with problems of sorting out the differences between correlation and causation.
Generally great book for parenting, not just for its TOK connections.
How can you identify a good study? This is a hard question. Some things you can see directly. Certain approaches are better than others – randomized trials, for example, are usually more compelling than other designs. Large studies tend, on average, to be better. More studies confirming the same thing tends to increase confidence, although not always – sometimes they all have the same biases in their results
Passages from Cribsheet by Emily Oster
The reason is simple: most of us, even those of us who are scientists ourselves, lack the relevant scientific expertise needed to adequately evaluate that research on our own. In our own fields, we are aware of the full suite of data, of how those puzzle pieces fit together, and what the frontiers of our knowledge is…
There’s an old saying that I’ve grown quite fond of recently: you can’t reason someone out of a position they didn’t reason themselves into. When most of us “research” an issue, what we are actually doing is:
formulating an initial opinion the first time we hear about something,
evaluating everything we encounter after that through that lens of our gut instinct,
finding reasons to think positively about the portions of the narrative that support or justify our initial opinion,
and finding reasons to discount or otherwise dismiss the portions that detract from it.
NYTimes Op-Doc exploring the psychological basis of our need for certainty and its pitfalls. Narrated by psychologist Arie Kruglanski who coined the term “cognitive closure.”
From 2016 but still offers meaningful insight into our current moment in politics.
“People who are anxious because of the uncertainty that surrounds them are going to be attracted to messages that offer them certainty. The need for closure is the need for certainty. To have clear-cut knowledge. You feel that you need to stop processing too much information, stop listening to a variety of information and zero in on what, to you, appears to be the truth. The need for closure is absolutely essential but it can also be extremely dangerous.”
So, it could be that the effect is all in your head. It could be that the effect is real, whether it’s placebo pain relief or measurable weight loss. But either way, if your experience flies in the face of research results, you’re probably going to go with your experience. And Hitchcock says that could be a completely rational decision. If the cost of continuing (say, paying for a supplement) is small compared to the risk of discontinuing (and potentially giving up the perceived benefit), it makes sense to keep on keeping on.
Here are some other articles related to natural sciences and diet
“Human cognition is inseparable from the unconscious emotional responses that go with it.”
In theory, resolving factual disputes should be relatively easy: Just present the evidence of a strong expert consensus. This approach succeeds most of the time when the issue is, say, the atomic weight of hydrogen.
But things don’t work that way when the scientific consensus presents a picture that threatens someone’s ideological worldview. In practice, it turns out that one’s political, religious, or ethnic identity quite effectively predicts one’s willingness to accept expertise on any given politicized issue.