The story of the laptop, what was on it, how the story was dealt with (and blocked) has been around for almost two years now but still worth exploring in a TOK context with connections to several themes (Knower, Technology, Politics). How do our prior beliefs affect how we interpret new information? How do we decide whether a claim is credible? What responsibility do social media companies have to decide what is true? What are the consequences of so few companies having so much power over the spread of information?
I like this topic because it pushes my students to confront their own discomfort with the potential weaponization of the concept of fake news but in a direction that suits their politics. The twitter video at the bottom of Sam Harris, ironically, communicates what many people actually believe.
Here are a few of articles that explain the controversy:
This was from a recent conversation with Sam Harris, whom I normally have great respect for. His defense of wide ranging conspiracies to generate politically desirable outcomes is interesting. This is a good example of consequentialist ethics.
“Our brains are not built for the truth,” David Linden, a professor of neuroscience at the Johns Hopkins University School of Medicine, told me earlier this year. “Our brains weren’t even built to read. Our brains weren’t. Evolution is a very slow process. It takes many, many, many, many, many generations. And the change in technology and particularly in information is so rapid that there’s no way for evolution to keep up.”…
We choose who to believe, we choose who to trust, often before we realize we are doing it. It is no wonder our disinformation battles can feel so personal, especially within families.
When a French dictionary included the gender-nonspecific “iel” for the first time, a virulent reaction erupted over “wokisme” exported from American universities.
Charles Bimbenet, its director-general, posted a statement rejecting the minister’s charge of militancy. “The mission of the Robert is to observe the evolution of a French language that is in motion and diverse, and take account of that,” he wrote. “To define the words that describe the world is to aid better comprehension of it.”
France, a country where it is illegal for the state to compile racial statistics, is particularly on edge over the rise of American gender and race politics. President Emmanuel Macron has warned that “certain social science theories entirely imported from the United States” may be a threat. Mr. Blanquer has identified “an intellectual matrix” in American universities bent on undermining a supposedly colorblind French society of equal men and women through the promotion of identity-based victimhood.
Scientists corrode public trust when they pretend to have authority on social and political matters.
Science operates by a process of criticism. Scientists don’t experience divine revelations, they propose hypotheses that they and others test. This rigorous process of testing gives science the persuasiveness that mere journalism lacks. If a scientific periodical expels editors or peer reviewers because they don’t accept some prevailing theory, that process has been short-circuited. Those who call for such expulsions have missed the whole point of how science works. They are the true deniers, far more dangerous to science than a religious fundamentalist who believes the world is 6,000 years old.
To doubt a scientist is not to doubt science. Quite the contrary, personal authority is precisely what science dispenses with, as much as possible…
At a time of anxiety about fake news and conspiracy theories, philosophy can contribute to our most urgent cultural and political questions about how we come to believe what we think we know.
Democracies are especially vulnerable to epistemic threats because in needing the deliberative participation of their citizens, they must place a special value on truth….Indeed, a striking feature of our current political landscape is that we disagree not just over values (which is healthy in a democracy), and not just over facts (which is inevitable), but over our very standards for determining what the facts are. Call this knowledge polarization, or polarization over who knows—which experts to trust, and what is rational and what isn’t.
Below are a few different resources from author Jonathan Rauch discussing concepts of truth, knowledge, misinformation and the roles of institutions in producing knowledge. His work covers a lot of important ground related to TOK.
When Americans think about how we find truth amid a world full of discordant viewpoints, we usually turn to a metaphor, that of the marketplace of ideas. It is a good metaphor as far as it goes, yet woefully incomplete. It conjures up an image of ideas being traded by individuals in a kind of flea market, or of disembodied ideas clashing and competing in some ethereal realm of their own. But ideas in the marketplace do not talk directly to each other, and for the most part neither do individuals. Rather, our conversations are mediated through institutions like journals and newspapers and social-media platforms.
What our society is really suffering from is myside bias: People evaluate evidence, generate evidence, and test hypotheses in a manner biased toward their own prior beliefs, opinions, and attitudes. That we are facing a myside bias problem and not a calamitous societal abandonment of the concept of truth is perhaps good news in one sense, because the phenomenon of myside bias has been extensively studied in cognitive science. The bad news, however, is that what we know is not necessarily encouraging.
Trump, a former reality-TV host, beauty pageant organizer and businessman, once called African nations “shithole countries.” But he is now taking a page from African dictators who spread bogus health remedies, like Yahya Jammeh of Gambia, who claimed he could cure AIDS with bananas and herbal potions and pushed his treatments onto the population, resulting in deaths. Trump appeared to suggest injecting bleach and using sunlight to kill the coronavirus. He has also said he has taken hydroxycholoroquine, a drug derived from quinine, a long-known jungle remedy for malaria. Doctors have advised against using the treatment to prevent or treat the coronavirus.
This article does a fascinating job of evaluating what the author calls “common knowledge,” similar to the TOK concept of shared knowledge, as a way to discuss the general idea of the role of communities in forming beliefs and how modern technologies change the nature of common knowledge.
It’s only with the growth of communities of people interacting that most people gain such courage in their convictions to defy that which authoritative sources (media, political, corporate) deem to be acceptable narratives and acceptable norms. These communities generate more than validation of one’s preexisting beliefs. They generate the common knowledge that I know that many others feel the same as I do, others to whom I am joined in a community.
‘Battlefield maps’ show continent under attack from hostile invaders.
See, maps have a problem. They appear neutral, objective, authoritative. But that’s exactly all that they’re not. Each map reflects the many choices the cartographer has made, consciously or not, both in terms of content and form.
And so, without us even noticing it, maps can confirm bias, entrench prejudice and perpetuate injustice.