Centralized social media, as Jack Dorsey wrote, was a grand experiment in collective global human consciousness. It was a modern-day Tower of Babel, the Human Instrumentality project from Neon Genesis Evangelion. Yes it was a way to make some people rich, but it was also an experiment in uniting the human race. Perhaps if we could all just get in one room and talk to each other, if we could just get rid of our echo chambers and our filter bubbles, we would eventually reach agreement, and the old world of war and hate and misunderstanding would melt into memory.
That experiment failed. Humanity does not want to be a global hive mind. We are not rational Bayesian updaters who will eventually reach agreement; when we receive the same information, it tends to polarize us rather than unite us. Getting screamed at and insulted by people who disagree with you doesn’t take you out of your filter bubble — it makes you retreat back inside your bubble and reject the ideas of whoever is screaming at you. No one ever changed their mind from being dunked on; instead they all just doubled down and dunked harder. The hatred and toxicity of Twitter at times felt like the dying screams of human individuality, being crushed to death by the hive mind’s constant demands for us to agree with more people than we ever evolved to agree with.
The story of the laptop, what was on it, how the story was dealt with (and blocked) has been around for almost two years now but still worth exploring in a TOK context with connections to several themes (Knower, Technology, Politics). How do our prior beliefs affect how we interpret new information? How do we decide whether a claim is credible? What responsibility do social media companies have to decide what is true? What are the consequences of so few companies having so much power over the spread of information?
I like this topic because it pushes my students to confront their own discomfort with the potential weaponization of the concept of fake news but in a direction that suits their politics. The twitter video at the bottom of Sam Harris, ironically, communicates what many people actually believe.
Here are a few of articles that explain the controversy:
This was from a recent conversation with Sam Harris, whom I normally have great respect for. His defense of wide ranging conspiracies to generate politically desirable outcomes is interesting. This is a good example of consequentialist ethics.
In which, Sam Harris says that it's ok to conspire against Trump getting elected, because he was the equivalent of an asteroid headed towards earth. Literally literally. Worth watching just for Francis' reaction at the end. Omg. pic.twitter.com/hVH7IPAx1t
There is a podcast clip circulating that seems to be confusing many people about my views on Trump (which is understandable because I wasn’t speaking very clearly). So, for what it’s worth, here is what I was trying to say: 1/6
Fascinating reflections on the power of imagery to change our consciousness about ourselves.
“Once a photograph of the Earth, taken from outside, is available…a new idea as powerful as any in history will be let loose.“ — Astronomer Fred Hoyle, 01948
“A photograph would do it — a color photograph from space of the earth,” Brand said. “There it would be for all to see, the earth complete, tiny, adrift, and no one would ever perceive things the same way.” -Stewart Brand
“Here we came all this way to the Moon, and yet the most significant thing we’re seeing is our own home planet, the Earth. “— Astronaut Bill Anders
“You develop an instant global consciousness, a people orientation, an intense dissatisfaction with the state of the world, and a compulsion to do something about it. From out there on the moon, international politics look so petty. You want to grab a politician by the scruff of the neck and drag him a quarter of a million miles out and say, ‘Look at that, you son of a bitch.’” — Astronaut Edgar Mitchell
“Our brains are not built for the truth,” David Linden, a professor of neuroscience at the Johns Hopkins University School of Medicine, told me earlier this year. “Our brains weren’t even built to read. Our brains weren’t. Evolution is a very slow process. It takes many, many, many, many, many generations. And the change in technology and particularly in information is so rapid that there’s no way for evolution to keep up.”…
We choose who to believe, we choose who to trust, often before we realize we are doing it. It is no wonder our disinformation battles can feel so personal, especially within families.
What Afghanistan shows is that we need a new definition of expertise, one that relies more on proven track records and healthy cognitive habits, and less on credentials and the narrow forms of knowledge that are too often rewarded. In an era of populism and declining trust in institutions, such a project is necessary to put expertise on a stronger footing.
Tetlock and the Taliban
How a humiliating military loss proves that so much of our so-called “expertise” is fake, and the case against specialization and intellectual diversity
The American-led coalition had countless experts with backgrounds pertaining to every part of the mission on their side: people who had done their dissertations on topics like state building, terrorism, military-civilian relations, and gender in the military…Meanwhile, the Taliban did not have a Western PhD among them.
Throughout the pandemic, Americans have grappled with, and largely failed to make sense of, COVID-19 statistics. One major reason for this failure is that the public has found itself at the mercy of commentators who simultaneously report and interpret the math for them. Too often, these interpretations are skewed to support a narrative that resonates with their audiences, either painting a drastic scenario about the risks (school is dangerous for children!) or one that minimizes these same risks (COVID-19 is just another flu!).
It is essential that we use better, more thoughtful COVID-19 math so we can get an accurate idea of the real risks of COVID-19, and of the potential downsides of interventions.
At a time of anxiety about fake news and conspiracy theories, philosophy can contribute to our most urgent cultural and political questions about how we come to believe what we think we know.
Democracies are especially vulnerable to epistemic threats because in needing the deliberative participation of their citizens, they must place a special value on truth….Indeed, a striking feature of our current political landscape is that we disagree not just over values (which is healthy in a democracy), and not just over facts (which is inevitable), but over our very standards for determining what the facts are. Call this knowledge polarization, or polarization over who knows—which experts to trust, and what is rational and what isn’t.
Below are a few different resources from author Jonathan Rauch discussing concepts of truth, knowledge, misinformation and the roles of institutions in producing knowledge. His work covers a lot of important ground related to TOK.
When Americans think about how we find truth amid a world full of discordant viewpoints, we usually turn to a metaphor, that of the marketplace of ideas. It is a good metaphor as far as it goes, yet woefully incomplete. It conjures up an image of ideas being traded by individuals in a kind of flea market, or of disembodied ideas clashing and competing in some ethereal realm of their own. But ideas in the marketplace do not talk directly to each other, and for the most part neither do individuals. Rather, our conversations are mediated through institutions like journals and newspapers and social-media platforms.
Regardless of what is most needed in the world at any given moment—regardless of whether the conditions call for more orthodoxy or more heterodoxy—there always needs to be an avenue for discussion. Both orthodox and heterodox ideas always need to be publicly discussable. Otherwise, whoever holds the most power when censorship begins—at the point at which people begin hiding their thoughts and conversations—will gain ever more power. The powerful will shape the governing orthodoxy—and it will always be an orthodoxy, even if its central ideas were heterodox just yesterday—and will crack down ever harder on those who dissent.
These statements reflect a real problem of vaccine advocacy. Proponents of the vaccine are unwilling or unable to understand the thinking of vaccine skeptics — or even admit that skeptics may be thinking at all. Their attempts to answer skepticism or understand it end up poisoned by condescension, and end up reinforcing it.