Cautionary Conversation: Flying on Empty (Followed by a great conversation on the utility and danger of relying on math)

Great podcast episode discussing how disasters can happen when we get the math wrong. This particular story is about unit conversions but is followed by a great conversation about the nature of math.

“A metre is longer than a yard. An ounce is heavier than a gram. We harmlessly mix them up sometimes, but a “unit conversion error” when you’re filling up the fuel tanks of an airliner can be fatal. Which is exactly what happened to Air Canada Flight 143.”

The episode itself is fascinating but the conversation with the author upon whose work this story was based is equally fascinating. Below is a quote from that conversation that explains some of the utility and dangers of math.

“The underlying issue is that as humans we’re not naturally good at mathematics…the human brain doesn’t do maths natively…because we have maths we can do so much more than our brains could do intuitively. We don’t have to make a building by eyeballing it and superoverengineering it, we can do the mathematics and figure out exactly what we need and how it is going to work. Using maths we can do far more than the human brain was ever designed to do. The cost, however, is that we can we are beyond our intuition and have to do the maths and do it very carefully.”

I have the link below queued to the section quoted above.

https://omny.fm/shows/cautionary-tales-with-tim-harford/cautionary-conversation-flying-on-empty/embed?t=27m59s

Full episode linked below

Prenatal Test False Positives: Bayes’ rule is still very, very important

Posted here are a NYT Article about genetic testing companies and the value of the results they provide along with an additional explanation of the math behind the issues.

Altogether, there are 415 positive tests: the 15 true positives, and 400 false positives. So if you get a positive test result, the chance that the fetus is actually affected is about 3.6%.

On its face, this sounds like a really good test! It detects 75% of cases, with only a 0.5% false positive rate. That seems like it should be helpful. 

https://emilyoster.substack.com/p/prenatal-test-false-positives

Further discussion from this author about Bayes’ rule

https://emilyoster.substack.com/p/bayes-rule-is-my-faves-rule

How to Fix Our Broken Relationship With COVID Math

Four rules to improve reporting about risk.

Throughout the pandemic, Americans have grappled with, and largely failed to make sense of, COVID-19 statistics. One major reason for this failure is that the public has found itself at the mercy of commentators who simultaneously report and interpret the math for them. Too often, these interpretations are skewed to support a narrative that resonates with their audiences, either painting a drastic scenario about the risks (school is dangerous for children!) or one that minimizes these same risks (COVID-19 is just another flu!).

It is essential that we use better, more thoughtful COVID-19 math so we can get an accurate idea of the real risks of COVID-19, and of the potential downsides of interventions. 

https://www.persuasion.community/p/how-to-fix-our-broken-relationship?

Is the Schrödinger Equation True? Just because a mathematical formula works does not mean it reflects reality

How real are the equations with which we represent nature?


Physicists’ theories work. They predict the arc of planets and the flutter of electrons, and they have spawned smartphones, H-bombs and—well, what more do we need? But scientists, and especially physicists, aren’t just seeking practical advances. They’re after Truth. They want to believe that their theories are correct—exclusively correct—representations of nature. Physicists share this craving with religious folk, who need to believe that their path to salvation is the One True Path.

But can you call a theory true if no one understands it?

https://www.scientificamerican.com/article/is-the-schroedinger-equation-true1/

Data Analytics and Sports. Game 6 of the 2020 World Series

The conflict over the proper role of data analytics has been an ongoing story in modern sports for many years now. The controversy only intensified with the way the final game of the baseball World Series played out. Lots of angry sports talking heads sounded off about how much they hated the decision for the Rays to pull their starting pitcher and hate the role analytics play in sports. Interesting contrast between what the “data says” and what your gut says. 
 
This first link summarizes the situation and the videos at the top of the linked page are a few of the angry talk show hosts sounding off on their anger.
 
World Series 2020: Why the Tampa Bay Rays took Blake Snell out while he was mowing down the Los Angeles Dodgers
 
 
 
 
“Championships are not won by guidebooks”
 
Host Colin Cowherd makes some interesting points here about what he thinks are the appropriate times to approach the sport analytically.
 

 
 
Kevin Cash’s decision to pull Blake Snell, explained: How analytics overruled World Series context clues and cost the Rays
 
The long view of analytics points to reasons why pulling Snell was the right move, but it also ignores the individual context clues that Snell’s Game 6 domination provided.
 
 

Here are a few more resources on the topic. Depending on your students and their interests, 

Sports closer to art than science

CHANGING THE GAME- The Rise of Sports Analytics

Are super-nerds really ruining US sports? | Sport | The Guardian

 

How do we know that math is real, and who came up with it in the first place?

Around 300 B.C., the Greek mathematician Euclid famously tried to construct the principles of geometry starting with axioms—basic truths that are taken as too fundamental to prove. He then asked what conclusions must follow. This is how a mathematical theory is built, and logic tells us that a theory has to be true whenever the axioms are true.

https://www.wsj.com/articles/a-viral-video-asks-a-deep-question-11599757497?mod=e2fb&fbclid=IwAR2HxVH1be67zjo_q-4VjV_2s-HHx4m3kvAKW9i8CNajrka2hilor15fzro

Here is another article that responded to the same video:

https://theconversation.com/is-mathematics-real-a-viral-tiktok-video-raises-a-legitimate-question-with-exciting-answers-145244

And one more:

This TikTok User Asked If Numbers Are Real, And Accidentally Started 2020’s Biggest Argument

Citizens need to know numbers

The episode demonstrates both the power and weakness of statistics: they can be used to amplify an entire worldview, and yet they often do not stand up to scrutiny. This is why statistical literacy is so important – in an age in which data plays an ever-more prominent role in society, the ability to spot ways in which numbers can be misused, and to be able to deconstruct claims based on statistics, should be a standard civic skill.

Statistics are not cold hard facts – as Nate Silver writes in The Signal and the Noise (2012): ‘The numbers have no way of speaking for themselves. We speak for them. We imbue them with meaning.’ Not only has someone used extensive judgment in choosing what to measure, how to define crucial ideas, and to analyse them, but the manner in which they are communicated can utterly change their emotional impact.

https://aeon.co/essays/good-citizenship-depends-on-basic-statistical-literacy

Modeling the impact of climate change on human migration: “Where Will Everyone Go?”

Extremely ambitious project from Propublica and the NY Times Magazine. What I really appreciate about this work is that it does not simply talk about alarmist conclusions and predictions but discusses at length its assumptions, methods, and different possible outcomes along the way. I think the TOK value here is less about climate change and its impacts and more about the ways in which we predict the future. What is the role of models? How do different disciplines build them? What is the value of interdisciplinary work? What are the limitations of these predictions?

In all, we fed more than 10 billion data points into our model. Then we tested the relationships in the model retroactively, checking where historical cause and effect could be empirically supported, to see if the model’s projections about the past matches what really happened. Once the model was built and layered with both approaches — econometric and gravity — we looked at how people moved as global carbon concentrations increased in five different scenarios, which imagine various combinations of growth, trade and border control, among other factors. (These scenarios have become standard among climate scientists and economists in modeling different pathways of global socioeconomic development.)

The results are built around a number of assumptions about the relationships between real-world developments that haven’t all been scientifically validated. The model also assumes that complex relationships — say, how drought and political stability relate to each other — remain consistent and linear over time (when in reality we know the relationships will change, but not how). Many people will also be trapped by their circumstances, too poor or vulnerable to move, and the models have a difficult time accounting for them.

All this means that our model is far from definitive. But every one of the scenarios it produces points to a future in which climate change, currently a subtle disrupting influence, becomes a source of major disruption, increasingly driving the displacement of vast populations.

https://features.propublica.org/climate-migration/model-how-climate-refugees-move-across-continents/

Related article:

The Truth about Scientific Models

They don’t necessarily try to predict what will happen—but they can help us understand possible futures

Scientists rely on models, which are simplified, mathematical representations of the real world. Models are approximations and omit details, but a good model will robustly output the quantities it was developed for.

Models do not always predict the future. This does not make them unscientific, but it makes them a target for science skeptics.

https://www.scientificamerican.com/article/the-truth-about-scientific-models/

Does the news reflect what we die from?

This is a page from the excellent site, “Our World in Data” which describes its mission as:

We believe that a key reason why we fail to achieve the progress we are capable of is that we do not make enough use of this existing research and data: the important knowledge is often stored in inaccessible databases, locked away behind paywalls and buried under jargon in academic papers.

The goal of our work is to make the knowledge on the big problems accessible and understandable. As we say on our homepage, Our World in Data is about Research and data to make progress against the world’s largest problems.

This helps us evaluate how we acquire knowledge about the world and how reliable it is. Specifically, this helps us explore the contrast between emotion and reason as ways of acquiring knowledge and gives us some specific materials around the utility of math and data in learning about the world.

Among many questions they ask is whether the news reflects what we die from and why it matters. Great data and visuals along with discussions of the key issues involved.

https://ourworldindata.org/does-the-news-reflect-what-we-die-from

Over-and-underrepresentation-of-deaths-in-media

The short history of global living conditions and why it matters that we know it

From the same site an exploration of macro world trends regarding global living conditions and our misperceptions of them.

https://ourworldindata.org/a-history-of-global-living-conditions-in-5-charts

Two-centuries-World-as-100-people

Here is a simple handout I made from the relevant text from the above links along with appropriate graphs and images. I’ve added nothing  to these. Just cut and pasted what I thought was most relevant for a class. Haven’t used it yet.

Download Perception vs Truth

Related handouts:

Download Assessing Risk Handout

Here is the article that accompanies the handout above.

Download Ten Ways We Get the Odds Wrong

Some different questions that get at the same point.

Download Alternative Assessing Risk

Here is yet another interesting example from Swedish physician Hans Rosling. He came up with a simple quiz assessing people’s knowledge of human development statistics. Even development experts were woefully unaware. You can take the quiz here:

https://factfulnessquiz.com/ 

Why the ‘Unreasonable Effectiveness’ of Mathematics?

What is it about #mathematics that it can describe so accurately the world around us? From quantum physics, the very smallest features and forces of the foundations of matter and energy, to cosmology, the very largest structures and forces of the beginning and evolution of the universe, mathematics is the language of description. Why does the physical world follow so faithfully equations of abstract symbols and variables?

https://youtu.be/uqGbn4b3LPM?t=144

 

Max Tegmark – Is Mathematics Invented or Discovered?

Mathematics describes the real world of atoms and acorns, stars and stairs, with remarkable precision. So is mathematics invented by humans just like chisels and hammers and pieces of music? Or is mathematics discovered—always out there, somewhere, like mysterious islands waiting to be found? Whatever mathematics is will help define reality itself.

https://www.youtube.com/watch?v=ybIxWQKZss8

Other great videos and interviews in this series you can find linked here on their website. Covers a wide set of topics.

https://www.closertotruth.com/about/content-guide