Practice Analytically, Perform Intuitively

Seeing the errors in how people intuitively think about the golf swing made Bryson question how other parts of the game were played. Having majored in physics at college, he operates like a scientist. He subscribes to Charles Dickens’ famous line from Great Expectations: “Take nothing on its looks; take everything on evidence. There’s no better rule.”…

Trusting empirical data over intuition was one of the defining ideas of the Enlightenment. Through paradigm shifts like the Copernican Revolution, which found that humans weren’t the center of the universe, people began trusting instruments over their senses. That isn’t to say that science is always correct, but ever since the Enlightenment, it’s been obviously foolish to ignore it. Yet, that’s exactly what golfers did—for decades…

Like aspects of Bryson’s swing, some of the computer’s most effective chess moves are ugly to the human eye because they violate our intuition for what a good chess move looks like. But if you spend enough time watching the computer move, you can incorporate those tactics into your intuitive game and become a stronger player. Intuition isn’t as static as we think. With the right tools, it can improve over time.

https://perell.com/essay/practice-analytically-perform-intuitively/

Is the Schrödinger Equation True? Just because a mathematical formula works does not mean it reflects reality

How real are the equations with which we represent nature?


Physicists’ theories work. They predict the arc of planets and the flutter of electrons, and they have spawned smartphones, H-bombs and—well, what more do we need? But scientists, and especially physicists, aren’t just seeking practical advances. They’re after Truth. They want to believe that their theories are correct—exclusively correct—representations of nature. Physicists share this craving with religious folk, who need to believe that their path to salvation is the One True Path.

But can you call a theory true if no one understands it?

https://www.scientificamerican.com/article/is-the-schroedinger-equation-true1/

Data Analytics and Sports. Game 6 of the 2020 World Series

The conflict over the proper role of data analytics has been an ongoing story in modern sports for many years now. The controversy only intensified with the way the final game of the baseball World Series played out. Lots of angry sports talking heads sounded off about how much they hated the decision for the Rays to pull their starting pitcher and hate the role analytics play in sports. Interesting contrast between what the “data says” and what your gut says. 
 
This first link summarizes the situation and the videos at the top of the linked page are a few of the angry talk show hosts sounding off on their anger.
 
World Series 2020: Why the Tampa Bay Rays took Blake Snell out while he was mowing down the Los Angeles Dodgers
 
 
 
 
“Championships are not won by guidebooks”
 
Host Colin Cowherd makes some interesting points here about what he thinks are the appropriate times to approach the sport analytically.
 

 
 
Kevin Cash’s decision to pull Blake Snell, explained: How analytics overruled World Series context clues and cost the Rays
 
The long view of analytics points to reasons why pulling Snell was the right move, but it also ignores the individual context clues that Snell’s Game 6 domination provided.
 
 

Here are a few more resources on the topic. Depending on your students and their interests, 

Sports closer to art than science

CHANGING THE GAME- The Rise of Sports Analytics

Are super-nerds really ruining US sports? | Sport | The Guardian

 

Making people aware of their implicit biases doesn’t usually change minds. But here’s what does work

Click here for other articles tagged, “Implicit Bias”

Do the diversity or implicit bias training programs used by companies and institutions like Starbucks and the Oakland Police Department help reduce bias?

I’m at the moment very skeptical about most of what’s offered under the label of implicit bias training, because the methods being used have not been tested scientifically to indicate that they are effective. And they’re using it without trying to assess whether the training they do is achieving the desired results.

I see most implicit bias training as window dressing that looks good both internally to an organization and externally, as if you’re concerned and trying to do something. But it can be deployed without actually achieving anything, which makes it in fact counterproductive. After 10 years of doing this stuff and nobody reporting data, I think the logical conclusion is that if it was working, we would have heard about it.

https://www.pbs.org/newshour/nation/making-people-aware-of-their-implicit-biases-doesnt-usually-change-minds-but-heres-what-does-work

Plane travel only feels like it’s dangerous

OEJQNFZYSXGU65DJ2X6KL33WYI

Psychologists have long acknowledged that humans tend to “fear the rare”. Some 160,000 Americans die of heart disease each year, but few are paralyzed by the fear of clogged arteries. Instead, statistically unlikely events like earthquakes have an unreasonable grip on our imagination. While plane crash coverage may not give you a new phobia, experts regularly psotulate that extensive news coverage of plane crashes may play with our perception of risk.

https://www.popsci.com/plane-risk-safest-travel/

Here’s another great chart

18264

Coronavirus ‘Hits All the Hot Buttons’ for How We Misjudge Risk

When you encounter a potential risk, your brain does a quick search for past experiences with it. If it can easily pull up multiple alarming memories, then your brain concludes the danger is high. But it often fails to assess whether those memories are truly representative.

A classic example is airplane crashes.

If two happen in quick succession, flying suddenly feels scarier — even if your conscious mind knows that those crashes are a statistical aberration with little bearing on the safety of your next flight.

Art and reality: How accurately does “Euphoria” portray real teens’ lives? Does it matter?

The central point here is that the show Euphoria inaccurately portrays teenagers’ lives which raises the question: Is there a responsibility that comes with creating artwork? Must it be accurate? Who decides?

The claim that the show is inaccurate is backup with statistics raises the question: How can math/statistics help us acquire knowledge? (or understand reality?)

People’s perceptions of teens’ behaviors seems to be generally inaccurate beyond what this show. If presented with this article and appropriate statistics would people change their mind or perceptions of these issues? I’m not sure that it would which leads us to the question: What is the role of intuition in acquiring knowledge? Can mathematical knowledge overcome intuitive beliefs?

This reminded me of an earlier article from the New York Times:

“The Kids Are More Than All Right”

https://well.blogs.nytimes.com/2012/02/02/the-kids-are-more-than-all-right/

Does mentoring “at risk” youth do more harm than good? The fascinating “Cambridge-Somerville Youth Study”

The “Cambridge-Somerville Youth Study” was a fascinating study in constructing knowledge in the human sciences but more importantly, using scientific methods to come to conclusions that seem to be completely counterintuitive: that mentorship programs can do more harm than not intervening in the lives of children considered at risk. The Freakonomics episode linked below gets into great detail about this.

Freakonomics Podcast: When Helping Hurts

Jump ahead to the 6 minute mark to hear about the “Cambridge-Somerville Youth Study”

http://freakonomics.com/podcast/when-helping-hurts/

Charities aren’t doing enough to determine if they’re really making a difference

First do no harm. It’s a basic tenet of medicine. When intervening in peoples lives – even with good intentions – we need to check whether we are doing them any damage. But sadly, this key principle from the medical profession has not been taken to heart by charities.

https://theconversation.com/charities-arent-doing-enough-to-determine-if-theyre-really-making-a-difference-95110

Similar to an older post “How do we measure the effectiveness of charitable giving?”

https://toktopics.com/2015/02/22/how-can-we-measure-the-effectiveness-in-charitable-giving/

 

The “Smirk seen ’round the world”

sandmannInteresting situation from a TOK perspective. Below is a collection of articles about the topic. They raise a lot of interesting questions about how we acquire knowledge and the relationships among the various ways of knowing. It also lends itself to ask about the primacy of some WOKs over others.

 

Download Lesson plan on “the smirk”

Download smirk articles handout

TOK Day 31 (daily student worksheet)

What’s also interesting is how impactful the image was. The image seemed to be a perfect representation of how many people view the current moment in the United States. It fit perfectly into prior assumptions about the world and spoke to a deeper truth. Interpreting and explaining this image!and fitting it into preexisting mental schema seemed pretty easy.

Once more and more videos started to emerge and the greater context became known, there were some interesting developments. Some people Continue reading “The “Smirk seen ’round the world””