When you encounter a potential risk, your brain does a quick search for past experiences with it. If it can easily pull up multiple alarming memories, then your brain concludes the danger is high. But it often fails to assess whether those memories are truly representative.
A classic example is airplane crashes.
If two happen in quick succession, flying suddenly feels scarier — even if your conscious mind knows that those crashes are a statistical aberration with little bearing on the safety of your next flight.
People seem to believe that kids today engage in risky behavior at far greater rates than previous generations did but the research shows that the opposite is true. Teens today do drugs, drink alcohol, get pregnant, and smoke cigarettes at lower rates than other teens have for the past thirty years. At the same time, though, people don’t believe that is the case? Why is that?
Part of this is the amount of media attention and awareness that which creates an example of an availability bias. With 24 hour news coverage on multiple channels in addition to social media driving news consumption, sensational stories stand out in our minds and cause us to misperceive actual trends.
The causes of this are also connected to the same factors that cause humans to be very bad at judging risks. Scary stories overwhelm us and make us believe in incorrect ideas.
Below are some interesting resources that provide data which is not sensational but presents some truth on the subject matter.
Today’s Teens are more than alright
Today’s teens use less…than you did
The rapid decline in teen births is a huge public health success story
Teens doing better: Why don’t adults believe it?
“One reason people’s fears don’t line up with actual risks is that our brains are wired by evolution to make fast judgements which are not always backed up by logical reasoning. “Our emotions push us to make snap judgments that once were sensible—but may not be anymore,” Maia Szalavitz, a child psychiatrist, wrote in 2008 in Psychology Today.”
“Intuition can encourage opinions that are contrary to the facts.”
Strongest opponents of GM foods know the least but think they know the most
“The extremists are more poorly calibrated. If you don’t know much, it’s hard to assess how much you know,” Fernbach added. “The feeling of understanding that they have then stops them from learning the truth. Extremism can be perverse in that way.”
The finding has echoes of the Dunning-Kruger effect, the observation from social psychology that incompetence prevents the incompetent from recognising their incompetence.
“The basic problem is this: The human brain evolved so that we systematically misjudge risks and how to respond to them. Our visceral fear of terrorism has repeatedly led us to adopt policies that are expensive and counterproductive, such as the invasion of Iraq.”
“As for the headline claim that half of all children will be autistic by 2025, this claim blithely ignores the broad consensus that the increasing prevalence of autism is largely due to increasing rates of diagnosis and – as a new study has recently demonstrated changes in how autism is diagnosed. The baseless assumption that rates of autism diagnosis will continue into the stratosphere is dumbfounding.”
“For as often as we hear about murders, the suicide rate is actually much higher than the homicide rate. Nearly a third more people die at their own hands than at other people’s (the murder rate in America is about 6 per 100,000; for suicides it’s about 10.8).”
What determines how much we fear something? Is it based on the actual risks posed? Or do our emotions lead us to fear the wrong things and weigh risks differently than we should?
Interesting piece comparing the relative risks of swimming pools and guns and how much we fear each.
This is a clever program that does an analysis of every 4th down play in every professional football game. It determines based on mathematical expected value whether teams should go for it, punt, or kick a field goal. It breaks down the math behind its decision making. What’s interesting is how often the mathematical decisions are not the ones followed by the people on the field. Who is right in a case like this? What happens when the “common sense” approach is different from the mathematically “true” approach?