“The rule is straightforward, but its implications are subtle. If journalists are encouraged to report extreme events, they guide both elite and public attitudes, leading many people, including experts, to feel like extreme events are more common than they actually are. By reporting on only the radically novel, the press can feed a popular illusion that the world is more terrible than it actually is.”
“Even in an industry where minority workers sometimes appear to be favored for highly desirable jobs,” the two concluded, “employers may still fall prey to symbolic discrimination, relying on deeply embedded stereotypes about minority groups during the interview process.”
“Intuition can encourage opinions that are contrary to the facts.”
Strongest opponents of GM foods know the least but think they know the most
“The extremists are more poorly calibrated. If you don’t know much, it’s hard to assess how much you know,” Fernbach added. “The feeling of understanding that they have then stops them from learning the truth. Extremism can be perverse in that way.”
The finding has echoes of the Dunning-Kruger effect, the observation from social psychology that incompetence prevents the incompetent from recognising their incompetence.
Interesting article about how we acquire and spread information. How we close ourselves off to voices we disagree with and how the frequency with which information is shared is not necessarily validation of its truthfulness.
“The problem is that social media is also a great way to spread misinformation, too. Millions of Americans shape their ideas on complex and controversial scientific questions – things like personal genetic testing, genetically modified foods and their use of antibiotics – based on what they see on social media. Even many traditional news organizations and media outlets report incomplete aspects of scientific studies, or misinterpret the findings and highlight unusual claims. Once these items enter into the social media echo chamber, they’re amplified. The facts become lost in the shuffle of competing information, limited attention or both.”
We’re not born with racial prejudices. We may never even have been “taught” them. Rather, explains Nosek, prejudice draws on “many of the same tools that help our minds figure out what’s good and what’s bad.” In evolutionary terms, it’s efficient to quickly classify a grizzly bear as “dangerous.” The trouble comes when the brain uses similar processes to form negative views about groups of people.
Though it’s easy to pick on Donald Trump and his supporters, this cognitive bias is evident in humans in general and we see it in various situations. Below is one article and below that is an amusing video mocking Bernie Sanders supporters.
“Graves’s article examined the puzzle of why nearly one-third of U.S. parents believe that childhood vaccines cause autism, despite overwhelming medical evidence that there’s no such link. In such cases, he noted, “arguing the facts doesn’t help — in fact, it makes the situation worse.” The reason is that people tend to accept arguments that confirm their views and discount facts that challenge what they believe.”
Interesting set of videos that shows you the limitations of what we can learn from body cameras on police officers. It also raises issues around how our prior knowledge, expectations, and experiences affect what we see when we interpret a given situation.
“This confirms what Professor Stoughton has found in his own presentations with judges, lawyers and students: What we see in police video footage tends to be shaped by what we already believe.
“‘Our interpretation of video is just as subject to cognitive biases as our interpretation of things we see live,’ Professor Stoughton said. ‘People disagree about policing and will continue to disagree about exactly what a video shows.’
“Race can also play a role. While Professor Stoughton’s work did not seek to determine how the race of the driver affected viewers’ conclusions, numerous studies have shown that some sort of conscious or unconscious bias is present in all of us, including law enforcement.”
Why smart people sometimes do dumb things
“No doubt you know several folks with perfectly respectable IQs who repeatedly make poor decisions. The behavior of such people tells us that we are missing something important by treating intelligence as if it encompassed all cognitive abilities. I coined the term ‘dysrationalia’ (analogous to ‘dyslexia’), meaning the inability to think and behave rationally despite having adequate intelligence, to draw attention to a large domain of cognitive life that intelligence tests fail to assess. Although most people recognize that IQ tests do not measure every important mental faculty, we behave as if they do. We have an implicit assumption that intelligence and rationality go together—or else why would we be so surprised when smart people do foolish things?”