Do the diversity or implicit bias training programs used by companies and institutions like Starbucks and the Oakland Police Department help reduce bias?
I’m at the moment very skeptical about most of what’s offered under the label of implicit bias training, because the methods being used have not been tested scientifically to indicate that they are effective. And they’re using it without trying to assess whether the training they do is achieving the desired results.
I see most implicit bias training as window dressing that looks good both internally to an organization and externally, as if you’re concerned and trying to do something. But it can be deployed without actually achieving anything, which makes it in fact counterproductive. After 10 years of doing this stuff and nobody reporting data, I think the logical conclusion is that if it was working, we would have heard about it.
Below is a link to the first in a series of New York Times videos examining the subject. It is related to the idea of intuition and how we acquire and process knowledge and information.
“Even in an industry where minority workers sometimes appear to be favored for highly desirable jobs,” the two concluded, “employers may still fall prey to symbolic discrimination, relying on deeply embedded stereotypes about minority groups during the interview process.”
We’re not born with racial prejudices. We may never even have been “taught” them. Rather, explains Nosek, prejudice draws on “many of the same tools that help our minds figure out what’s good and what’s bad.” In evolutionary terms, it’s efficient to quickly classify a grizzly bear as “dangerous.” The trouble comes when the brain uses similar processes to form negative views about groups of people.
Interesting set of videos that shows you the limitations of what we can learn from body cameras on police officers. It also raises issues around how our prior knowledge, expectations, and experiences affect what we see when we interpret a given situation.
“This confirms what Professor Stoughton has found in his own presentations with judges, lawyers and students: What we see in police video footage tends to be shaped by what we already believe.
“‘Our interpretation of video is just as subject to cognitive biases as our interpretation of things we see live,’ Professor Stoughton said. ‘People disagree about policing and will continue to disagree about exactly what a video shows.’
“Race can also play a role. While Professor Stoughton’s work did not seek to determine how the race of the driver affected viewers’ conclusions, numerous studies have shown that some sort of conscious or unconscious bias is present in all of us, including law enforcement.”
“Racist stereotypes, at their root, come from quite a fundamental learning mechanism. Humans are able to learn and adapt so quickly because they are excellent at making generalisations about the world based on very limited experience. Take dogs, for example – a toddler might reasonably conclude after meeting just two or three that all dogs are furry, bark and have tails that should be treated with some caution.”
“This elegant experiment follows in a tradition of audit testing, in which social scientists have sent testers of different races to, for example, bargain over the price of new cars or old baseball cards. But the Australian study is the first, to my knowledge, to focus on discretionary accommodations. It’s less likely these days to find people in positions of authority, even at lower levels of decision making, consciously denying minorities rights. But it is easier to imagine decision makers, like the bus drivers, granting extra privileges and accommodations to nonminorities. Discriminatory gifts are more likely than discriminatory denials.”
A really amazing two part podcast about policing in the United States. Through the different parts of this podcast, we hear from police departments and officers around the country and how they’re dealing with the challenges they face. What’s fascinating about this is the role of perspective and how different experiences affect how people see different situations. Part 2 Act 2 discusses the implicit association test and what a police department is doing about how to deal with implicit bias while policing. Part 2 Prologue is an interesting and short bit about a reporter watching the Eric Garner video with a friend who is a police officer and how the two of them see completely different things and interpret the video in very different ways.
Below are links for the full episodes.
“Most white Americans demonstrate bias against blacks, even if they’re not aware of or able to control it. It’s a surprisingly little-discussed factor in the anguishing debates over race and law enforcement that followed the shootings of unarmed black men by white police officers. Such implicit biases — which, if they were to influence split-second law enforcement decisions, could have life or death consequences — are measured by psychological tests, most prominently the computerized Implicit Association Test, which has been taken by over two million people online at the website Project Implicit.”
“You think of yourself as a person who strives to be unprejudiced, but you can’t control these split-second reactions. As the milliseconds are being tallied up, you know the tale they’ll tell: When negative words and black faces are paired together, you’re a better, faster categorizer. Which suggests that racially biased messages from the culture around you have shaped the very wiring of your brain.”