Implicit bias is a phrase for the idea that we’re all probably at least a little bit biased, even if we really don’t want to be.
It’s a term that describes what’s happening when, despite our best intentions and without our awareness, stereotypes and assumptions creep into our minds and affect our actions.
The phrase has been used lately regarding race, and especially by those alleging that slight “microagressions” are hurtful to racial minorities. The right has fought back regarding implicit bias, charging that it’s just another “woke” notion.
Besides politics, implicit bias slips into just about every crack and crevice of life, including areas like criminal justice. It also affects our thoughts about many other qualities such as age, gender, and ethnicity/national origin.
However, 30 years of neurology and cognitive psychology research show that implicit bias influences the way we see and treat others, even when we’re absolutely determined to be, and believe we are being, fair and objective. Think of it as “thoughts about people we didn’t know we had.”
Besides seeping into everything from education to health to policing, it affects how we treat people with facial differences. I’ve written many blogs about a similar term “lookism,” and you can find just a few of ’em here, here, and here. The last noted blog is a good companion piece to this blog.
From whence come implicit biases towards people with facial differences?
Rings of Power and Privilege
Lord of the Rings
A recent viewing of Rings of Power, the latest video-streaming installment of JRR Tolkien’s epic fantasy, brought a disquiet—a disquiet to which I was insensitive half a century ago, when I first saw the movie. That was before I became a scientist.
Psychological and neural responses to facial anomalies
For better and worse, our looks matter. Psychologists have long identified a beauty-is-good stereotype. That is, attractive people receive all sorts of unearned privileges in life. They are hired more easily, given higher pay, and receive lesser punishments for infractions than their less attractive counterparts.
We found an analogous facial-anomaly-is-bad bias, in which people who have scars, birthmarks, and developmental differences are viewed as being less intelligent, hard-working, trustworthy, and competent (Jamrozik et al., 2017).
Many people harbor such biases implicitly; our brains track them even if we don’t (Hartung et al., 2019). Based on our research, three brain structures are critical for these effects (Workman et al., 2021).
They include areas in our visual cortex, called the fusiform gyri, and in the left amygdala. Parts of the fusiform gyrus are tuned to processing faces, and the amygdala is implicated in processing various emotional states, especially fear and anxiety, tied to their social relevance.
Neural activity in these areas correlated with the degree of implicit bias held by people enrolled in the study. The left amygdala activity was further modulated by individual differences in people’s dispositions.
Those who reported that they feel less empathy towards others, and also believe that the world is generally just (i.e., people get what they deserve), had a stronger neural response in their amygdala to faces with such differences.
A standard interpretation of such a seemingly wired-in-the-brain response is that we evolved a mechanism of pathogen sensitivity. Pathogens, whether in plants, animals, or humans, can cause morphological disfigurements.
Since we wish to avoid pathogens, humans developed unconscious reflexive responses designed to avoid potential carriers of pathogens or people who are signaling compromised health that makes them susceptible to infections.
Is this pathogen sensitivity account the whole story?
Pathogen sensitivity presumably was coded in our brains during the Pleistocene era, when we lived in small groups as hunter and gatherer nomads. If this avoidance mechanism were necessary to preserve health in small populations, we would expect a similar bias to be conserved in modern day hunter-gatherers.
The Hadza are a hunter-gatherer group who live in small nomadic groups around Lake Eyasi in the central Rift Valley. Using digital methods, we created face images based on Hadza physiognomy and rendered scars on some of them. Conducting research in the field with the Hadza is not exactly similar to enrolling college undergraduate students in a computer-based experiment for class credit.
The Hadza in our study were shown pictures and asked whether they thought the person was a good hunter-gatherer and whether they thought the person had a good heart. These questions targeted judgments of competence and warmth, two factors important to how we form stereotypes.
Looking at the data, it first appeared that the Hadza also formed stereotypes of people with scars. On closer examination, the pattern of their responses was far more interesting. Some Hadza have minimal contact with the outside world.
The degree of contact can be assessed by asking questions like, is the name Nelson Mandela familiar, or, do you know words in Swahili? When we factored exposure to non-Hadza culture into the analysis of the data, all hints of a stereotype vanished.
As best we can tell, biases against faces with anomalies were absent in this hunter-gatherer group, and they only start to emerge with outside cultural contact. We are teaching our brains the anomalous-is-bad bias.
In the video series Rings of Power, the elves are contrasted with the orcs. The elves are good, immortal, golden, and beautiful. The orcs are evil, undead, dark, and distorted.
Half a billion dollars were spent on the visual imagery depicting good and evil. Our popular media is replete with similar tropes. Just about every villain in the Bond, Marvel, and Star Wars universes is depicted with facial anomalies.
Our children are fed the same visual diet of good and evil in movies like Beauty and the Beast and The Lion King.
One need not bother with developing much of a back story for why a character becomes bad. Their face tells the story….”