Please ensure Javascript is enabled for purposes of website accessibility Why must educators explore implicit bias? | Berry Street Skip to main content

One year ago, I moved to Australia to become a Senior Trainer with the Berry Street Education Model. As an American citizen, now an Australian resident, and a former New York City public school educator, I have been closely following the recent Black Lives Matter events in both Australia and the United States. Because this movement has been covered prominently in world news, I’ve had many conversations with Australians who have expressed shock and disbelief that racism is still one of America’s biggest battles. Interestingly though, I can see that Australia has its own story when it comes to the ongoing prevalence of racism. The way both countries have historically and currently treat people of colour significantly impacts the young people with whom we work and as such, is a critical subject to address.

Educators have the unique power of working and creating safe spaces where social change is possible. If we ignore the societal issue of racism in schools, we are ignoring our young people of colour who don’t have the privilege to turn a blind eye. In saying that, we must first reflect on our own understandings before we can mindfully begin these crucial conversations. This blog series will provide tools for educators to facilitate this work. This first post will explore a subtle and unconscious form of racism, known as implicit bias.

What is Implicit Bias?

Implicit bias, by definition, refers to “the attitudes or stereotypes that affect our understanding, actions, and decisions in an unconscious manner.” (The Ohio University’s Kirwan Institute for the Study of Race and Ethnicity [Kirwan Institute], 2015). Implicit biases can be favourable or unfavourable and become deeply, involuntarily, and unknowingly ingrained in our subconscious. Once lodged, these biases can influence our behaviour towards members of particular social groups, our interactions with them, and the decisions we make, even though we remain oblivious to their influence. As a result, these biases can have a significant impact concerning our work with our students, colleagues, and peers.

Anchors and implicit bias

Implicit biases begin to form at a very young age from exposure to direct and indirect messages from family, carers, media, early life experiences, and society in general. Our brains absorb these messages and save them for later when we need to make predictions and snap judgements in a process called “anchoring” (Kahneman & Tversky, 1974). During anchoring, our minds start by using whatever information is immediately available as a reference point, then adjust as we collect more.

Banaji and Greenwald (2016) describe why we create these anchors. They posit our ‘survival brains’ have evolved to pay special attention to others of our kind (humans) and predict what might be going on in their minds. Research further suggests that selective brain regions appear active when we make these predictions. More specifically, two different clusters of neurons are engaged. The particular cluster of neurons activated depends on how much we can identify with the person or group we are thinking about. In other terms, during this process, our brains are unconsciously pulling from different “anchors” when thinking about different people and groups.

The concern then, is that once anchors have formed, our minds don’t always perceive things as they actually are. This can easily cause us to make errors in judgement. These errors, or “mindbugs” (Banaji & Greenwald, 2016) are ingrained habits of thought in how we perceive, remember, reason, and make decisions. Banaji and Greenwald (2016) provide an example of how this works by instructing us to look at a picture of any two strangers and ask ourselves:

  • Which one of these people seems more trustworthy?
  • Which will be more competent on the job?
  • Which is likely to dominate the other?.

How we answer these questions is reliant on the subconscious anchoring that we’ve done. Banaji and Greenwald further explain how easy it is to make judgements purely on one static image and that it requires far more work to avoid making judgments, even though they may be quite wrong. Additionally, the Kirwan Institute (2015) has gathered research that shows that we generally hold implicit biases that favour our own ingroups (those who share similarities with us). So, while this judgement was once a survival skill in our brain, it may now act as a tool for making and holding biases against groups that are not only inaccurate, but also harmful.

What can we do about our implicit biases?

The good news is that our implicit biases, like our brains, are malleable. As the Kirwan Institute (2015) explains, the implicit associations that we’ve formed can be gradually unlearned through a variety of debiasing techniques. The next post in this series will explore how we can identify and engage with our implicit biases to make positive changes in our thinking. In the meantime, reflect on the three questions from Banaji and Greenwald above, or mindfully bring awareness to the messages you are receiving in the media, and how these messages are adding to the anchors your brain already holds.

Banaji, M. R., & Greenwald, A. G. (2016). Blindspot: Hidden biases of good people. Bantam.

Kirwan Institute for the Study of Race and Ethnicity (2015). The State of Science: Implicit Bias Review. The Ohio State University

Staats, C., Capatosto, K., Wright, R. A., & Contractor, D. (2015). State of the science: Implicit bias review 2015 (Vol. 3). Columbus, OH: Kirwan Institute for the Study of Race and Ethnicity.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124-1131.

Senior Training of the Berry Street Education Model Joanne Olsen

Joanne Olsen

Masters of Science (English Education) | Bachelor Arts (English) | Certificate in Positive Education