Skip to main content

Verified by Psychology Today

Bias

Why Do Stereotypes Get Such a Bad Rap?

Thinking about schemas, stereotypes, and prejudice in unconscious bias.

Key points

  • Schemas are cognitive constructs that act as mental frameworks that help us interpret and guide us through our (social) worlds.
  • Schemas permit us to make quick, split-second decisions without effort or cognitive burden.
  • In a novel situation, we may over-generalize and use incomplete schema construction or stereotypes if we do not have an appropriate schema.

Recently, I posted about how many courses on unconscious bias fail to convey or emphasise contextual clarity within the training and that, without such clarity, such initiatives may not be likely to succeed. I received some interesting feedback regarding the discussion of schemas and questions regarding how they are linked with related biases and some of their negative outcomes. I continue the discussion in this post.

Schemas are cognitive constructs that act as mental frameworks that help us interpret and guide us through our (social) worlds. As I noted in that previous post, schemas are a good thing: evolutionarily advantageous–permitting us to make quick, split-second decisions without effort or cognitive burden. However, though intuitive judgment is quick, easy, and often right, one shouldn’t rely on it when making important decisions; when it’s wrong, it’s often very wrong. When we care about our decisions, critical thinkers should engage in reflective judgment (Dwyer, 2017).

Schemas manifest in different ways (providing protocols for acting contextually appropriately and cues for what to anticipate; how to use or work objects and mechanisms; how to engage people–who and for what purpose; even how we conceptualise notions about ourselves). Everything we encounter fits into some personalised schema, for example, based on second-hand education or first-person experience.

If we are faced with a novel situation and don’t have an appropriate schema, we often use a pre-existing one and adapt it to the novel scenario, person, place, or thing in an effort to construct a new one. Such adaptation implies incompleteness, and so, we often fill in the gaps with extant over-generalised information, and what we wind up using is a type of schema known as a stereotype.

A stereotype, which is often associated with implicit or unconscious bias, is an outcome of incomplete schema construction. That is, in lieu of having a thoroughly formed schema, we wind up relying on what little we know and or what we think we know about the situation and using that as the basis for decision-making.

Stereotypes get a bad rap because when they’re identified in real-world decision-making, it’s generally because the stereotype is incorrect and someone has misapplied a schema. If the stereotype-schema was correct (for example, unknowingly matching empirical evidence), it’s treated as factual, and no one’s the wiser.

Contrary to the aforementioned bad rap, stereotypes are often correct and consistent with other intuition-based decision strategies; however, they can also be disastrously wrong (that is, often due to over-generalisation). Why risk being disastrously wrong? Well, given our cognitive laziness (for example, see Kahenman, 2011), if the conclusion you draw isn’t of great personal impact, why wouldn’t you go with the quick, easy solution in an effort to stave off cognitive burden? We all do it many times a day. Why pretend like the person who uses stereotypes is a bad person?

Herein lies the issue, if the conclusion one draws is of importance or impact (for example, for one's self or another person), one should avoid using intuitive cognitive processes like schemas and stereotypes in isolation; and ensure they apply reflective judgment. Another reason for the negative connotation is that they can be resistant to new information, meaning that even when provided with information that can enhance one’s schema construction for a particular topic, if a person has "invested" in their stereotyped belief system, it’s not easy to change that.

When we have a stereotype resistant to new information, prejudice is more likely. As we’ve been discussing, the application of stereotypes is prejudice–another term that gets a bad rap–a judgment based on some thought or belief about the attributes of some person, group, place, situation, orthing.

To simplify, a stereotype is a piece of information (be it true or false), and prejudice is using that information to make or draw a judgment, decision, conclusion (without certainty over the truth of the information). Prejudice is akin to implicit bias in that the former refers to a pre-judgment, and the latter refers to pre-reflective attribution. Similar, right?

The common conception of prejudice is that of disliking a person or group based on some negative belief about them; again, prejudice has a negative connotation because of its associations with racism, sexism, and other forms of discrimination—to which prejudice can lead.

Though this phenomenon most certainly exists, it is not a complete description of the term. On the other hand, prejudice can often have an affirmative slant; for example, we perceive more attractive people to be healthier, smarter, more successful, and so on. (Dion et al., 1972). We pre-judge many things in our everyday lives. "That food looks disgusting." "I don’t think I’ll be able to hop that fence."

Your prejudice in these scenarios is useful–you don’t want bad food, so you eat somewhere else; you don’t want to fall, so you walk around the fence. Despite the positive or negative slant that might be associated with a prejudice, it doesn’t mean it’s correct, and that actually should be the main reason we choose not to rely on them when making important decisions.

Of course, because of incomplete schemas, prejudices often coincide with biases, which essentially boil down to a person’s preferences (for example, n this pre-judged situation, I’ve assumed this outcome; and as I don’t like this outcome, I will remove myself from the situation). We will always favour familiar things and situations, as well as ourselves or people like ourselves, onsistent with in-group vs. out-group bias. Recognising that is key, and, in fairness, that’s the point of most unconscious bias training courses.

It doesn’t matter how woke you are. Stereotypes and prejudice are cognitions we all process. Whether or not you let these processes guide "important" decision-making and actions matters. If you do, there’s an increased chance you do or will at some stage act in a discriminatory fashion (that is, an unjustifiable negative behavior toward a person, group, place, or thing, often in light of stereotyped or prejudicial thinking). The severity of such behaviours, of course, will vary–but that’s another conversation altogether. Nevertheless, stereotypes and prejudices are not necessarily bad things; however, they can most certainly lead to bad things.

In my previous post on unconscious bias, I pointed out that the problem with its training is that oversimplifying the concepts can potentially facilitate misinterpretation of the core messages. The goal of this follow-up has been to elaborate on some relevant key concepts and facilitate clarity. With that, accounting for the context of schemas, stereotypes, and prejudices is key in the decision-making process.

References

Dion, K., Berscheid, E., & Hatfield, E. (1972). What is beautiful is good. Journal of Personality and Social Psychology, 24, 285-290.

Stephan Lewandowsky, John Cook, Ullrich Ecker, Dolores Albarracin, Michelle Amazeen, P. Kendou, D. Lombardi, E. Newman, G. Pennycook, E. Porter, D. Rand, D. Rapp, J. Reifler, J. Roozenbeek, P. Schmid, C. Seifert, G. Sinatra, B. Swire-Thompson, S. van der Linden, E. Vraga, T. Wood, M. Zaragoza. "The Debunking Handbook 2020."

advertisement
More from Christopher Dwyer Ph.D.
More from Psychology Today
More from Christopher Dwyer Ph.D.
More from Psychology Today