Domain Generality vs. Specificity

Another critical thinking debate.

Posted Nov 27, 2017

Critical thinking is purposeful, self-regulatory reflective judgment that, through specific skills (i.e. analysis, evaluation, and inference) and dispositions (e.g. open-mindedness, inquisitiveness, and perseverance), increases the chances of producing a logical conclusion to an argument or solution to a problem (Dwyer, 2017; Dwyer et al., 2016; Dwyer, Hogan & Stewart, 2014). Though many definitions and descriptions of critical thinking (CT) have been conceptualized over the years, they all generally point to these components and outcomes. Critical thinking is also often conceptualized as domain-general, in that it can be applied to any topic in any field. For example, the five "real-world" applications of CT: argumentation, verbal reasoning, hypothesis testing, problem-solving, and judging likelihood and uncertainty, are quite broad in their own right. However, debate over the domain-generality of CT remains.

Over the years, it has occurred to me that the primary argument for the domain-specificity of CT (i.e. CT ability conceptualized as being specific to a particular area[s]) stems from the simple observation that many people are better at thinking critically about specific topics than they are at other topics. Even research conducted by my colleagues and I suggest the possibility of domain-specificity (Dwyer, Boswell & Elliott, 2015). For example, our research found that though CT is domain-general in theory, domain-specific scenarios in which CT is required is facilitated by matching domain-specific expertise; and that though there was no statistically significant difference between the CT scores of those with and without business experience on domain-general CT, those with business experience scored significantly higher on the business-related CT assessment than those without. This was further elaborated by the finding that those with business expertise (i.e. those with five or more years working in a business-related field) scored significantly higher on business-related CT than novices (i.e., less than five years working in a business-related field), as well as those without business experience.

To a certain extent, this should come as no surprise, given research indicates that efforts to promote knowledge construction through training can aid in the reduction of cognitive load (Pollock, Chandler & Sweller, 2002; van Merriënboer, Kirschner & Kester, 2003). That is, those who are trained in a specific domain are provided the opportunity to develop expertise in that domain through the construction of relevant knowledge during training (Chi, Glaser, Rees, 1982; Kotovsky, Hayes & Simon, 1985); and thus, are better equipped to assimilate complex information than are those who do not possess the relevant pre-existing knowledge (Pollock, Chandler & Sweller, 2002; Sweller, 2010). Simply, those who have sufficient knowledge in a particular area will be burdened less by the cognitive processes associated with CT than those who must also actively search for relevant information. 

This perspective is further supported by research in Post-Piagetian frameworks of cognitive development. For example, according to Kurt Fischer’s (1980) Dynamic Skill Theory, wherein knowledge can be conceptualized by the individual as concrete (i.e. facts and procedures) or abstract (i.e. concepts and principles), skill development is often domain-specific. That is, skills develop independently of one another and at different rates, and different skills draw upon different knowledge.

It might seem that I am arguing for the domain-specific perspective of CT; however, there are two important issues that require consideration before we bury the notion of domain-generality. The first consideration is that of what I refer to as the Experience-Expertise Dilemma. For context, the cognitive biases and heuristics that stem from reliance on intuitive judgment (i.e. the antithesis of CT) are influenced by the individual’s familiarity and salience with a topic (Tversky & Kahneman, 1974); and in turn, influence the manner in which individuals analyse, evaluate, and infer conclusions and judgments. Interestingly, the use of these heuristics and biases are not limited to naïve subjects— experienced individuals have been found to make similar errors in judgments; for example, perhaps explaining why individuals in business and finance often fall prey to erroneous intuitive judgments. Moreover, experience has often been observed to be unrelated to judgment accuracy and sometimes negatively correlated with accuracy (Goldberg, 1990; Hammond, 1996; Kahneman, 2011; Stewart, Heideman, Moninger, & Reagan-Cirincione, 1992), perhaps as a result of overconfidence (Kahneman, 2011) or perhaps as a result of experience in doing the wrong thing (Hammond, 1996). However, individuals expert in a particular domain tend to use logic rather than intuition (Kahneman & Frederick, 2002) and they tend to avoid making elementary errors, such as the gambler’s fallacy (which inexperienced people tend to make). Consistent with this perspective, research has found that individuals with expertise in a particular field will perform better on problem-solving, informal reasoning and CT tasks specific to that field (Cheung, Rudowicz, Kwan, & Yue, 2002; Chiesi, Spliich, & Voss, 1979; Graham & Donaldson, 1999; Voss, Blais, Means, Greene, & Ahwesh, 1986); perhaps as a result of being better able to evaluate the strengths and weaknesses of a given perspective due to more domain-specific knowledge.

Though research indicates that the benefits of expertise in this context help support the domain-specificity argument, the caveat to this recommendation is that expertise is not particularly easy to obtain. In addition, it would need to be the relevant expertise—specific to a particular domain. Thus, if expertise (and the right expertise at that) was necessary to be a good critical thinker, then it seems reasonable to suggest that there are genuinely few good critical thinkers. What seems more likely, however, is that CT may be conducted on different levels of ability; for example, an individual may be a good critical thinker with respect to some domains; but a much better critical thinker in a specific domain, provided they are expert in it.

The second major consideration necessary is that of explicit CT training. Individuals who expert in a particular field enjoy the luxury of having information relevant to that field. However, the manner in which they were taught this information will also affect how well they analyze, evaluate, or infer conclusions about it. Take the topic of literature, for example. Let’s say a literature teacher "immerses" CT into her instruction. The teacher will present author names and publication dates to rote-learn, but might also instruct her class how to analyze and evaluate plots, characters, and settings; and subsequently, infer themes, motifs, stylistic choices and perhaps more personal motivations. On the other hand, a history teacher in the same school, with the same class, might choose to teach using lists of facts, dates and general descriptions of events, with no need for further consideration. Students may become more proficient in literary settings than history settings because of the way in which they were taught. However, even though these students have been taught CT skills through their literature class, they may not be readily able to translate these skills to other domains, given that the skills were only introduced in the context of literature, not explicitly as something that could be applied in a more general manner.

The perspective in this example is supported by research on CT instructional approaches. Specifically, Ennis (1989) developed a framework of CT instructional methodologies, including general, infusion, immersion, and mixed approaches, to better elucidate how CT can be taught and learned. In the general approach to CT training, actual CT skills and dispositions "are learning objectives, without specific subject matter content" (Abrami et al., 2008, p. 1105). The infusion of CT into a course requires specific subject matter content upon which CT skills are practiced. In the infusion approach, the objective of teaching CT within the course content is made explicit. In the immersion approach, like the infusion approach, specific course content upon which CT skills are practiced is required; though, CT objectives are not made explicit. Finally, in the mixed approach, CT is taught independently of the specific subject matter content of the course. Comparing the four CT course types, results from Abrami et al.’s (2008) meta-analysis revealed that courses using the mixed approach had the largest effect on CT performance (g+ =.94), followed by the infusion approach (g+ =.54), the general approach (g+ =.38) and the immersion approach (g+ =.09), respectively. It is important to note that the immersion approach (which had the smallest effect) is the only approach that does not make CT objectives explicit to students. This finding indicates that making CT objectives and requirements clear to students is vital to any course design aimed at increasing CT ability (Abrami et al., 2008).

What makes these findings of utmost importance in the context of the current debate is that the domain-general nature of CT application may not be fully realized until the individuals in question have been explicitly trained in CT. This perspective creates a chicken-or-egg style debate, in that it would be unfair to expect an individual to think critically about something if they don’t know how to think critically, even though they know how to think critically (in a specific field) without knowing they know how! However, in order to see CT develop in action, explicit CT training is necessary. I repeat: explicit CT training is necessary if educators want to see CT improve and flourish across domains.

Similar to the argument related to cognitive load above, a vast body of research (e.g. Gadzella, Ginther & Bryant, 1996; Hitchcock, 2004; Reed and Kromrey, 2001; Rimiene, 2002; Solon, 2007) indicates that training in CT yields better CT performance than performance prior to training (regardless of domain). Moreover, lacking specialist knowledge in a particular domain does not restrict one from thinking critically about the topic (as commonly argued in favor of a domain-specific conceptualization). Of course, individuals lacking such knowledge have an intellectual responsibility to make efforts to learn more about that domain (particularly if they’re tasked with CT), but this is not a hindrance per se; rather, this uncertainty is invaluable when assessing one’s approach to making important decisions (e.g. it may decrease the potential for [unwarranted] overconfidence in a particular topic, such as that engaged when an individual is experienced and not expert). The importance of this uncertainty and its association with Reflective Judgment, will be the focus of my next blog post.  

References

Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M. A., Tamim, R., & Zhang, D. (2008). Instructional interventions affecting critical thinking skills and dispositions: A stage 1 meta-analysis. Review of Educational Research, 78, 4, 1102–1134.

Chi, M. T. H., Glaser, R., & Rees, E. (1982). Expertise in problem solving. In R. S. Sternberg (Ed.), Advances in the psychology of human intelligence, 7–77. Hillsdale, NJ: Erlbaum.

Chiesi, H. L., Spliich, G. J., & Voss, J. F. (1979). Acquisition of domain-related information in relation to high and low domain knowledge. Journal of Verbal Learning and Verbal Behaviour, 18, 257–273.

Cheung, C., Rudowicz, E., Kwan, A. S. F., & Yue, X. D. (2002). Assessing university students’ general and specific critical thinking. College Student Journal, 36, 504–522.

Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. Cambridge, UK:  Cambridge University Press.

Dwyer, C.P., Boswell, A. & Elliott, M.A. (2015). An evaluation of critical thinking competencies in business settings. Journal of Education for Business, 90, 5, 260-269.

Dwyer, C.P., Hogan, M.J., Harney, O.M. & Kavanagh, C. (2016). Facilitating a Student- educator Conceptual Model of Dispositions towards Critical Thinking through Interactive Management. Educational Technology & Research, 65, 1, 47-73.

Dwyer, C.P., Hogan, M.J. & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43-52. 

Ennis, R. H. (1989). Critical thinking and subject specificity: Clarification and needed research. Educational Researcher, 18, 4–10.

Fischer, K. W. (1980). A theory of cognitive development: The control and construction of hierarchies of skills. Psychological Review, 87, 477–431.

Gadzella, B. M., Ginther, D. W., & Bryant, G. W. (1996). Teaching and learning critical thinking skills. Paper presented at the 26th International Congress of Psychology, Montreal, August 19.

Goldberg, M. (1990). A quasi-experiment assessing the effectiveness of TV advertising directed to children. Journal of Marketing Research, 27, 445–454.

Graham, S., & Donaldson, J. F. (1999). Adult student’s academic and intellectual development in college. Adult Education Quarterly, 49, 147–161.

Hammond, K. R. (1996). Upon reflection. Thinking & Reasoning, 2, 2–3, 239–248.

Hitchcock, D. (2004). The effectiveness of computer-assisted instruction in critical thinking. Informal Logic, 24, 3, 183–218.

Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.

Kahneman, D. & Frederick, S. (2002). Representativeness revisited: Attribute substitution in intuitive judgment. In T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment, 49–81. New York: Cambridge University Press.

Kotovsky, H. A., Hayes, K., & Simon, J. R. (1985). Why are some problems hard? Evidence from the Tower of Hanoi. Cognitive Psychology, 17, 22, 248–294.

Pollock, E., Chandler, P., & Sweller, J. (2002). Assimilating complex information. Learning & Instruction, 12, 61–86.

Reed, J. H. & Kromrey, J. D. (2001). Teaching critical thinking in a community college history course: Empirical evidence from infusing Paul’s model. College Student Journal, 35, 2, 201–215.

Rimiene, V. (2002). Assessing and developing students’ critical thinking. Psychology Learning & Teaching, 2, 1, 17–22.

Solon, T. (2007). Generic critical thinking infusion and course content learning in Introductory Psychology. Journal of Instructional Psychology, 34, 2, 95–109.

Stewart, T. R., Heideman, K. F., Moninger, W. R., & Reagan-Cirincione, P. (1992). Effects of improved information on the components of skill in weather forecasting. Organizational Behavior and Human Decision Processes, 53, 2, 107–134.

Sweller, J. (2010). Cognitive load theory: Recent theoretical advances. In J. L. Plass, R. Moreno, & R. Brünken (Eds.), Cognitive Load Theory, 29–47. New York: Cambridge University Press.

Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 4157, 1124–1131.

van Merriënboer, J. J. G., Kirschner, P. A., & Kester, L. (2003). Taking the load off a learner’s mind: Instructional design for complex learning. Educational Psychology, 38, 5–13.

Voss, J. F., Blais, J.,Means,M. L., Greene, T. R.,&Ahwesh, E. (1986). Informal reasoning and subject matter knowledge in the solving of economics problems by naive and novice individuals. Cognition and Instruction, 3, 269–302.