The last thirty years have seen the production of a large range of diagnostic tests that can be administered to students to judge their learning level in order to guide teachers in their selection of content. Starting with the seminal Force Concept Inventory (Hestenes et al. 1992) there are now inventories in all the major areas of undergraduate physics, along with many such inventories in other sciences. Two relatively new, validated, diagnostic instruments are the Biology Lab Inventory of Critical Thinking (Eco-BLIC) and the Physics Lab Inventory of Critical Thinking (PLIC). These two tests aim to activate student critical thinking by presenting a number of different laboratory scenarios and asking students to make judgements about the quality of the experimental design.

The paper published in the journal PLoS ONE defines critical thinking as “the ways in which one uses data and evidence to make decisions about what to trust and what to do”. The authors analysed the answers on two already validated diagnostic tests of critical thinking in physics and biology with a view to determining the best types of question for activating critical thinking.

The Eco-BLIC test was administered to 1620 students across 26 separate courses at 11 different higher education establishments, whilst the PLIC was administered to 1839 students across 21 separate courses at 11 higher education establishments. Tests were administered online. In the first two phases of the test, students were asked to think critically about two separate experimental studies, thereafter in the third part of the test, students were asked to compare and contrast the two studies from phases one and two. Since the researchers already knew that more critical thinking was elicited in the final compare and contrast phase, they split the students into two groups. Depending on their discipline, the control group completed either the Eco-BLIC or the PLIC as normal. However, for the second, research group, the first two phases of evaluating the individual studies were omitted.

As expected, comparing and contrasting studies elicited the highest levels of critical thinking in both groups. However, the critical thinking elicited by the compare and contrast questions was just as strong, in the research group even though the first two phases where the individual studies were critiqued separately had been removed.

Since the results for the research and groups were identical, the authors suggest that compare and contrast questions alone are sufficient to elicit critical thinking. The authors therefore suggest that teachers should consider using contrasting studies in their teaching and assessment in order to be able to both encourage and evaluate the critical thinking skills of their students.

Comment: One of the interesting aspects of critical thinking is that it can take quite different forms across disciplines. Questions that are valid in one setting can make little or no sense in another. For example, the constructs of reliability and validity in quantitative research have quite different criteria than the corresponding terms in qualitative studies. Learning what constitutes valid critique in a particular discipline is an important part of becoming a disciplinary literate individual.

The authors of this research study found that comparing and contrasting experimental designs was the most effective way to elicit critical thinking. They therefore recommend that lecturers adopt this technique. But perhaps their finding is not so surprizing after all? I would suggest that what may actually be happening when one compares studies is that the aspects that differ between the studies are noticed and reflected on. Marton and Booth (1997) have described this phenomenon in their seminal work Learning and Awareness. Put simply, they claim that in any given situation there are thousands of aspects that one could potentially focus on, but as humans we tend only to notice things that change. Aspects that vary come into focal awareness, whilst everything that does not vary moves to the background of our attention.

Variation theory is a powerful tool for teaching and learning, and in this case the theory suggests to me that we should focus on what differs between the studies rather than simply recommending the use of compare and contrast questions. Variation theory predicts that when studies vary in some respect, comparing and contrasting will lead to students critically reflecting on those aspects that are different. I therefore agree with the authors that compare and contrast provides an excellent way to activate critical thinking about experimental design, but I would go further and suggest that we should be asking ourselves as teachers which aspects of a situation we want students to critically reflect on. It is only by making sure that these aspects are handled differently in the studies to be compared that we can leverage the power of compare and contrast for student learning.

Text: John Airey, Department of Teaching and Learning

The study
Heim AB, Walsh C, Esparza D, Smith MK, Holmes NG (2022) What influences students’ abilities to critically evaluate scientific investigations? PLoS ONE 17(8): e0273337.

Keywords: Critical thinking, Undergraduate biology, Undergraduate physics, Compare and contrast, Variation theory.

References
Hestenes, D. Wells,  M., & Swackhammer, G. (1992) Force concept inventory. The Physics Teacher 30: 141-166
Marton, F., & Booth, S. (1997). Learning and awareness. Mahwah, NJ: Lawrence Erlbaum Associates.