Distillates: David Read looks at some recent chemical education research

Students taking exams

Many observers have expressed concern in recent years that excessive testing and the over-reliance of the authorities on the results of national tests in judging teaching standards and school performance has been harmful to teaching and learning. Furthermore, many students at all levels of academic ability fail to demonstrate their potential in terms of exam results, and the arguments about how best to assess performance are ongoing. In a US study, Noble et al have investigated the correlation between students’ conceptual understanding and their ability to answer exam questions correctly. 

While the study focuses on grade 5 science tests, any teacher whose students are regularly sitting exams will be interested in the outcomes. Most teachers will have students in their classes who clearly have a good grasp of the underlying scientific principles behind a particular concept, but are then unable to produce the appropriate response to gain the marks on an exam. The aim of this study was to determine whether the knowledge and skills that students report using to answer questions are the same as those that the questions intended to test. Students completed a six question science test which was followed by an interview that probed their actual level of understanding of the concept. This produced some interesting results, particularly when students from different social groups were compared. 

A total of 36 students were involved in the study. 12 students came from low income households, 12 students were ‘English Language Learners’ (ELLs) and the remaining 12 students were native English speakers (NES) largely from middle-class backgrounds. The authors first considered the students who had demonstrated a good understanding of a question’s target knowledge in their interview. They found that low income students and ELLs were more likely than NES students to have given an incorrect answer in the exam when they actually had a good grasp of the concept being tested. When they considered students who did not demonstrate a grasp of a question’s target knowledge, the authors found that NES students were considerably more likely to still submit a correct answer than those from the other groups. 

The investigation indicated that the use of language in the text of the questions was the main issue affecting the performance of low-income students and ELLs. This comparative disadvantage was compounded by the high incidence of NES students who were able to give correct answers despite having a poor understanding of the concept, which raises serious questions regarding the use of such exams to measure attainment. The authors conclude with the suggestion that alternative measures of scientific knowledge that are responsive to students’ backgrounds and experience are essential to allow all of them to demonstrate their true abilities, a suggestion which will ring true for many educators in different countries across the globe.