ABSTRACT Concept inventories (CIs) are assessment instruments designed to measure students’ conceptual understanding of fundamental concepts in particular fields. CIs utilise multiple-choice questions (MCQs), and specifically designed response selections, to… Click to show full abstract
ABSTRACT Concept inventories (CIs) are assessment instruments designed to measure students’ conceptual understanding of fundamental concepts in particular fields. CIs utilise multiple-choice questions (MCQs), and specifically designed response selections, to help identify misconceptions. One shortcoming of this assessment instrument is that it fails to provide evidence of the causes of the misconceptions, or the nature of students’ conceptual understanding. In this article, we present the results of conducting textual analysis on students’ written explanations in order to provide better judgements into their conceptual understanding. We compared students’ MCQ scores in Signals and Systems Concept Inventory questions, with the textual analysis utilising vector analysis approaches. Our analysis of the textual data provided the ability to detect answers that students identified as a ‘guessed’ response. However, the analysis was unable to detect if conceptually correct ideas existed within the ‘guessed’ responses. The presented approach can be used as a framework to analyse assessment instruments that utilise textual, short-answer responses. This analysis framework is best suited for the restricted conditions imposed by the short-answer structure.
               
Click one of the above tabs to view related content.