Abstract:
The integrated use of text, graphics, animation, video and sound is becoming relatively commonplace in the presentation phases of computer-based learning (CBL) material. Less common, however, is the use of different data types for questions in the test phases of CBL. This paper discusses possibilities for "multimedia" test items and illustrates their use in projects under development.
Questions have been a key element in CBL virtually from its inception as researchers and developers capitalised on the machine's capacity to provide automatic scoring, to generate tests from item banks or from algorithms, and to provide detailed item analyses. More advanced systems use adaptive testing. In the main, however, test items tend to be screen presentations of conventional, text-based, completion/short-answer or multiple-choice pencil and paper questions. It is also the case that often the computerised version offers the user less flexibility than the non-computerised counterpart (Ring, 1993). That, along with the need to develop techniques by which extended (non-multiple choice) answers can be computer-scored, remain an issue.
The focus of the present paper, however, is on possibilities for additional item types. Developments in using multimedia for test items include audio, applicable in many fields (e.g., language learning, music, medicine), high quality colour graphics, greater variety in response modes (e.g., direct manipulation of screen objects as in assembling or using apparatus), items involving animation and video clips (e.g., in simulations), time-controlled responses, items which require the student to "call up" and employ a certain tool or procedure, and items for which help or other "look-up" resources are available.
Questions have been a key element in CBL virtually from its inception as researchers and developers capitalised on the machine's capacity to provide automatic scoring, to generate tests from item banks or from algorithms, and to provide detailed item analyses. More advanced systems use adaptive testing. In the main, however, test items tend to be screen presentations of conventional, text-based, completion/short-answer or multiple-choice pencil and paper questions. It is also the case that often the computerised version offers the user less flexibility than the non-computerised counterpart (Ring, 1993). That, along with the need to develop techniques by which extended (non-multiple choice) answers can be computer-scored, remain an issue.
The focus of the present paper, however, is on possibilities for additional item types. Developments in using multimedia for test items include audio, applicable in many fields (e.g., language learning, music, medicine), high quality colour graphics, greater variety in response modes (e.g., direct manipulation of screen objects as in assembling or using apparatus), items involving animation and video clips (e.g., in simulations), time-controlled responses, items which require the student to "call up" and employ a certain tool or procedure, and items for which help or other "look-up" resources are available.