|[ Teaching and Learning Forum 2001 ] [ Proceedings Contents ]|
Curtin University of Technology presently uses two computer testing systems run through a central testing laboratory. The mainframe based package has been in operation since 1987 and the PC based package was initially piloted in 1998. Both packages have the facility for students to have a practice test before their assessed test. Previous studies (Sly, 1999; Sly & Rennie, 1999a; Sly & Rennie, 1999b) demonstrated that those students who completed a practice test scored higher marks on the subsequent assessed test. This effect was seen for students from a number of different subject disciplines with effect sizes ranging from 0.5 to 2.1.
This paper reports students' usage rates of the computer generated practice test as well as students responses to survey and interview questions. Students' preference for the use of computerised testing and the impact that feedback from computerised multiple choice tests can have on study prior to an assessed test are described. Students know the computer testing packages as computer managed learning (CML) systems and this terminology is used in the surveys.
This research aimed to answer three main questions. Firstly, do students use computerised testing, that is, when offered an optional practice test do students elect to sit it? Secondly, do students want computerised testing? This second issue was addressed as a series of smaller questions. Thirdly, how do students use the feedback they receive from the computerised tests?
|1||227||152 (55%)||125 (45%)|
|2||190||78 (41%)||112 (59%)|
|3||369||147 (40%)||216( 60%)|
|4||69||62 (90%)||7 (10%)|
Students in Studies 2 and 3 responded to a survey question that asked why they did not sit the practice test. Seventy-five students (39.5%) from the Study 2 and 199 students (55.3%) from Study 3 completed the survey. Approximately half the students in each group reported not sitting the practice test. The most frequent reason given in each group was lack of time. This was reported by 11/38 (28.2%) Study 2 students and 34/112 (30.4%) Study 3 students. The next most frequent responses for the Study 2 students included forgetting (12/38, 30.8%) and not thinking the test would be helpful (6/38, 15.4%). Only one student gave previous use of the CML system as a reason. For the Study 3 group, 32 students (28.6%), reported that they did not think the practice test would be helpful or that they had used CML previously (27/112, 24.1%). The fourth most common response was forgetting, with 10 students (8.9%) selecting this as an option. Some Study 3 students gave more than one response so the percentages do not total 100%.
2a. want computerised testing as one part of their assessment?
2b. consider that CML systems have best features ?
2c. like CML testing as a form of assessment?
2d. feel that CML testing measures their understanding?
2e. find CML testing helpful?
Students from Studies 1, 2 and 3 participated. Questions 2a and 2b were answered by students from Study 1 while questions 2c, 2d, 2e and 2f by students in Studies 2 and 3. Completed surveys were received from 213 (76.9%) students from Study 1, 75 students (39.5%) from the Study 2 and 199 students (55.3%) from Study 3.
2a. Do students want computerised testing as one part of their assessment?
Students from Study 1 were asked to select assessment options for twenty percent of their unit mark. Table 2 reports student responses to the choice of assessment method. Students were required to rank their preferences but were not obliged to rank all options supplied.
|Mid semester test||9||4.2||38||17.8|
Students ranked CML tests well ahead of other options - a mid semester test, a second essay (the unit already had one essay), no preference, or "other".
2b. Students' opinions about the best features of CML
Students (Study 1) were able to select any one or more of six options they believed were the best features of CML, and asked to add any additional information they wished. As Table 3 shows, most students selected "immediate feedback", followed by the freedom to select a convenient time as the most important features. About two thirds of students said they liked the removal of the concentrated pressure of a single mid semester examination while 56% of students thought they should be offered the opportunity to resit a test if they wished to improve their mark. Both the chance to take their own remedial action and the ability to revise on an ongoing basis were considered valuable by students.
In addition, students were asked if they would like an optional practice test before each test, and 153 students (78.9%) said they would.
2c. Do students like CML testing as a form of assessment.
Both groups (Study 2 and 3) rated the computer managed learning system highly as a form of assessment, as shown in Table 4. More than half of the students in both groups selected 4 or 5 on the five point scale. Very few responded that they hated the system.
2d. Do students feel that CML testing measures their understanding?
The results reported by the second question in Table 4 show that students from both groups believed the CML system was able to accurately measure their understanding of the unit.
|How much do you like CML as a form of assessment?||2||2.7||14.7||29.3||34.7||18.7|
|How well do you think CML assessment measures your own understanding of this unit?||2||2.7||17.3||33.3||34.7||12.0|
|Notes: a is "hate it" and "not very well" respectively. b is "like it a lot" and "very well".|
2e. Is CML testing helpful?
Students from Study 2 and Study 3 were asked if they found the practice test a useful preparation for the assessed test. Table 5 reports the results of the 49.3% (37/75) of Study 2 survey respondents and the 47.2% (94/199) of the Study 3 survey respondents who sat the practice test. Most students reported that sitting the practice test "was helpful... in preparing for the next CML test", with 62.1% and 71.2%, respectively, marking response 4 or 5.
|Do you think that the practice test was helpful to you in preparing for the next CML test?||2||2.7||16.2||18.9||24.3||37.8|
|Notes: a is "not at all". b is "very helpful".|
Summary of responses to students' preference for the use of computer generated tests
Students from Study 1 rated the immediate feedback offered by the CML system as well as the convenience of selecting their own testing time as important features of the testing system. When offered other choices for 20% of their mark, they were strongly in favour of retaining the CML tests. In both Study 2 and Study 3, approximately 50% of students rated the CML system highly with only a very small proportion expressing dislike. Students from these groups are in favour of CML as a form of testing and feel that it generally measures their understanding well. In addition, approximately 70% of each group (Study 2 & 3) said they found the practice test helpful.
Only 14 (37.8%) Study 2 students wrote comments. Nine of these students reported that the practice test gave them an idea of what the assessed test was like. Thirty-nine (54.9%) of the 71 Study 3 students who wrote comments also reported that the practice test gave them an idea of what the assessed test was like and so what to prepare for. Five students (7%) said it showed them weak areas or areas to work on while two students made a general comment about the item type and standard. Five students (7%) said the practice test helped to make them less nervous about computers while seven students (9.8%) said it did not reflect the assessed test, as it was either harder or easier. Table 6 reports students' comments from Studies 5, 6 and 7.
|Q. How do you use the information about incorrect answers that you get from the CML system to help your preparation for the next CML test?|
n = 119
n = 46
n = 296
|Identifies error areas||58 (48.7%)||28 (60.9%)||86 (29.1%)|
|Identifies key content areas||20 (16.8%)||19 (6.4%)|
|Item specific revision||28 (23.5%)||9 (19.6%)||95 (32.1%)|
|General revision||15 (12.6%)||5 (10.9%)||12 (4.1%)|
|Hard to remember||7 (5.9%)||3 (6.5%)||24 (8.1%)|
|Do not use||22 (7.4%)|
|Acts as a motivator||135 (45.6%)|
|Identifies how item asked||26 (8.8%)|
|Notes: Multiple responses were allowed. Percent represents percentage of cases.|
Students were selected at random, across all Studies, and asked questions related to the computer testing in their unit. Twenty-six students were interviewed. The interview data were in general agreement with the responses to student surveys reported previously. Students found a practice test useful, and used it to alert themselves to their own error areas as well as important content areas. In the majority of cases they did little study before this test.
Sly, L. (1999). Practice tests as formative assessment improve student performance on computer managed learning assessments. Assessment and Evaluation in Higher Education, 24(3), 339-343.
Sly, L. & Rennie, L. J. (1999a). Computer managed learning: Its use in formative as well as summative assessment. In M. Danson, & R. Sherrat (Eds), Proceedings of Flexible Learning: Third annual computer assisted assessment conference (pp. 179-189). Loughborough University.
Sly, L. & Rennie, L. J. (1999b). Computer managed learning as an aid to formative assessment in higher education. In S. Brown, J. Bull, & P. Race (Eds), Computer Assisted Assessment of Students (pp. 113-120). London: Kogan Page.
|Please cite as: Sly, L. (2001). Are we listening to our students? In A. Herrmann and M. M. Kulski (Eds), Expanding Horizons in Teaching and Learning. Proceedings of the 10th Annual Teaching Learning Forum, 7-9 February 2001. Perth: Curtin University of Technology. http://lsn.curtin.edu.au/tlf/tlf2001/sly.html|