Teaching and Learning Forum 2001 Home Page
[ Teaching and Learning Forum 2001 ] [ Proceedings Contents ]

Are we listening to our students?

Leith Sly
Centre for Educational Advancement
Curtin University of Technology

Introduction

All undergraduates spend a considerable amount of time engaged in some form of assessment. Organising assessment also takes a large proportion of the lecturer's time and resources and impacts on students in ways that may be positive or negative. As the effects of assessment potentially have such critical consequences for students, it is essential that all assessment is well planned and tailored for its particular purpose. For any given assessment technique there will be some students who perform well and others who do not. Partly for this reason, Brown and Knight (1994) argue that the use of multiple techniques is essential to good assessment practice, because this does not disadvantage any one group of learners.

Curtin University of Technology presently uses two computer testing systems run through a central testing laboratory. The mainframe based package has been in operation since 1987 and the PC based package was initially piloted in 1998. Both packages have the facility for students to have a practice test before their assessed test. Previous studies (Sly, 1999; Sly & Rennie, 1999a; Sly & Rennie, 1999b) demonstrated that those students who completed a practice test scored higher marks on the subsequent assessed test. This effect was seen for students from a number of different subject disciplines with effect sizes ranging from 0.5 to 2.1.

This paper reports students' usage rates of the computer generated practice test as well as students responses to survey and interview questions. Students' preference for the use of computerised testing and the impact that feedback from computerised multiple choice tests can have on study prior to an assessed test are described. Students know the computer testing packages as computer managed learning (CML) systems and this terminology is used in the surveys.

Method

Seven groups of first year undergraduates studying units in 6 different subject disciplines participated. All computer generated tests contained multiple choice questions. Students in Studies 1, 2, 3 and 4 were offered an optional computerised practice test prior to their computerised assessed test. Data taken directly from the CML systems, for students in Studies 1, 2, 3 and 4, were used to analyse how many students in each group chose to sit the practice test. Students from Studies 1, 2, 3, 5, 6 and 7 answered survey questions related to their utilisation of computerised tests for some part of their unit assessment. Specifically, they answered questions relating to their use of the feedback supplied from these tests to direct their study.

This research aimed to answer three main questions. Firstly, do students use computerised testing, that is, when offered an optional practice test do students elect to sit it? Secondly, do students want computerised testing? This second issue was addressed as a series of smaller questions. Thirdly, how do students use the feedback they receive from the computerised tests?

Results

The result section is divided into three with each part addressing one of the research questions. In Part A, rate of use of the practice test was taken directly from the computer testing system for students from Studies 1, 2, 3 and 4. Additional information was obtained from survey results (Studies 2 and 3). Part B and Part C report survey results with those in Part B relating to students' preference for computer testing (Studies 1, 2 and 3) and results in Part C to students' use of feedback (2, 3, 5, 6 and 7). Part C also reports interview data collected from students in any of the Studies.

Part A: Do students use optional computer testing?

Students in Studies 1, 2, 3 and 4 were offered an optional practice test prior to their CML assessed test. Table 1 reports the use of the practice test across the studies. Each group of students was from a different subject discipline. While the rate of participation varied across the groups on average 57% of students chose to sit the practice test.

Table 1: Selection of optional practice test by subject discipline

Study nPractice Test
YesNo
1 227152 (55%)125 (45%)
2 19078 (41%)112 (59%)
3 369147 (40%)216( 60%)
4 6962 (90%)7 (10%)

Students in Studies 2 and 3 responded to a survey question that asked why they did not sit the practice test. Seventy-five students (39.5%) from the Study 2 and 199 students (55.3%) from Study 3 completed the survey. Approximately half the students in each group reported not sitting the practice test. The most frequent reason given in each group was lack of time. This was reported by 11/38 (28.2%) Study 2 students and 34/112 (30.4%) Study 3 students. The next most frequent responses for the Study 2 students included forgetting (12/38, 30.8%) and not thinking the test would be helpful (6/38, 15.4%). Only one student gave previous use of the CML system as a reason. For the Study 3 group, 32 students (28.6%), reported that they did not think the practice test would be helpful or that they had used CML previously (27/112, 24.1%). The fourth most common response was forgetting, with 10 students (8.9%) selecting this as an option. Some Study 3 students gave more than one response so the percentages do not total 100%.

Part B: Do students want computer generated tests to contribute to their unit assessment?

This second question was addressed as a number of smaller questions. Do students

  2a.  want computerised testing as one part of their assessment?
  2b.  consider that CML systems have best features ?
  2c.  like CML testing as a form of assessment?
  2d.  feel that CML testing measures their understanding?
  2e.  find CML testing helpful?

Students from Studies 1, 2 and 3 participated. Questions 2a and 2b were answered by students from Study 1 while questions 2c, 2d, 2e and 2f by students in Studies 2 and 3. Completed surveys were received from 213 (76.9%) students from Study 1, 75 students (39.5%) from the Study 2 and 199 students (55.3%) from Study 3.

2a. Do students want computerised testing as one part of their assessment?

Students from Study 1 were asked to select assessment options for twenty percent of their unit mark. Table 2 reports student responses to the choice of assessment method. Students were required to rank their preferences but were not obliged to rank all options supplied.

Table 2: Student options for 20% of mark (Study 1)

Option First
preference
% Second
preference
%
CML tests 14769.062.8
Mid semester test 94.23817.8
Another essay 104.72210.3
No preference 198.941.9
Other 31.410.5

Students ranked CML tests well ahead of other options - a mid semester test, a second essay (the unit already had one essay), no preference, or "other".

2b. Students' opinions about the best features of CML

Students (Study 1) were able to select any one or more of six options they believed were the best features of CML, and asked to add any additional information they wished. As Table 3 shows, most students selected "immediate feedback", followed by the freedom to select a convenient time as the most important features. About two thirds of students said they liked the removal of the concentrated pressure of a single mid semester examination while 56% of students thought they should be offered the opportunity to resit a test if they wished to improve their mark. Both the chance to take their own remedial action and the ability to revise on an ongoing basis were considered valuable by students.

Table 3: Which are the best features of the CML System? (Study 1)

Option nPercent
Immediate feedback 18586.9
Convenient time 16677.9
Removes pressure 13965.3
Resit opportunity 11955.9
Remedial action 7233.8
Continuous revision 7032.9

In addition, students were asked if they would like an optional practice test before each test, and 153 students (78.9%) said they would.

2c. Do students like CML testing as a form of assessment.

Both groups (Study 2 and 3) rated the computer managed learning system highly as a form of assessment, as shown in Table 4. More than half of the students in both groups selected 4 or 5 on the five point scale. Very few responded that they hated the system.

2d. Do students feel that CML testing measures their understanding?

The results reported by the second question in Table 4 show that students from both groups believed the CML system was able to accurately measure their understanding of the unit.

Table 4: Students' attitude to the CML System (Study 2 and Study 3)

Questions StudyResponse (%)
1a 2345b
How much do you like CML as a form of assessment? 22.714.729.3 34.718.7
37.011.6 27.128.126.1
How well do you think CML assessment measures your own understanding of this unit? 22.717.333.3 34.712.0
315.623.126.1 26.68.5
Notes: a is "hate it" and "not very well" respectively. b is "like it a lot" and "very well".

2e. Is CML testing helpful?

Students from Study 2 and Study 3 were asked if they found the practice test a useful preparation for the assessed test. Table 5 reports the results of the 49.3% (37/75) of Study 2 survey respondents and the 47.2% (94/199) of the Study 3 survey respondents who sat the practice test. Most students reported that sitting the practice test "was helpful... in preparing for the next CML test", with 62.1% and 71.2%, respectively, marking response 4 or 5.

Table 5: Responses from Practice Test Group regarding use of the Practice Test
(Study 2 and Study 3)

Questions StudyResponse (%)
1a 2345b
Do you think that the practice test was helpful to you in preparing for the next CML test? 22.716.218.9 24.337.8
38.56.4 13.837.234.0
Notes: a is "not at all". b is "very helpful".

Summary of responses to students' preference for the use of computer generated tests

Students from Study 1 rated the immediate feedback offered by the CML system as well as the convenience of selecting their own testing time as important features of the testing system. When offered other choices for 20% of their mark, they were strongly in favour of retaining the CML tests. In both Study 2 and Study 3, approximately 50% of students rated the CML system highly with only a very small proportion expressing dislike. Students from these groups are in favour of CML as a form of testing and feel that it generally measures their understanding well. In addition, approximately 70% of each group (Study 2 & 3) said they found the practice test helpful.

Part C: How do students use the feedback they receive from the computerised tests

Students from Studies 2 and 3 were asked to comment on the usefulness of the practice test as a preparation for the assessed test and students from Studies 5, 6 and 7 were asked directly about their use of the feedback supplied on incorrect answers.

Only 14 (37.8%) Study 2 students wrote comments. Nine of these students reported that the practice test gave them an idea of what the assessed test was like. Thirty-nine (54.9%) of the 71 Study 3 students who wrote comments also reported that the practice test gave them an idea of what the assessed test was like and so what to prepare for. Five students (7%) said it showed them weak areas or areas to work on while two students made a general comment about the item type and standard. Five students (7%) said the practice test helped to make them less nervous about computers while seven students (9.8%) said it did not reflect the assessed test, as it was either harder or easier. Table 6 reports students' comments from Studies 5, 6 and 7.

Table 6: Student comments related to the use of CML information
on incorrect responses across Studies 5, 6 and 7

Q. How do you use the information about incorrect answers that you get from the CML system to help your preparation for the next CML test?
Comment Study 5
n = 119
Study 6
n = 46
Study 7
n = 296
Identifies error areas 58 (48.7%)28 (60.9%)86 (29.1%)
Identifies key content areas 20 (16.8%)
19 (6.4%)
Item specific revision 28 (23.5%)9 (19.6%)95 (32.1%)
General revision 15 (12.6%)5 (10.9%)12 (4.1%)
Hard to remember 7 (5.9%)3 (6.5%)24 (8.1%)
Do not use

22 (7.4%)
Acts as a motivator

135 (45.6%)
Identifies how item asked

26 (8.8%)
Notes: Multiple responses were allowed. Percent represents percentage of cases.

Interview data

Students were selected at random, across all Studies, and asked questions related to the computer testing in their unit. Twenty-six students were interviewed. The interview data were in general agreement with the responses to student surveys reported previously. Students found a practice test useful, and used it to alert themselves to their own error areas as well as important content areas. In the majority of cases they did little study before this test.

Conclusion

Students are happy using computerised multiple choice tests as one component of their unit assessment. In addition, students like to be given the option to sit a practice test and they use the feedback in a variety of ways. Students use the information in the revision process, as a motivating factor and as an identifier of the way in which questions will be asked, that is, they are using the practice test as a formative tool and are exercising some control over their learning.

References

Brown, S. & Knight, P. (1994). Assessing learners in higher education. London: Kogan Page.

Sly, L. (1999). Practice tests as formative assessment improve student performance on computer managed learning assessments. Assessment and Evaluation in Higher Education, 24(3), 339-343.

Sly, L. & Rennie, L. J. (1999a). Computer managed learning: Its use in formative as well as summative assessment. In M. Danson, & R. Sherrat (Eds), Proceedings of Flexible Learning: Third annual computer assisted assessment conference (pp. 179-189). Loughborough University.

Sly, L. & Rennie, L. J. (1999b). Computer managed learning as an aid to formative assessment in higher education. In S. Brown, J. Bull, & P. Race (Eds), Computer Assisted Assessment of Students (pp. 113-120). London: Kogan Page.

Please cite as: Sly, L. (2001). Are we listening to our students? In A. Herrmann and M. M. Kulski (Eds), Expanding Horizons in Teaching and Learning. Proceedings of the 10th Annual Teaching Learning Forum, 7-9 February 2001. Perth: Curtin University of Technology. http://lsn.curtin.edu.au/tlf/tlf2001/sly.html


[ Abstract for this article ] [ TL Forum 2001 Proceedings Contents ] [ All Abstracts ] [ TL Forums Index ]
HTML: Roger Atkinson, Teaching and Learning Centre, Murdoch University [rjatkinson@bigpond.com]
This URL: http://lsn.curtin.edu.au/tlf/tlf2001/sly.html
Last revision: Feb 2002. Curtin University of Technology
Previous URL 12 Dec 2000 to Feb 2002 http://cleo.murdoch.edu.au/confs/tlf/tlf2001/sly.html