Teaching and Learning Forum 98 [ Contents ]

Practice tests improve student performance on computer managed learning assessments

Leith Sly
Computing Centre
and
Dave Western
Economics and Finance
Curtin University of Technology
Curtin University operates a computer managed learning package that gives students the option of undertaking a practice test before the unit assessment. This facility imposes an extra load on the Computing Centre but is thought to offer students an advantage. 417 students took advantage of the optional practice test (PT) and their marks on the unit assessments were compared with those students (n=197) who opted not to take the practice test (non-PT). There was a significant improvement in the mean mark of the PT group from the practice test to the first unit test (P<0.001). The PT group also significantly out performed the non-PT group. They were significantly better on both unit tests although only the first unit test was preceded by a practice test (P<0.001). This was despite apparently weaker students opting to sit the practice test. Students who performed the practice test improved their mark on the assessment tests and performed better on these tests than those students who did not do the practice test. Based on these data, practice tests should be offered before all unit assessments despite the implications for departmental resources.

Introduction

Curtin University operates a computer managed learning (CML) package (Learning Management System, CBTS) which stores questions in test banks. Tests are generated from the bank according to parameters set in the course map. As questions are randomly selected each test is unique. This system also gives students the option of undertaking a practice test before the unit assessment. While this facility imposes an extra load on the Computing Centre the resources required could be justified if the practice test offers students an advantage. To determine whether the practice test improved student performance on unit assessments, the results for the first two unit assessments were compared for students who chose to take the practice test with those who chose not to.

Method

Students enrolled in a first year economics unit were studied. They had the option of taking a practice test (P01) before their first Unit assessment test (S01). They later undertook a second Unit assessment test (S02) without the option of a prior practice test. The P01 drew questions randomly from the same testbank as the S01. The test parameters were identical with those used to generate the Unit assessment test. The S02 was also randomly generated but there was no practice test to precede it. This test was drawn from a different part of the testbank so students would not have encountered questions on the topics examined previously.

Results were obtained from a survey handed to students when they visited the Computer Managed Learning Lab to take their second Unit assessment test (S02) and data obtained directly from the CML system. The survey asked questions related to gender, TEE score, TEE Economics score, expectation of result on the second Unit assessment test (S02) and whether the student had sat the optional practice test. Marks obtained by each student for the Practice test (P01) and both the first and second Unit assessment tests (S01 &S02) were retrieved directly from the CML system.

From the 614 students enrolled, 592 completed forms were received (96%) There were several potential reasons for students not handing in the forms. Student may not have sat the particular Unit assessment test, may have sat outside the allocated time frame, were missed by lab staff, or, did not hand back the survey. However, the high response rate suggests the data are reliable. The question on gender was answered by 570 students of whom 287 (50.4%) were female.

Survey data were entered into SPSS and paired t-tests, unpaired t-test or chi-square used where appropriate. Statistical significance was accepted at the 5% level.

Results

Of the study sample 417 students chose to do the practice test and 197 did not. All 614 sat S01 and 609 sat S02.

Those students who opted to sit P01 showed a significant increase in their mark between P01 and S01 (p<0.001). These tests are randomly drawn from the same testbank and the two tests cover the same material. (Table 1)

Table 1: Practice and Unit Test

NMeanS. D.

Practice test (P01) 41762.1815.34
First Unit test (S01) 41772.7215.66

Those students who opted to sit P01 performed significantly better on S01 than those who did not (p<0.001) (Table 2).

Table 2: First Unit Test (S01)

NMeanS. D.

non PT197 67.5615.68
PT417 72.7215.66

The improved performance was also seen on S02, even though there was no practice test before this unit test and the questions were drawn from a different part of the test bank (p=0.001) (Table 3).

Table 3: Second Unit Test (S02)

N (S01)N (S02) MeanS. D.

non PT197194 62.2415.45
PT417415 66.8816.76

To investigate whether the brighter students were the ones who chose to do P01, the results from the first test students did (P01 or S01), which came from the same part of the test bank, were compared. Students who opted not to do P01 performed significantly better on their first test (p<0.001) (Table 4).

Table 4: First Test Comparision

NMeanS. D.

S01 mark - non PT 19767.5615.68
P01 mark 41762.1815.34

Student gender or TEE score did not influence their choice to do P01.

Discussion

These data suggest that the practice test has a significant bearing on subsequent Unit assessment test results. The group of students who sat the practice test, improved their mark on the first Unit assessment test and did significantly better on the first Unit assessment test (S01) than the group who did not sit the practice test. They also did significantly better on the second Unit assessment test (S02).

While we do not have definitive proof, those students who chose to do P01 did not appear to be the better students. When we examined the first test students did (P01 or S01) from the test bank, those who sat the P01 did significantly worse than those who did not (on S01). One possible explanation for the apparent benefit gained by doing the practice test is familiarity with the computer system. However, the tests are not performed on a computer. When a student draws a test, that test is printed and the student writes the answers on the paper. The student then enters these answers into the computer. The student has the option to verify and correct answers before the computer marks the test. Thus the student's interaction with the computer technology is minimal during the performance of the test and familiarity with the system is not likely to explain the improvement in performance in those who sat the P01. An alternative explanation is the increase in familiarity with the nature of the test, the type of questions set by the lecturer, and the "revision" value of the practice test.

Conclusions

The data from the present study suggest that students who opt for a practice test perform better on subsequent tests. Practice tests should be offered to all students and perhaps should be compulsory.

Please cite as: Sly, L. and Western, D. (1998). Practice tests improve student performance on computer managed learning assessments. In Black, B. and Stanley, N. (Eds), Teaching and Learning in Changing Times, 310-312. Proceedings of the 7th Annual Teaching Learning Forum, The University of Western Australia, February 1998. Perth: UWA. http://lsn.curtin.edu.au/tlf/tlf1998/sly.html


[ TL Forum 1998 Proceedings Contents ] [ TL Forums Index ]
HTML: Roger Atkinson, Teaching and Learning Centre, Murdoch University [rjatkinson@bigpond.com]
This URL: http://lsn.curtin.edu.au/tlf/tlf1998/sly.html
Last revision: 14 Mar 2002. The University of Western Australia
Previous URL 8 Jan 1998 to 14 Mar 2002 http://cleo.murdoch.edu.au/asu/pubs/tlf/tlf98/sly.html