Curtin University operates a computer managed learning package that gives students the option of undertaking a practice test before the unit assessment. This facility imposes an extra load on the Computing Centre but is thought to offer students an advantage. 417 students took advantage of the optional practice test (PT) and their marks on the unit assessments were compared with those students (n=197) who opted not to take the practice test (non-PT). There was a significant improvement in the mean mark of the PT group from the practice test to the first unit test (P<0.001). The PT group also significantly out performed the non-PT group. They were significantly better on both unit tests although only the first unit test was preceded by a practice test (P<0.001). This was despite apparently weaker students opting to sit the practice test. Students who performed the practice test improved their mark on the assessment tests and performed better on these tests than those students who did not do the practice test. Based on these data, practice tests should be offered before all unit assessments despite the implications for departmental resources.
Results were obtained from a survey handed to students when they visited the Computer Managed Learning Lab to take their second Unit assessment test (S02) and data obtained directly from the CML system. The survey asked questions related to gender, TEE score, TEE Economics score, expectation of result on the second Unit assessment test (S02) and whether the student had sat the optional practice test. Marks obtained by each student for the Practice test (P01) and both the first and second Unit assessment tests (S01 &S02) were retrieved directly from the CML system.
From the 614 students enrolled, 592 completed forms were received (96%) There were several potential reasons for students not handing in the forms. Student may not have sat the particular Unit assessment test, may have sat outside the allocated time frame, were missed by lab staff, or, did not hand back the survey. However, the high response rate suggests the data are reliable. The question on gender was answered by 570 students of whom 287 (50.4%) were female.
Survey data were entered into SPSS and paired t-tests, unpaired t-test or chi-square used where appropriate. Statistical significance was accepted at the 5% level.
Those students who opted to sit P01 showed a significant increase in their mark between P01 and S01 (p<0.001). These tests are randomly drawn from the same testbank and the two tests cover the same material. (Table 1)
|Practice test (P01)||417||62.18||15.34|
|First Unit test (S01)||417||72.72||15.66|
Those students who opted to sit P01 performed significantly better on S01 than those who did not (p<0.001) (Table 2).
The improved performance was also seen on S02, even though there was no practice test before this unit test and the questions were drawn from a different part of the test bank (p=0.001) (Table 3).
|N (S01)||N (S02)||Mean||S. D.|
To investigate whether the brighter students were the ones who chose to do P01, the results from the first test students did (P01 or S01), which came from the same part of the test bank, were compared. Students who opted not to do P01 performed significantly better on their first test (p<0.001) (Table 4).
|S01 mark - non PT||197||67.56||15.68|
Student gender or TEE score did not influence their choice to do P01.
While we do not have definitive proof, those students who chose to do P01 did not appear to be the better students. When we examined the first test students did (P01 or S01) from the test bank, those who sat the P01 did significantly worse than those who did not (on S01). One possible explanation for the apparent benefit gained by doing the practice test is familiarity with the computer system. However, the tests are not performed on a computer. When a student draws a test, that test is printed and the student writes the answers on the paper. The student then enters these answers into the computer. The student has the option to verify and correct answers before the computer marks the test. Thus the student's interaction with the computer technology is minimal during the performance of the test and familiarity with the system is not likely to explain the improvement in performance in those who sat the P01. An alternative explanation is the increase in familiarity with the nature of the test, the type of questions set by the lecturer, and the "revision" value of the practice test.
|Please cite as: Sly, L. and Western, D. (1998). Practice tests improve student performance on computer managed learning assessments. In Black, B. and Stanley, N. (Eds), Teaching and Learning in Changing Times, 310-312. Proceedings of the 7th Annual Teaching Learning Forum, The University of Western Australia, February 1998. Perth: UWA. http://lsn.curtin.edu.au/tlf/tlf1998/sly.html|