Curtin University of Technology operates a computer managed learning (CML) package that gives students the option of undertaking a practice test before the unit assessment. This test is designed predominately to contribute to student learning and, as such, is considered formative. This presentation will focus both on current use and research into the benefits of the practice test at Curtin University of Technology. Currently some lecturers make the practice test compulsory and others make it optional. Some allow multiple attempts on practice tests before students proceed to the unit test. Often content only covers part of the assessed test. Research to date has focused on how the practice test can affect performance on subsequent computer managed learning assessed tests and which of the aspects contribute to this. Students who perform the practice test improve their mark on the subsequent assessment tests. These data suggest that practice tests should be incorporated into all CML assessments.
CML systems usually have the ability to hold many banks of questions, each of which can be categorized into smaller components, such as topics, learning outcomes or objectives. Often questions can also be coded on a variety of characteristics, for example, degree of difficulty and cognitive content.
The key function of a CML system is to generate tests. It has a template or a coursemap where the parameters used to generate any test are stored. Options for the generation of any test can vary. Tests may be generated using the same questions for all students, others may be set to select questions randomly but take them from specific parts of the bank of questions. In this case, students are more likely to have unique tests, depending on the number of questions available for selection.
Most systems also have the ability to give students immediate feedback on incorrect answers, and supply them with the correct response. Because systems retain students' answers to questions, they are able to produce a report of the distribution of question responses and students' marks. This feedback can alert lecturers to problem areas, both from the point of view of question quality and also from the perspective of student performance.
The computer managed learning system in use at Curtin University uses predominately multiple-choice questions, drawn at random from a test bank according to parameters requested by the lecturer. Although advantages may exist for using questions which require written answers, i.e. constructed response items (Birenbaum, 1987), increasing student numbers and lecturer workloads have been associated with the use of multiple-choice questions which are marked by the system.
Previous research (Sly & Western 1998, Sly (in press)) has shown that the practice test contributes to an increase in performance. In this presentation I am reporting investigations designed to assess two possible mechanisms by which the practice test could contribute to increased performance: 1) increasing familiarity with content to be examined and 2) decreasing test anxiety.
The anxiety component of the studies was designed around self-evaluation questionnaires. The instrument used was the State-Trait Anxiety Inventory for Adults by Charles D. Spielberger (1983). It comprises separate self-report scales for measuring state and trait anxiety. The State (S)-Anxiety scale has twenty statements that are used to evaluate how the person feels "right now" and the Trait (T)-Anxiety scale has twenty statements that assess how the person feels "generally".
For the purposes of this analysis, the assessment test is divided into two equal parts. The first half is designated as A and covers the two topics previously examined by the practice test, while B, the second half, covers two topics not previously examined.
|Assessment 1 (total)||323||75.76||12.18|
The percentage mean scores in Table 1 show that students performed better on the assessment test than they did on the practice test. Further, they performed significantly better on part A of the assessment test (i.e. the topics which were included in the practice test) than on part B, which was new work. Both of these differences are statistically significant (t=-5.343 p<.001 and t=4.012 p<.001).
Group 2 - Accounting
For the purposes of this analysis, the assessment test is again divided into two parts, A and B. The assessment test covered material from 4 topic areas with 3 of those covered by the practice test. The first part of the assessment is designated as A. It (A) covers topics previously examined by the practice test, that is topics 1,2 and 3. B is the second part of the assessment and covers a topic not previously examined, that is topic 4. Part A contained 15 questions while B contained 5 questions, giving a total of 20 questions.
|Assessment 1 (total)||215||68.67||16.55|
|Assessment 1 (total)||95||73.16||15.09|
|Assessment 1 - A||95||75.93||17.21||-4.833||.001|
|Assessment 1 - B||95||64.84||21.18||4.542||.001|
Table 2 and 3 show percentage mean scores for the entire group and the practice test group, respectively. Those students who elected to sit the optional practice test performed better on the assessment test than they did on the practice test. Further, they performed significantly better on part A of the assessment test (i.e. the topics which were included in the practice test) than on part B, which was new work. Both of these differences are statistically significant (t=-4.833 p=.001 and t=4.542 p=.001).
STATEE1 was the state anxiety measure used before the assessment test. Students were asked to indicate how they felt at the time immediately prior to their test. There was no difference (student t test, t=0.164, p>0.05) on the STATEE1 for the students who sat the practice test (= 45.38, sd=11.97) and those who did not (=45.69, sd=12.19).
In order to test whether anxiety affected later performance on CML tests a bivariate correlation was performed using Pearson's correlation. The group was treated as a whole and the continuous data examined pairwise. The findings indicated no relationship between anxiety and performance. The level of the students' state anxiety (STATEP1) before the practice test did not appear to influence their performance on the practice test (r =-.076, p=.174). The level of the students' trait anxiety (TRAITP1) before the practice test also had no influence on performance on the practice test (r = -.066, p= .242). Performance on the assessment test was not affected by either the level of the students' state anxiety (STATEE1) (r = -.027, p= .655) or trait anxiety (TRAITE1)(r = -.008, p = .899) before this assessment test .
Group2 - Accounting
For those students who sat the practice test the level of trait anxiety (TRAITP1) before the practice test had no influence on performance on the practice test (r = -.133, p= .283). Performance on the assessment test was not affected by the level of the students' state anxiety (STATEE1) (r = -.100, p= .373) before this assessment test. However, for the group of student who did not sit the practice test there was a small but statistically significant relationship between state anxiety (STATEE1) and performance on the assessment test (r = -.205, p= .037).
As no student received a question on the assessment test that they had previously received on the practice test, familiarity with content from doing the practice test may not be the only reason for superior performance on part (A) of the assessment test. Students who sat the practice test received feedback on their responses, in the form of the correct answer to any question they had answered incorrectly. This feedback may have been a factor in subsequent improvement if students used the feedback to correct their own misconceptions and hence to build a knowledge structure of these particular areas. This organization and structure they had created for themselves would help to increase their performance when retested, even when they received different questions on the topics. As students had received no practice or feedback for part B questions, they had no opportunity to test their own knowledge and revise weak areas. This could account for the difference in performance on parts A and B of the assessment test. So, while content appears to be an important factor it may be closely connected to feedback, with both factors, instrumental in making the student restructure and organize their knowledge.
The practice test could also have a beneficial effect by familiarizing students with some parts of the process - test format or familiarity with the CML system. However, this seems to be a less likely explanation for improvement as the improvement would be expected to be consistent over parts A and B of the unit assessment test and this was not the case.
Analysis of the anxiety data showed no relationship between anxiety and performance for the Psychology group who had a compulsory practice test or for those Accounting students who sat the optional practice test. However, a small relationship was found for the Accounting group who chose not to sit the optional practice test. The influence of anxiety needs further investigation.
The reason for improvement on content covered in the practice test is not certain, but two possible explanations are first, that students use the feedback they receive on incorrect responses to assist in their learning of the topic, and second that increased familiarity with the CML system may contribute to a better performance the next time.
Sly, L. and Western, D. (1998). Practice tests improve student performance on computer managed learning assessments. In Black, B. and Stanley, N. (Eds), Teaching and Learning in Changing Times, 310-312. Proceedings of the 7th Annual Teaching Learning Forum, The University of Western Australia, February 1998. Perth: UWA. http://cleo.murdoch.edu.au/asu/pubs/tlf/tlf98/sly.html
Sly, L. Practice tests as formative assessment improve student performance on computer managed learning assessments. Assessment and Evaluation in Higher Education (in press).
Spielberger, C.D. (1983). State-Trait Anxiety Inventory for Adults (Form Y). Mind Garden.
|Please cite as: Sly, L. (1999). Computer testing options and benefits of a practice test. In K. Martin, N. Stanley and N. Davison (Eds), Teaching in the Disciplines/ Learning in Context, 387-390. Proceedings of the 8th Annual Teaching Learning Forum, The University of Western Australia, February 1999. Perth: UWA. http://lsn.curtin.edu.au/tlf/tlf1999/sly.html|