Teaching and Learning Forum 99 [ Contents ]

Computer testing options and benefits of a practice test

Leith Sly
Centre for Educational Advancement
Curtin University of Technology
Curtin University of Technology operates a computer managed learning (CML) package that gives students the option of undertaking a practice test before the unit assessment. This test is designed predominately to contribute to student learning and, as such, is considered formative. This presentation will focus both on current use and research into the benefits of the practice test at Curtin University of Technology. Currently some lecturers make the practice test compulsory and others make it optional. Some allow multiple attempts on practice tests before students proceed to the unit test. Often content only covers part of the assessed test. Research to date has focused on how the practice test can affect performance on subsequent computer managed learning assessed tests and which of the aspects contribute to this. Students who perform the practice test improve their mark on the subsequent assessment tests. These data suggest that practice tests should be incorporated into all CML assessments.

Introduction

With large numbers of student assessments to be managed there are advantages for lecturers to use computer managed learning (CML) systems for assessment and student tracking. If lecturers are to use CML as one part of their assessment program, it is important that they use effectively all of the information available. Computer managed learning systems are software packages with several common features, including the ability to generate tests from banks of questions, marking of tests generated, statistical analysis of the results, keeping records of students' marks and progress. Feedback to students on their responses is common.

CML systems usually have the ability to hold many banks of questions, each of which can be categorized into smaller components, such as topics, learning outcomes or objectives. Often questions can also be coded on a variety of characteristics, for example, degree of difficulty and cognitive content.

The key function of a CML system is to generate tests. It has a template or a coursemap where the parameters used to generate any test are stored. Options for the generation of any test can vary. Tests may be generated using the same questions for all students, others may be set to select questions randomly but take them from specific parts of the bank of questions. In this case, students are more likely to have unique tests, depending on the number of questions available for selection.

Most systems also have the ability to give students immediate feedback on incorrect answers, and supply them with the correct response. Because systems retain students' answers to questions, they are able to produce a report of the distribution of question responses and students' marks. This feedback can alert lecturers to problem areas, both from the point of view of question quality and also from the perspective of student performance.

The computer managed learning system in use at Curtin University uses predominately multiple-choice questions, drawn at random from a test bank according to parameters requested by the lecturer. Although advantages may exist for using questions which require written answers, i.e. constructed response items (Birenbaum, 1987), increasing student numbers and lecturer workloads have been associated with the use of multiple-choice questions which are marked by the system.

Previous research (Sly & Western 1998, Sly (in press)) has shown that the practice test contributes to an increase in performance. In this presentation I am reporting investigations designed to assess two possible mechanisms by which the practice test could contribute to increased performance: 1) increasing familiarity with content to be examined and 2) decreasing test anxiety.

Study design

Two subject areas were investigated, Psychology and Accounting. To examine the effect of prior exposure to content, student performance on a unit assessment test that had been preceded by a practice test was investigated. The Psychology group had a compulsory practice test while the Accounting group had an optional practice test. To examine the influence of the practice test on anxiety, state and trait anxiety were measured before the practice test and before the unit assessment test.

The anxiety component of the studies was designed around self-evaluation questionnaires. The instrument used was the State-Trait Anxiety Inventory for Adults by Charles D. Spielberger (1983). It comprises separate self-report scales for measuring state and trait anxiety. The State (S)-Anxiety scale has twenty statements that are used to evaluate how the person feels "right now" and the Trait (T)-Anxiety scale has twenty statements that assess how the person feels "generally".

Results

1. Content

Group1 - Psychology

For the purposes of this analysis, the assessment test is divided into two equal parts. The first half is designated as A and covers the two topics previously examined by the practice test, while B, the second half, covers two topics not previously examined.

Table 1: Mean scores on the CML assessment component - Psychology

Test Component parts NMeanSD

Practice
32370.9016.44
Assessment 1 (total)
32375.7612.18
Assessment 1 A32177.5112.95
Assessment 1 B32173.9815.97

The percentage mean scores in Table 1 show that students performed better on the assessment test than they did on the practice test. Further, they performed significantly better on part A of the assessment test (i.e. the topics which were included in the practice test) than on part B, which was new work. Both of these differences are statistically significant (t=-5.343 p<.001 and t=4.012 p<.001).

Group 2 - Accounting

For the purposes of this analysis, the assessment test is again divided into two parts, A and B. The assessment test covered material from 4 topic areas with 3 of those covered by the practice test. The first part of the assessment is designated as A. It (A) covers topics previously examined by the practice test, that is topics 1,2 and 3. B is the second part of the assessment and covers a topic not previously examined, that is topic 4. Part A contained 15 questions while B contained 5 questions, giving a total of 20 questions.

Table 2: Mean scores on the CML assessment component - Accounting

Test Component parts NMeanSD

Practice
9665.0018.97
Assessment 1 (total)
21568.6716.55
Assessment 1 A21571.5717.68
Assessment 1 B21560.0023.44

Table 3: Practice test group - Accounting

Task NMeanSDTp

Practice 9565.1619.01

Assessment 1 (total) 9573.1615.09

Assessment 1 - A 9575.9317.21-4.833.001
Assessment 1 - B 9564.8421.184.542.001

Table 2 and 3 show percentage mean scores for the entire group and the practice test group, respectively. Those students who elected to sit the optional practice test performed better on the assessment test than they did on the practice test. Further, they performed significantly better on part A of the assessment test (i.e. the topics which were included in the practice test) than on part B, which was new work. Both of these differences are statistically significant (t=-4.833 p=.001 and t=4.542 p=.001).

2. Anxiety

Group1 - Psychology

STATEE1 was the state anxiety measure used before the assessment test. Students were asked to indicate how they felt at the time immediately prior to their test. There was no difference (student t test, t=0.164, p>0.05) on the STATEE1 for the students who sat the practice test (= 45.38, sd=11.97) and those who did not (=45.69, sd=12.19).

In order to test whether anxiety affected later performance on CML tests a bivariate correlation was performed using Pearson's correlation. The group was treated as a whole and the continuous data examined pairwise. The findings indicated no relationship between anxiety and performance. The level of the students' state anxiety (STATEP1) before the practice test did not appear to influence their performance on the practice test (r =-.076, p=.174). The level of the students' trait anxiety (TRAITP1) before the practice test also had no influence on performance on the practice test (r = -.066, p= .242). Performance on the assessment test was not affected by either the level of the students' state anxiety (STATEE1) (r = -.027, p= .655) or trait anxiety (TRAITE1)(r = -.008, p = .899) before this assessment test .

Group2 - Accounting

For those students who sat the practice test the level of trait anxiety (TRAITP1) before the practice test had no influence on performance on the practice test (r = -.133, p= .283). Performance on the assessment test was not affected by the level of the students' state anxiety (STATEE1) (r = -.100, p= .373) before this assessment test. However, for the group of student who did not sit the practice test there was a small but statistically significant relationship between state anxiety (STATEE1) and performance on the assessment test (r = -.205, p= .037).

Discussion

Students who sat a practice test improved their performance on the part of the assessment test that covered content that had previously been examined by the practice test. In the Psychology group this was on the first half of the assessed test while in the Accounting group this was on the first three-quarters of the assessed test.

As no student received a question on the assessment test that they had previously received on the practice test, familiarity with content from doing the practice test may not be the only reason for superior performance on part (A) of the assessment test. Students who sat the practice test received feedback on their responses, in the form of the correct answer to any question they had answered incorrectly. This feedback may have been a factor in subsequent improvement if students used the feedback to correct their own misconceptions and hence to build a knowledge structure of these particular areas. This organization and structure they had created for themselves would help to increase their performance when retested, even when they received different questions on the topics. As students had received no practice or feedback for part B questions, they had no opportunity to test their own knowledge and revise weak areas. This could account for the difference in performance on parts A and B of the assessment test. So, while content appears to be an important factor it may be closely connected to feedback, with both factors, instrumental in making the student restructure and organize their knowledge.

The practice test could also have a beneficial effect by familiarizing students with some parts of the process - test format or familiarity with the CML system. However, this seems to be a less likely explanation for improvement as the improvement would be expected to be consistent over parts A and B of the unit assessment test and this was not the case.

Analysis of the anxiety data showed no relationship between anxiety and performance for the Psychology group who had a compulsory practice test or for those Accounting students who sat the optional practice test. However, a small relationship was found for the Accounting group who chose not to sit the optional practice test. The influence of anxiety needs further investigation.

Conclusion

The results of the studies suggests that the practice test makes a difference, and prior exposure to content helps to improve student performance on subsequent CML tests. Anxiety appears to have a small effect on CML test performance in a specific case.

The reason for improvement on content covered in the practice test is not certain, but two possible explanations are first, that students use the feedback they receive on incorrect responses to assist in their learning of the topic, and second that increased familiarity with the CML system may contribute to a better performance the next time.

Bibliography

Birenbaum, M., & Tatsuoka, K. K. (1987). Open-ended versus multiple-choice response formats - it does make a difference for diagnostic purposes. Applied Psychological Measurement, 11, 385-395.

Sly, L. and Western, D. (1998). Practice tests improve student performance on computer managed learning assessments. In Black, B. and Stanley, N. (Eds), Teaching and Learning in Changing Times, 310-312. Proceedings of the 7th Annual Teaching Learning Forum, The University of Western Australia, February 1998. Perth: UWA. http://cleo.murdoch.edu.au/asu/pubs/tlf/tlf98/sly.html

Sly, L. Practice tests as formative assessment improve student performance on computer managed learning assessments. Assessment and Evaluation in Higher Education (in press).

Spielberger, C.D. (1983). State-Trait Anxiety Inventory for Adults (Form Y). Mind Garden.

Please cite as: Sly, L. (1999). Computer testing options and benefits of a practice test. In K. Martin, N. Stanley and N. Davison (Eds), Teaching in the Disciplines/ Learning in Context, 387-390. Proceedings of the 8th Annual Teaching Learning Forum, The University of Western Australia, February 1999. Perth: UWA. http://lsn.curtin.edu.au/tlf/tlf1999/sly.html


[ TL Forum 1999 Proceedings Contents ] [ TL Forums Index ]
HTML: Roger Atkinson, Teaching and Learning Centre, Murdoch University [rjatkinson@bigpond.com]
This URL: http://lsn.curtin.edu.au/tlf/tlf1999/sly.html
Last revision: 1 Mar 2002. The University of Western Australia
Previous URL 23 Jan 1999 to 1 Mar 2002 http://cleo.murdoch.edu.au/asu/pubs/tlf/tlf99/ns/sly.html