This paper contains a brief review of some CAL initiatives in the UK with a focus on engineering and assessment. A comparison is made of the objectives, funding and outcomes of the Australian (CAUT) programme and the UK TLTP programme. In particular the use of computers in assessment and diagnosis is considered. This assessment may be formative, evaluate process as well as product and may be used to assess students at risk from deficient study skills.
In contrast the equivalent activity in the UK is significantly different. The TLTP (Teaching and Learning Technology Programme) programme commenced in 1992 and the first phase had funds of £7.5 million a year over three years. The stated aim was to "make teaching and learning more productive and efficient by harnessing modern technology". One of the largest grants was for £1,000,000 over three years to a consortium lead by the University of Birmingham for a project to develop mathematics modules. There were several other projects given similar funding (CTISS 1993a). Forty three projects were funded with first year funding ranging from £35,000 to £435,000. A second phase of funding (CTISS 1993b) has increased the number of projects to seventy six. From the above it is evident that each project has significantly more funding than the Australian equivalent and is for three years. The main motivation appears to come from the reducing of funds available for education and the need to become more efficient. The projects are also usually awarded to consortia, and this is similar to the way in which the National Science Foundation in the USA ensures that any new material will be used on more than one campus. Consortium members have to agree in advance that they will all use the material produced.
During the latter part of 1994 the author was able to visit many of the universities involved in the UK TLTP programme (26 person/project visits). The aim of this "study tour" was to investigate computer aided assessment and become familiar with the engineering, mathematics and physics materials being produced by the TLTP. This study tour was funded by the University of Western Australia from its 1994 "Quality Money" and arose because of the author's interest in computer aided learning and computer based assessment (Devenish et al, 1993; Devenish et al, 1994a and 1994b; Lyons et al, 1994; Lwin et al, 1994; Scott et al, 1994a and 1994b; Stone, 1994).
Before focusing on assessment it is appropriate to comment on the quality of the teaching material being produced. A general conclusion was that many of those who had not prepared any computer aided teaching material previously were making heavy weather of their projects. However, the few who had several years experience were well advanced and producing high quality material. There also seemed to be some unnecessary constraints imposed because of the need to have the software used on several campuses and on both PC and Macintosh computers. As an example the QUEST (QUality in Engineering through Simulation Technology) project involving nine campuses has chosen to write all its simulation programmes in LabVIEW because it is transportable across platforms. This means that on each campus there is a significant learning curve with respect to LabVIEW before software may be produced. Not withstanding this, the project claims some significant productivity & efficiency gains.
"In January 1993, a 10-week inter-departmental simulation-based course in Solids & Materials at the University of Surrey, delivered to over 80 engineering students, demonstrated gains over traditional methods amounting to a 23% reduction in student contact time and an even greater 50% reduction in staff delivery requirement. Further, the associated shift from passive lectures to tutor-supported computer-based workshops represented a real enhancement in the quality of provision" (Cartwright, 1993).When it comes to assessment much excellent work has been published in the UK, for example Gibbs & Habershaw, 1990; Ellison, 1992 and Gibbs, 1992. It was therefore expected that assessment would be a major thrust of TLTP projects and many of the projects did include some multiple choice questions. However, these questions were embedded in software that never changed and were subject to the normal limitations of this form of assessment. The author was looking for more advanced and helpful approaches using diagnostic assessment (see Devenish, 1994a) and there were only a few of these. However, there were two projects that at first sight did not appear relevant but were to prove very interesting in providing a different perspective on assessment.
This is significantly different from the objective of the work being undertaken by the author and his colleagues. This aims at using an Intelligent Computer Tutor to allow students to assess their own understanding. The questions are not multiple choice and an incorrect answer produces a suggestion of the possible misunderstanding that produced it. Immediate feedback is thus given and appropriate help provided. The assessment with immediate diagnosis thus becomes an integral part of the learning process, i.e. it is formative assessment.
The most advanced TLTP project using diagnostic assessment is called DIAGNOSYS and is a computer based test for basic mathematics skills (Appleby, 1994). A prototype was used in the Autumn of 1993 involving 650 students at three of the north east Universities. The current version is intended for release in the Spring of 1995. The main objective is to be able to test large numbers of students and provide qualitative assessment of basic mathematical skills. The test runs under DOS 3.0 with a time limit if requested. Various question styles are available including multiple choice, algebraic entry etc. A results file is produced and four processing programs are used to produce profiles of individual students and a group of students; a ranking of a group of students and a text file suitable to be used with a spreadsheet package. The individual profile includes the list of skills which the student is judged to understand; the skills judged not to be understood; skills not tested due to the time limit; directions to resources for assistance. Similar information on a group can be obtained. It is anticipated that the package will be particularly useful where the quality of the incoming students is below average and it is important to diagnose any deficiencies in knowledge and understanding as early as possible.
This is achieved in various ways but essentially all keystrokes are recorded for any assigned task. Thus the process by which a student achieves the product is known. The current version of the software package, called MacCATS, uses typically a window with a model document and another with the candidate's document which is required to be edited to the same form as the model. The model and candidate files include all the information about each character. An overall count of differences may be made and the product assessed. When it comes to process assessment it is possible to count the various keystrokes made and return comparative statistics using multiple correct solution paths. This process is (September 1994) in the early stages of implementation. One of the perceived problems is what they call "noise" and could be a typing error which is subsequently noted and corrected. The product is correct but the process affected by this noise. Sophisticated methods are being investigated to distinguish this noise from a process fault. The interested reader is strongly advised to read the paper by Dowsing et al (1995).
The first is a set of cards constituting a questionnaire (a sheet of the questions is also available and the responses have then to be entered subsequently to the computer). This questionnaire is used with first year students after their first 5 weeks and is aimed at giving an early warning of problems and not waiting until the end of semester examinations. The questions are carefully designed as described by Entwistle and others ( Entwistle & Ramsden, 1983 and Entwistle & Tait, 1990). As an aside, it is important to note that it has been found that just filling in the questionnaire has the benefit of making students think and an important outcome is that students realise that they are not unique in having difficulties.
The second package takes the results of the questionnaires and produces an output called "Student View" for staff. This produces 3D representations of the class profile with respect to Deep, Surface and Strategic approaches to learning. This may be used to show all the students or a group or an individual. Information can be presented as 2D graphs. Students can also have entered comments and the lecturer can immediately observe class or individual difficulties and take appropriate action.
The third package is a HyperCard study skills based advisor called "Student Advisor". Each student can import their own survey results and get specific help. The Student Advisor has 400 cards with appropriate advice. Students can note selected cards for subsequent printing.
Also it is important to avoid being short sighted and assess only after misunderstandings have been acquired. The work at Edinburgh, in that it assesses if study skills are adequate, is potentially very significant. We need to know our class profile near to the commencement of a course so that we may modify our teaching accordingly and provide help to students at risk.
The author has benefited greatly from the interactions with academics in the UK and is currently implementing formative assessment in first year engineering at the University of Western Australia.
Cartwright, A. (1993). QUEST newsheet November 1993, available from Dr. Tony Cartwright, Centre for Educational Technology, University of Surrey.
CTISS (1993a). Report on 43 Projects funded by the UFC and DENI under the Teaching and Learning Technology Programme. Published by CTISS on behalf of the HEFCE in conjunction with SHEFC, HEFCW and DENI. Available from External Relations Team, HEFCE, Northavon House, Coldharbour Lane, Bristol BS16 1QD.
CTISS (1993b). A report on 33 additional projects, Phase II, funded by HEFCE, SHEFC, HEFCW and DENI. Available from External Relations Team, HEFCE, Northavon House, Coldharbour Lane, Bristol BS16 1QD.
Devenish, D. G., Entwistle, R. D. and Stone, B. J. (1993). An interactive course in Engineering Dynamics. Teaching Forum, Sharing Quality Practice, Curtin University of Technology, February 1993, pp311-319.
Devenish, D. G., Entwistle, R. D., Scott, N. and Stone, B. J. (1994a). Computer-Based Assessment In Engineering Teaching. Australasian Association for Engineering Education Conference, Sydney. pp682-686.
Devenish, D. G., Entwistle, R. D., Scott, N. and Stone, B. J. (1994b). A Computer Package For Teaching Curvilinear Motion. Australasian Association for Engineering Education Conference, Sydney. pp718-721.
Dowsing, R. D., Long, S. and Sleep, M. R. (1995). An interactive system for assessing word processing skills. To be published.
Ellison, E. G. (1992). Assessment Methods in Engineering Degree Courses. Discussion Document prepared by the (UK) Engineering Professors' Conference, No. 5, December 1992.
Entwistle, N. J. and Ramsden, P. (1983). Understanding Student Learning. London: Croom Helm.
Entwistle, N. J. and Tait, H. (1990). Approaches to learning, evaluations of teaching, and preferences for contrasting environments. Higher Education, 19, 169-194.
Gibbs, G. and Habershaw, T. (1990). An introduction to assessment. Induction pack IV. Published by The Standing Conference on Educational Development, Birmingham UK.
Gibbs, G. (1992). Assessing more students. No. 4 in Series from the Teaching More Students Project, The Polytechnics and Colleges Funding Council.
Lwin, D. T., Chua, P. and Stone, B. J. (1994). A Computer Assisted Learning Package for Sequential Control. IEEE First International Conference on Multi-Media Engineering Education, The University of Melbourne, pp256-262.
Lyons, M., Li, X., Scott, N. and Stone, B. J. (1994). A Student Controlled Vibration Laboratory. Australasian Association for Engineering Education Conference, Sydney. pp385-388.
Scott, N., Devenish, D. G., Entwistle, R. D. and Stone, B. J. (1994a). Dynamic Teaching Solutions. IEEE First International Conference on Multi-Media Engineering Education, The University of Melbourne, pp419-427.
Scott, N., Devenish, D. G., Entwistle, R. D. and Stone, B. J. (1994b). Computer-based error detection in engineering dynamics education. Proceedings ASEE '94 Conference. Edmonton, Canada, pp2481-2485.
Stone, B. J. (1994). Experiences in the Use of Computers for Teaching. Australasian Association for Engineering Education Conference, Sydney. pp511-516.
|Please cite as: Stone, B. J. (1995). A review of some CAL initiatives in the UK (with particular reference to assessment). In Summers, L. (Ed), A Focus on Learning, p245-249. Proceedings of the 4th Annual Teaching Learning Forum, Edith Cowan University, February 1995. Perth: Edith Cowan University. http://lsn.curtin.edu.au/tlf/tlf1995/stone.html|