Teaching and Learning Forum 95 [ Contents ]

A review of some CAL initiatives in the UK

(with particular reference to assessment)

Brian Stone
Department of Mechanical and Materials Engineering
The University of Western Australia
This paper contains a brief review of some CAL initiatives in the UK with a focus on engineering and assessment. A comparison is made of the objectives, funding and outcomes of the Australian (CAUT) programme and the UK TLTP programme. In particular the use of computers in assessment and diagnosis is considered. This assessment may be formative, evaluate process as well as product and may be used to assess students at risk from deficient study skills.


Computers are being used more and more in education and some governments are committing large sums of money to enable computer aided learning (CAL) materials to be written. The declared objectives vary from country to country. In Australia the Committee for the Advancement of University Teaching (CAUT) has, since 1993, been awarding approximately $4m per annum for grants up to $50,000 (approximately 80 awarded each year). The main objective for CAUT has been the improvement of university teaching. They have specifically not supported teaching research but have looked for "products" as outcomes. The fourth round of applications for CAUT is about to commence.

In contrast the equivalent activity in the UK is significantly different. The TLTP (Teaching and Learning Technology Programme) programme commenced in 1992 and the first phase had funds of 7.5 million a year over three years. The stated aim was to "make teaching and learning more productive and efficient by harnessing modern technology". One of the largest grants was for 1,000,000 over three years to a consortium lead by the University of Birmingham for a project to develop mathematics modules. There were several other projects given similar funding (CTISS 1993a). Forty three projects were funded with first year funding ranging from 35,000 to 435,000. A second phase of funding (CTISS 1993b) has increased the number of projects to seventy six. From the above it is evident that each project has significantly more funding than the Australian equivalent and is for three years. The main motivation appears to come from the reducing of funds available for education and the need to become more efficient. The projects are also usually awarded to consortia, and this is similar to the way in which the National Science Foundation in the USA ensures that any new material will be used on more than one campus. Consortium members have to agree in advance that they will all use the material produced.

During the latter part of 1994 the author was able to visit many of the universities involved in the UK TLTP programme (26 person/project visits). The aim of this "study tour" was to investigate computer aided assessment and become familiar with the engineering, mathematics and physics materials being produced by the TLTP. This study tour was funded by the University of Western Australia from its 1994 "Quality Money" and arose because of the author's interest in computer aided learning and computer based assessment (Devenish et al, 1993; Devenish et al, 1994a and 1994b; Lyons et al, 1994; Lwin et al, 1994; Scott et al, 1994a and 1994b; Stone, 1994).

Before focusing on assessment it is appropriate to comment on the quality of the teaching material being produced. A general conclusion was that many of those who had not prepared any computer aided teaching material previously were making heavy weather of their projects. However, the few who had several years experience were well advanced and producing high quality material. There also seemed to be some unnecessary constraints imposed because of the need to have the software used on several campuses and on both PC and Macintosh computers. As an example the QUEST (QUality in Engineering through Simulation Technology) project involving nine campuses has chosen to write all its simulation programmes in LabVIEW because it is transportable across platforms. This means that on each campus there is a significant learning curve with respect to LabVIEW before software may be produced. Not withstanding this, the project claims some significant productivity & efficiency gains.

"In January 1993, a 10-week inter-departmental simulation-based course in Solids & Materials at the University of Surrey, delivered to over 80 engineering students, demonstrated gains over traditional methods amounting to a 23% reduction in student contact time and an even greater 50% reduction in staff delivery requirement. Further, the associated shift from passive lectures to tutor-supported computer-based workshops represented a real enhancement in the quality of provision" (Cartwright, 1993).
When it comes to assessment much excellent work has been published in the UK, for example Gibbs & Habershaw, 1990; Ellison, 1992 and Gibbs, 1992. It was therefore expected that assessment would be a major thrust of TLTP projects and many of the projects did include some multiple choice questions. However, these questions were embedded in software that never changed and were subject to the normal limitations of this form of assessment. The author was looking for more advanced and helpful approaches using diagnostic assessment (see Devenish, 1994a) and there were only a few of these. However, there were two projects that at first sight did not appear relevant but were to prove very interesting in providing a different perspective on assessment.

Diagnostic assessment

During the course of the visits it was apparent that there was a wide range of understanding of the phrase "diagnostic assessment". The most common understanding was that the diagnosis resulting from the assessment would indicate whether a student did or did not understand particular topics. It was not expected that any indication would be given of the misunderstandings that cause the incorrect answer.

This is significantly different from the objective of the work being undertaken by the author and his colleagues. This aims at using an Intelligent Computer Tutor to allow students to assess their own understanding. The questions are not multiple choice and an incorrect answer produces a suggestion of the possible misunderstanding that produced it. Immediate feedback is thus given and appropriate help provided. The assessment with immediate diagnosis thus becomes an integral part of the learning process, i.e. it is formative assessment.

The most advanced TLTP project using diagnostic assessment is called DIAGNOSYS and is a computer based test for basic mathematics skills (Appleby, 1994). A prototype was used in the Autumn of 1993 involving 650 students at three of the north east Universities. The current version is intended for release in the Spring of 1995. The main objective is to be able to test large numbers of students and provide qualitative assessment of basic mathematical skills. The test runs under DOS 3.0 with a time limit if requested. Various question styles are available including multiple choice, algebraic entry etc. A results file is produced and four processing programs are used to produce profiles of individual students and a group of students; a ranking of a group of students and a text file suitable to be used with a spreadsheet package. The individual profile includes the list of skills which the student is judged to understand; the skills judged not to be understood; skills not tested due to the time limit; directions to resources for assistance. Similar information on a group can be obtained. It is anticipated that the package will be particularly useful where the quality of the incoming students is below average and it is important to diagnose any deficiencies in knowledge and understanding as early as possible.

Assessment of transferable skills

One of the unexpected bonuses of the study tour was the visit to the University of East Anglia and the project, "Computer aided Assessment of Transferable Skills (CATS)". This is a single institution project under the leadership of Dr Roy Dowsing (see Dowsing et al 1995). The transferable skill being assessed is that of word processing and the motivation came from the large number of students being taught and the desire to give them a formal qualification. This project was of great interest to the author as it was the only one that was found to be considering process as well as product. Most assessment software examines an answer or answers (product) and is not able to examine the method and steps used (process). At East Anglia a very well informed (see Dowsing for an excellent list of relevant literature) project team is making substantial progress towards assessing process as well as product skills in word processing.

This is achieved in various ways but essentially all keystrokes are recorded for any assigned task. Thus the process by which a student achieves the product is known. The current version of the software package, called MacCATS, uses typically a window with a model document and another with the candidate's document which is required to be edited to the same form as the model. The model and candidate files include all the information about each character. An overall count of differences may be made and the product assessed. When it comes to process assessment it is possible to count the various keystrokes made and return comparative statistics using multiple correct solution paths. This process is (September 1994) in the early stages of implementation. One of the perceived problems is what they call "noise" and could be a typing error which is subsequently noted and corrected. The product is correct but the process affected by this noise. Sophisticated methods are being investigated to distinguish this noise from a process fault. The interested reader is strongly advised to read the paper by Dowsing et al (1995).

Causes of misconceptions

One of the earliest projects visited was also to prove one of the most interesting. The project being undertaken at the Centre for Research into Learning and Instruction at the University of Edinburgh is entitled, "Identifying and advising students at risk from deficient study skills: a computer based package for departments". This was particularly appropriate as it starts one stage back from assessment of knowledge and understanding gained on a particular course. The objective is to test students to determine if they have weaknesses in a range of basic skills such as taking notes in lectures, using tutorials properly, labs etc. Unlike all the other projects visited most of the software has been written by a professional programmer. Three packages using HyperCard on the Macintosh are available and are appropriate for the whole range of subjects taught in first year at university.

The first is a set of cards constituting a questionnaire (a sheet of the questions is also available and the responses have then to be entered subsequently to the computer). This questionnaire is used with first year students after their first 5 weeks and is aimed at giving an early warning of problems and not waiting until the end of semester examinations. The questions are carefully designed as described by Entwistle and others ( Entwistle & Ramsden, 1983 and Entwistle & Tait, 1990). As an aside, it is important to note that it has been found that just filling in the questionnaire has the benefit of making students think and an important outcome is that students realise that they are not unique in having difficulties.

The second package takes the results of the questionnaires and produces an output called "Student View" for staff. This produces 3D representations of the class profile with respect to Deep, Surface and Strategic approaches to learning. This may be used to show all the students or a group or an individual. Information can be presented as 2D graphs. Students can also have entered comments and the lecturer can immediately observe class or individual difficulties and take appropriate action.

The third package is a HyperCard study skills based advisor called "Student Advisor". Each student can import their own survey results and get specific help. The Student Advisor has 400 cards with appropriate advice. Students can note selected cards for subsequent printing.


The use of computers in teaching, learning and assessment is sure to increase with the amount of funding being provided for the development of software. As pressure on education funding increases it seems inevitable that to a certain degree some teaching functions will be replaced by computers. The area that appears to have the greatest potential is that of assessment and particularly formative assessment utilising immediate feedback. This assessment should be designed to test for both process as well as product. In this respect the work at East Anglia is potentially very important.

Also it is important to avoid being short sighted and assess only after misunderstandings have been acquired. The work at Edinburgh, in that it assesses if study skills are adequate, is potentially very significant. We need to know our class profile near to the commencement of a course so that we may modify our teaching accordingly and provide help to students at risk.

The author has benefited greatly from the interactions with academics in the UK and is currently implementing formative assessment in first year engineering at the University of Western Australia.


The author wishes to thank the University of Western Australia for funding the study tour of the UK; Sarah Turpin the Coordinator of the TLTP programme for assistance in arranging visits; the many academics in the UK who were hospitable and informative hosts; CAUT for a grant allowing the investigation of diagnostic assessment; David Devenish and Rodney Entwistle for their collaboration on the CAUT grants; finally thanks are due to Nathan Scott who turns ideas into realities.


Appleby, J. (1994). The DIAGNOSYS System, a product of the North-East Mathematics Project. Available from Dr. J Appleby, Department of Engineering Mathematics, Newcastle University.

Cartwright, A. (1993). QUEST newsheet November 1993, available from Dr. Tony Cartwright, Centre for Educational Technology, University of Surrey.

CTISS (1993a). Report on 43 Projects funded by the UFC and DENI under the Teaching and Learning Technology Programme. Published by CTISS on behalf of the HEFCE in conjunction with SHEFC, HEFCW and DENI. Available from External Relations Team, HEFCE, Northavon House, Coldharbour Lane, Bristol BS16 1QD.

CTISS (1993b). A report on 33 additional projects, Phase II, funded by HEFCE, SHEFC, HEFCW and DENI. Available from External Relations Team, HEFCE, Northavon House, Coldharbour Lane, Bristol BS16 1QD.

Devenish, D. G., Entwistle, R. D. and Stone, B. J. (1993). An interactive course in Engineering Dynamics. Teaching Forum, Sharing Quality Practice, Curtin University of Technology, February 1993, pp311-319.

Devenish, D. G., Entwistle, R. D., Scott, N. and Stone, B. J. (1994a). Computer-Based Assessment In Engineering Teaching. Australasian Association for Engineering Education Conference, Sydney. pp682-686.

Devenish, D. G., Entwistle, R. D., Scott, N. and Stone, B. J. (1994b). A Computer Package For Teaching Curvilinear Motion. Australasian Association for Engineering Education Conference, Sydney. pp718-721.

Dowsing, R. D., Long, S. and Sleep, M. R. (1995). An interactive system for assessing word processing skills. To be published.

Ellison, E. G. (1992). Assessment Methods in Engineering Degree Courses. Discussion Document prepared by the (UK) Engineering Professors' Conference, No. 5, December 1992.

Entwistle, N. J. and Ramsden, P. (1983). Understanding Student Learning. London: Croom Helm.

Entwistle, N. J. and Tait, H. (1990). Approaches to learning, evaluations of teaching, and preferences for contrasting environments. Higher Education, 19, 169-194.

Gibbs, G. and Habershaw, T. (1990). An introduction to assessment. Induction pack IV. Published by The Standing Conference on Educational Development, Birmingham UK.

Gibbs, G. (1992). Assessing more students. No. 4 in Series from the Teaching More Students Project, The Polytechnics and Colleges Funding Council.

Lwin, D. T., Chua, P. and Stone, B. J. (1994). A Computer Assisted Learning Package for Sequential Control. IEEE First International Conference on Multi-Media Engineering Education, The University of Melbourne, pp256-262.

Lyons, M., Li, X., Scott, N. and Stone, B. J. (1994). A Student Controlled Vibration Laboratory. Australasian Association for Engineering Education Conference, Sydney. pp385-388.

Scott, N., Devenish, D. G., Entwistle, R. D. and Stone, B. J. (1994a). Dynamic Teaching Solutions. IEEE First International Conference on Multi-Media Engineering Education, The University of Melbourne, pp419-427.

Scott, N., Devenish, D. G., Entwistle, R. D. and Stone, B. J. (1994b). Computer-based error detection in engineering dynamics education. Proceedings ASEE '94 Conference. Edmonton, Canada, pp2481-2485.

Stone, B. J. (1994). Experiences in the Use of Computers for Teaching. Australasian Association for Engineering Education Conference, Sydney. pp511-516.

Please cite as: Stone, B. J. (1995). A review of some CAL initiatives in the UK (with particular reference to assessment). In Summers, L. (Ed), A Focus on Learning, p245-249. Proceedings of the 4th Annual Teaching Learning Forum, Edith Cowan University, February 1995. Perth: Edith Cowan University. http://lsn.curtin.edu.au/tlf/tlf1995/stone.html

[ TL Forum 1995 Proceedings Contents ] [ TL Forums Index ]
HTML: Roger Atkinson, Teaching and Learning Centre, Murdoch University [rjatkinson@bigpond.com]
This URL: http://lsn.curtin.edu.au/tlf/tlf1995/stone.html
Last revision: 19 Apr 2002. Edith Cowan University
Previous URL 8 Mar 1997 to 19 Apr 2002 http://cleo.murdoch.edu.au/asu/pubs/tlf/tlf95/stone245.html