Since March 1995 all of our tutorial classes in Engineering Dynamics have been held in large computer rooms, using a monitored, diagnostic, online tutorial system. The form of this system has changed over the years but the basic ideas have remained the same:
We have published widely on this system and it is now attracting attention from around the world; the idea has also been taken up and used in several other courses. In this demonstration session we invite you to visit a large teaching space at UWA and try the software for yourself.
- We set the students a long sequence of carefully chosen problems;
- Each student has slightly different numerical parameters for the problems;
- Students must enter answers that are of the form "Number Units" e.g. 3.2 m/s;
- The computer software records all student activity and keeps staff informed about the state of the class;
- As far as possible we try to provide diagnostic feedback when an incorrect answer is entered;
- There is an on-line messaging system that is attached to each problem.
Venue: The Maths Computing Laboratory, first floor, Mathematics Department
Figure 1: (a) A screen shot of the login page for Dynamics 100; (b) A typical engineering problem presented by the system, showing the numerical answer.
In Dynamics 100 students had to complete about 200 problems, of which 37 were assessed. The problems were divided into 18 problem sets. Students had to complete the non-assessed problems in each set before they could attempt the assessed problems, and each set had to be completed before the next set could be started.
The introduction of this kind of computer-based tutorial led to a new kind of classroom culture. Students worked in a large computer room in the Mathematics Department (although there were other sites that would also work). The regular deadlines, along with the 20% credit for the assignments, meant that students worked consistently during the year. In turn this meant that students seemed to understand more in the lectures.
Figure 2: A visual display maintained by the server, showing the progress of each student.
When students made errors we had expected them to make the computer system responded with a useful diagnostic message, and this was then something which students did not need to ask a human tutor. The learning atmosphere in the computer room was one of active collaboration. Students worked closely together while solving the problems, and for most students this was a productive and valid learning activity. Some students abused this working approach by taking too much knowledge from friends - but this happens even under traditional tutorial methods. These two features of the system: the automated diagnostics and the collaborative environment, together led to remarkable efficiencies in teaching. A tutor was present in the computer room during advertised tutorial hours, but they were not always in high demand even though there were often 50 or more students.
If students were unable to get useful help from peers, from the automatic diagnostics, or from the tutors, there was an additional source of help. In 1997 each problem had an anchor or "hot link" that would take the student to the "forum" for the problem. Each forum was a text-only Web page which students could read or add text to. See Figure 3.
Wednesday, 3 September 1997; 2:13:58 PM; Adrian Norris is the distance AO the same as OC, even though it does not look like it? If not, what is the distance AO? insert new text here Wednesday, 3 September 1997; 2:16:02 PM; Nathan Scott AO is NOT equal to OC. OC is the radius of the wheel! insert new text here Wednesday, 3 September 1997; 3:05:34 PM; Gareth Blakey why isn't AO the radius of the wheel insert new text here Wednesday, 3 September 1997; 4:40:22 PM; Nathan Scott The problem clearly states that OA is [given value] and that the radius is [a different, larger value]. point A is NOT on the edge of the wheel but somewhat closer to the centre. insert new text here You can add text to this file: type your message in the box below and then click the button marked "Send". Remember that staff read all messages posted here.
Figure 3: Part of the Forum for the problem of Figure 1b.
The principle of the Forum was very simple but it proved to be an effective learning resource. It should be noted that students were anonymous to one another: if a student were to view the forum of Figure 3 then the names associated with the messages (except for Nathan Scott, a staff member) would be simply "Anon.". Generally the pattern of the interaction was [student question, staff response] but there were also cases where one student would answer the questions of another. In either case the forum became an enduring record of conceptual difficulties students had had with the engineering problem. Essentially staff were often able to answer a given question once. One student even made the comment that "[in the forum] I saw whole aspects of the problem that I did not realise were important". In other words, by seeing the conceptual difficulties of other students, this student had come to know about "holes" in her own understanding that she was not aware of.
It is important to note that each Forum was a separate record. If a student decided to visit a Forum this meant that he or she would only see information pertinent to a specific problem, and not a confusing mixture of (possibly irrelevant) ideas.
In 1995 (the first year of computer tutorials) there was an unusually high failure rate, but in 1996 the rate was back to its historical level. We attribute this high failure rate to early system difficulties, coupled with difficult examinations. At the time of writing the examination marks for 1997 were not yet available.
For many years we have also taken anonymous surveys of student opinion. These surveys show that (even in 1995) student approval of the computer system was high.
Figure 4: Failure rates in Engineering 100 (Dynamics) (from Faye 1997)
Figure 5: Diagnostic software in Calculus. (a) a problem in differentiation; (b) the result of clicking the "Show" button; (c) automatic diagnosis of the student's attempt (b).
The power of these diagnostics should not be underestimated. This software is able to give the student precise, English feedback about errors in an expression, even if there are several or if they are deeply embedded (see Figure 5). This software has been used for several successful courses in both calculus and statistics, and it is clear that it is a powerful tool for learning. Students can 'drill' in mathematical skills at quite a high level, and receive instant diagnostic feedback. This is normally only achievable using intensive human tuition.
We will demonstrate Kevin's software during the session in the Maths Lab. The URL is http://calmaeth.maths.uwa.edu.au/
For further information contact Kevin: firstname.lastname@example.org
We are now of the view that our first-year students actually needed a more structured tutorial environment; essentially that they needed 'parental' monitoring and guidance throughout the year. Our effort in producing animations and other 'multimedia objects' was not entirely wasted since they form the core of a valuable printed textbook for the course, and we use them as illustrations in lectures.
Efficient computer teaching can only occur when the system meets the real needs of the students.
|Please cite as: Nathan Scott and Brian Stone (1998). Web-based tutorial systems in use at UWA. In Black, B. and Stanley, N. (Eds), Teaching and Learning in Changing Times, 300-304. Proceedings of the 7th Annual Teaching Learning Forum, The University of Western Australia, February 1998. Perth: UWA. http://lsn.curtin.edu.au/tlf/tlf1998/scott.html|