Teaching and Learning Forum 99 [ Contents ]

Do errors in teaching material enhance/hinder learning?

Shri Rai
School of Information Technology
Murdoch University
We go to great lengths to remove errors from the teaching material that we use. This is done perhaps in the belief that teaching material that is error free would enable students to learn the material better. It was discovered (serendipitously) that if there were some errors in the worked tutorial exercises and instead of providing help to the students to fix the errors, the students were directed to the lecture notes and the theoretical material to fix the problem themselves, the students were better able to cope with questions which tested their understanding of the material. Unfortunately (perhaps fortunately), making things convenient for the students did not help the students. It appears that students want "instant gratification". Unfortunately, learning is a chore for most people. Do teachers have to give in to this desire for instant gratification (keeping in mind the implications of making it difficult for the students) knowing that the students may not have really understood? It appeared that the errors in the teaching material had to be domain specific (within the same discipline) to have any learning benefit. Is there any information on what learning process(es) is/are involved within each knowledge domain?


I had prepared a unit material and study guide for a database unit under a huge time constraint. Some parts of the unit material had errors in the worked examples. Some of these errors were in the syntax of a database query language SQL as well as the logic of the query itself. There were also some spelling and grammatical errors. There were not many errors - only a handful of errors in certain topics. Students were expected to study the worked examples by typing the examples out and observing the results of the query. On completion of the worked examples, the students had to complete a set of exercises that were assessed.

Some students approached me for help when they ran into trouble whilst working through the examples. I corrected the errors for them on an individual basis. The errors were not announced to the rest of the class. A number of students fixed the problem themselves. When the exercises were submitted to me for assessment, I noticed the students who had found and corrected the errors themselves had no problems answering the exercises that were related to the erroneous examples. The students who had received help with the erroneous examples had problems with the related exercises. All of them had problems with the harder exercises which were related to the error-free examples.

I decided to locate all the errors in the unit materials and observe more closely the way students worked on the examples and then the exercises. I provided help to students who asked me for help with the examples that had errors. I noticed that students would just "zoom" through the working examples and only slowed down to think when the examples did not work as expected. Those who approached me for help just corrected the example and went on to the next example. These students were also the ones who finished going through all the examples early, giving a false impression that they knew what they were doing. It was only after the exercises were assessed that I could see that they had not understood. The other students who had corrected the errors on their own appeared to understand what they were doing because they did not have much problems with the related exercises.

After a number of similar observations of the students' behavior, I decided that instead of removing the errors that had been discovered in the unit materials, I would introduce a few more deliberate errors. The other staff member who was involved with the teaching of the unit did not appreciate this, so the following semester, we came to an arrangement that I would take over the unit completely. I decided not to provide explicit help when students approached me for help with the examples. I directed them to the theory on which the examples were based and asked to them to correct the examples themselves. I discovered that the students' understanding of the material improved as shown by their attempts at the exercises.

For the next two editions of the study guide, I spent my time fine-tuning the nature and the placement of deliberate errors.


I believe that people "learn by doing". But my experience described above indicates that just "mindless doing" as illustrated by working through correct examples is not effective. I have seen and worked through a number of interactive CD-ROM based course materials as well as interactive web based courses. What I got from many of them is the "feel-good" factor - the feeling that something has been learned. I could always think of some serious question that would upset the good feeling that something has been learnt.

I do not know how important this "feel-good" factor is. I noticed that many students were frustrated when examples that were expected to work did not work as expected. They were made to think of reasons why the examples were not working. Thinking was not generally popular amongst the students. I wonder if continuous frustration of this sort hinders the desire to learn even though the students got better at fixing problems on their own? There were a small number of students for whom nothing seemed to work. They would give up at the slightest obstacle.

I noticed that the "feel-good" factor appears to influence some teachers too. I was also teaching a unit called Data Structures. Students studying this unit needed to have successfully completed a pre-requisite unit. The Data Structures unit required students to have some familiarity with a computer programming language called C. Many students (even some Computer Science majors) dislike C. Students in the Data Structures unit would run into problems with very trivial tasks. I discovered later, that students in the pre-requisite unit were being "spoon-fed" and the exercises and assignments they were doing were too simple to enable the students to learn effective problem solving skills. The students did well in the pre-requisite unit, so both the teacher and the students felt good about that unit. The students were actually learning next to nothing.

Because I was making it "difficult" for the students, some students may have complained about the teaching material and my methods. As I was teaching at a private institution, the institution's management had no qualms in asking teachers to leave because of student complaints. I was lucky that I was the only one there who had an intimate knowledge of the discipline and the software used and so if there were any complaints, I did not hear about it and I could do pretty much as I pleased. It was probably in my favour that the final results for the database unit showed an improvement over previous semesters.

It appeared the "best" errors were errors in concept. Here the examples appeared to be correct but when attempted gave the wrong results. Typing and grammatical errors did not have a good learning outcome and were generally annoying. I had to spend time working out the placement of the errors. I had to pinpoint the exact concept that I wanted the students to learn and then design a worked exercise that would illustrate the concept. Then I had to select some part of the worked exercise to introduce an error which would not be detected on cursory reading. The erroneous exercises had to be preceded with simpler but correct exercises that illustrated the concepts to be learnt. Placement of errors required some expertise with the material being taught - domain specific knowledge was needed. I was fortunate enough to have this experience. If I did not have the necessary experience with the subject, it would have been very difficult for me to design the erroneous worked examples. Then it would be easier to have only correct working examples.

After completing the worked exercises, the students had to solve other exercises (assessed) that fell in the following framework:

GeneralisationThe solution to the assessed exercise is similar to one of the worked exercise with only the names being different - the solution algorithm would be the same as the worked exercise.
SpecialisationThe solution to the assessed exercise is similar to one of the worked exercise except that some part(s) of the solution is(are) different - the solution algorithm from one of the worked solutions would have to be modified (eg. instead of calculating the total age, calculate the average age).
CompositionThe solution to the assessed exercise is built up using techniques from several of the worked exercises. This may include generalisation and specialisation of one or more worked exercises.

Understanding the learning processes involved when learning to perform tasks in the above categories would help teachers to teach students to better perform tasks in these categories. I have noticed that if the assessed exercises were not in the above framework, then the students I have encountered had much difficulty in doing the exercises. Is there a problem with this approach? I am not well informed on learning processes so I am hoping that others better informed would be able to enlighten me on what is involved. I would also appreciate pointers on learning processes within the computing domain. In the computing area that I am involved in, computers are used to build artificial models of reality. For example writing a program or using a spreadsheet to do budgeting, is trying to mimic the "conventional" tools and processes involved in this task. A budget does not exist on the computer but the computer provides this illusion and for most people illusion is the reality.

Entwistle and Entwistle (1992) in Biggs, (1996) describe 5 levels of understanding:

  1. Reproduces content from lecture notes without any clear structure.

  2. Reproduces the content within the structure used by the lecturer.

  3. Develops own structure, but solely to generate answers to anticipated exam questions.

  4. Adjusts structures from strategic reading to represent personal understanding, but also to control examination requirements.

  5. Develops an individual conception of the discipline from wide reading and reflection.
Level 1 is the shallowest and level 5 leads to the deepest level of understanding. Students in their first or second year are not generally mature enough in the discipline to be at level 5. The better students would be at level 3 and the exceptional ones at level 4. In designing the framework for the questions, my aim was to enable understanding at least at level 3. The three categories of questions should lead (I belief) to an optimal extent of recoding (Biggs and Telfer, 1987) thus leading to "positive intrinsic motivation". The degree of mismatch between what the students have learnt through the working examples and the incorrect exercises and the assessed exercises should be sufficiently large to be challenging (and/or intriguing), but not too large so that the exercises are perceived as undoable. Because the optimal level of mismatch is different for different people, the correct working exercises followed by the incorrect exercises were used as a tool to equalise the optimal level of mismatch for the students.


Entwistle, A. & Entwistle, N. (1992). Experiences of understanding in revising for degree examinations. Learning and Instruction, 2, 1-22.

Biggs, J. (1996). Assessing learning quality: Reconciling institutional, staff and educational demands. Assessment & Evaluation in Higher Education, 21, 5-15.

Biggs & Telfer (1987). The process of learning. Prentice Hall Australia, pp.114.

Please cite as: Rai, S. (1999). Do errors in teaching material enhance/hinder learning? In K. Martin, N. Stanley and N. Davison (Eds), Teaching in the Disciplines/ Learning in Context, 337-340. Proceedings of the 8th Annual Teaching Learning Forum, The University of Western Australia, February 1999. Perth: UWA. http://lsn.curtin.edu.au/tlf/tlf1999/rai.html

[ TL Forum 1999 Proceedings Contents ] [ TL Forums Index ]
HTML: Roger Atkinson, Teaching and Learning Centre, Murdoch University [rjatkinson@bigpond.com]
This URL: http://lsn.curtin.edu.au/tlf/tlf1999/rai.html
Last revision: 1 Mar 2002. The University of Western Australia
Previous URL Jan 1999 to 1 Mar 2002 http://cleo.murdoch.edu.au/asu/pubs/tlf/tlf99/ns/rai.html