Teaching and Learning Forum 95 [ Contents ]

Masqued meanings: Student evaluation of teaching

Fran Crawford and Sabina Leitmann
School of Social Work
Curtin University
What does it mean when 81% of students state that there was high pressure on them in our course, when only 48% thought that the work load was heavy? This was one of the results obtained from the Course Experience Questionnaire (Ramsden 1991) recently completed by first year Social Work students at Curtin University of Technology.

The use of standardised student survey is being encouraged by key decision makers in DEET and universities as an evaluative tool to assess and improve teaching and course quality. This strategy resonates well with current organisational wisdom that in evaluating the quality of a service or product the involvement of customers is required.

Such surveys are arguably a valid and reliable index to comparatively measure the quality of teaching in any degree program. Their standardised nature however limits their use in taking action for improvement. These surveys provide no more than a global index of perceived teaching quality. They are blunt tools by which to enhance academics' understanding of what is required to improve their teaching.

The paper reports and reflects on how two lecturers from the School of Social Work at Curtin University of Technology use the CEQ results as a springboard to engage in a collaborative process of inquiry with students. The range of meanings embedded in student evaluation are mapped as we seek to make explicit understandings of 'good teaching and good courses' with regard to this particular course. From this mapping we have become conscious of the "differing reasonings" students use in evaluating their course experience. Using a frame developed by Kemmis (1994) we suggest that practitioner course development needs to consider "differing reasonings" in changing for quality.


Introduction

The Course Experience Questionnaire (Ramsden, 1991) has been administered by Curtin University to first year students since 1993 as a standard assessment of teaching quality across university courses. Our project used the results of the CEQ as a springboard to conduct a focus group in which first year social work students detailed meanings behind their obtained scores. We hoped thereby to map student perceptions as to what facilitates and constrains teaching and learning in order to improve the quality of teaching in the first year of the social work course at Curtin.

Standardised questionnaires to measure the quality of teaching of individual lecturers; particular degree programs; or complete institutions are commonly used (Marsh, 1987). In consequence some have concluded that an objective index for measuring teaching quality exists free of context. Rather as Ramsden (1993:93) cautions, student's assessment of teaching effectiveness must always be contextualised and are perceptions not objective ratings. Measures of teaching quality need to be sensitive to a range of factors, including the discipline being taught, the degree of familiarity the learners have with that discipline and the nature of the student body (Nightingale and O'Neil, 1994; Neumann, 1994).

Acknowledging that "quality is a messy variable" (Cross, 1994, 1) this project aims to unpack some of the reasoning behind the evaluation a group of first year social work students gave of their course experience. Listening to students in this focused manner could strengthen our ongoing reflective practice of constructing, in concert with other stakeholders, a quality course experience.

Quality assurance and the use of performance indicators: The CEQ

The Higher Education Council has declared that the 1990s will be 'the decade of quality' (1992:7) . Analysts of such rhetoric ( Nightingale and O'Neil, 1994: Beecher, 1994; Curtis, 1994; Sachs, 1994;) argue that the quality agenda is primarily driven by economic and political requirements of the Federal government and has concerned itself more with meeting the demands of external accountability than with internal quality improvements. Responding to this demand, there has been a growth in the development of quantitative performance indicators, particularly those measuring quality of teaching. Research historically has been the focus of quality assessment in higher education in Australia (Becher, 1994). Expanded access to university education in recent years and a strong agenda of economic rationalism, has produced this political focus on the quality of university teaching (Cross, 1994).

The Australian Performance Indicators in Higher Education Research Project (Linke 1990), funded by DEET, recommended a national trial of the CEQ as part of its program of testing potential performance indicators for standardised application across institutions. The CEQ was designed to measure the quality of teaching of academic organisational units--whether these be degree programs, departments or faculties (Ramsden 1991). Respondents indicate their agreement or disagreement on a five point measure of 30 statements. The statements fall into one of five scales identified in previous research as reflecting dimensions of effective instruction within higher education: Good Teaching: Clear Goals and Standards; Appropriate Workload; Appropriate Assessment; and Emphasis on Student Independence and Choice (Appendix 1).

Ramsden (1991:133) states that 'CEQ's guiding principle ... was a requirement to produce - as economically as possible - quantitative data which permit ordinal ranking of units in different institutions, within comparable subjects areas, in terms of perceived quality teaching.' For reasons of validity, Ramsden (1991:145) cautions against the use of the CEQ as a global measurement of teaching performance between institutions or between different fields of study. The CEQ is best used to provides feedback to institutional decision-makers on the performance of each internal department relative to equivalent departments in other institutions.

The study methodology

In 1994 Curtin University surveyed all first year students across all degree programs. The results were incorporated in the quality portfolio submitted to the Committee for Quality Assurance in Higher Education. Additionally the result were forwarded to each School, with no general requirement that Schools respond or take action on the data. This stands in sharp contrast to Ramsden's( 1991:149) conclusion that 'evidence of how a course or department responds to the data... might be regarded as one of the most important indexes of its educational effectiveness.'

If the primary concern of administration was to satisfy the scrutiny of external powerholders, the provision of this data to our School seemed a fortuitous opportunity to research from a locally controlled quality improvement perspective (Sachs, 1994). In 1994 we independently administered the CEQ to 48 students enrolled in the first year of the Social Work program (61% of total student enrolment in the Behavioural Science unit - a core social work unit). All 48 respondents were then invited to attend a focus group to explore in greater depth the survey results. It was a measure of student interest in the project that 25 students ( 24 females and 1 male) indicated a willingness to attend this focus group to be held after semester finished. Sixteen female students actually participated in the group which was jointly facilitated by the two researchers. Both of us have had extensive experience as social work practitioners and educators.

The group ran for some three hours with the students being given a copy of the CEQ results and asked to respond to a series of open ended questions grouped around each of the teaching quality scales. The ethos of the group was one of enthusiastic and engaged participation. With student permission, the discussion was taped and transcribed for further analysis by the researchers.

The course and its context

The Bachelor of Social Work is a four year vocational degree program, with two semesters of field practicums. On graduation students readily find employment as social workers (or related positions) in the public sector or non government agencies.

Each year approximately 70 students enrol representing 50% of eligible applicants. Characteristically students arrive with a strong motivation to undertake the course. Social work remains a predominantly female profession, with men making up only 20% of the total student body. Given the nature of the work undertaken by social workers, it is not surprising that only 25% of our student intake is drawn from school leavers. In fact our admission procedure by weighting for employment and life experiences related to social work, indirectly favours non-school leavers. Students come from diverse ethnic and minority backgrounds. Some have an employment history within the human services, others are making a career shift, and yet others are at the beginning of their professional journey after having been full-time carers for a number of years.

Like the students, the academic staff also come from a variety of lived experiences and maintain a range of positions (sometime compatible, sometimes not) around what constitutes the domain of social work. There are a range teaching styles represented. Academic staff differ from the student body in the sex ratios, as 50% of teaching staff are men, with 60% of senior lecturers being male.

Over the last decade there has been a decrease in funding allocated to the School, with a resultant increase in staff workloads. All staff are in agreement that this has had primarily negative consequences on the quality of teaching. For example there has been an increase in the size of tutorial class from 12 to 16 students and a reduction in time allocated to students for professional development.

What constitutes good course experience in the 1st year of the BSW

The School of Social Work scored 83% overall satisfaction with the course compared with an average score of 72% for all Curtin courses. None of the social work group were dissatisfied compared with 13% of the Curtin-wide survey sample. The School performed at or well above the university profile in four of the five CEQ scale areas (see Appendix 2). In the clear goals and standards scale in contrast, on all four questions students rated the course below the standard attained by the university as a whole. Additionally there were specific questions on which we scored poorly or results seemed incongruent. So in response to question 7 within the Good Teaching Scale, some 72% of students indicated that they did not believe that staff put a lot of time into commenting on their work. Even though this score was the same for the university as a whole, to us it was a matter of concern in seeking to improve teaching. A lack of congruence emerged in comparing responses within the Appropriate Workload Scale. While 81% agreed that there was a lot of pressure on them as students in this course (Q. 21), only 48% thought the workload was too heavy (Q. 4). Within the same scale 70% of social work students agreed that the sheer volume of work in the course meant that it couldn't be thoroughly comprehended compared to 52% of all students (Q. 23).

In analysing the dialogue from the focus group around the above concerns, it was often difficult to gain a clear sense of direction for improvement. Unambiguous directions for improvement usually related to organisational matters such as access to texts. In the overall picture however, a pattern emerged that, what was cited as a strength of the course would appear re-framed as a weakness. This re-framing occurred both within the one student and in the ongoing interaction between students.

Contradictory student reasoning

Bad teaching is:

repetitive. . . dull, sparse. . . too much. . . just rattling it off. . .most of the time she had an overhead and she wasn't talking to the students, she was just rattling things off. We just didn't feel a connection or a bond with her at all.

I was really interested. . . but . . .I missed lots because it was just all going through like a machine. . . It was presented more as a science than a humanity. . . just facts with no interaction which was difficult.

Good Teaching is:

He had such a passion that I don't think anyone could sit through those lectures and not feel motivated to do their best work. And if you compare that say. . (where) there was no passion and there was a big difference."

"I think that there's such a difference between people who just want to transfer knowledge and lecturers who want you to know for a higher reason...."

"You're actually other than just absorbing knowledge."

"You came out of those lectures feeling moved and changed, altered." (quotes from Focus Group Tape Nov. x 1994)

The researchers, being long-standing practitioners of social work as a process-centred, value-based discipline of caring for the casualties of modernism and scientific progress, resonated with such student rejection of objectivism in teaching practice. However the tape also undeniably recorded many instances in which more objectivism was demanded by students. There emerged much ambiguity, contradiction and contestation as to what students meant by the measured score. On teaching, a dissenting position on the machine-like nature of the above teacher's presentation style was voiced:
I can't really fault her... that's what we're there for, it is just to consume it... she rarely had time for people... asking questions because she was just so busy.
Turning from teaching per se, different expectations emerged across a range of teaching related issues. With reference to assessment:
There should be a standard between the lecturers and tutors.
was a very common theme unmasking some of the meaning to our lower scores in the Clear Goals and Standards Scale. On the other hand a competing meaning around assessment was given by other students such as in the following statement:
People want the answers, they want the answers to everything, so they're giving their tutors a hard time and saying "well what's the answer?". There is no answer. . there is no one answer. . . I felt valued from my personal life experience viewpoint and the sense that I've made of stuff that's been presented in the course.
For the students, 'what was learnt' in the course proved an important yardstick in evaluating their course experience. In the record of the dialogue there are numerous opinions expressed on this theme of 'what was learnt'.
To be strong about your beliefs and that you are a worthwhile person and that your opinion is worthwhile and should be listened to. So it makes you much smarter, I found that I grew up a lot in the course.

The content of the course and the skills I've learnt in it have helped me to defend my indignance... Whereas before I was just indignant, now I can defend them...

Knowledge makes you stronger... and the knowledge that you get here does make you strong.

Beliefs, values, and process are as important in these statements as the actual content learnt. Recognising this helped us as researchers connect and make sense of students' wanting reflexive creativity and wanting clear goals and standards. The students actually made this connection overtly when they cited their experience of a specific exam as a "horrible display of what was not explained". Though the exam was designed to allow for creativity, students condemned the lack of clear guidelines and set standards Assessment that disregards an instrumental approach to learning, leaves open the possibility that teaching staff can decide on an acceptable performance as they please.
(Some staff) feel that they are in a position of power and that they are going to use that power, then that's really intimidating.
In talking of what was learnt and the stress involved, it emerged that for many becoming a university student was highly stressful and against deeply held convictions of not belonging at university. Students spoke of losing hair, periods and husbands in the first months of wondering whether they could meet expectations -- theirs, ours and others. The workload was stressful at times but for many stress was greatest in applying ideas learnt in the course to self-reflection.
Some of the things you discover about yourself you might not really like... it makes you fragile for a while.
The workload per se was just part of this stress. Processing the workload and its application and interplay with the rest of their lives was as significant. There was general agreement that as students become more practiced in balancing the competing demands of their lives, so they feel more positive about the course and their ability to manage. A valued part of learning to manage were the many discussions with other students that built a sense of collaborative student power.

Diverse but connected patterns of student reasoning

Reflecting on this issue of the play of power among stakeholders in higher education, we noted how little has been written about student powerlessness in the current debate on quality. Stephen Kemmis(1994) identifies differing reasonings used by educators in debating the nature of educating for teaching practice. He offers a useful frame to understand the diversity of our students' comments on their experience of being educated for social work practice. The frame highlights shifting power relationships among stakeholders in any setting. Drawing on the work of philosopher Alisdair MacIntyre (1988, 1990), Kemmis suggests that there are three different types of reasoning that inform social expectations. Each form implies different patterns of social relationship between the reasoners and those reasoned about. The three reasonings Kemmis (1994) characterises as instrumental (or technical), practical and critical. All three types of reasoning can be identified in the focus group dialogue.

The technical reasoner adopts an objectifying stance towards others in a setting. Others are colloquially identified as "Them." This reasoning is evident in the expressed student wish that all lecturer interaction with them over marks and assessment be standardised, objective and publicly known. Clear goals and standards provide a power base from which students can negotiate their way through the course. While this is an area in which the School could improve the equity of teaching, more instrumentalism alone did not seem the answer. Rather the task is to achieve a balance between the three reasonings, always considering the context in which each form is expressed.

Practical reasoning, in contrast to technical, adopts an interpretive stance. It argues for the importance of lived experience in understanding the human condition. Other people involved in a setting are knowing subjects who are "autonomous and responsible agents" (Kemmis, 1994.6). These others are called "You." Practical reasoning was evidenced in our focus group in the many comments made about how the course enabled students to develop as knowing, autonomous and responsible individuals. They valued teachers bringing their reflected lived experience and passions into class.

Critical reasoning "treats the others involved in the setting as co-participants, who, through their participation in the practices which daily constitute and reconstitute the setting both as system and as life-world, can work together collaboratively to change the ways in which they constitute it, and thus change both system and life-world" (Kemmis, 1994, 6). The word used here about others is "Us". In the focus group this form of reasoning pervaded talk of student interaction but included us in the deconstruction of the power issue involved in the lack of exam guidelines. This exam story was told in the spirit of us working together to prevent a recurrence of what students experienced as an injustice. Such critical reasoning was introduced again when students raised anxieties over practicums. They questioned whether the costs involved, financial and social, made the social work course elitist in contrast to our professed values. It is noticeable on the tape how we as researchers used our power to steer the conversation back to what we considered were research concerns.

Kemmis (1994) argues that it is a key task of university educators to recognise the existence and value of all three reasonings. Personal resonance and/or political imperatives can be used by those in power (for various reasons) to suppress or amplify one or more of these forms of reasoning. As noted above in the focus group we suppressed critical discussion about course changes. We defensively shifted to an instrumental discussion at that point. Arguably we were reinscribing the style of much higher education discourse, where those in power endeavour to impose the form of reasoning to be used. Kemmis argues that premature closure of discussion by all stakeholders by those in power prevents fully reflected and informed decision-making.

Where to from here

This framing from Kemmis unmasked for us the seeming messiness of contradictory comments about quality made by social work students. A wealth of ideas on how to improve teaching in the social work course has emerged from the exercise. The first step in this seems to be to keep the dialogue going between staff, students and other stakeholders without imposing premature closure on the discussion of how to improve the quality of education. At the same time there are a number of tasks that can be addressed immediately to improve the quality of teaching in the social work course such as ensuring ready access to texts, clearer guidelines for task assessments and uniformity in exam procedures.

More generally, this study demonstrates that quantitative measurements of teaching quality while providing a useful overview, do not offer a deeper evaluation that takes into account specific disciplinary and pedagogical matters operating at the local level. For that open and interactive communication is needed.

References

Beecher, T. (1994). Quality assurance and disciplinary differences. The Australian Universities' Review, 37(1), 4-7.

Cross, P. K. (1994). An American perspective on Transition: Issues of Quality and Access. Learning Matters Curtin, 2(3), 1-2.

Curtis, S. (1994). Higher Education in Australia: When organisations for learning need to become learning organisations. Journal of Institutional Research in Australia, 3(1), 39-50.

Higher Education Council, National Board of Education, Employment and Training (1992). Higher Education: Achieving Quality. Canberra: Australian Government Publishing Service.

Kemmis, S. (1994). Control and Crisis Teacher Education. Paper prepared for the inaugural Harry Penney Lecture, University of South Australia, Adelaide, April 11.

Linke, R. (1991) (Chair ). Performance Indicators in Higher Education: Report trial Evaluation Study. Commissioned by the Commonwealth Department of Employment, Education and Training. AGPS, Canberra, ACT.

MacIntyre, A. (1988). Whose Justice? Which Rationality? London: Duckworth.

MacIntyre, A. (1990). Three Rival Versions of Moral Theory: Encyclopaedia, Genealogy and Tradition. London: Duckworth.

Marsh, H. W. (1987). Students' evaluations of university teaching: Research findings, methodological issues, and directions for future research. International Journal of Educational Research, 11, 253-388.

Neumann, R. (1994). Valuing quality teaching through recognition of context specific skills. The Australian Univeristies' Review, 37(1), 8-13.

Nightingale, P. and Neil, M. (1994). Achieving Quality Learning in Higher Education. Kogan Paul, London.

Ramsden, P. (1991). A Performance indicator of teaching Quality in Higher Education: The Course Experience Questionnaire. Studies in Higher Education, 16(2), 129-149.

Ramsden, P. (1993). Theories of learning and teaching and the practice of Excellence in Higher Education. Higher Education Research and Development, 12(1), 87-97.

Sachs, J. (1994). Strange yet compatible bedfellows: Quality assurance and quality improvement. The Australian Universities' Review, 37, 22-25.

Appendix 1

Appendix 1

Appendix 2

Curtin University of Technology
Institutional Research Office

The 1993 & 1994 First Year CEQ Results

Course: 678 Title: Bachelor of Social Work

Results on the top line are for the social work course, 1993. Results in ( ) are for all Curtin respondents, 1993. Results in the third line are for the social work course, 1994.

The Good Teaching Scale
strongly
disagree
----- percentage ----- strongly
agree
No.
3. The teaching staff of this course motivated me to do my best work. 8
(5)
2
24
(24)
8
8
(22)
25
48
(40)
58
12
(9)
6
25
1617
48
7. The staff put a lot of time into commenting on my work. 16
(11)
10
48
(36)
29
4
(21)
33
32
(26)
23
0
(5)
4
25
1608
48
15. The staff made a real effort to understand. 13
(6)
0
25
(25)
29
25
(26)
21
29
(33)
42
8
(10)
8
24
1608
48
17. The teaching staff normally gave me helpful feedback on how I was going. 12
(6)
0
28
(29)
30
8
(21)
30
40
(35)
35
12
(8)
6
25
1612
48
18. My lecturers were extremely good at explaining things. 0
(5)
0
16
(22)
15
24
(25)
27
52
(40)
56
8
(7)
6
25
1607
48
20. The teaching staff worked hard to make their subjects interesting. 4
(5)
2
8
(20)
4
8
(21)
23
63
(45)
56
17
(10)
15
24
1610
48

The General Skills Scale
strongly
disagree
----- percentage ----- strongly
agree
No.
2. My studies developed my problem-solving skills. 8
(2)
0
8
(12)
2
25
(22)
25
42
(48)
54
17
(16)
19
24
1612
48
5. My studies sharpened my analytical skills. 0
(2)
0
4
(9)
4
20
(23)
13
52
(47)
48
24
(20)
35
25
1607
48
9. My studies helped me develop my ability to work as a member of a team. 16
(8)
6
8
(21)
19
12
(24)
21
40
(35)
46
24
(12)
8
25
1614
48
10. As a result of my studies, I feel confident about tackling unfamiliar problems. 4
(3)
-
8
(14)
9
13
(27)
21
67
(46)
57
8
(10)
19
24
1610
47
11. My studies improved my skills in written communication. 8
(4)
0
12
(15)
6
12
(17)
15
32
(44)
40
36
(21)
40
25
1611
48
22. My studies helped me to develop the ability to plan my own work. 8
(3)
2
20
(10)
24
12
(18)
13
52
(50)
52
8
(19)
9
25
1597
46

The Clear Goals and Standards Scale
strongly
disagree
----- percentage ----- strongly
agree
No.
1. It was always easy to know the standard of work expected. 8
(5)
6
52
(28)
32
0
(13)
32
36
(47)
30
4
(7)
4
25
1616
47
6. I usually had a clear idea of where I was going and what was expected of me in this course. 12
(6)
0
32
(23)
35
16
(17)
21
36
(44)
40
4
(11)
2
25
1615
48
R13. It was often hard to discover what was expected of me in this course. 8
(7)
4
40
(36)
30
8
(20)
28
24
(30)
32
20
(7)
6
25
1605
47
24. The staff made it clear right from the start what they expected from students. 16
(6)
15
48
(26)
31
4
(20)
21
20
(38)
23
12
(11)
4
25
48

The Appropriate Workload Scale (R)
strongly
disagree
----- percentage ----- strongly
agree
No.
R4. The work-load was too heavy. 4
(5)
2
36
(31)
31
12
(20)
19
32
(30)
40
16
(14)
8
25
1605
48
14. I was generally given enough time to understand the things I had to learn. 4
(8)
11
13
(25)
26
8
(17)
20
63
(43)
40
13
(8)
2
24
1606
46
R21. There was a lot of pressure on me as a student in this course. 4
(5)
0
38
(22)
15
25
(16)
21
13
(35)
66
21
(23)
15
24
1615
48
R23. The sheer volume of work in this course meant that it couldn't be thoroughly comprehended. 4
(5)
0
36
(21)
13
16
(19)
17
28
(33)
40
16
(22)
30
25
1608
47

The Appropriate Assessment Scale (R)
strongly
disagree
----- percentage ----- strongly
agree
No.
R8. To do well in this course all you really needed was a good memory. 60
(26)
44
24
(34)
33
4
(12)
10
12
(21)
8
0
(8)
2
25
1612
48
R12. Tbe staff seemed more interested in testing what I had memorised that what I had understood. 46
(16)
15
38
(35)
40
8
(23)
27
4
(19)
15
4
(7)
4
24
1613
48
R19. Too often teaching staff asked me questions just about facts. 21
(7)
4
42
(35)
60
21
(42)
23
17
(14)
9
0
(2)
4
24
1607
47

The Type of Course (R)
strongly
disagree
----- percentage ----- strongly
agree
No.
R16. The units were overly theoretical and abstract 16
(13)
6
36
(38)
49
4
(22)
17
36
(20)
26
8
(7)
2
25
1613
47

Overall Satisfaction
strongly
disagree
----- percentage ----- strongly
agree
No.
25. Overall, I was satisfied with tbe quality of this course 4
(2)
0
12
(11)
0
4
(14)
17
64
(56)
64
16
(16)
19
25
1614
47

Please cite as: Crawford, F. and Leitmann, S. (1995). Masqued meanings: Student evaluation of teaching. In Summers, L. (Ed), A Focus on Learning, p42-52. Proceedings of the 4th Annual Teaching Learning Forum, Edith Cowan University, February 1995. Perth: Edith Cowan University. http://lsn.curtin.edu.au/tlf/tlf1995/crawford.html


[ TL Forum 1995 Proceedings Contents ] [ TL Forums Index ]
HTML: Roger Atkinson, Teaching and Learning Centre, Murdoch University [rjatkinson@bigpond.com]
This URL: http://lsn.curtin.edu.au/tlf/tlf1995/crawford.html
Last revision: 16 Apr 2002. Edith Cowan University
Previous URL 5 Mar 1997 to 15 Apr 2002 http://cleo.murdoch.edu.au/asu/pubs/tlf/tlf95/crawf42.html