Teaching and Learning Forum 2006 Home Page

Category: Research
Teaching and Learning Forum 2006 [ Refereed papers ]
Academic productivity of Australian academics and higher degree research students: What can we learn from the facts

Dora Marinova
Institute for Sustainability and Technology Policy
Murdoch University

It has become usual practice for the Australian Federal Government to shape the country's research priorities to better reflect the needs of its economy and society. The funding mechanisms for university research and research training have also changed with the latest system being introduced since 2001. A new model, namely the Research Quality Framework (RQF) is being currently discussed, shaped along the lines of the British Research Assessment Exercise and the New Zealand's Performance Based Research Fund. These are also times when the performance of Australian universities is being attacked with open calls for them to prove that they are worth the taxpayers' money.

The paper analyses the productivity of the Australian academic sector between 1992 and 2005 in comparison with New Zealand and the UK and then uses the case study of the Institute for Sustainability and Technology Policy (ISTP), Murdoch University to demonstrate the changes in research quality. Its main argument is that the constantly improving performance of the Australian universities is not being acknowledged and instead, a false public image of lack of productivity is being created. This can potentially alienate prospective new researchers and PhD students directing them to seek alternative career paths or countries of education.


Introduction

Although subject to some minor changes (such as discipline clusters or weightings), the funding for teaching undergraduate students in Australia by the Department of Education, Science and Training (DEST) has been relatively stable and consistently provided on the basis of the number of students taught. This however has not been the case when it comes to research training, ie. the supervision of research students. The models have changed dramatically from entirely based on teaching load (ie. number of research students supervised) to a complex mixture of research outcomes, which in some cases are completely disconnected from the research supervision process.

The current funding model (introduced in 2001) is performance based and rewards research completions, publications and research income, with research student load playing a very minor part in the funding equation. Delivered as a one line budget to the Australian universities, the research funding has been used internally by them according to a variety of models reflecting the differences in the teaching and research philosophy, strategic thinking and everyday practices. For example, in some cases the schools, departments or centres, where the actual research supervision occurs, are the recipients of the full amount allocated by Canberra and then get charged for the provision of services from the central administration. In others, central administration withholds a share of the DEST money (in some cases as high as 72%) and the teaching units have to provide the supervision within the remaining budget.

Whatever the existing practices across the Australian academic world, the current debate surrounding the introduction of the new Research Quality Framework (or RQF) is the first time in Australia's history when universities will be funded according to a research quality ranking. It is also the first time that they are being publicly attacked for not delivering expected research outcomes and not being productive. The criticisms do not specifically target research supervision but they challenge the overall research performance of the tertiary sector which has significant ramifications for the supervision of PhD and other higher research degree students.

The RQF paper produced by DEST in March 2005 claims that "it is difficult to assure stakeholders that public funds for research are being invested in the highest quality endeavours. Without this assurance, the argument for further public investment in research is not as persuasive as it should be" (DEST, 2005, p. 7). This paper (which is one of a series of RQF publications) asserts that if we have "a consistent approach to measure research quality and impact across the breadth of the Australian research landscape" (DEST, 2005, p. 7), it would be easier to convince the taxpayers that investing in Australian research capabilities is worth their dollar.

The aim of this conference paper is to put to the test the assumptions behind the current Australian Government position in relation to publicly funded research, including quality of research supervision. In order to do this, it uses macro analysis of academic productivity in Australia (particularly in comparison with New Zealand and the UK) and a case study of the Institute for Sustainability and Technology Policy (ISTP) for changes in research quality and research supervision. The main argument is that the constantly improving performance of Australian universities is not being acknowledged and instead, a false picture of wastefulness of taxpayer money is being created.

The writing of this paper is in itself an act of reflection necessary for the ongoing process of enthusiasm and commitment to the development of research skills in Australia and it comes from an Institute where 8 staff have the privilege of supervising 79 (or around 45 full time equivalent) postgraduate research students in the interdisciplinary area of sustainability. The passion that these academics have for their work is shared by many around Australia and the negative image portrayed by DEST is an alarming issue that needs to be addressed.

Publications as evidence of research productivity

The DEST RQF papers claim that there is concern that refereed publications[1] as currently used in the university funding formula "do not sufficiently encourage a focus on research quality" (DEST, 2005, p. 7). The main argument for change in the research funding in Australia is influenced by the schemes introduced recently in the UK, the National Research Assessment Exercise (Harnad et al., 2003) and New Zealand, the Performance Based Research Fund (Goldfinch, 2003).

Australia however has outperformed both of these two countries in terms of research output (Marinova and Newman, 2005). The internationally widely recognised and accepted journal listings of the Institute for Scientific information (ISI) cover around 10-12% of all refereed journals (Dale and Goldfinch, 2005; Monastersky, 2005; Smith and Marinova, 2005) under the belief that a core "small number of journals accounts for the bulk of significant scientific results" (Garfield, 1996, p.13). In the last three years, namely since 2003 Australia has outperformed both the UK and New Zealand by the number of ISI papers on a per capita basis (see Table 1). The estimated figure for 2005 is 182 papers per 100,000 population compared with 176 for New Zealand and 172 for the UK. For Australia, the increase since 1992 has been dramatic, namely by 72% (or around 5.5% per annum). The respective figures are 64% (or around 5% per annum) for New Zealand and 44% (or around 4% per annum) for the UK. Moreover, during the 1992-2005 period out of the three countries only Australia has consistently improved its absolute share in total ISI refereed papers (see Figure 1) to reach around 2.5%.

Table 1: ISI refereed paper publications by Australia, New Zealand and UK, 1992-2005

YearUnited KingdomNew ZealandAustralia
ISI papersISI papers per 100,000 peopleISI papersISI papers per 100,000 peopleISI papersISI papers per 100,000 people
199268,9211193,692 10718,612106
199369,9611213,708 10719,427110
199474,1401274,109 11720,770116
199581,5261404,414 12423,112128
199685,3781464,612 12723,838130
199784,0621434,828 13124,819134
199889,2531515,397 14526,477141
199990,0971525,358 14227,053143
200091,4361545,505 14426,882140
200191,0671525,524 14328,087145
200285,9281435,418 13927,631141
200395,3441595,962 15132,589165
200490,6771505,732 14430,425153
2005*10,38481727,108 17636,587182
Notes: * The 2005 figure is extrapolated based on data until September 2005 (inclusive).
Source: Data extracted from ISI Web of Science, 30 September 2005.

The Australian universities have been the biggest contributor to these changes. Firstly, the university sector in all three countries has been pushed to become the main contributor to the pool of ISI refereed papers which is confirmed by the ever increasing university shares in the total number of ISI papers (see Figure 2). In the case of Australia, the share of universities has reached as high as 85% in 2005. Secondly, the productivity of Australian universities (measured as number of ISI refereed papers per 100,000 population) has been consistently higher than that of New Zealand for the entire 1992-2005 period (see Table 2 and Figure 3). It also has been higher than that of the UK since 2001. The gap between the Australian and British/New Zealander academic productivity increased significantly in the last three years which broadly coincides with the introduction of their respective new university funding models.

It is also important to note that many of the ISI refereed papers have been a direct result from collaboration between research students and their supervisors and in many occasions (predominantly in the social sciences) research students have also been encouraged to publish without including the supervisor's name. The production of refereed papers is part of the research training process which is often regarded as important as the writing of the research thesis itself. The learning that occurs during the conquering of the ISI world is challenging but rewarding when it comes to future career prospects.

Figure 1

Figure 1: Percentages of total ISI papers for Australia, New Zealand and UK, 1992-2005
Source: Data extracted from ISI Web of Science, 30 September 2005.

Figure 2

Figure 2: Percentage shares of university papers in total national ISI
refereed papers for Australia, New Zealand and UK, 1992-2005
Source: Data extracted from ISI Web of Science, 30 September 2005.

Table 2: ISI refereed paper publications by university sector in Australia, New Zealand and UK, 1992-2005

YearUnited KingdomNew ZealandAustralia
ISI univ papersISI univ papers per 100,000 peopleISI univ papersISI univ papers per 100,000 peopleISI univ papersISI univ papers per 100,000 people
199242,890742,485 7213,07475
199344,689772,470 7113,64277
199449,515852,740 7814,86083
199556,563972,968 8317,05094
199660,5531033,227 8917,97998
199760,4171033,354 9119,009102
199864,4791093,863 10420,767111
199965,8411113,895 10321,378113
200068,1821153,964 10421,696113
200169,0581164,060 10522,730117
200266,3711114,125 10622,606116
200373,4611224,605 11727,196138
200471,5931194,516 11325,650129
2005*81,0051345752 14330,963154
Notes: * The 2005 figure is extrapolated based on data until September 2005 (inclusive).
Source: Data extracted from ISI Web of Science, 30 September 2005.

Figure 3

Figure 3: Research productivity of academics in Australia, New Zealand and UK
(represented by ISI papers per 1000,000 population), 1992-2005
Source: Data extracted from ISI Web of Science, 30 September 2005.

Against this outstanding performance of the Australian university researchers, it is misleading for the Federal Government to imply that there are problems as to how the taxpayers' money is used in supporting research and research training. There is clear indication that the research productivity of the Australian universities has been increasing consistently. This however has not been matched by any means with appropriate increases in their research funding.

The ISI evidence of productivity shows that Australian universities have been producing world class research that is widely accepted by the top refereed journals in an environment which generally undervalued the importance of publications[2] and did not directly encourage publishing in ISI journals. It is therefore completely wrong to create an image of under performing for the Australian university sector. This potentially can alienate prospective talented new PhD students and young academics, and push them to look for research opportunities overseas, including in countries which do not compare favourably with Australia (the two countries analysed here are a perfect example).

Citations as evidence of research quality

Citation rates are a major component in the British Research Assessment Exercise as well as in the Performance Based Research Fund in New Zealand. Although they have not been part of the current and past university funding models in Australia, they are likely to be given a heavy weighting in the proposed RQF. In the anticipation of this development, there has been a resurgence of interest in studies that rank and compare university departments. In addition to the econometricians' obsessions with rankings (Baltagi, 2003), some other recent examples are the following: Because of limitations in the citation search engines, it is impossible to get any other aggregated citation rates than for individual academics. Hence, it is a very labour intensive exercise to estimate what are the citation rates for all Australian academics, compared to their counterparts in the UK or New Zealand. Therefore a case study analysis was undertaken of the academics within the Institute for Sustainability and Technology Policy (ISTP) at Murdoch University in Western Australia. Since 1995, the ISTP has maintained the same size of 8 full time equivalent academics. Three different citation search engines were used, namely ISI, scholar.google.com and Scopus[3]. It is interesting to note that in the case of ISTP, there was very little overlap between ISI and Google as well as between Scopus and Google. There was some overlap between ISI and Scopus. The latter includes a wider range of journals, some of them also covered by ISI but there were also journals included in ISI and not covered by Scopus. Both, Google and Scopus do not include books which Google does. Another specific characteristic of ISI is that the citation counts relate only to first author[4]. Scopus' and Google's search engines retrieve all citations, irrespectively of the author's family name's position in the order of authorship.

Table 3 shows ISTP's citation rates per academic staff for 1995-2005 using the ISI citation index. Despite some ups and downs (triggered mainly because of the small size of the unit), the period averages show a distinctive trend towards increased citation rates. The latest 5 year annual average of citations/academic staff, namely 3.05 is more than 4 times higher than the first 5 year average (see Table 3). Likewise, the number of citations per paper for 2001-2005, namely 4.437, has increased one and a half times compared with the 1995-1999 period[5].

Table 3: ISI citations for ISTP academics, 1995-2005

YearISI citations
per academic
ISI citations
per paper
19950.0000.000
19960.2500.667
19970.3750.375
19981.37511.000
19991.6253.250
20001.7502.800
20012.1252.833
20021.8753.750
20033.0006.000
20042.2503.600
2005*6.0006.000
1995-2000 average0.7253.058
1995-2002 average1.1723.084
1995-2005 average1.8753.661
2001-2005 average3.0504.437
* The 2005 figure is extrapolated based on data for 2005 until October (inclusive). Source: Data obtained from ISI Web of Science, 31 October 2005.

The same trends seem to be apparent in the Scopus citation rates (Table 4), namely the citation rates have increased significantly during more recent years. The overall figures are consistently higher due to the larger journal coverage of this database. A similar picture appeared using Google's citation search engine (Table 5). Yet again, the figures are different than those obtained from ISI and Scopus because of the different coverage and overall larger number of websites visited by Google's crawlers.

Consequently, irrespectively of which citation tool is used to assess the quality of the academic output of ISTP (Figure 4 clearly shows the mismatch between the three citation databases), the changes that had been witnessed in the last decade are a clear signal about the increased quality of output by academics. Hence, again there appears to be no justification for concerns about the world class quality of Australia's research. The question should be asked not how the system should be changed to punish academics for not performing but how to further encourage an extremely positive trend. It is also important to make this trend known to the wider academic, professional and overall Australian community to give confidence to any new academics as well as potential PhD candidates.

Table 4: Scopus citations for ISTP academics, 1995-2002

YearScopus citations
per academic
Scopus citations
per paper
19950.0000.000
19964.5006.000
19973.62514.500
19984.2506.800
19996.2505.556
20005.2507.000
20015.62515.000
20027.62512.200
20039.87515.800
20047.5006.667
2005*8.00010.667
1995-2000 average3.7256.571
1995-2002 average4.6418.382
1995-2005 average5.8279.108
2001-2005 average8.04512.067
* The 2005 figure is extrapolated based on data for 2005 until October (inclusive). Source: Data obtained from Scopus, 31 October 2005.

Table 5: Google citations for ISTP academics, 1995-2002

YearGoogle citations
per academic
Google citations
per paper
19951.7502.800
19961.1250.692
19971.3751.100
19981.2501.667
19992.1251.700
20002.2503.000
20014.8753.250
20027.5003.158
200312.8754.292
20048.8755.917
2005*5.0004.286
1995-2000 average1.5251.592
1995-2002 average2.7812.171
1995-2005 average4.4552.896
2001-2005 average7.8254.180
* The 2005 figure is extrapolated based on data for 2005 until September (inclusive). Source: Data obtained from Google Scholar, 30 September 2005.

Figure 4

Figure 4: Comparison between ISI, Scopus and Google for ISTP academics, 1995-2002.
Citations per ISTP academic. * The 2005 figures are an estimate.

Figure 5

Figure 5: Comparison between ISI, Scopus and Google for ISTP academics, 1995-2002.
Citations per ISTP article. * The 2005 figures are an estimate.

Completion time as evidence of PhD related performance

Supervision of higher degrees research students is closely linked to the research performance of Australian academics. In the early 1990s, the Federal Government introduced a limit of 10 years to the time a PhD student can potentially take to finish his/her research. At that time, this was perceived as a significant conceptual change, particularly in the social sciences and humanities where talented researchers often took longer than 10 years to complete what they (and others as well) regarded as seminal work. In the early 2000s, DEST introduced completion time as a component of the research funding model. This changed the attitudes across university campuses with supervisors and students starting to work together in a more focussed way towards producing high quality theses in shorter periods of time. In many instances, this meant concentrating on writing and commenting on PhD chapters instead of publishing refereed articles. The benefits from conference participation were also judged against this background of completion pressure.

Table 6 presents some empirical evidence on PhD completion times from Murdoch University and ISTP in particular. The completion time for PhD students at ISTP has been drastically reduced by 12 months (or a quarter) between 2001 and 2004. A similar trend is observed for Murdoch University since 2002. On average, it takes just over 3 years for a PhD to be completed, including the refereeing process and any finalising changes. The message that this sends again is about the commitment and high achievements by academics and PhD candidates in Australia, something that would be difficult to match in other parts of the world.

Table 6: Completion time (months) for PhD students at ISTP and Murdoch University, 2001-2004
Source: Data available from http://www.murdoch.edu.au/

YearISTPMurdoch
20015246
20024652
20034545
20044040

The Institute for Sustainability and Technology Policy may not be the average academic unit - it is amongst the highest performing units at Murdoch (for more information about ISTP see Hossain et al., 2005; Marinova and McGrath, 2004 and 2005; McGrath, Marinova & Newman, 2005) but it is still representative of the pressure that the Federal Government has put on Australian universities. There are many other such units across Australia that do exemplary work which goes unrecognised. Moreover, research performance in academia has not been adequately rewarded[6] and the public, including the average taxpayer, should be given the true picture. Creating knowledge and capabilities for the future generations is the most important role universities play. The building of a misleading picture of the Australian universities poses a threat of diminishing individual aspirations by potentially capable new researchers, for them seeking employment and career options in other sectors or countries. Australian academics and PhD researchers are proud of their achievements and their satisfaction is well justified.

The full picture?

The relationships between supervisors and PhD students go far beyond the publishing of articles and completion of research theses. The training and learning cover a wide range of professional activities that prepare the new generation for an active presence in the scientific world. The list and the range of functions within society that researchers undertake are big. They include teaching and tutoring, public seminars, academic refereeing, membership of professional and editorial bodies, administrative duties, community service, marketing and commercial activities, to mention a few (eg. Smith and Marinova, 2005; CHASS, 2005). A study by Smith (2003) of Australian geoscientists who have become part of centres with partial industry funding (eg. Cooperative Research Centres) reveals that these academics have continued with their professional activities and commitments irrespectively of the additional pressure on them to tap into non-government money. This all is part of the teaching and learning process for their research students who are themselves involved in a number of these activities.

The full picture of the academic productivity of Australian researchers will never be complete if we forget Freire's (1998, p 3) words that it is impossible to be on this job "without a forged, invented, and well-thought out capacity to love".

Conclusion

According to Phillimore (1989), academic performance is a complex concept for which no objective indicators exist and "the context and process through which indicators of performance are arrived at, and the subsequent use to which they are put, are judged to be as important as the information which each indicator conveys" (Phillimore, 1989, p. 255). It is therefore imperative to put the attempts of the Federal Government in trying to find "a more consistent and comprehensive approach to assessing the quality and impact of publicly funded research" (DEST, 2005, p. 7) in the right context of excellent academic performance.

The new research funding model proposed by DEST, namely the RQF, is based on a 20th century concept of professional achievements which encourages actors in universities and government research organisations to move physically to larger centres to specialise rather than to diversify, and to move upwards through hierarchies of power and privilege whose apexes decide what counts and what should be rewarded (Chambers, 1993 and 1997). Instead of giving a fair go to all Australian universities, it will encourage concentration and specialisation of research funding, including research students' supervision. The evidence is that Australian academics have achieved an excellent performance record in a climate that allowed more for diversity, complexity, interdisciplinarity and did not target the building of hierarchical rankings. The Australian university sector has not been rewarded for its accomplishments. Moreover, there is also the risk of creating a negative image for the valuable work academics and their research students are doing.

There cannot be a definitive answer as to what is the best way to measure research productivity and quality because of the complexity and diversity of the academic world. The proposed RQF model incorporates a ranking system that would determine the research funding for universities. Any funding model, including one based around ranking, is by definition a simplification of the real world where certain aspects of reality are better represented (emphasised or valued) than others. Consequently, with a shift from one model to another, some are winners and some lose. Trying to find the "best fit" or an approach that will be consistent and comprehensive in assessing research quality and impact (DEST, 2005, p. 7) is a statistical illusion when it comes to investing in Australia's future.

There are at least two necessary pre-conditions for Australia to have a healthy, strong and world class university research sector. Firstly, adequate resources should be provided to match its current achievements, including financially facilitating the transition to any new funding model. Secondly, the funding model used[7] should allow for diversity and flexibility to properly reflect the complexity of academic world.

Endnotes

  1. Refereed publications include books, book chapters, refereed journal publications, and full paper refereed published conference proceedings. The books are weighted at 5 while all of the remaining three categories are weighted at 1.
  2. The weighting of publications in the current research funding model is only at 10%.
  3. Scopus is a registered trademark of Elsevier B.V. It is the world's largest abstract and indexing database, which offers access to 14,200 peer reviewed titles from more than 4,000 international publishers [http://www.info.scopus.com/, viewed 31 Oct 2005]. By comparison, the ISI database in 2004 included 8,700 refereed journals [http://www.isinet.com/essays/selectionofmaterialforcoverage/199701.html/, viewed 26 Sep 2005; not found 23 Jan 2006, see replacement http://scientific.thomson.com/free/essays/selectionofmaterial/journalselection/].
  4. What this means that if an academic from ISTP has published as second, third, etc. co-author with the first academic not being from ISTP, these citations are not included in the analysis.
  5. The ISTP 1995-2002 citation averages also compare favourably with the averages of the top political science units in Australia and New Zealand (see Dale and Goldfinch, 2005).
  6. For example, university funding from DEST has been consistently indexed below the CPI (consumer price index).
  7. Some very good recommendations are made by CHASS, 2005.

Acknowledgment

The author acknowledges the financial support of the Australian Research Council. She also wants to thank the academics who have refereed this paper for their comments as well as for their commitment and love for the work they do.

References

Baltagi, B. H. (2003). Worldwide institutional and individual rankings in econometrics over the period 1989-1999: An update. Econometric Theory, 19(6), 165-224.

Chambers, R. (1993). Challenging the professions frontiers for rural development. London: Intermediate Technology Publications.

Chambers, R. (1997). Whose reality counts? Putting the first last. London, Intermediate Technology Publications.

CHASS (Council for the Humanities, Arts and Social Sciences) (2005). Measures of quality and impact of publicly-funded research in the humanities, arts and social sciences. Canberra: CHASS. [viewed 22 Dec 2005] http://www.chass.org.au/

Dale, T. & Goldfinch, S. (2005). Article citation rates and productivity of Australian political science units 1995-2002. Australian Journal of Political Science, 40(3), 425-434.

DEST (Department of Education, Science and Training) (2005). Research Quality Framework: Assessing the Quality and Impact of Research in Australia. Advanced Approaches Paper. Canberra, Government of Australia. http://www.dest.gov.au/sectors/research_sector/policies_issues_reviews/key_issues/research_quality_framework/issues_paper.htm

Freire, P. (1998). Teachers as cultural workers: Letters to those who dare to teach. Boulder, CO: Westview.

Garfield, E. (1996). The significant scientific literature appears in a small core of journals. The Scientist, 10(17), 13.

Goldfinch, S. (2003). Investing in excellence? The performance-based research fund and its implications for political science departments in New Zealand. Political Science, 55(2), 39-53.

Harnad, S., Carr, L., Brody, T., & Oppenheim, C. (2003). Mandated online RAE CVs linked to university eprint archives: Enhancing UK research impact and assessment. Ariadne, 35. http://www.ariadne.ac.uk/issue35/harnad/

Hossain, A., Hossain, P. & Marinova, D. (2005). Recognising sustainability as an area of thriving demand. In Tertiary education: Surviving or thriving - forging the way in a new landscape. Proceedings of the Tertiary Education Management (TEM) Conference [CD]. Perth, Western Australia.

Marinova, D. & McGrath, N. (2004). A transdisciplinary approach to teaching and learning sustainability: A pedagogy for life. In Seeking educational excellence. Proceedings of the 13th Annual Teaching Learning Forum, 9-10 February 2004. Perth: Murdoch University. http://lsn.curtin.edu.au/tlf/tlf2004/marinova.html

Marinova, D., & McGrath, N. (2005). Transdisciplinarity in teaching and learning sustainability. In G. Banse, I. Hronszky & G. Nelson (Eds), Rationality in an uncertain world. Berlin: Edition Sigma: 275-285.

Marinova, D. & Newman, P. (2005). Academic productivity and the changing research funding models in Australia: What is the true picture? In A. Zerger & R. M. Argent (Eds), MODSIM 2005 International Congress on Modelling and Simulation. Modelling and Simulation Society of Australia and New Zealand: 1063-1069. [viewed 5 Dec 2005] http://www.mssanz.org.au/modsim05/proceedings/papers/marinova_2.pdf

McGrath, N., Marinova, D. & Newman, P. (2005). Crossing borders through reflective and participatory practice: Learning, researching, teaching and facilitating sustainability. In The reflective practitioner. Proceedings of the 14th Annual Teaching Learning Forum, 3-4 February 2005. Perth: Murdoch University. http://lsn.curtin.edu.au/tlf/tlf2005/refereed/mcgrath.html

Monastersky, R. (2005). The number that's devouring science. The Chronicle of Higher Education, 52(8), A12-20.

Phillimore, J. (1989). University research performance indicators in practice: The University Grants Committee's evaluation of British universities, 1985-86. Research Policy, 18(5), 255-271.

Smith, K. (2003). Performance measurement of Australian geoscientific minerals researchers in the changing funding regimes. PhD thesis. Perth, Western Australia: Murdoch University.

Smith, K. & Marinova, D. (2005). Use of bibliometric modelling for policy making. Mathematics and Computers in Simulation, 69(1-2), 177-187.

Author: Dora Marinova is an Associate Professor and Head of ISTP, Murdoch University where she teaches in the areas of demography and women and development. She is currently supervising 14 PhD students on topics related to sustainability. Her research interests cover technology policy and development, sustainable business and partnerships. She has published over 60 refereed journal articles and book chapters and has conducted research for Western Australian and Commonwealth Government departments.

Dora Marinova, Institute for Sustainability and Technology Policy (ISTP), Murdoch University, Murdoch, WA 6150, Australia. Email D.Marinova@murdoch.edu.au

Please cite as: Marinova, D. (2006). Academic productivity of Australian academics and higher degree research students: What can we learn from the facts. In Experience of Learning. Proceedings of the 15th Annual Teaching Learning Forum, 1-2 February 2006. Perth: The University of Western Australia. http://lsn.curtin.edu.au/tlf/tlf2006/refereed/marinova.html

Copyright 2006 Dora Marinova. The author assigns to the TL Forum and not for profit educational institutions a non-exclusive licence to reproduce this article for personal use or for institutional teaching and learning purposes, in any format (including website mirrors), provided that the article is used and cited in accordance with the usual academic conventions.


[ Refereed papers ] [ Contents - All Presentations ] [ Home Page ]
This URL: http://lsn.curtin.edu.au/tlf/tlf2006/refereed/marinova.html
Created 19 Jan 2006. Last revision: 19 Jan 2006.