Insights Blog

2010 Excellence in Research for Australia Exercise: What has been learned...

2010 Excellence in Research for Australia Exercise: What has been learned...

24 August 2011, by Professor Emeritus Frank Larkins

 

Frank Larkins is Professor Emeritus in the School of Chemistry at the University of Melbourne. He is a former Deputy Vice Chancellor at that university, Dean of Science and Dean of Land and Food Resources. He has published more than 200 scientific papers. His current interests are in research, education and energy policy developments. He has recently published a book entitled Australian Higher Education Research Policies and Performance 1987-2010 (MUP 2011).

 

 

2010 Excellence in Research for Australia Exercise: What has been learned about discipline quality and diversity?

Australia has considerable diversity in its research activities as measured by the contributions of universities to the disciplines assessed in the ERA exercise. In terms of quality of performance, for 18 of the 25 disciplines evaluated at least half of Australia’s universities submitting research outputs were deemed to be at or above world standard. However, in some of the remaining 7 disciplines less than one third of the universities assessed were at world standard. Overall science-related disciplines are rated more highly against world standards than humanities and social sciences disciplines. The impact of these findings on the quality of Australian Universities teaching and research programs and on the government’s proposal to reform higher education requires much more debate. [Read as pdf]

Introduction

The Commonwealth Government through the Australian Research Council (ARC) conducted in 2010 the first Excellence in Research for Australia (ERA) exercise (1). The responsible minister, Senator Kim Carr, Minister for Innovation, Industry, Science and Research has described the ERA exercise as ‘a key element of the government’s agenda for the reform of Australia’s higher education system’ (1). The cumulative cost of the 2010 exercise to all parties involved was at least $100 million for data collection, reporting and evaluation. A further evaluation exercise will be conducted in 2012 to refine the ERA methodology.

Among the key questions that are foremost on the agenda of strategic research policy personnel in Universities and in relevant Government departments are:

  • What has been learnt by all parties from the 2010 exercise?
  • How should the information be used within universities to manage the diversity and quality of an institution’s future research profile and its impact on teaching programs?
  • How will the Federal Government use the ERA discipline quality information to allocate funding on a performance basis to universities in the future?
  • How should a university prepare for the 2012 exercise in view of the 2010 outcomes?

These questions provide the framework for a series of articles analysing various aspects of the ERA policy initiative by the Federal Government. In this article the information obtained on the diversity and quality of various research discipline areas in aggregate at Australian universities is discussed. In the second article in this series the performance of individual universities will be reviewed. In an earlier article in the LH Martin Institute Newsletter Pettigrew (2) analysed some of the ERA discipline outcomes in relation to Total Research Block Grant Income (TRBGI) (3).

ERA data were collected from all institutions and evaluated in eight multidisciplinary clusters using the two- and four- digit Fields of Research (FOR) codes (4). The codes and cluster designations are given in appendix 1. A rating scale of from 1 (well below world standard) to 5 (well above world standard) was used. The performance assessment standards, which inevitably varied with discipline, and the world standard benchmarks have not been disclosed. Consequently, the transparency of the ERA exercise and the capacity for independent appraisal by non ARC parties of the validity of the evaluations made have been seriously compromised.

The information publicly available also does not enable one to assess the absolute amount of research activity (volume factor) undertaken in each discipline or by each university. The information publicly available only informs the community that universities submitting data had reached a defined FOR threshold activity level in that discipline area. Consequently, any data pertaining to averages must be interpreted with some caution, but averages can nevertheless provide a useful insight (within limits) as to relative performances within and between disciplines.

The task was a very substantial one with more than 2,400 Units of Evaluation (UoE) assessed. Universities were provided with their assessments at both the two-digit and four-digit FOR levels. This means that potentially a university could have a quality rating in as many as 25 discipline categories (three of the 22 FORs, Technology, Medical and Health Sciences and Built Environment and Design, were split between two clusters) and 156 discipline sub-categories. Individual universities decided in how many disciplines their research outputs would be evaluated. No university was assessed in all sub-categories. All 39 Australian universities submitted some research outputs that were examined at the two- and four- digit FOR level*. ERA assessment data required for the present appraisal at the two-digit FOR level are presented in Appendix 2.

* The Melbourne College of Divinity submitted data for assessment in only one FOR and the Batchelor College of Indigenous Tertiary Education did not submit.

Discipline Diversity

One measure of discipline diversity is the number of universities submitting research for assessment in the various FOR categories. These data are listed in appendix 2 and shown graphically in Figure 1. The information is colour coded according to the research evaluation cluster to which the discipline was assigned. All 39 universities made a submission for Education (code 13) and for Commerce, Management, Tourism and Services (code 15), while only one university (Queensland) made a submission for Technology (code 10B).

Some 21 (54%) of Australia’s 39 universities made submissions in 22 (88%) or more of the 25 discipline categories. The three other codes (10A, 10B and 12B) attracted few entries. The 13 most researched disciplines were practiced in 31 (79%) of Australia’s universities. Nine of these disciplines are humanities and social sciences related. The most practiced science and technology disciplines were in Medical and Health Sciences followed by the Biological Sciences. The Physical and Mathematical Sciences disciplines are practiced in only around three in every five Australian universities. The lower level of activities reflects in part the costs associated with researching in these disciplines and the different teaching profiles in universities. In many Australian universities there is a higher demand for humanities and social science undergraduate teaching programs than for science-based teaching programs.

These findings do highlight the breadth and diversity of discipline research activities in Australian universities. In addition to what has been reported some universities will be active in disciplines, but have not reported the information because their level of activity was below the volume threshold established for assessment by the research evaluation committees. The data does provide the basis for an informed policy discussion as to whether Australia has an appropriate research discipline mix in its universities given national economic, social and cultural requirements and expectations. A more detailed analysis will require a sector profile assessment at the four-digit discipline code level along with knowledge of the volume of research activity for each FOR code. The Australian Research Council has these data.

Quality of Performance

The quality of performance is at least as important as the overall level of activity in the various disciplines. The number of 5, 4 and 3 ratings for each two-digit FOR are given in appendix 2 (column4). The information is also shown graphically in Figure 2.


The Medical and Health Science Discipline component (code 11A) in the Biomedical and Clinical Health Sciences cluster has the most number of universities rated at well above world standard (9 universities at 5). The other outstanding discipline performances rated 5 are the biological sciences (code 06 with 8 universities), physical sciences (code 02 with 8 universities) and earth sciences (code 04 with 6 universities). The best discipline in the Social, Behavioural and Economic Sciences Cluster was Psychology and Cognitive Sciences (code 17 with three universities rated five). History and Archaeology (code 21) was the best performed of the Humanities and Creative Arts Cluster with four universities deemed to be outstanding. In 11 of the 25 two-digit FOR disciplines (44%) the majority of universities are rated at or above world standard. Twenty universities (51%) are performing at this high level. Eight of these 11 disciplines are science-based (codes 02, 03, 05, 06, 07, 09,11A, 11B) and three are humanities and creative arts based (codes 24, 21, 21). There are no disciplines from the social, behavioural and economic cluster in this premier group of disciplines.

In addition to considering absolute performance one is able to examine the percentage of universities submitting research outcomes in each discipline category that were assessed for each two-digit FOR at or above world standard (≥3). The data are listed in appendix 2 (column 5) and presented graphically in figure 3, ordered according to the research cluster to which each two-digit FOR belongs. There is a wide variation in the percentage of universities receiving a rating of world standard or above. The chemical sciences have received the most outstanding result (100%) with all 26 submitting universities being rated at of above world standard. The Technology code 10B in the Biological Sciences and Biotechnology Cluster also received this distinction but only one university, the University of Queensland, submitted in this category.

Viewed from this perspective Australia’s depth in quality performance is not so impressive. In only six fields are more than 80 percent of Australia’s universities that submitted outputs assessed to be at or above world standard and they are all in the sciences (codes 02, 03, 04, 05, 07, 10B). In 18 of the 25 disciplines evaluated (72%) the level of performance was such that at least 50% of universities evaluated were deemed to be at or above world standard. In some of the remaining 7 disciplines less than one third of the universities were at world standard. A policy discussion is required to establish whether this is an acceptable outcome, but such a discussion can be only of limited value until data on the world standard benchmarks used and the levels of research activity data by university and FOR are made available. What is striking about the data presented in figure 3 is that on a system-wide basis the research outcomes of all the Social, Behavioural and Economic Sciences disciplines in universities (codes 13, 14, 15, 16, 17) were assessed to be significantly below world standard in the majority of universities. This outcome must be of concern as these are the disciplines where a significant amount of undergraduate teaching is undertaken in universities and many academics advocate that research-led teaching is essential to maintain world-class teaching and learning standards.

The individual discipline performances can be aggregated into the eight interdisciplinary clusters in order to provide a more consolidated view at a level that may be manageable to be used for research policy development purposes. The percentage of universities at or above world standard at the cluster level are given in figure 4. The outcomes reflect the primary data that has been discussed earlier with the standout performances being mainly in the sciences.

Average Discipline Ratings

Caution must be exercised in averaging the discipline ratings because data are not available on the level of research activity associated with each discipline to obtain a more complete analysis. Nevertheless, the simple averaging of rating scores does provide some insight into the distribution within a discipline. In particular, information on whether there are low scores that counterbalance the high scores and therefore reduce the raw average is of value. The discipline and cluster averages are given in appendix 2 (columns 6 & 7). The averages by discipline are presented in figure 5.

When all the ratings are taken into consideration only 12 of the 25 the disciplines have an average rating that is at or above world standard. Only two of these disciplines are in the HCA cluster and none are in the BSE cluster. While the data should not be over interpreted it does indicate that in a number of disciplines there is a ‘long tail’ of weak performances that are counterbalancing the good performances of other universities. This is a matter that needs to be addressed at the policy level. To a large extent the conclusions to be drawn from the data presented in figure 5 are similar, if somewhat less supportive of overall system-wide quality, to those obtained from the percentage performance data provided in figure 3.

Conclusion

The ERA exercise has provided a valuable insight into the discipline spread across Australian universities and their quality standings assessed against non-transparent world standards. It is evident that there is considerable room for research improvement in most disciplines, but especially in the Humanities and Social Sciences disciplines. In the next article the performance of individual universities across the various disciplines will be examined.

The setting of research priorities for future investments in many universities will undoubtedly be influenced by the ERA outcomes. It is essential that Australia maintains a diverse and internationally competitive higher education research profile. Universities have a major responsibility to support the world-class skills base required to maintain Australia’s economic competitiveness with our major trading partners. Key objectives of the ERA exercise should be to strengthen Australia’s international research standing and to underpin world-competitive teaching programs. It will be several years before the full reforming impact of the ERA exercise will be able to be reliably evaluated.
 

References

1. ERA outcomes, http://www.arc.gov.au/era/outcomes_2010.htm
2. Alan Pettigrew, Excellence in Research – where is it? LH Martin Institute, The University of Melbourne 6th April 2011, http://www.lhmartininstitute.edu.au/insights-blog/2011/03/43-excellence-in-research-where-is-it
3. Research Block Grants, http://www.innovation.gov.au/Research/ResearchBlockGrants/Pages/default.aspx.
4. Australian and New Zealand Standard Research Classification, http://www.arc.gov.au/era/ANZSRC.htm.
 

Bookmark and Share

Comments

There are currently no comments

About Insights

Insights is a regular feature of the LH Martin Institute e-Newsletter, presenting opinion pieces from tertiary education's leaders and experts. Subscribe to the e-newsletter.

* The views and comments expressed in 'Insights' belong to the author and do not necessarily reflect those of LH Martin Institute.

Contributors
All Articles
Paul Abela (1)
Muneer Ahmed (2)
Prof Warwick Anderson (1)
Professor Awang Bulgiba Awang Mahmud (1)
Anne-Marie Birkill (1)
Liz Bare (2)
Fabiana Barros de Barros (2)
Janet Beard (1)
Prof Warren Bebbington (1)
Prof Sharon Bell (1)
Doron Ben-Meir (1)
Peter Bentley (5)
Tom Boland (1)
Prof Sandra Bohlinger (1)
Prof. Victor (Vic) Borden (1)
Prof Tim Brailsford (1)
Professor Linda Butler (1)
Angel Calderon and Karen L. Weber (1)
Emeritus Prof Richard Carter (1)
Pam Caven (1)
Rosalind Chan (1)
Pam Christie (1)
Prof Roy Crawford (2)
Prof Stephen Crump & Linda Cooper (1)
Dr Heather Davis (4)
Prof Glyn Davis (1)
Associate Prof. Stijn Dekeyser (1)
Prof John Dewar (1)
Martin Doel OBE (1)
Mari Elken (1)
Lorelle Espinosa (1)
Dr Peter Ewell (1)
Professor Mike Ewing (1)
Penny Fenwick & Dr Jan Cameron (1)
Neil Fernandes (1)
Claire Field (1)
Jon File (2)
Pat Forward (1)
Dr Nick Fredman (3)
Prof. Leo Goedegebuure (3)
Ian Gostelow (1)
Carroll Graham (1)
Carol Harding (1)
Prof Elizabeth Harman (1)
Tony Heywood (1)
Professor Jeroen Huisman (2)
Professor Anne Jones (1)
Tom Karmel (2)
Sharon Kerr (1)
Prof Denise Kirkpatrick (1)
Prof Linda Kristjanson (1)
Eunyoung Kyung (1)
Professor Emeritus Frank Larkins (17)
Mary Leahy (1)
Megan Lilly (1)
Prof David G Lloyd (1)
Bruce Mackenzie (1)
Hon. Steve Maharey (1)
Derek McCormack (1)
Prof. Stuart McCutcheon (1)
Prof V. Lynn Meek (1)
Prof Robin Middlehurst (1)
Gavin Moodie (2)
Bruce Muirhead (2)
Huong Nguyen (1)
Peter Noonan (1)
Andrew Norton (1)
Prof Gareth Parry (2)
Prof Alan Pettigrew (1)
Barbara Pocock (1)
Professor Richie Poulton (1)
Dr Joanne Pyke (2)
Dr Sarah Richardson and Assoc. Prof. Hamish Coates (1)
Julie Rowlands (1)
Professor Yoni Ryan (1)
Dr Ruth Schubert (4)
Kevin Seales (1)
Jeffrey Selingo (1)
Dr Geoff Sharrock (3)
Robin Shreeve (1)
Denise Stevens (1)
Jan Stevenson (1)
Associate Professor Elaine Stratford (2)
Marian Mahat (6)
Dr Ly Tran (1)
Prof Andrew Vann (1)
Prof Frans van Vught (1)
Prof Pat Walsh (2)
Professor Robert Wagenaar (1)
Professor Paul Wellings (1)
Dr Julie Wells (1)
Assoc Prof Leesa Wheelahan (6)
David Windridge (1)
Dr Peter Woelert (1)
Dr Peter Woelert and Dr Victoria Millar (1)
Serena Yu (1)
Dr Nadine Zacharias (1)
Dr Stefan Popenici (1)
Dennis Murray (1)
Alan Lawler (1)
Andrew Gunn and Michael Mintrom (1)
Mark Tayar & Rob Jack (1)
Assoc Prof Damien Harkin (2)
Rafael Arruda Furtado & Tatiana deCampos Aranovich (1)
Dr Jan Cameron (1)
Jim Davidson (1)
James Barron (1)
Anne Dening (1)
Lea Patterson (1)
Ms Åsa Olsson (1)
Jennifer Westacott (1)
Maree Conway (1)
Qingcheng Li (1)
Archives
2014
August (4)
July (6)
June (4)
May (3)
April (5)
March (4)
February (4)
January (3)
2013
November (4)
October (4)
September (4)
August (8)
July (5)
June (6)
May (5)
April (4)
March (4)
February (5)
January (2)
2012
November (3)
October (3)
September (6)
August (5)
July (2)
June (2)
May (2)
April (3)
March (5)
February (4)
January (2)
2011
November (4)
October (4)
September (6)
August (3)
July (3)
June (3)
May (4)
April (1)
March (2)
February (1)
January (2)
2010
November (2)
October (2)
September (4)
August (3)
July (2)
June (2)
http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p01.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p02.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p03.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p04.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p05.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p06.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p07.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p08.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p09.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p10.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p11.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p12.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p13.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p14.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p15.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p16.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p17.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p18.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p19.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p20.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p21.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p22.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p23.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p24.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p25.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p26.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p27.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p28.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p29.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p30.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p31.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p32.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p33.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p34.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p35.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p36.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p37.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p38.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p39.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p40.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p41.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p42.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p43.html http://www.lhmartininstitute.edu.au/birkenstock/Birkenstock-p44.html