Measuring the effects of the research funding changes

Get Complete Project Material File(s) Now! »

DATA AND METHOD

Scope of analysis

This report examines the performance of seven of the eight12 Centres of Research Excellence (CoREs) across five areas: strategic impact, research networks and collaboration, research quality, the academic impact of research, and knowledge transfer (including human capital development). As the reporting of the CoREs evolves, we expect to be able to report performance on a more regular basis.
We focus on the performance of the CoREs between 2002 and 2010.

Strategic impact

This section describes some of the most important impacts of the research of the CoRE. The CoREs were selected because of their research excellence and their strategic importance for New Zealand. So we look at some of the contributions each CoRE has made to the wider New Zealand society and economy, beyond the production of research papers and beyond the education role of the CoREs. The focus of this section is on the ways in which each CoRE’s research has been used in society and in business and how New Zealand’s society and economy have been influenced by that research.

Research networks and collaboration

Using the publication lists compiled for the principal investigators in each CoRE, we constructed co-author network diagrams for each year. Co-author relationships between pairs of authors are revealed by noting instances when the two authors appear in the author list of the same publication in a given year. By representing authors as nodes and connecting co-authors by lines, we can draw co-authorship network diagrams. These diagrams provide an indication of the degree of collaboration occurring in the CoRE between principal investigators and may show changes over time in collaboration as the CoRE develops.
It is to be expected that principal investigators will co-author articles with people who are not principal investigators: in some instances these co-authors will not be associated with the CoRE, while in other cases they may be affiliated with the CoRE in other ways. We have retained all co-authors in the network diagram, whether principal investigators or not, as we recognise that collaboration can be mediated by individuals who are not principal investigators (eg co-supervised research students or postdoctoral fellows, or associate investigators of the CoRE).
For each CoRE, we have constructed co-authorship diagrams to illustrate trends in collaboration between researchers. In this report, we have reproduced the co-authorship diagrams covering joint publications for a period of two years at the start of their funding (2003/04 for most CoREs) and a second for the period 2009/10. This allows us to depict the early- and late-stage collaboration networks that have arisen within each CoRE.
Examples of these network diagrams and how to interpret them are presented below.
In year a, the principal investigators (represented by blue dots) are not collaborating with each other. However, in year b, two of the principal investigators are directly linked to each other through a co-authored paper, while two others are indirectly linked through co-authorship.

Research quality

To rate the quality of the peer-reviewed journal publications from the CoREs, we have assigned each publication a rating reflecting the quality of the journal in which it was published. We used the journal quality ranking list produced by the Australian Research Council (ARC) as part of the Australian university research quality assessment system – Excellence in Research for Australia (ERA).
The ARC used a comprehensive consultation process to determine the journal rankings, with several discipline representative bodies helping in their compilation. Therefore, these rankings reflect a consensus among Australian experts/researchers. But, by their nature, they have an element of subjectivity and they were determined with publication by Australian authors in mind. Therefore, publications in New Zealand-based journals may receive a lower ranking than if a similar exercise were to be carried out in New Zealand.
Over 20,000 journals were ranked by the ARC into four tiers. The definitions of each of the four tiers are presented in Table 313. The ARC applied a ‘not ranked’ category to new journals. Also, a number of journals that published articles by CoRE researchers did not appear on the list. This may be because Australian authors did not publish in these journals and so they were not ranked by the ARC, or because the journals had only recently been created. We have allocated these journals to the not ranked category.

Academic impact of research

This analysis looked at articles published by researchers designated as principal investigators at each of the CoREs. We used the Thomson Reuters Web of Science to examine the number of citations per publication for each of the CoREs and compared that with the same measure for the publications in a similar field produced by a top-ranked institution in Australia.
For each of the six CoREs with a focus on science, lists of investigators were used to compile lists of publications by each year using author searches of the Science Citation Index Expanded and the Conference Proceedings Citation Index – Science databases in the Thomson Reuters Web of Science. The searches were restricted to articles, proceedings articles, notes and letters. For each CoRE, impact factors (IF) for each year from 2005 to 2011 (since 2006 for Gravida and 2010 for the Riddet Institute) were computed using the citations to publications as recorded in the Web of Science according to the formula where P(Year) is the list of publications in a given Year, N({P}) is the number of publications in the set {P} and C(Year,{P}) is the number of citations received by publications in the set {P} in a given Year. This is essentially the same measure of impact as that commonly used by journals (commonly known as journal impact factors or JIFs).
Because there are different rates of citation in different subject disciplines, the multi-disciplinary nature of some of the CoREs in this analysis means that the impact factors need to be treated with an element of caution. In particular, it is not valid to compare the performance of the CoREs against each other. Rather, in this measure, we compare each CoRE against a leading benchmark institution.
As it proved difficult to locate an institution that had an inter-disciplinary mix that was comparable with that of the Maurice Wilkins Centre, that CoRE was excluded from this part of the analysis.
Nga Pāe o te Māramatanga (NPM) operates across a range of disciplines, with a particular focus on the emerging field of indigenous studies. The impact factor analysis described above is not applicable to the social sciences (as opposed to health or natural sciences) because of differences in publication conventions between the natural sciences and the social sciences. So we have not used this approach for NPM.
To select benchmark institutions, we chose to use the results of the recent Excellence in Research for Australia (ERA) initiative. Each CoRE in the study was asked to nominate the fields of research most relevant to their research from the Australian and New Zealand Standard Research Classifications. The top-ranked Australian university based on the aggregate ERA scores in each CoRE’s nominated fields of research was then chosen as the corresponding benchmark institution. In some cases where the subject coverage of the CoRE journal publications was not covered adequately by the benchmark institution, another benchmark was added to cover those areas. This procedure led to the following choices of benchmark institutions:

  • The Allan Wilson Centre: University of Queensland and Stony Brook University (an American university)
  • The Bio-Protection Research Centre: University of Adelaide
  • The MacDiarmid Institute: University of Queensland
  • Gravida: University of Queensland and University of Adelaide
  • The Riddet Institute: University of Queensland, University of Adelaide and University of New South Wales.
READ  Business Rules (BRs)

Although this means we are comparing a CoRE’s performance with the entire research output in the selected subject areas of an Australian university, the research performance of that university was assessed as being of world class and hence provides a useful benchmark.
A list of publications with authors from each of the benchmark institutions was then compiled by searching for all publications from these institutions14 that fell in the top eight most frequent Web of Science subject area classifications for the corresponding CoRE publication lists. Where multiple benchmark institutions were used (for Gravida and Riddet), an aggregated impact factor was computed from a weighted sum of impact factors for different subject areas at the different benchmark institutions. The weights in this sum were chosen to match the aggregate distribution of Web of Science subject areas that made up each particular CoRE’s publication list. This procedure produced a representative list of publications from the benchmark institution(s) that overlapped substantially in subject classification with the corresponding list from each CoRE. The impact factor of this representative list was then computed using the formula above.

Knowledge transfer

This analysis describes some of the ways in which the CoREs in this study have gone about their activities. This includes how many postgraduate research students are supported by the CoREs and also their commercialisation and outreach activities.

1 Introduction 
1.1 Background
1.2 Measuring the effects of the research funding changes
2 Background 
2.1 What are Centres of Research Excellence?
2.2 Why were the Centres of Research Excellence established?
2.3 What are the objectives of the Centres of Research Excellence fund?
2.4 How were the Centres of Research Excellence selected?
2.5 The Centres of Research Excellence
2.6 Government funding of Centres of Research Excellence
3 Data and method 
3.1 Scope of analysis
3.2 Strategic impact
3.3 Research networks and collaboration
3.4 Research quality
3.5 Academic impact of research
3.6 Knowledge transfer
4 The Allan Wilson Centre 
4.1 Introduction
4.2 Strategic impacts
4.3 Research networks and collaboration
4.4 Research quality and impact
4.5 Knowledge transfer
5 The Bio-Protection Research Centre 
5.1 Introduction
5.2 Strategic impacts
5.3 Research networks and collaboration
5.4 Research quality and academic impact
5.5 Knowledge transfer
6 The MacDiarmid Institute 
6.1 Introduction
6.2 Strategic impacts 27
6.3 Research networks and collaboration 27
6.4 Research quality and academic impact 28
6.5 Knowledge transfer
GET THE COMPLETE PROJECT
CoREs and effect

Related Posts