Product has been added to the basket

Goalposts on the move


The impact profile of recent UK geoscience research is set to change, say Nick Petford* & Jonathan Adams#

 

Geoscientist Online and Geoscientist February 2008


It is generally accepted that the number of citations a paper receives is useful measure of research impact1. Indeed, citation counts, renormalised to a common benchmark, provide compelling evidence that UK science is truly world class - second only to the USA in terms of citation impact2. In an article published recently in the journal Scientometrics, researchers at Evidence Ltd, a consultancy specialising in research performance analysis (www.evidence.co.uk), have added a new layer of sophistication to the citation analysis game. The paper3 looks in detail at UK bibliometric journal impact profiles from 1995 to 2004, and presents a new methodology with important implications for the way citations might be used to assess research performance in the metrics-driven world set to follow the 2008 Research Assessment Exercise4.

Rather than reporting the average citation rate (for a research group or individual) as a measure of research impact, as is common practice, their analysis takes into account research significance by profiling and comparing impact factor distributions. For example, while citations per paper are clearly and demonstrably related to impact (and easy to understand) they absorb outlier values and hide the fact that research performance data of any kind are rarely normally distributed. Where data are skewed, the “average” can differ significantly from the median and mode. This means that that a static value of average impact factor (citations per paper, itself a ratio), while useful as a measure of research quality, will not paint a full picture of research activity and could even be misleading5.

Citation profiles and Rebased impact

In order to move away from reporting a single value as a proxy for quality, the Evidence team have developed a method based on citation profiles. The primary data underpinning the analysis were drawn from 750,376 individual UK journal articles covering the major science technology engineering and mathematics (STEM) subjects, held on the Thomson Scientific Inc. database. Using total UK publication data covering a 10-year period, the team split citations per paper into categories based on a reference value called the Rebased Impact (RBI) index. This normalises citation counts to the world average for the year and discipline category of publication. In this model, an uncited article has an RBI of 0, while an RBI >1 means that it is above the world average. An RBI of 0.5 is a paper with normalised citation count half world average while RBI = 2 is twice world average. An exceptional RBI >8 corresponds to the top cited 1-2% UK research papers and overlaps with Thomson ‘Highly Cited’ papers that comprise the top 1% in a given field.

Results

Using the this method, it is possible to analyse differences in performance by subject area nationally, by university and even at research centre level. Table 1 is a summary of research impact profiles by journal category that equated broadly to subject discipline, drawn from Adams et al., (2007).

The category marked Geosciences contains 22,939 UK outputs over the period 1995-2004 along with the percentage of those outputs categorised by RBI along with % world average. Also included for comparison are data from the journal categories for Chemistry and Physics. These three subject areas together comprise Main Panel D in RAE2008 and thus make natural comparators for the analysis presented here (and Fig 1).

Table 1

Table 1. Summary of UK research impact (1995-2004) by Thomson Scientific subject areas for Geosciences, Chemistry and Physics (from Adams et al., 2007).

Without delving into too much detail, and ignoring for now the potentially thorny problem of a precise definition of "Geosciences", some broad conclusions can be drawn. Firstly, 21.4% of all geoscience articles go uncited over the review period (similar to the UK average of 21.8%). Secondly, 42.2% of UK geoscience journal articles are cited but below the world average (RBI <1). Similar performance in these categories is mirrored in Physics and Chemistry. When aggregated as a whole, it is apparent that while the UK RBI average is 1.33, some two-thirds of all outputs are at or below world average. This is strong evidence of the potential for misunderstanding that would arise from superficial indices of skewed metrics, especially if those then affect funding. For the geosciences, broadly defined, 63.6% of all publications are either never cited or are cited less than the world rebased average. However, at the upper end of the citation scale, Geosciences compares favourably with its Panel D comparators and the total UK average. For example, the percentage of outputs with RBI > 8 in Geosciences is 1.7 (UK average = 1.5), compared to 1.8% for Chemistry and 3.0% for Physics. The percentage of citations greater than the world average in Geosciences (36.4) is only just behind Chemistry (36.5) and marginally ahead of Physics (34.2), despite the fact that these subject areas produced nearly three times the research output over the census period.

Fig 1

 

Fig 1. RBI scores for Geosciences over the period 1995-2004, along with impact profiles for Chemistry and Physics as comparators (see Table 1). All three subject areas display similar RBI citation profiles.

The profiling methodology allows the UK Geosciences Rebased impact to be tracked over time. For this analysis Evidence has applied moving five-year windows (which help to smooth annual volatility) for the period from the early 1990s through to the most recent five years for which data are available. Fig. 2 shows how the RBI has risen from a value of around 1.17 around the 1996 RAE to its present day value of 1.33. The steepest rate of change followed the 2001 RAE where the RBI climbed rapidly over three consecutive years. The hint of recent decline is probably transitory.

Fig. 2

Fig. 2. UK Geoscience average rebased impact (RBI) for a sequence of five-year windows: 1993-1997 through 2002-2006.

The power of the profiling methodology is seen in Fig. 3, where the distribution of Geoscience outputs for the most recent 10 years are grouped into less coarse RBI bins. Data presented in this way allow subject areas and research performance to be compared much more informatively against national or international benchmarks than simple averages would do. Here, it is clear that as a subject group, Geosciences outperform the UK RBI baseline over the interval 0.5-1.0 < RBI < 2.0-4.0 and have a lower percentage of uncited papers compared to the national average.

Such high-level profiling is useful for assessing the relative research strength of whole subject areas, but could also be applied to universities and research centers. Benchmarks could be based on internal metrics or comparator organizations working in the same field. It is now easy to benchmark a university against the UK subject area as a whole. Research managers will of course be looking for enhanced performance at the upper end of the RBI scale and a lower percentage of uncited papers compared with the benchmark value. They can also analyse the distribution of authorship across their highly cited papers.

Fig 3

Fig 3. Profiled 1997-2006 citation impact of 24574 Geoscience papers (blue curve) compared with the total UK citation impact (740389 papers). The distribution shows the Geosciences ‘outperform’ the UK baseline with fewer uncited papers, less activity in the low citation region (RBI < 1) and higher activity where RBI > 1. At the very top end (RBI > 4-8, RBI > 8), however, the UK baseline has a slight edge.

Definition of "Geosciences"

But before we all get too excited, we should pause to consider what exactly is meant by Geosciences as defined currently by Thomson Scientific. The journal category Geosciences contains 439 journals covering a very wide range of subject matter indeed (see www.in-cites.com/journal-list/index.html) As well as our very own Journal of the Geological Society, it also includes the entire Journal of Geophysical Research family, and publications on remote sensing, climatology, oceanography, glaciology – nearly everything, in fact, to do not just with the solid Earth, but atmospheres and oceans as well. This is a very broad definition, with its boundaries blurred into physical geography, meteorology, environmental sciences and ecology; single honours geology it is not! And with this wide spread of categories comes a rather awkward question: “Which bit (if any) of Geosciences sensu lato, dominates the "high impact" end?” Put another way, could the sustained greater-than-average performance since 1997 (Fig. 2) be due to consistently high performance in a specific subset of Geosciences as presently defined?

Institutional ranking based on RBI metrics

Perhaps one way to answer this question is to look at the top ten UK institutions ranked by article count where RBI >8. This is shown in Table 2 for the period 1997-2006 used in Figure 3. As a proxy for those institutions that retain a grounding in traditional geological subjects, the table also shows where F600 (Undergraduate BSc Geology) is offered.

Table 2

Table 2. Top UK Institutions Ranked by Geoscience Article Count where RBI > 8. Two out of 8 HEIs do not offer single honors Geology. The table includes two research institutes (the Meteorological Office and the European Centre for Medium Range Forecasts) specialising in atmospheric processes.

While we should not read too much into the rankings compiled this way (the table summarises data for institutions, not specific geoscience departments), some interesting points emerge. The most immediate is that the big jump in RBI > 8 count (24) between the élite universities and the Meteorological Office points clearly to the fact that within Thomson's Geosciences category, papers in atmospheric sciences/meteorology dominate the high impact end of the spectrum. This is further supported by the fact that the European Centre for Medium Range Forecasts outperforms three traditional universities (Edinburgh, Manchester and Southampton). We have no information yet as to the specific content of these high scoring papers, but purely in volume terms they contribute significantly to the above world average performance of UK Geosciences. An interesting open question is this: were the criteria for Geosciences revised to exclude meteorology (but keep oceanography), would the field still perform consistently above the world average?

Implications for UK Geoscience

Academics can expect to see a growth in the profiling of institutions, as collective research output is ranked increasingly against national and international benchmarks. This could easily be extended to research group level and even the individual. Indeed, US research looking into publication behaviour based on a sample of AGU Fellows clearly identifies the majority of researchers as “prolific” (meaning that their work is cited more than 21 times6. Combining these gross behavioural traits with new and more subtle metrics as a measure of research “quality” may become the norm (especially at the high-impact end)).

More contentiously, institutional profiling (Table 2) suggests that given the current wide definition of Geosciences, the highest scoring departments in a metrics-driven QR funding environment will be those that have a strong atmospheric sciences/meteorology research component. The community will therefore wish to evaluate HEFCE’s proposals2 with care and engage closely with them in debating exactly how the metrics process is developed, structured and weighted by discipline. University UK’s recent report5 provides useful background to this debate.

References

  1. Garfield, E, (1955), Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas, Science, 122, 108-111.
  2. King, D.A. (2004), The Scientific impact of nations, Nature, 430, 311-316.
  3. Adams, J., Gurney, K & Marshall, S (2007), Profiling citation impact: a new methodology, Scientometrics, 72, 325-344.
  4. HEFCE, (2007), Consultation on Research Excellence Framework, http://www.hefce.ac.uk/Research/assessment/reform/
  5. Evidence, (2007), The Use of Bibliometrics to Measure Research Quality in UK Higher Education Institutions, Report to Universities UK , ISBN 978 1 84036 165 4.
  6. Laird, J.D. & Bell, R.E. (2007), EOS 88, #38, 370-371.

* University Executive Group, Bournemouth University, Fern Barrow Dorset, BH12 # Evidence Ltd, 103 Clarendon Road, Leeds, LS2 9DF