Society News

EPS response to REF consultation on UoA4 costing exercise

As a Learned Society, the EPS was invited to consider the current REF consultation exercise, and submit a response to a variety of questions. The deadline was 15 October 2018. The Officers do their best to express a view in response to consultations that fits with their understanding of Members’ interests and attitudes, and with the overall field of experimental psychology, bearing in mind the constraints in how such exercises are timed. The broader strategy of the Society can be, and often is, actively debated at AGM.

At a number of points in the REF cycle, the EPS has attempted to offer constructive suggestions and views to inform decision making at the discipline-level, whilst refraining from commenting on many issues that fall within the remit of institutional decisions or practices. The EPS committee agreed to comment specifically on the proposal that REF introduce a costing exercise for UoA4. For transparency, we report on the EPS submission below.

12a. How feasible do you consider to be the approach set out at paragraphs 267 to 271 for capturing information on the balance of research activity of different costs within submitting units in UOA 4? (300 word limit)

The Experimental Psychology Society opposes the costing of research outputs from UoA 4. Our concerns about feasibility relate to measurement, coherence, and risk of perverse incentives.

We recognise that research costs vary across projects. This is the case in many UoAs, and therefore it is not clear why costing of research outputs should be applied idiosyncratically in UoA 4.

The consultation document proposes costing of methods used in research, rather than the actual total cost of a piece of research, in the sense of its Full Economic Costing (FEC).  It is easy to think of experiments with low-cost methods but high FEC, or vice versa.  The UK science funding framework is based on FEC, and it is hard to defend using different costing frameworks for research funding (FEC) and research assessment.  Two key factors that affect the FEC of research are not mentioned: one is researcher time investment, and the other is size of a dataset.  Both have a positive effect on research quality, and particularly on research reproducibility.  The current proposal risks rewarding small, irreproducible studies with high infrastructure costs, and penalising careful reproducible studies with lower infrastructure costs. This would be a major scientific mistake, and runs contrary to the current, consensual focus on improving the reliability of research.

REF focuses on the evaluation of research outputs; it is not tuned for the assessment of cost, an input measure, or for the assessment of outputs relative to inputs. We endorse this focus on outputs, and see a serious risk that conflation of input and output measurement will encourage institutions to believe they can gain by driving up the cost base of research. It is likely to encourage the use of expensive research methodologies, which is not the same as the REF’s stated aim of encouraging excellent research.

12b. Are the examples of high cost and other research activity sufficiently clear to guide classification? (300 word limit)

Based on the examples, we do not have confidence that classification will provide valid and reliable indicators of cost. One problem relates to change in costs over time. If an institution invests in expensive equipment, the costs of the first output (based on the initial capital) will be different from the next (in which the initial capital is no longer directly relevant). Another problem relates to distribution of effort. Use of expensive methods often involves collaborative teams, often working internationally. If we understand the consultation document correctly, a UK researcher who submits a paper with cutting-edge fMRI data collected at another university will bring money into the university that employs them, not the university that bore the cost of the neuroimaging facility.  (The Experimental Psychology Society advised against such ‘portability’ in a prior REF consultation.) We see no reliable way of directing the rewards towards the institutions that actually incurred the costs, which brings the risk of a perception of unfairness.

Published papers will not generally provide enough information to make accurate estimates of the cost of the methods used.  Misconceptions abound: brain stimulation, for example, is mentioned as a high-cost method, yet sample sizes are small, analyses are often simple, and the equipment can be cheap.  In animal studies, stains, reagents and vectors vary dramatically in cost, but the cost will probably be known only to those who buy them. Assessors may not have information to make accurate estimates of cost, and should not therefore be asked to do so.

12c. Please provide feedback on any specific points in the guidance text as well as the overall clarity of the guidance. (300 word limit)

Para 270 proposes a classification into three bands based on the percentage of research activity that is classed as high cost. This may encourage departments to select outputs based not on the quality of the research but on the cost of the infrastructure in order to ensure classification in a preferred band. This has the potential to be divisive, and runs counter to the central aim of the REF: to reward excellence.

The Experimental Psychology Society agrees that advanced research methods are important, and REF should encourage rather than discourage investment in key research infrastructures, such as animal labs and neuroimaging.  In our view, this should be done through the *environment* assessment, rather than through output assessment.  That way, funding for high-cost infrastructures is guaranteed to go to the HEI that bears the cost, which is not the case for output assessment.  We recommend that the REF team consider how evaluation of methods-based facilities in the environment assessment can take account of the research productivity of methods-based facilities, as well as their existence.  HEIs should be rewarded for facilities that produce useful research, not for facilities that are poorly used.

Society News

Small grants and study visits: open for open science!

The EPS committee have agreed to make explicit in the guidance for small grant and study visit  applicants that we welcome proposals that specify relevant open science practices. These awards are used for a wide variety of purposes and without being restrictive, we are simply encouraging applicants to frame their proposals in ways that help convince an audience of their potential value to applicants and the community.