Following feedback from the sector, REF team has increased the initial threshold for impact to 20 FTE (from 15 FTE)
Updated requirements are as follows:
FTE of submitted staff Number of case studies
Up to 19.99 2
20 to 34.99 3
35 to 49.99 4
50 to 64.99 5
65 to 79.99 6
80 to 94.99 7
95 to 109.99 8
110 to 159.99 9
160 or more 10, plus one case study per additional 50 FTE
In communications released over the summer period and in late November, HEFCE have confirmed many important aspects of the requirements for the next Research Excellence Framework, REF 2021.
Outcomes and weightings
- Five-point scale: 4* – Unclassified
- Weightings: Outputs –60%, Impact –25%; Environment –15%
HEFCE are implementing Lord Stern’s recommendations to decouple staff and outputs. The number of outputs submitted will be determined by the FTE of staff submitted in each Unit of Assessment, with the flexibility to allow minimum and maximum number of outputs per staff member
- Minimum 1 and maximum 5 outputs per individual
- Average of 2.5 outputs per FTE
- For REF2021, outputs may be submitted both by the institution employing a researcher on the census date (31 July 2020), and by the institution where the researcher was previously employed when the output was demonstrably generated
- ‘Demonstrably generated’ – when the output was first made publicly available
- Impact remains with institution where research was generated
- Impact must be underpinned by excellent research of minimum 2* quality
- 1 January 2000 -31 December 2020 for underpinning research; 1 August 2013 -31 July 2020 for impacts
- Definitions of impact will be broadened
- Impact on teaching at the submitting institution will be counted
- Case studies continued from examples submitted in 2014 will be eligible
- Minimum 2 case studies per Unit of Assessment
Submission of staff
All staff with significant responsibility for research will be returned to REF 2021.
Starting point of identifying ‘total pool of category A eligible staff’:
- 0.2 full-time equivalent (FTE) or greater
- primary employment function is to undertake either ‘research only’ or ‘teaching and research’
- substantive connection with the submitting institution
- and they must be independent researchers (i.e. not research assistants
In institutions that are confident that all Category A Eligible staff have ‘significant responsibility’, 100% of those staff should be submitted. In institutions where the basic criteria for ‘Category A Eligible’ staff does not accurately identify only those staff with significant responsibility for research, a smaller group of ‘Category A Submitted’ staff can be identified—those staff with ‘significant responsibility for research’.
Significant responsibility for research: ‘those for whom explicit time and resources are made available to engage actively in independent research, and that is an expectation of their job role.’ (REF 2017/04)
The criteria for determining who should be included in this category should be determined by each HEI. The process must be developed collaboratively in consultation with staff, and relate to standard ways of working at the institution. It should be written into a Code of Practice.
This webinar will look at public engagement and how to measure impact in arts research. The session will give you an opportunity to consider impact and engagement in your research and what to consider at the start of each project.
Charlotte Medland is the Impact and Evaluation Officer for the Humanities Division at Oxford University and has worked previously on the evaluation of public engagement projects at the University of Southampton.
Interested? – email firstname.lastname@example.org and we will send you your link to join this webinar.
All you need is a computer (or mobile phone) with headphones.
RDP Webinar – 11th October 1.00pm ‘Impact and Public Engagement in the Arts’ with Charlotte Medland
Other forthcoming RDP Webinars can be seen at UCA Research Development along with links to the recordings of past webinars
Latest news on REF 2021
- outputs – 60 per cent
- impact – 25 per cent
- environment – 15 per cent
Read full details on the initial decisions here:
Webinar – Friday, Jul 14 2017 @ 10:00am – 11:00am
What does 4* impact evidence look like?
Vertigo Ventures are running a one-hour intensive session is aimed at supporting those who are developing impact cases for the next REF. Using our experience of working with universities to write impact case studies and analyzing cases for good practice this session will utilize these reflections and learnings to share insight for those currently grappling with impact evidence collection.
To sign up click here
Who should attend?
- Researchers looking to submit impact case studies
- REF Managers, Impact Officers, Research Support Officers
- Heads of academic departments seeking to understand and develop the impact in their departments
The webinar will answer the following questions:
- What does impact evidence in 4* case studies look like?
- How is this evidence used effectively?
- What they difference between panel or impact types?
- What should researchers and research managers be doing to develop good practice?
Delivered in a convenient and concise format, the presentation will leave you better equipped to identify support evidencing of case studies.
Visiting the Solent Research Conference recently, we had a session from Southampton University on ‘Public Engagement with Research’.
They have developed a tool kit, which is open to all researchers, based on a three step process. You can check this out here-http://www.southampton.ac.uk/per/2017/evaluation-planning.page
As many art research outputs include engaging with the public, this maybe a useful source of information for planning evaluation and impact of projects.
- What are you trying to achieve with the project?
- How should this be measured?
- What do you need to measure for your funder?
- How will your work contribute to any required reporting to your funder?
- Can you measure impact?
The AHRC recommend using a logic model for engagement and evaluation planning, and they use the Kirkpatrick Model for levels of potential impact. You can access an example of the Logic model on the above website in step 1.
You can checkout our recent blog on Impact Case Studies for an example of UCA Public Engagement in Research – Lost In Lace.