Retrospective storytelling alone cannot be used to evidence research impact. And that is one core point underlined in the 2029 Research Excellence Framework (REF); in fact it highlights that impact is assessed on whether reach and significance can be clearly evidenced and traced back to the research question itself, and whether institutions can demonstrate a clear, credible connection between research activity and real-world change.
Moreover, analysis of REF 2021 impact case studies reinforces an important reality: impact is rarely a linear, end-of-project activity. Instead, it is built through engagement, partnerships and co-production across the research lifecycle. REF 2029 now makes this explicit, recognising these non-linear and engagement-led pathways to impact.
“For universities, quite often the challenge is not generating the impact itself but having access to a joined-up evidence-based view of where and how impact is already beginning to emerge… early enough to be recognised, sustained and built upon,” says Ann Campbell, Director Research Impact & Comparative Analytics at Digital Science, drawing on her experience supporting REF submissions in a previous university role as Research Systems and Data Manager.
Impact often starts with research embedded beyond academia
“Analysing the REF 2021 Impact Case Study data helps clarify where impact really begins,” says Campbell. Of the 6,361 impact case studies submitted, 6,045 included underpinning research publications that could be identified and matched in Dimensions using DOIs, ISBNs and bibliographic metadata. “This allowed us to explore patterns in authorship, collaboration, and external engagement,” she explains.
Within these 6,045 case studies:
- 1,193 were underpinned by research co-authored with healthcare partners
- 1,040 involved government partners
- 913 involved industry
- 866 involved non-profit organisations
“This shows that a substantial proportion of REF impact is built on research already embedded in external systems, not simply disseminated afterwards,” she explains. Across disciplines, different pathways are visible: health impact is dominated by healthcare-embedded research; technological impact often emerges from industry-embedded R&D; and environmental impact is closely linked to government and NGO networks.
Interconnected data in Dimensions helps universities see where their research is already connected to the world beyond academia, whether through hospitals, government bodies, industry or public organisations. This makes it possible to identify areas of research that are naturally well positioned to generate impact, early in the cycle.
“It is, however, important to emphasise that this kind of analysis does not select impact case studies or predict REF outcomes. Rather, it supports better understanding of where strong impact is already emerging, so institutions can recognise it sooner and support it more effectively over time,” says Campbell. “Crucially, this kind of analysis also helps institutions identify case studies that already have the building blocks of a REF-ready narrative – where links between underpinning research, external uptake, and corroborating evidence are clearer, traceable and easier to articulate.”
Engagement needs to be captured early, not reconstructed later
Even when research is externally embedded, evidence of engagement is often fragmented or lost in advisory roles, policy input or informal collaborations. Over time, staff movement, changing roles and organisational turnover can erode institutional memory, making it harder to reconstruct how engagement unfolded and how impact developed.This creates risk when institutions later need to reconstruct timelines and narratives under time pressure.
REF 2029 explicitly requires units to describe how engagement and partnerships enabled impact, not just the outcomes. This means engagement activity should be recorded as it happens, not retrospectively inferred.
That is why it becomes important to have systems like Symplectic Elements that provide structured capture of engagement activity and linking it to related data such as people, publications and grants. In practice, it acts as an institutional memory layer, supporting traceability and consistency without assessing impact itself.
Impact relies on materials, not just publications
Impact pathways differ sharply by discipline. Health impact is evidenced through clinical guidelines and policy; technological impact through patents and translation; environmental impact through policy reports and media; and social and cultural impact through public and professional discourse.
Many of these pathways rely on outputs beyond the traditional journal article including datasets, reports, briefings, tools and other materials that support engagement and use. In many cases, these sit alongside conventional publications as part of a broader impact narrative. Yet these materials are often poorly preserved, uncitable or disconnected from the research record.
Research repository services such as Figshare support institutions by providing persistent access to impact-supporting materials, assigning DOIs and stable landing pages, and enabling transparency and reuse. Institutions may use Figshare, for example, either as a dedicated repository for non-traditional research outputs, or as a full institutional repository capturing both traditional publications, datasets, and wider research outputs, depending on their infrastructure and needs.
Making external uptake visible
REF 2021 impact narratives and their underpinning research leave clear, traceable signals of real-world use, from policy and clinical guidelines to patents, Wikipedia and news. The mix of these signals varies by discipline, revealing different pathways from research to real-world change.
Tools like Altmetric reveal where research is being cited in policy and guidelines, taken up in patents, and discussed in media and public discourse. It does not just measure impact, but provides external confirmation of engagement and uptake, aligned with how REF panels assess reach and significance.
From reactive reporting to impact readiness
REF impact is built through sustained interaction with the world beyond academia. Institutions that invest early in understanding, capturing and evidencing engagement are better placed to meet REF expectations with confidence. Digital Science supports this readiness by strengthening evidence and decision-making while leaving judgement firmly with institutions and REF panels.
REF readiness is about leading, not lagging. It means planning ahead and building on early engagement signals, not relying on retrospective evidence gathering at the end of the research cycle.
Institutions that prepare most effectively:
- understand where their research connects with the world, with interconnected data solutions, e.g., Dimensions
- capture and structure engagement activity as it happens, linking people, publications and grants through systems such as Symplectic Elements
- preserve the materials that support impact claims by investing in a repository that mints research data and NTRO’s, like Figshare
- validate reach and uptake beyond the institution using tools such as Altmetric
Together, these become the building blocks of REF-ready impact narratives: clearer to articulate, easier to evidence, and more credible to defend.
The post REF readiness: evidencing Engagement & Impact appeared first on Digital Science.
from Digital Science https://ift.tt/Z7OBAKz
No comments:
Post a Comment