International Conference on New Science Inventions
Website: https://x-i.me/abdunews#science#sciencefather#shorts#technology#conference#awards#research#engineering#microbiology#physics#mathematics#arts#management#chemistry#neuroscience#biology#business#genetics#medicine#Pharmacology#economics#toxicology#pharmacology
International Conference on New Science Inventions
Visit Our Website: https://new-science-inventions.sciencefather.com/
Visit Our Conference Nomination: https://x-i.me/nesiabst2
Visit Our Award Nomination: https://x-i.me/abdunews
Contact us :nesin@sciencefather.com
Persistent identifiers – or PIDs – are long-lasting references to digital resources. In other words, they are a unique label to an entity: a person, place, or thing. PIDs work by redirecting the user to the online resource, even if the location of that resource changes. They also have associated metadata which contains information about the entity and also provide links to other PIDs. For example, many scholars already populate their ORCID records, linking themselves to their research outputs through Crossref and DataCite DOIs. As the PID ecosystem matures, to include PIDs for grants (Crossref grant IDs), projects (RAiD), and organisations (ROR), the connections between PIDs form a graph that describes the research landscape. In this post, Phill Jones talks about the work that the MoreBrains cooperative has been doing to show the value of a connected PID-based infrastructure.
Over the past year or so, we at MoreBrains have been working with a number of national-level research supporting organisations to develop national persistent identifier (PID) strategies: Jisc in the UK; the Australian Research Data Commons (ARDC) and Australian Access Federation (AAF) in Australia; and the Canadian Research Knowledge Network CRKN, Digital Research Alliance of Canada (DRAC), and Canadian Persistent Identifier Advisory Committee (CPIDAC) in Canada. In all three cases, we’ve been investigating the value of developing PID-based research infrastructures, and using data from various sources, including Dimensions, to quantify that value. In our most recent analysis, we found that investing in five priority PIDs could save the Australian research sector as much as 38,000 person days of work per year, equivalent to $24 million (AUD), purely in direct time savings from rekeying of information into institutional research management systems.
Investing in infrastructure makes a lot of sense, whether you’re building roads, railways, or research infrastructure. But wise investors also want evidence that their investment is worthwhile – that the infrastructure is needed, that it will be used, and, ideally, that there will be a return of some kind on their investment. Sometimes, all of this is easy to measure; sometimes, it’s not.
In the case of PID infrastructure, there has long been a sense that investment would be worthwhile. In 2018, in his advice to the UK government, Adam Tickell recommended:
Jisc to lead on selecting and promoting a range of unique identifiers, including ORCID, in collaboration with sector leaders with relevant partner organisations
Streamlining the processes undertaken during National Competitive Grant Program funding rounds must be a high priority for the ARC… I ask that the ARC identify ways to minimise administrative burden on researchers
In the same letter, Minister Clare even suggested that preparations for the 2023 ERA be discontinued until a plan to make the process easier has been developed. While he didn’t explicitly mention PIDs in the letter, organisations like ARDC, AAF, and ARC see persistent identifiers as a big part of the solution to this problem.
A problem of chickens and eggs?
With all the modern information technology available to us it seems strange that, in 2022, we’re still hearing calls to develop basic research management infrastructure. Why hasn’t it already been developed? Part of the problem is that very little work has been done to quantify the value of research infrastructure in general, or PID-based infrastructure in particular. Organisations like Crossref, Datacite, and ORCID are clear success stories but, other than some notable exceptions like this, not much has been done to make the benefits of investment clear at a policy level – until now.
It’s very difficult to analyse the costs and benefits of PID adoption without being able to easily measure what’s happening in the scholarly ecosystem. So, in these recent analyses that we were commissioned to do, we asked questions like:
How many research grants were awarded to institutions within a given country?
How many articles have been published based on work funded by those grants?
What proportion of researchers within a given country have ORCID IDs?
How many research projects are active at any given time?
All these questions proved challenging to answer because, fundamentally, it’s extremely difficult to quantify the scale of research activity and the connections between research entities in the absence of universally adopted PIDs. In other words, we need a well-developed network of PIDs in order to easily quantify the benefits of investing in PIDs in the first place! (see Figure 1.)
Luckily, the story doesn’t end there. Thanks to data donated by Digital Science, and other organisations including ORCID, Crossref, Jisc, ARDC, AAF, and several research institutions in the UK, Canada, and Australia, we were able to piece together estimates for many of our calculations.
Take, for example, the Digital Science Dimensions database, which provided us with the data we needed for our Australian and UK use cases. It uses advanced computation and sophisticated machine learning approaches to build a graph of research entities like people, grants, publications, outputs, institutions etc. While other similar graphs exist, some of which are open and free to use – for example, the DataCite PID graph (accessed through DataCite commons), OpenAlex, and the ResearchGraph foundation – the Dimensions graph is the most complete and accessible so far. It enabled us to estimate total research activity in both the UK and Australia.
However, all our estimates are… estimates, because they involve making an automated best guess of the connections between research entities, where those connections are not already explicit. If the metadata associated with PIDs were complete and freely available in central PID registries, we could easily and accurately answer questions like ‘How many active researchers are there in a given country?’ or ‘How many research articles were based on funding from a specific funder or grant program?’
The five priority PIDs
As a starting point towards making these types of questions easy to answer, we recommend that policy-makers work with funders, institutions, publishers, PID organisations, and other key stakeholders around the world to support the adoption of five priority PIDs:
DOIs for funding grants
DOIs for outputs (eg publications, datasets, etc)
ORCIDs for people
RAiDs for projects
ROR for research-performing organisations
We prioritised these PIDs based on research done in 2019, sponsored by Jisc and in response to the Tickell report, to identify the key PIDs needed to support open access workflows in institutions. Since then, thousands of hours of research and validation across a range of countries and research ecosystems have verified that these PIDs are critical not just for open access but also for improving research workflows in general.
Going beyond administrative time savings
In our work, we have focused on direct savings from a reduction in administrative burden because those benefits are the most easily quantifiable; they’re easiest for both researchers and research administrators to relate to, and they align with established policy aims. However, the actual benefits of investing in PID-based infrastructure are likely far greater.
Evidence given to the UK House of Commons Science and Technology Committee in 2017 stated that every £1 spent on Research and Innovation in the UK results in a total benefit of £7 to the UK economy. The same is likely to be true for other countries, so the benefit to national industrial strategies of increased efficiency in research are potentially huge.
Going a step further, the universal adoption of the five priority PIDs would also enable institutions, companies, funders, and governments to make much better research strategy decisions. At the moment, bibliometric and scientometric analyses to support research strategy decisions are expensive and time-consuming; they rely on piecing together information based on incomplete evidence. By using PIDs for entities like grants, outputs, people, projects, and institutions, and ensuring that the associated metadata links to other PIDs, it’s possible to answer strategically relevant questions by simply extracting and combining data from PID registries.
Final thoughts
According to UNESCO, global spending on R&D has reached US$1.7 trillion per year, and with commitments from countries to address the UN sustainable development goals, that figure is set to increase. Given the size of that investment and the urgency of the problems we face, building and maintaining the research infrastructure makes sound sense. It will enable us to track, account for, and make good strategic decisions about how that money is being spent.
Phill is a product innovator, business strategist, and highly qualified research scientist. He is a co-founder of the MoreBrains Cooperative, a consultancy working at the forefront of scholarly infrastructure, and research dissemination. Phill has been the CTO at Emerald Publishing, Director of Publishing Innovation at Digital Science and the Editorial Director at JoVE. In a previous career, he was a bio-physicist at Harvard Medical School and holds a PhD in Physics from Imperial College, London.
The MoreBrains Cooperative is a team of consultants that specialise in and share the values of open research with a focus on scholarly communications, and research information management, policy, and infrastructures. They work with funders, national research supporting organisations, institutions, publishers and startups. Examples of their open reports can be found here: morebrains.coop/repository
For nearly three decades the UN has been bringing together countries from around the globe to hold climate summits on how to address the growing climate crisis. Last year’s Conference of the Parties (COP) in Glasgow (delayed by a year due to the pandemic) took major steps toward addressing the climate crisis, but failed to deliver the national commitments required to together limit warming globally to 1.5C laid out by the Paris Agreement.
After a year of extreme weather events, from record heatwaves to disastrous flooding, this year’s COP27 in Sharm el-Sheikh, Egypt, will be crucial as the world seeks to take steps together toward mitigating and preventing the worst impacts of climate change.
The United Nations’ Sustainable Development Goals (SDGs) are designed to be a blueprint for achieving a better and more sustainable future for all by addressing the global challenges we face. The SDGs are at the centre of the UN’s 2030 Agenda for Sustainable Development and represent an urgent call for action by all countries – both developed and developing – in global partnership. They recognise that ending poverty and other deprivations must go hand-in-hand with strategies that improve health and education, reduce inequality, and spur economic growth – all while tackling climate change and working to preserve our oceans and forests.
Tracking and reporting on SDGs in Elements
Since SDGs were first introduced, there has been a growing vested interest in tracking, analysing and showcasing the ways in which researchers are contributing to achieving these goals, and in demonstrating global research impact at an institutional level. This can be clearly seen in the increasing numbers of institutions participating in the Times Higher Education (THE) Impact Rankings, which in 2022 showed a 23% increase from 2021. THE Impact Rankings are the only global performance tables that assess universities against SDGs, and currently show participation across 110 countries and regions.
This year we introduced simple but powerful functionality into Elements, allowing institutions to track which research outputs, publications, and activities connect back to the 17 SDGs via use of a new label scheme. SDG labels can be applied to any items captured in Elements (eg. publications, grants, professional & teaching activities and records of impact).
Labels can be applied manually, in bulk via the Elements API, or automatically through our Dimensions integration. Dimensions uses machine-learning to automatically analyse publications and grants, and map them to relevant SDGs. Dimensions maps SDG labels to over 12.9 million publications and hundreds of thousands of grants, with more records being analysed and mapped all the time. These labels are now automatically harvested into Elements together with other metadata on Dimensions records. Those who are licensed to use Dimensions as a data source can further exploit the benefit of having SDG labels harvested and applied to records automatically.
Once collected, SDG data can be used for powerful reporting purposes, whether at an individual, school, or institutional level. We have introduced stock dashboards to support initial reporting on SDG labels. These tools can help research institutions demonstrate which individuals, schools or groups are focused most on specific SDGs, analyse gaps and areas of further necessary investment, and even demonstrate return on investment for funding.
Sustainable Development Goals (SDGs) labels on publications
Labels can also be applied to user profiles and surfaced in public profiles within the Discovery Module add-on to Elements, helping external researchers, members of the press and other stakeholders identify specialists working toward particular sustainability goals (see examples of public profiles showcasing SDG labelling at Oklahoma State University or Lincoln University). This can help drive discoverability of research, open up opportunities for greater collaboration and innovation, and support the public understanding and availability of science by connecting the media to knowledgeable scientific sources.
Users can search and filter by specific SDGs they are interested in to find researchers specialising in that field, while the researchers themselves can showcase their work within their own profiles.
Applying the SDG framework to Elements facilitates and supports both internal and external collaboration and innovation, advancing global efforts to achieve the 2030 Agenda for Sustainable Development.
SDG Case Study: Carnegie Mellon University
Carnegie Mellon University (CMU) is a private, global research university which stands among the world’s most renowned educational institutions. CMU acquired Elements in 2017 and now uses the platform across a wide range of use cases, including “service tracking, faculty annual reviews, publications and monitoring, public directory, custom reporting, data visualization and analysis, data feeds to external websites, open access research and scholarship, data migration from historical systems, researcher identity management, and mapping faculty research to Sustainable Development Goals”. Read more.
During 2021, the University Libraries worked alongside the Provost Office’s Sustainability Initiative to conduct the Sustainable Development Goal mapping with a set of early adopters.
A recent news post on the Carnegie Mellon libraries blog on their ongoing expansion of Elements across campus explains how Director of Sustainability Initiatives Alexandra Hiniker utilised Elements to support faculty in thinking critically about how their work aligns with the 2030 Agenda.
“One thing I’ve heard consistently from students, faculty, staff, and external partners that I work with here in Pittsburgh, across the country, and around the world, is that they want to know what our CMU community is doing on the range of sustainable development goals that cover everything from poverty and hunger, to good health and wellbeing, peaceful, just and strong institutions, reducing inequalities, and of course, climate action,” explains Hiniker in a recent video interview published by the university. “There’s so much great work going on across CMU but it’s hard to pull out all of that information, and share it with all of these different people who are interested in collaboration.
“As part of my role linking students, staff, and faculty across the campus to sustainability efforts, I heard from them that the most important thing was to connect to different parts of the university to which they usually didn’t have access,” Hiniker explained. “Elements is a way for people to quickly access information about what researchers are doing, so that they can help contribute to finding solutions to some of the world’s greatest challenges.”
Elements is now providing a centralized space for CMU’s campus researchers to record which SDGs are associated with their research outputs and other academic activities. The Libraries’ Elements reporting and data visualization team worked with the Sustainability Initiatives Office to build reporting dashboards which surface data on how faculty initiatives and research across campus are supporting specific SDGs.
You can hear more from Hiniker directly in this short interview:
Find out more or get support
Elements can help you track and report on how your researchers are contributing towards the United Nations Sustainable Development Goals as we all work towards achieving a better and more sustainable future for all. Not only does this make participation in the THE Impact Rankings far simpler, it also helps you demonstrate your commitment to global progress to researchers and faculty, prospective students, funders, and other key stakeholders. If you’d like to get in touch to learn more about Elements, or if you’re a current client who’d like more information on how to integrate Dimensions as a data source, or surface SDG labels in public profiles, please get in touch to find out more.
The Digital Science Consultancy team can also produce tailored analysis for non-profits, governments, funders, research institutions and STEM publishers to inform strategy to meet organisational goals. We can help you relate the influence and impact that your organisation has to research on the UN’s Sustainable Development Goals (SDGs).
Natalie Guest
About the Author
Natalie Guest, Marketing Director | Symplectic
Natalie Guest works in pursuit of the advancement of knowledge by delivering flexible research solutions that help universities, institutions and funding organisations achieve their research goals. She has 10 years’ experience in B2B technology marketing, focusing predominantly on the scholarly publishing, research and information management sector.