The post Protected: Global 250 2025 appeared first on Digital Science.
from Digital Science https://ift.tt/yzowC6t
The post Protected: Global 250 2025 appeared first on Digital Science.
34th Edition of International Research Awards on New Science Inventions
14th July 2025 | Webinar Visit Our Website 🌐: nesin.sciencefather.com Nomination👍: https://new-science-inventions.sciencefather.com/young-scientist-research-awards/ Contact us ✉️: nesinsupport@sciencefather.com #ScienceFather #researchawards #shorts #technology #researchers #labtechnicians #conference #awards #professors #teachers #lecturers #biologybiologiest #physicist #coordinator #business #genetics #medicine #labtechnicians #agriculture #bestreseracher #bestpaper Get Connected Here:Wednesday 16 July 2025
Scholarly publishers can now fully integrate research integrity checks into their editorial and submission workflows, thanks to Digital Science’s new Dimensions Author Check API, which launches today.
Built on Dimensions – the world’s largest interconnected global research database – Dimensions Author Check evaluates researchers’ publication and collaboration histories within seconds, delivering reliable, concise, structured insights.
For the first time, the new Dimensions Author Check API enables publishers to embed this functionality directly into their own workflows, without the need to switch to an outside platform.
Dr Leslie McIntosh, Vice President of Research Integrity at Digital Science, said Dimensions Author Check API is designed to support consistent and confident editorial decision-making.
“By highlighting key indicators of research integrity – such as retractions, tortured phrases, or unusual co-authorship patterns – the Dimensions Author Check API helps to rapidly identify potential issues for concern. These include continuously improving indicators that will identify paper mills and increase trust in science,” Dr McIntosh said.
“Importantly, the Author Check API can do this at scale, giving publishers the ability to screen multiple researchers per request. This makes it ideal for high-volume manuscript processing and broader editorial oversight.”
Key benefits of the new Dimensions Author Check API include:
Note to editors: The Dimensions Author Check dashboard was originally announced in December last year. This announcement is specific to the Dimensions Author Check API, which launches today.
About Dimensions
Part of Digital Science, Dimensions hosts the largest collection of interconnected global research data, re-imagining research discovery with access to grants, publications, clinical trials, patents and policy documents all in one place. Follow Dimensions on Bluesky, X and LinkedIn.
About Digital Science
Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.
Media contact
David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com
The post Digital Science to strengthen research integrity in publishing with new Dimensions Author Check API appeared first on Digital Science.
Executive Order 14303 requires every federal research agency to document reproducibility, transparency, and COI management—by August 22, 2025.
Digital Science solutions give you the platforms and insights to address these demands confidently.
Dimensions is a powerful platform that uniquely and transparently facilitates data collection and analysis, providing access to grants, the publications that they funded, the supporting research data and analytics tools that can efficiently gather and report on this data.
What you’ll learn in our EO 14303 Readiness Guide:
Your mission demands integrity, EO 14303 requires it.
The post Federal research excellence appeared first on Digital Science.
We believe it is the role of Digital Science to help the scientometrics community access information it needs to develop open, transparent research indicators. In Barcelona: A beautiful horizon, our CEO Daniel Hook describes the history and vision of this commitment. Researchers can use Altmetric and Dimensions data to study how research is funded, communicated, commercialized, and how it makes an impact in the world.
At Digital Science, our belief is that by taking risks, being innovative and pushing boundaries, so that clients gain real value and significant benefit from our offerings, there should be an opportunity for an appropriate return on investment.”
Through the Scientometric Researcher Access to Data (SRAD) program, we offer no-cost access to Altmetric and Dimensions for non-commercial scientometric research projects, through which you will get access to the Altmetric Explorer, Dimensions Analytics, our APIs (Altmetric Explorer API, Altmetric Details Page API, and the Dimensions Analytics API) and we are now expanding research to Dimensions data by offering access to the Dimensions on Google BigQuery (GBQ) dataset through Google’s Analytics Hub.
By expanding access to the Dimensions on GBQ, we are excited to facilitate researchers in answering complex questions with big data, making connections between more data points, and connecting Dimensions data to other open datasets!
Access supports university-affiliated scientometrics researchers with clearly defined research projects, aiming for published results. Scientometric research involves the quantitative study of science, technology, and innovation. It aims to measure and evaluate the impact, patterns and trends in scientific research and its influence across disciplines and institutions. We also welcome projects that support the development, testing, or comparison of bibliometric or scientometric indicators. Innovative research is particularly encouraged; duplicative research questions are not (with the exception of replication studies, which are encouraged). Access is not intended for use in commercial products, for self assessment, or non-scientometric research.
Special consideration is given for International Society for Scientometrics and Informetrics members and associates of the Research on Research Institute.
Join the program
[need to confirm Asana form is ok to use]
In addition to access, scientometric researchers meet monthly to share information about Digital Science datasets (Altmetric & Dimensions), access methods (APIs, BigQuery, Google Analytics Hub), and using these data to study science.
Find a meeting
[need correct google calendar link]
For information on how we process personal information, please refer to the Dimensions privacy policy. Email addresses are used to monitor usage and ensure compliance with terms of use. See our Terms and Conditions to learn more.
For years, research has shown that inclusive datasets like Dimensions are essential for understanding the global research landscape, whether the interest is open access measurement or publishing diversity. We are proud to make these data available for the study of science.
We are proud of the results achieved by using our data, which include the following output:
We are also particularly proud that our support powers the Problematic Paper Screener, introduced here.
The post Introducing SRAD appeared first on Digital Science.
Thursday 10 July 2025
China is outstripping the rest of the world in artificial intelligence research at a time when AI is becoming a “strategic asset” akin to energy or military capability, according to a new report released today by research technology company Digital Science.
The report – entitled DeepSeek and the New Geopolitics of AI: China’s ascent to research pre-eminence in AI – has been authored by Digital Science CEO Dr Daniel Hook based on data from Dimensions, the world’s largest and most comprehensive database describing the global research ecosystem.
Dr Hook has analyzed AI research data from the year 2000 to 2024, tracking trends in research collaborations and placing these within geopolitical, economic, and technological contexts.
His report says AI research has grown at an “impressive rate” globally since the turn of the millennium – from just under 10,000 publications in 2000, to 60,000 publications in 2024.
Dr Hook’s key findings include:
“AI is no longer neutral – governments are using it as a strategic asset, akin to energy or military capability, and China is actively leveraging this advantage,” Dr Hook says.
“Governments need to understand the local, national and geostrategic implications of AI, with the underlying concern that lack of AI capability or capacity could be damaging from economic, political, social, and military perspectives.”
Dr Hook says China is “massively and impressively” growing its AI research capacity. Unlike Western nations with clustered AI hubs, he says China boasts 156 institutions publishing more than 50 AI papers each in 2024, supporting a nationwide innovation ecosystem. In addition, “China’s AI workforce is young, growing fast, and uniquely positioned for long-term innovation.”
He says one sign of China’s rapidly developing capabilities is its release of the DeepSeek chatbot in January this year. “The emergence of DeepSeek is not merely a technological innovation – it is a symbol of a profound shift in the global AI landscape,” Dr Hook says.
“DeepSeek exemplifies China’s technological independence. Its cost-efficient, open-source LLM demonstrates the country’s ability to innovate around US chip restrictions and dominate AI development at scale.”
Dr Hook’s report comments further on the AI research landscape in the US, UK and EU.
He says the UK remains “small but globally impactful”. “Despite its modest size, the UK consistently punches above its weight in attention-per-output metrics.”
However, the EU “risks falling behind in translation and visibility”. “The EU shows weaker international collaboration beyond its borders and struggles to convert research into applied outputs (e.g., patents), raising concerns about its future AI competitiveness.”
About Dimensions
Part of Digital Science, Dimensions hosts the largest collection of interconnected global research data, re-imagining research discovery with access to grants, publications, clinical trials, patents and policy documents all in one place. Follow Dimensions on Bluesky, on X and LinkedIn.
About Digital Science
Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.
Media contact
David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com
The post New report shows China dominates in AI research – and is western world’s leading collaborator on AI appeared first on Digital Science.
33rd Edition of International Research Awards on New Science Inventions
18th June 2025 | Webinar
Prof. Maciej Sosnowski | Medical University of Silesia, Poland | Best Researcher Award
Visit Our Website 🌐: nesin.sciencefather.com
Nomination👍: https://new-science-inventions.sciencefather.com/young-scientist-research-awards/
Contact us ✉️: nesinsupport@sciencefather.com
#ScienceFather #researchawards #shorts #technology #researchers #labtechnicians #conference #awards #professors #teachers #lecturers #biologybiologiest #physicist #coordinator #business #genetics #medicine #labtechnicians #agriculture #bestreseracher #bestpaper
Get Connected Here:
==================
YouTube: youtube.com/@nesinconferenceandawards4869
Twitter : twitter.com/ScienceInventi1
Pinterest : in.pinterest.com/scienceinventions/.....
Instagram : www.instagram.com/kaylee_rowan_
Linkedin : www.linkedin.com/feed/
33rd Edition of International Research Awards on New Science Inventions
14th June 2025 | Webinar
Dr. Kalyan Banerjee | SRM University AP | India | Best Researcher Award
Visit Our Website 🌐: nesin.sciencefather.com
Nomination👍: https://new-science-inventions.sciencefather.com/young-scientist-research-awards/
Contact us ✉️: nesinsupport@sciencefather.com
#ScienceFather #researchawards #shorts #technology #researchers #labtechnicians #conference #awards #professors #teachers #lecturers #biologybiologiest #physicist #coordinator #business #genetics #medicine #labtechnicians #agriculture #bestreseracher #bestpaper
Get Connected Here:
==================
Twitter : twitter.com/ScienceInventi1
Pinterest : in.pinterest.com/scienceinventions/.....
Tumblr : kayleerowan.tumblr.com.....
Instagram : www.instagram.com/kaylee_rowan_
Linkedin : www.linkedin.com/feed/
Facebook : facebook.com/kaylee.rowan.92
Making research Open Access (OA) is one major step in the process, but how do we know if OA research is having its intended impact? Ann Campbell and Katie Davison share the results of their investigations and some lessons for the future of OA.
One of the principal aims of Open Access (OA) has always been to democratize knowledge by making research free to read; however, that should be the starting point, not the ultimate goal. Perhaps it’s time to step back and ask ourselves, “Are we in danger of becoming preoccupied with the ‘access’ aspect of open – neglecting the other components that make research successful?”
In our rush to remove paywalls and ‘financial barriers’, could it be that we are simply equating ‘freely available’ to ‘truly accessible’? How valuable is making research content accessible without it being discoverable? And how beneficial is it for an end user to find content if they don’t see its relevance, or if they can’t act on it?
Access alone isn’t enough. If research isn’t discoverable, understandable, or actionable for the people who need it (policymakers, practitioners, researchers across regions and community organizations), then OA has fallen short of its full potential.
The ability to get research into the hands of those who can fully capitalize on it is a crucial factor to research success, but in practice, significant gaps and disconnects are evident – particularly from a data and systems perspective. We have made huge progress in terms of the volume of research that is technically ‘open’, however we now need to find out who is actually benefiting.
Current narrative suggests that OA articles are more likely to be cited – but our data suggests this isn’t universally true, or at least that there is more to the story. In addition, citations alone don’t tell us who’s engaging with the content or whether it’s reaching communities outside of academia.
If equity in research means the ability to publish and participate in research fairly, (regardless of location, career stage or discipline), should we accept that the measure of success is whether an article has been published OA? Or should we be measuring success based on whether the research achieves its intended aims, reaches its intended audience, and enables meaningful participation across global research communities?
This blog will look at what ‘access’, taken in isolation, is and what it isn’t. Using data from Dimensions, extracted from the Dimensions on GBQ environment alongside World Bank data on GBQ, we challenge the notion that emphasis on publishing OA is enough to ensure equitable participation. We explore what happens when we focus on access without discoverability. We assess whether research participation is happening in a balanced way or whether there are barriers to journal publication – including but not limited to Article Processing Charges (APCs) – and engagement.
To help us with this, we have conducted a benchmarking and data interpretation exercise to understand the wider problem of participation in research.
Let’s begin with a common assumption: that publishing is the ultimate goal for a researcher, and that lower-middle and low-income countries struggle to publish OA at the same rate as upper-middle and high-income countries due to the financial challenges associated with APCs.
The visual on the left (in Chart 1) shows us the number of gold OA articles published in 2023. This view alone might suggest that lower-income countries are being prevented from publishing OA compared to upper-income countries. However, benchmarking against the overall amount of research from these regions shows the reverse – low-income (LIC) and lower-middle-income countries (LMIC) are producing proportionately more OA content.
With this data in mind we dismiss the notion that a general analysis of open participation will drive further insight and shift to participation at journal level. For this analysis, it is useful to consider participation in these terms: where there is intent to contribute to a research topic, is that intent being met or prevented through journal selection and traditional impact measures?
To see this in action, we decided to focus this case study on Indonesian researchers’ contribution to SDG 4, Quality Education.
In a world where participation in global research was truly balanced and contributions to knowledge were reflected proportionally, if Indonesia contributes 10% of overall research to quality education, we would hope to see the 10% Indonesian representation happen at journal level as well.
To view this, we analyzed journals publishing the most research articles aligned with SDG 4 and benchmarked them against common markers for citation impact and attention. We then assessed the representation of Indonesian research within these journals. Specifically, we calculated the proportion of SDG 4-aligned research with at least one Indonesian-affiliated researcher, aiming for a 10% representation rate. The results are shown in the visual below (Chart 3).
Our journal-level analysis revealed that the desired 10% participation rate was not met. There was an imbalance within the journals around the level of Indonesian research present. Notably, this imbalance occurred across varying access types and associated publication fees. At the top, Education and Information Technologies, our highest-cited journal, a hybrid title, showed ~2% Indonesian representation. Education Sciences, a gold title that scored middle-ish for citation average, has less than 1%. The largest portion of Indonesian research appeared at the bottom left in two diamond-access, regional titles where we saw lower average scores in both citation and attention.
Therefore, a barrier may be the APCs; usually higher for market leading, established journals. (We’d highlight that Cogent Education is the closest to meeting the 10% participation rate and is a publication that does charge an APC but also offers waivers for LIC and LMIC countries.) However, this is just one of many potential barriers to equitable participation and one addressed by programs like Research4Life and publisher-led, global discounting practices. Our focus here was viewing the research holistically, taking into account how open practices have supported or hindered participation through both journal selection and research impact.
This view (Chart 3) highlights the challenge seasoned publishers face in balancing publication preferences, what motivates or prevents a researcher to select that journal, and readership habits, which encompass both accessibility and discoverability, the kind of discoverability established journals typically offer. The low metrics for the diamond OA journals (bottom left, Chart 3) illustrate the challenge for journals of ensuring research reaches readers.
To look closer at the intersection between the two sides publishers must mediate to ensure research meets its potential, we first focus on publication preferences. Many publishers aim to remove participation barriers so we can share quality research in a balanced, fully representational way. How can publishers work to ensure this proportional representation?
One approach is reducing costs of APCs, another is raising awareness. Emerald Publishing uses Dimensions data to benchmark the locale of research relative to our journal level subjects and try to balance Editorial Advisory Board (EAB) selection proportionally. This practice aims to inform publishers and editors where the research is coming from, without compromising EAB selection quality; addressing this at journal level regardless of access type or other unintended barriers.
The other aspect of this publisher mediation, and the one crucial to ensuring research is seen by the intended audience, is understanding reader habits. It is important to understand the benefits of making research openly accessible versus accessible, findable, and usable. Access in isolation, without the presence of discoverability to ensure the work reaches the end user, is not enough.
Below we can see the average citations for the top 100 most productive countries by access type (Table 1). We conclude from this brief view that hybrid titles generate more citation activity as they are the established journals that have an established readership base.
Citation Calculation | Closed | Hybrid | Gold (APC charge) | Gold (no-APC charge) |
Average | 1.9 | 3.0 | 1.8 | 1.1 |
Median | 1.8 | 2.9 | 1.8 | 1.0 |
It is probable that the imbalance in Indonesian representation is shaped by the age and prestige of journals themselves. For the most part, Open Access journals are younger than their subscription-based closed counterparts, and because Journal Impact Factors (JIFs) are based on a two-year citation window, newer journals (both open and closed access) are naturally disadvantaged.
As a result, newer journals that cover emerging or interdisciplinary areas, such as research aligned with the Sustainable Development Goals (SDGs), may find it difficult to achieve similar visibility and ‘reputation’. This creates a compounding effect: newer OA journals may be more inclusive and open to geographically diverse contributions, yet they lack the discoverability and citation momentum of older, established titles.
In turn, researchers from countries like Indonesia are more likely to publish in regional, Diamond OA journals – which remain under-recognized in global research metrics despite playing a crucial role in local knowledge and research ecosystems.
This echoes the concerns raised in the Budapest Open Access Initiative 20th anniversary recommendations (BOAI20), which call for a more equitable and inclusive approach to Open Access – one that recognizes the value of diverse publication venues, fosters participation from underrepresented communities, and moves beyond outdated prestige indicators.
This points to a deeper issue: when discoverability and prestige are unequally distributed across journals, people may judge research quality based on where it’s published, rather than on the actual quality of the research.
This brings us to further consider the practice of prioritizing access above all else, how this may perpetuate bias in the system arising from assessing research quality based on its potential reach, and how that can be hindered by the journal itself.
We examined the quality of Indonesian research in high-output titles and found that when venue and discoverability practices align, Indonesian research citations are above average, dispelling any assumption about overall ‘quality’ that may arise from most Indonesian researchers prioritizing access when selecting journal (Chart 4).
This prompted a further question: Even when quality is demonstrable, is it being recognized globally? A parallel analysis examining citation practices across all low-income countries allowed us to test whether the patterns we observed with Indonesian research reflect broader systemic issues. We found a consistent pattern: research from low-income countries is often overlooked in citation practices, even when it is highly relevant and well-aligned with global priorities and even when it aligns closely with the focus of the citing publication.
In a parallel analysis, we found a consistent pattern: research from low-income countries is often overlooked in citation practices, even when it is highly relevant and well-aligned with global priorities and even when it aligns closely with the focus of the citing publication.
The parallel analysis examined global research output from 2013 to 2023, focusing on contributions to Sustainable Development Goals (SDGs), excluding SDG 3 (Good Health and Well-Being) given its high proportion of research. Using author affiliations from the Dimensions database, we categorized publications by author country and matched them to World Bank income group classifications. This allowed us to compare research priorities between high-income and low-income countries over this time.
As shown in the chart below, there are clear differences in thematic focus. Researchers in low-income countries disproportionately prioritize areas like SDG 2: Zero Hunger and SDG 6: Clean Water and Sanitation – topics that directly reflect the urgent, lived realities in these regions. In contrast, high-income countries show a stronger focus on SDGs such as Affordable and Clean Energy and Partnerships for the Goals. These differing priorities demonstrate the local expertise and indigenous knowledge embedded in lower-income regions – expertise that, as shown in our citation analysis, is not being adequately acknowledged or cited in global research outputs.
In critical areas such as Zero Hunger and Clean Water and Sanitation – topics where low-income countries often hold deep, practical expertise – our citation analysis reveals minimal inclusion of their work by researchers in high-income countries. Specifically, just 0.2% of references in high-income country publications on these SDGs cite publications where authors are based solely in low-income countries. In contrast, over 70% of the references come from publications with authors affiliated exclusively with high-income institutions (74% for Zero Hunger and 71% for Clean Water and Sanitation).
Even when we broaden the scope to include any contribution from a low-income country, the numbers remain stark: 1.41% for Zero Hunger and 1.22% for Clean Water and Sanitation. This is despite the fact that these regions face the most urgent realities tied to these challenges – and who are actively publishing in these areas.
These findings point to a clear disconnect between where expertise exists and where it is recognized. In both Zero Hunger and Clean Water and Sanitation, areas where low-income countries have direct, practical experience, we see how research is vastly under-cited by high-income country publications. This underrepresentation suggests a missed opportunity to draw on locally grounded knowledge that could meaningfully shape global solutions.
This isn’t about a lack of relevant research. It’s about discoverability, visibility, and deeply embedded citation habits. Open Access isn’t just about making research available, it’s about making sure that research is seen, used, and respected within the global knowledge ecosystem.
Emerald has recently launched the Open Lab, which looks at the research ecosystem and how open practices impact it. Its goal is to find real solutions to some of the problems not yet addressed by open practices and some of the problems created by them.
We hope this analysis encourages thoughtful discussion on where the focus should shift thus allowing us to effectively evaluate the success of Open Access and help ensure that all research can meet its full potential.
Authors:
Ann Campbell, Technical Solutions Manager, Digital Science
Katie Davison, Insights Analyst, Emerald Publishing
The post Access vs Engagement – is OA enough? appeared first on Digital Science.
This content is password protected. To view it please enter your password below: Password: The post Protected: Global 250 2025 appeare...