How Experts are Redefining Research Visibility Beyond Traditional Metrics

On-Demand Webinar: The Future of Research Visibility: Beyond Traditional Metrics

Introduction

Success in scientific publishing has long been measured by citations and impact factors. Yet in today’s Medical Affairs landscape, the definition of value is shifting rapidly. This article recaps insights from the recent panel discussion The Future of Research Visibility: Beyond Traditional Metrics, where experts from across the field explored how publication success is evolving, which new measures of impact matter most, and how digital transformation and AI are reshaping the game.

Bringing a wealth of diverse perspectives, the panel featured Shehla Sheikh, Head of Medical Communication & Publications at Kyowa Kirin; Kim Della Penna, Scientific Communications Director for Lymphoma, Myeloid, and Multiple Myeloma at Johnson & Johnson; Myriam Cherif, Founder of Kalyx Medical and former Regional Medical Director at GSK; and Carlos Areia, Senior Data Scientist at Digital Science. The discussion was moderated by Natalie Jonk, Enterprise Marketing Segment Lead, who guided the conversation through the critical challenges and opportunities shaping the future of research visibility.

Success: Still a Moving Target

Defining success remains one of the greatest challenges. For some organizations, it’s still as simple as getting the data published. For others, success means shaping clinical guidelines or influencing real-world decision-making.

Kim explained:

“A lot of these tools help us see who is engaging with our publication. Are they sharing the publication, did they find it important enough to share? Where is the data being incorporated? Is it being used in policy and guidelines, cost data, real-world healthcare data or by population health decision makers for access?”

Myriam emphasized how the lens has broadened over the past decade:

“A decade ago, people just looked at impact factors and citations. Now, we discuss with HCPs how data applies to patients. Sometimes a paper may be more practical for certain regions. We’ve moved toward a more holistic approach.”

Metrics Beyond the Traditional

Today, a wealth of data is available, but the challenge is deciding which metrics are truly meaningful. Downloads, mentions, and social media shares are only part of the story.

Carlos noted the complexity:

“Things are changing quite fast with data. How do you track success when different publications have different goals? Sometimes the goal is to see how quickly new studies get into clinical guidelines. Other times, it’s about reaching a very specific group of oncologists in one country.”

Sentiment analysis is also emerging as a key tool:

“We can now see if a publication has been well or badly received by, for example, a group of cardiologists. Medical Affairs is adapting rapidly to what real-time data can offer,” Carlos added.

The Discoverability Dilemma

Shehla raised a critical issue: ensuring publications are findable by the right stakeholders.

“Discoverability is super important. A lot of data ends up in supplementary indices, which aren’t always accessible. If it’s not directly available through the paper, that’s problematic. It raises the question: how much do we include in the main publication versus holding back for supplementary materials?”

The difficulty, she argued, isn’t just in publishing but in making materials trackable. Without DOIs or identifiers, measuring performance across channels becomes impossible.

Carlos emphasized that when any content type, including supplementary data, infographics, and plain language summaries, is uploaded to Figshare and assigned a DOI, it becomes both accessible and trackable.  This is a critical step that several Digital Science customers are already using to monitor and demonstrate the impact of their materials and gain really deep insights regarding who is engaging with their content.

Formats and Channels that Resonate

Visual and digital formats are transforming scientific communication. With tools like Altmetric and Figshare, it’s now possible to track which content resonates with different audiences,  for example, whether visual abstracts work best for patients, short videos for junior doctors, or news platforms or Medscape for senior clinicians.

Key takeaways from the discussion included:

  • Infographics and visual abstracts help make complex data more digestible for both HCPs and patients.
  • Social media engagement, accelerated since COVID-19, has expanded the demographic reach of publications.
  • Podcasts, YouTube, and blogs are emerging as alternative channels for research dissemination.

Shehla summarized the opportunity:

“Data visualization has been a game changer. It helps people understand complex results without dumbing them down. But it has to be a true representation of the data.”

Strategic Decision-Making with Engagement Data

Engagement data is no longer just descriptive – it’s strategic.

Myriam explained:

“This data helps us know which publications to amplify and in what format. If a subgroup analysis is relevant for Asia or South America, we integrate it into the regional strategy. Affiliates want to know how to use this data locally, whether in slides or field medical materials.”

Carlos added an example of reverse engineering success:

“We worked with a partner who had two trials presented at the same congress. One made it into a guideline in a specific country much faster than the other. By looking back at the local attention it had on social media, news and others, we tried to understand why.”

The Future: AI, Social Media, and Trust

Looking ahead, AI and digital platforms are set to further disrupt how success is measured.

Myriam highlighted new challenges:

“Citations and downloads will matter less. AI tools are already being used by HCPs to answer questions on diseases and treatments. But a recent study showed less than 15% overlap in references across Google, ChatGPT, and Perplexity when asked the same question. Metadata and referencing are going to be critical to ensure our publications are being picked up correctly.”

Kim added:

“We need to optimize what we create so AI can pick up data through correct tagging. Who is engaging, what types of data they’re engaging with, and what channel they use – these are all factors we have to plan for.”

Carlos cautioned on the risks:

“AI is a wonderful tool if used correctly – but like computer scientists used to say: it’s ‘garbage in, garbage out’. AI is very confident even when it’s wrong. The real value comes from using the right data together with AI to help people understand it better and extract the needed insights from it, whilst mitigating its potential for misuse and misinformation.”

Conclusion: Toward a Holistic, Dynamic View of Impact

As the panel made clear, measuring publication performance can no longer be reduced to a single number. Success is multi-dimensional, context-specific, and evolving alongside technology and stakeholder expectations.

Traditional metrics such as citations and impact factors remain useful, but they are no longer sufficient. Engagement data, sentiment, and discoverability are now central to understanding whether a publication truly resonates and reaches its intended audience. At the same time, AI, social media, and new digital formats are reshaping how, and by whom research is consumed. And sometimes, the most meaningful measures are the informal ones: when medical scientific liaisons hear health care professionals discussing a paper, when KOLs reference it unprompted, or when data directly influences patient care.

A Call to Reframe Success

The future of publication success will depend on Medical Affairs teams embracing this broader, more dynamic definition of impact. By combining rigorous traditional metrics with innovative digital measures, and by ensuring content is discoverable, trackable, and presented in accessible formats, organizations can create lasting value. Most importantly, reframing success around real-world influence and patient outcomes ensures that research doesn’t just get published, it makes a difference.

Continue the Conversation

At Digital Science, we’re committed to helping Medical Affairs professionals thrive in an era where research visibility and impact are being redefined. To deepen the insights shared in this panel, we invite you to explore our latest white paper, Empowering Medical Affairs in the Digital Age,” authored by thought leader Mary Ellen Bates. Inside, you’ll find practical strategies to navigate evolving challenges, demonstrate value, and drive measurable outcomes.

Mary Ellen Bates will also be leading our upcoming webinar, “From Data Chaos to Strategic Impact: Transforming Medical Affairs in the Digital Age” (Tuesday 28 October 2025).

The post How Experts are Redefining Research Visibility Beyond Traditional Metrics appeared first on Digital Science.



from Digital Science https://ift.tt/AHv5rup

From data to decisions: Accelerating public sector outcomes in Singapore

Singapore’s public sector has long been recognised as a leader in evidence-based policymaking.

But fragmented systems and manual review processes are still slowing down critical insights.

This case study explores how Dimensions is helping agencies in Singapore to:

  • Unify grants, publications, patents, collaborators, and policy into a single secure platform
  • Cut analysis cycles from weeks to hours, freeing staff for higher-value work
  • Strengthen accountability and transparency with audit-ready records
  • Deliver better alignment with national initiatives such as Smart Nation, RIE2025, and the Digital Government Blueprint

Unlock the full case study to see how Singapore agencies are making data work harder, faster, and smarter.

From data to decisions: Accelerating public sector outcomes in Singapore

Get the case study

The post From data to decisions: Accelerating public sector outcomes in Singapore appeared first on Digital Science.



from Digital Science https://ift.tt/kT7v6mL

Digital Science investigation shows millions of taxpayers’ money has been awarded to researchers associated with fictitious network

Thursday 4 September 2025 – London, UK and Chicago, USA

Researchers associated with a fictitious research network and funding source have collectively netted millions of dollars of taxpayers’ money for current studies from the United States, Japan, Ireland, and other nations. That’s according to investigations led by Digital Science’s VP of Research Integrity, Dr Leslie McIntosh.

The results of her investigations raise serious concerns about the lack of accountability for those involved in questionable research publications.

“This example illustrates how weaknesses in research and publishing systems can be systematically exploited, so that researchers can game the system for their own benefit,” Dr McIntosh says.

Dr McIntosh – one of the co-founders of the Forensic Scientometrics (FoSci) movement – has presented her analysis at this week’s 10th International Congress on Peer Review and Scientific Publication in Chicago, in a talk entitled: Manufactured Impact: How a Non-existent Research Network Manipulated Scholarly Publishing.

While not naming the individual researchers involved, Dr McIntosh’s presentation was centered on a group known as the Pharmakon Neuroscience Network, a non-existent body listed on more than 120 research publications from 2019–2022 until being exposed as fictitious. These publications involved 331 unique authors and were associated with 232 organizations and institutions across 40 countries.

Research network raised multiple red flags

The Pharmakon Neuroscience Network functioned as a loosely organized collaboration of predominantly early-career researchers, such as postdoctoral and PhD students, whose publications included:

  • Funding acknowledgments with unverifiable organizations
  • Use of questionable or unverifiable institutional affiliations
  • Suspiciously large citations in a short timeframe
  • Globally connected despite a young publication age

“Despite clear concerns about the legitimacy of their work, only three papers have been formally retracted to date,” Dr McIntosh says.

Using Digital Science’s research solutions Dimensions and Altmetric, Dr McIntosh and colleagues have tracked the progress of the authors connected with this network.

“Once the Pharmakon Neuroscience Network was exposed as being fake in 2022, it no longer appeared on publications, but many of the researchers associated with it have continued to publish and attract significant funding for their work,” she says.

Millions in funding for current research

Of the initial 331 researchers associated with the Pharmakon Neuroscience Network’s publications, Dr McIntosh has established that more than 20 currently have funding either as a Principal Investigator or a Co-Principal Investigator from sources where the grant commenced in 2022 or later. During this time, those researchers have collectively been awarded the equivalent of at least US$6.5 million from seven countries: US, Japan, Ireland, France, Portugal, and Croatia, and an undisclosed sum from Russia.

One researcher with more than US$50 million in funding has authorship on one of the Pharmakon papers. It is not clear if he knowingly participated in the network or was part of a former student activity. 

“Many of the researchers had grants before and after Pharmakon. This is legitimate, taxpayer money in most instances that are funding very unethical practices,” Dr McIntosh says.

“One aspect we need more time to vet is the possibility that a few of these researchers do not know they were authors on papers within this network. We are still completing this work.”

Of the funded researchers, five had never previously received funding for their research, but following their involvement with the Pharmakon Neuroscience Network they have since been awarded grants from the following sources ($US equivalent):

  • Science Foundation Ireland – $649,891
  • Ministry of Science, Technology and Higher Education (Portugal) – $538,904 total
  • Croatian Science Foundation – $206,681
  • Russian Science Foundation – undisclosed sum

“Here we have evidence that some authors have secured legitimate funding, including large sums of taxpayers’ money, following their participation in questionable research and publication activity,” Dr McIntosh says.

“We can presume that their publication portfolio, no matter how it was obtained, helped in securing this funding from legitimate sources.”

Dr McIntosh says this case has implications across the research system and emphasizes the need for stronger verification, monitoring, and cooperation.

“Although most of these publications remain in circulation and have been cited widely, corrective actions have been limited. This highlights the challenge of addressing such networks once their work is embedded in the scholarly record,” she says.

Recommendations

Dr McIntosh recommends the following:

  • Oversight to be reinforced by requiring the use of verified institutional identifiers, such as GRID or ROR, in all publications to ensure affiliations are legitimate and traceable.
  • Transparency to be mandated through clearer author contribution statements and verified funding acknowledgments, creating a more reliable and accountable record of how research is conducted and supported.
  • Monitoring mechanisms should be improved by supporting the adoption of forensic scientometrics, which can detect unusual collaboration patterns or questionable authorship practices before they become systemic.

“By addressing these gaps, governments, publishers and research institutions alike can help protect the integrity of the research system and ensure that trust in science is maintained,” Dr McIntosh says.

See further detail about this investigation in Dr McIntosh’s blog post: From Nefarious Networks to Legitimate Funding.

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.

Media contact

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com

The post Digital Science investigation shows millions of taxpayers’ money has been awarded to researchers associated with fictitious network appeared first on Digital Science.



from Digital Science https://ift.tt/ohgSpcj

Altmetric adds Sentiment Analysis to social media tracking

AI-powered Sentiment Analysis to provide deeper insights into how research is being received

Tuesday 2 September 2025

Digital Science is pleased to announce that Altmetric, which captures the online attention of research, has introduced a new AI-powered sentiment analysis feature, to provide research teams with deeper insights into the public response and impact of their work on selected social media platforms.

Now available in Altmetric Explorer, Altmetric’s AI-powered Sentiment Analysis has been robustly refined to explore the sentiment towards the use of research, thanks to the work of Digital Science Senior Data Scientist Dr Carlos Areia and Head of Data Insights Mike Taylor, in consultation with the research community.

Mike Taylor said: “Impactful research deserves the best possible insights. Our new Sentiment Analysis feature gives some meaning to numbers, leveraging advanced technology to interpret and visualize the sentiment behind mentions on key social media platforms, and brings the potential to turn raw data into actionable insights for members of the research community.”

Using AI to assign scores to mentions, it was possible to create a spectrum of sentiment for given research outputs. By capturing a whole range of reactions and discourse on social media, sentiment analysis supports research teams to better understand how their work is being received and engaged online across different audiences.

“There are many potential benefits from these new insights, including the opportunity for research teams to refine their approach to research publication, communication and dissemination plans,” Taylor said.

Key Features of Altmetric Sentiment Analysis

  • Sentiment Scoring: Automatically assigns a sentiment score to individual social media mentions (ranging from strong negative to strong positive).
  • Sentiment Breakdown Charts: Visualize sentiment trends with clear and concise graphical representations. Research teams can quickly identify changes in perception and respond accordingly.
  • Filtering by Sentiment: Narrow down results in the Altmetric Explorer by sentiment type, allowing users to focus on specific aspects of discussions most relevant to their strategy or goals.

Amye Kenall, Chief Product Officer, Digital Science, said: “The inclusion of Sentiment Analysis into Altmetric data is an important step in helping users get real insight from Altmetric data, enabling researchers and organizations to understand how their publications are being received, discussed and used. Digital Science is committed to using AI responsibly and ethically in ways that drive more value to our users but also protect the community we serve. We’re pleased to bring this feature to our Altmetric Explorer users.

“Medical affairs professionals, academic researchers, scholarly publishers, and R&D specialists alike can fully explore the ‘how and why’ behind their impact, leveraging these insights to maximize the visibility and effectiveness of their published research.”

About Altmetric

Altmetric is a leading provider of alternative research metrics, helping everyone involved in research gauge the impact of their work. We serve diverse markets including universities, institutions, government, publishers, corporations, and those who fund research. Our powerful technology searches thousands of online sources, revealing where research is being shared and discussed. Teams can use our powerful Altmetric Explorer application to interrogate the data themselves, embed our dynamic ‘badges’ into their webpages, or get expert insights from Altmetric’s consultants. Altmetric is part of the Digital Science group, dedicated to making the research experience simpler and more productive by applying pioneering technology solutions. Find out more at altmetric.com and follow @altmetric on X and @altmetric.com on Bluesky.

About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.

Media Contact

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com

The post Altmetric adds Sentiment Analysis to social media tracking appeared first on Digital Science.



from Digital Science https://ift.tt/SXdTR51

Featured Post

How Experts are Redefining Research Visibility Beyond Traditional Metrics

On-Demand Webinar: The Future of Research Visibility: Beyond Traditional Metrics Introduction Success in scientific publishing has long...

Popular