Thursday, September 04, 2025

CRIMINAL CAPITALI$M

Digital Science investigation shows millions of taxpayers’ money has been awarded to researchers associated with fictitious network




Digital Science
Dr Leslie McIntosh - comment 

image: 

Dr Leslie McIntosh, VP of Research Integrity, Digital Science.

view more 

Credit: Digital Science.





Researchers associated with a fictitious research network and funding source have collectively netted millions of dollars of taxpayers’ money for current studies from the United States, Japan, Ireland, and other nations. That’s according to investigations led by Digital Science’s VP of Research Integrity, Dr Leslie McIntosh.

The results of her investigations raise serious concerns about the lack of accountability for those involved in questionable research publications.

“This example illustrates how weaknesses in research and publishing systems can be systematically exploited, so that researchers can game the system for their own benefit,” Dr McIntosh says.

Dr McIntosh – one of the co-founders of the Forensic Scientometrics (FoSci) movement – has presented her analysis at this week’s 10th International Congress on Peer Review and Scientific Publication in Chicago, in a talk entitled: Manufactured Impact: How a Non-existent Research Network Manipulated Scholarly Publishing.

While not naming the individual researchers involved, Dr McIntosh’s presentation was centered on a group known as the Pharmakon Neuroscience Network, a non-existent body listed on more than 120 research publications from 2019–2022 until being exposed as fictitious. These publications involved 331 unique authors and were associated with 232 organizations and institutions across 40 countries.

Research network raised multiple red flags

The Pharmakon Neuroscience Network functioned as a loosely organized collaboration of predominantly early-career researchers, such as postdoctoral and PhD students, whose publications included:

  • Funding acknowledgments with unverifiable organizations
  • Use of questionable or unverifiable institutional affiliations
  • Suspiciously large citations in a short timeframe
  • Globally connected despite a young publication age

“Despite clear concerns about the legitimacy of their work, only three papers have been formally retracted to date,” Dr McIntosh says.

Using Digital Science’s research solutions Dimensions and Altmetric, Dr McIntosh and colleagues have tracked the progress of the authors connected with this network.

“Once the Pharmakon Neuroscience Network was exposed as being fake in 2022, it no longer appeared on publications, but many of the researchers associated with it have continued to publish and attract significant funding for their work,” she says.

Millions in funding for current research

Of the initial 331 researchers associated with the Pharmakon Neuroscience Network’s publications, Dr McIntosh has established that more than 20 currently have funding either as a Principal Investigator or a Co-Principal Investigator from sources where the grant commenced in 2022 or later. During this time, those researchers have collectively been awarded the equivalent of at least US$6.5 million from seven countries: US, Japan, Ireland, France, Portugal, and Croatia, and an undisclosed sum from Russia.

One researcher with more than US$50 million in funding has authorship on one of the Pharmakon papers. It is not clear if he knowingly participated in the network or was part of a former student activity. 

“Many of the researchers had grants before and after Pharmakon. This is legitimate, taxpayer money in most instances that are funding very unethical practices,” Dr McIntosh says.

“One aspect we need more time to vet is the possibility that a few of these researchers do not know they were authors on papers within this network. We are still completing this work.”

Of the funded researchers, five had never previously received funding for their research, but following their involvement with the Pharmakon Neuroscience Network they have since been awarded grants from the following sources ($US equivalent):

  • Science Foundation Ireland – $649,891
  • Ministry of Science, Technology and Higher Education (Portugal) – $538,904 total
  • Croatian Science Foundation – $206,681
  • Russian Science Foundation – undisclosed sum

“Here we have evidence that some authors have secured legitimate funding, including large sums of taxpayers’ money, following their participation in questionable research and publication activity,” Dr McIntosh says.

“We can presume that their publication portfolio, no matter how it was obtained, helped in securing this funding from legitimate sources.”

Dr McIntosh says this case has implications across the research system and emphasizes the need for stronger verification, monitoring, and cooperation.

“Although most of these publications remain in circulation and have been cited widely, corrective actions have been limited. This highlights the challenge of addressing such networks once their work is embedded in the scholarly record,” she says.

Recommendations

Dr McIntosh recommends the following:

  • Oversight to be reinforced by requiring the use of verified institutional identifiers, such as GRID or ROR, in all publications to ensure affiliations are legitimate and traceable.
  • Transparency to be mandated through clearer author contribution statements and verified funding acknowledgments, creating a more reliable and accountable record of how research is conducted and supported.
  • Monitoring mechanisms should be improved by supporting the adoption of forensic scientometrics, which can detect unusual collaboration patterns or questionable authorship practices before they become systemic.

“By addressing these gaps, governments, publishers and research institutions alike can help protect the integrity of the research system and ensure that trust in science is maintained,” Dr McIntosh says.

See further detail about this investigation in Dr McIntosh’s blog post: From Nefarious Networks to Legitimate Funding.


About Digital Science

Digital Science is an AI-focused technology company providing innovative solutions to complex challenges faced by researchers, universities, funders, industry and publishers. We work in partnership to advance global research for the benefit of society. Through our brands – Altmetric, Dimensions, Figshare, IFI CLAIMS Patent Services, metaphacts, OntoChem, Overleaf, ReadCube, Symplectic, and Writefull – we believe when we solve problems together, we drive progress for all. Visit digital-science.com and follow Digital Science on Bluesky, on X or on LinkedIn.


Media contact

David Ellis, Press, PR & Social Manager, Digital Science: Mobile +61 447 783 023, d.ellis@digital-science.com


 

Fraudsters use fake stars to game GitHub




Carnegie Mellon University

fake github stars 

image: 

New research from Carnegie Mellon's School of Computer Science shows that GitHub users increasingly buy or trade for fake stars, then use their ill-gotten gains to attract unsuspecting users so they can steal their cryptocurrency, swipe their credentials or trick them into downloading malicious software.

view more 

Credit: Carnegie Mellon University





Millions of users of GitHub, the premier online platform for sharing open-source software, rely on stars to establish their software product's credibility. But new research from Carnegie Mellon University's School of Computer Science shows this star-based system has grown star-crossed.

Researchers in the Software and Societal Systems Department found that users increasingly buy or trade for fake stars. Some use their ill-gotten cachet to quickly build up the reputation of repositories, the GitHub homes for software projects. More diabolically, other bad actors use fake stars to attract unsuspecting users so they can steal their cryptocurrency, swipe their credentials or trick them into downloading malicious software.

From July 2019 to December 2024, the CMU researchers counted six million stars on GitHub that appear to be fakes.

"We're not the first people to talk about this, so we weren't surprised to find fake stars that were associated with scams," said Bogdan Vasilescu, an S3D associate professor. "But we were surprised by their volume."

Social media platforms have long used popularity signals — such as stars, likes, followers or retweets — to establish user reputations. And for just as long, some users have boosted their profiles by buying or otherwise acquiring fake, unearned stars or other signals.

"One way to think of the GitHub ecosystem as a whole is as an attention economy, much like social media," Vasilescu said.

GitHub is organized such that each software project resides in its own repository. There are millions of repositories, and they all seek to attract users who might download their software or help further develop a project. In such a competitive enterprise, accumulating stars can be critical.

Fake stars might seem like a shortcut to such acceptance, and various vendors have cropped up to sell them. (Try Googling "buy GitHub stars.") But it's also possible to trade for stars on exchange sites. The use of fake stars on GitHub began growing in 2022 and surged in 2024. At their peak — to date, in July 2024 — more than 16 percent of GitHub repositories were associated with fake star campaigns.

The CMU researchers found that fake star campaigns intended simply to boost the popularity of a repository usually don't work for long. Unfortunately, most fake star campaigns are associated with malicious sites that pose a real security threat.

Some of these scam repositories try to entice someone to download a bit of software — a cheat for a videogame, for example. Launching the software activates a bit of hidden, nasty software. For example, the researchers cite a file that appeared to be a blockchain application but actually stole cryptocurrency from the user.

A more ambitious approach, called a software supply chain attack, involves a piece of malicious code attached to legitimate, widely used software.

"No software these days is written from scratch," Vasilescu said. "We reuse things as much as possible. Every bit of software is built upon other bits of software."

These packages of reused and overlapping bits of software constitute the software supply chain. A bit of nefarious software inserted in the right place can affect a substantial number of applications downstream in the supply chain.

Last year, computer scientists uncovered a software supply chain attack that came to be known as the XZ Backdoor. A fraudster gained access to XZ Utils, a package of compression/decompression software that has been incorporated into many computer systems. The fraudster added hidden software that created a "backdoor" that enabled unauthorized access to any computer system using this modified software.

"XZ Backdoor was the most famous attack of this kind to date and also the most elaborate," Vasilescu said.

The perpetrator spent about two years earning the trust of the people who controlled the XZ Utils repository until he was given authorization to modify the code. The scheme was cut short, however, when a Microsoft software engineer discovered the backdoor while investigating the cause of unusual software test results.

While the XZ Backdoor supply chain attack was not propped up by fake stars, it illustrates how vulnerable open-source software can be when nefarious actors are involved. Fake stars muddy the waters when trying to differentiate the good from the bad.

To study the fake star phenomenon, the researchers — including Hao He, a Ph.D. student in software engineering, and Christian Kästner, an associate professor in S3D — created a tool called StarScout that scans GitHub activity looking for anomalous behaviors.

The behaviors of interest fall into two categories. One is an account that has little activity and often has empty profiles and default avatars. The other behavior involves large groups of accounts that seem to act in lockstep, all awarding stars within a short time span. 

This latter category seems to identify accounts linked with star vendors, Vasilescu said.

"It's necessarily the case that if you're one of these merchants, the delivery of fake stars happens quickly because otherwise you would have a dissatisfied customer," he explained.

What's to be done about the fake star problem? The researchers suggest that reducing the reliance on stars for GitHub's reputation system would make sense. Another approach might be to not count everyone's stars equally. Perhaps only users who have held accounts for a long time or have otherwise established their own reputations should be allowed to issue stars.

Vasilescu said that regular use of a tool such as StarScout would also be advisable. In this case, the people who run GitHub have advantages in using such tools. Unlike the CMU researchers, who could only access public data, GitHub operators can access private data, such as user IP addresses.

The research team's report on StarScout and GitHub fake stars has been accepted for the 2026 International Conference of Software Engineering. In addition to He, Vasilescu and Kästner, the team included Haoqin Yang, an undergraduate CMU computer science major; Alexandros Kapravelos, a computer scientist at North Carolina State University; and Philipp Burckhardt, a data scientist with Socket Inc., a cybersecurity firm that specializes in software supply chain attacks.



No comments:

Post a Comment