Research, Attention, and Manipulation
Questionable articles can quietly move into Wikipedia and news outlets without notice.
Forensic scientometrics isn’t about finding fraudulent papers. It’s about tracing the signals of manipulation and learning how to trust in science is quietly shaped, sometimes distorted, in public view.
How might attention be used strategically – even manipulatively – to amplify questionable science from within the scholarly sphere? We’re not talking about political gaming of scientific information by those not conducting the research. This inspection is about people within the academic sphere, with the education and appointments of seemingly trustworthy scholars, whose work gets attention beyond a published paper. Given Leslie’s work reviewing a network of people seemingly participating in authorship-for-sale schemes, we started there.
The Changing Landscape of Attention
When we talk about scientific attention, we often focus on citations, impact factors, or journal prestige. But in an increasingly online world, scientific articles travel far beyond the bounds of academia, gaining traction – and, crucially, legitimacy – through social media, Wikipedia edits, and inclusion in news articles, blogs, and SEO-chasing clickbait.
What happens when a potentially illegitimate paper travels these paths, specifically Wikipedia and news articles? How can we see who’s engaging with it online and what they’re saying about it? To better understand the attention being received by such a paper, we’ve used Altmetric, which gauges the online attention of research by tracking a range of sources.
This post is an exploration, not a conclusion. We’re examining one specific article – published in the Food and Chemical Toxicology journal – that popped up in Altmetric Explorer with more mentions in Wikipedia and News than in social media conversations. So, why focus on these two sources, and why this paper as an example?
Wikipedia and news sources are important information venues. They are trusted by many people and move more slowly with more checks than social media channels. But both venues risk misdirection – unintentionally, when Wikipedia editors and news journalists treat scholarly publications as equally credible, or as a result of intentional manipulation. With Wikipedia, contributors are vetted over time but may still be anonymous, adding a layer of privacy but also of potential obfuscation of bad actors. In the case of news organizations, the influence of editorial decisions, funding sources, and institutional biases creates a complex landscape where misinformation can slip through under the guise of credibility.
This paper – Potential health benefits of carotenoid lutein: An updated review – is also tied to broader concerns about “authorship-for-sale” networks (here and here), where organizations or individuals sell bylines on scientific articles, often without legitimate contribution or peer review. So, how does a paper like that find its way into public-facing channels people tend to trust?
One of the most unsettling aspects of authorship-for-sale is how convincingly these articles look like real science. They’re published in peer-reviewed journals, often with open access (in this case, for a fee of USD 4,670). They can have DOIs, ethical statements, and inclusion in publisher platforms like ScienceDirect.
And once published, they become citable, naturally.
From there, attention metrics kick in – journal articles cited in Wikipedia, links in news articles boost visibility on search engines. Before long, visibility becomes a proxy for validity, even when the underlying science is shaky.
That’s the digital scaffolding of legitimacy. But what if someone’s manipulating the scaffolding?
Wikipedia – Trusted by Many, Edited by Anyone
Let’s begin with the spinach.
A version of this article appears as reference #22 on the Romanian Wikipedia page for spinach (spanac). The edit was made in May 2023 by a contributor who describes themselves as a professor and scientist with a clear interest in editing vegetable-related Wikipedia pages.
Is this a case of a well-meaning academic trying to improve public science communication? It might be. The editor explains that they prefer to cite peer-reviewed articles and scientific sources. But they also note their primary area of expertise isn’t botany or nutrition.
That leads us to consider two possibilities:
Naïve Citation – The editor trusted the article simply because it was in a “respectable” journal. If so, that’s another reminder of the trust placed on publications and the need to have them vetted. Even scientists are not immune to the veneer of legitimacy.
Deliberate Propagation – The editor knew the article’s origins and inserted it intentionally to boost its visibility.
Both scenarios are troubling. The first reveals a structural vulnerability in how we curate public knowledge. The second raises the possibility of an organized attempt to plant citations in high-trust spaces.
Which is it? We don’t know. And that’s a problem.
News Media – A Mirage of Reach
The article also shows up in nine news stories. Some of the sources are recognizably health-focused (Verywell Health) or general news with a health section (Lagranepoca.com). Others are not as clearly categorized. By bringing together multiple news sources that give attention to the same research paper, Altmetric helped us spot similarities worth a deeper look.
Six of the news mentions appear to be the same article syndicated across multiple sites or scraped from one site and repeated on others, with broken links and minimal traceability. A few sites, like TodayHeadline or Nouvelles Du Monde, act as content aggregators. Their articles have since disappeared.
That’s not unusual in itself. Web content dies all the time – due to maintenance, expired domains, or takedown requests. But in this case, the pattern raises questions:
Why was the same article published (or copied) in multiple languages? Could the rise of machine translation be playing a part in boosting syndicated content more broadly and at a quicker pace?
Who authored the original? A journalist, a marketer, or a machine?
Is there an SEO strategy at play – one designed to artificially boost the article’s Google ranking?
As of July 10, 2025, the ScienceDirect article ranks in the top 5 results for searches like health benefits of lutein. Coincidence? Possibly. Or maybe someone is optimizing visibility deliberately.
The Limits of Transparency
Here’s what makes this case difficult to untangle.
Wikipedia editors can be pseudonymous or unverifiable.
News sites may not list bylines or maintain archives. The article may also have authors that do not match real people (h/t David Ellis).
Aggregated or translated content further muddies the trail.
And so, while one article doesn’t prove a conspiracy, it hints at something worth investigating. Visibility without vetting creates the conditions for manipulation; research legitimacy can be quietly rewritten in the margins.
The takeaway for readers is that questionable articles can quietly move into news without notice. In this case, the publication has a mention on PubPeer, which is a starting point to check for potential questions about an article. Science needs better defense mechanisms: researchers, journalists, and the public deserve tools to distinguish scientific rigor from its imitations.
Disclosure from Leslie
I don’t hide where I work or what my company does – Digital Science is a research technology company. But the FoSci project lives outside that perimeter. These posts aren’t about promoting products; they’re about understanding how trust in science is built or broken.
Still, I can’t ignore the fact that the tools I have at my disposal allow me to dig. And sometimes, a good investigation begins with a simple ask. This one started with: Do you have anything where you use Altmetric (looking at scholarly attention) forensically? Strictly speaking, no. But in truth, I’ve been circling the deep end of this question for a while, and thanks to experts like Federica, I was able to make progress on this topic.
Fully agree with "Science needs better defense mechanisms: researchers, journalists, and the public deserve tools to distinguish scientific rigor from its imitations." I've been exploring this issue over the past year and had some neat insights from a metascience workshop earlier this year. This is an excellent space for tool-builders and science communicators to innovate in.
To make the message stick, I'd love for this to turn into a recurring blog activity spanning disciplines. We need to show just how far poor quality research travels.