Fake Disease Hoax Exposes Internet's Gullibility

Until a few years ago, no one had heard of bixonimania. Then, in 2024, a group of scientists posted findings online announcing the condition, which they claimed affected the eyes after computer use. However, the scientists had made it up - not just the work, but the authors' names, affiliations, locations and funding, which was the University of Fellowship of the Ring and the Galactic Triad.

Large language models like ChatGPT and Gemini treated it as real anyway, and in doing so, helped turn a fictional disease into a legitimate-sounding health concern.

Bixonimania is not an isolated case. Being deceived - whether you are a person or an AI model - is concerningly common, in science and beyond. Whether we're talking about AI hallucinations, state-backed disinformation or just everyday lies, humans have a remarkable knack for naivety, owing to our biases and increasing need to outsource learning to others. These are problems we - individually and collectively - urgently need to better understand and overcome.

Our shared fascination with deception may help explain the popularity of The Traitors, a TV programme built around the tension between trust and suspicion, where contestants must decide who among them is deceiving the group.

The show captures something intrinsic to being human: the persistent threat of being unsure about whether we're placing trust effectively. Yet in the modern era of mass digital communication and AI , we're now almost constantly faced with a similar threat, often without realising it.

At a recent event at the Cambridge Festival , we aimed to highlight this risk through a Traitors-themed science event. Four panellists presented work, all of which could have been a lie. The audience was asked to vote on which of the presenters was deceiving them and why.

We deliberately made the presenters and their work outlandish. From their varying backgrounds and with varying accents, the panellists presented their work in global health, climate, media and astrophysics. Some dressed formally, while one - a Nigerian researcher presenting her work on immigration in a healthcare context - wore clothes linked to her ethnic identity.

We were interested in exploring which of these signals - accent, gender, ethnicity, and dress and presentation style - influenced the audience's decisions. Both content and presentation styles influenced them, but the signals they relied on led them to the wrong conclusions, rating the traitors as more credible than honest researchers.

The ones who received the most votes were the two "faithful" researchers (to use the language of The Traitors) - Ada, from the non-profit Development Media Initiative, and Sarah, an astrophysicist working in galactic archaeology.

Ada's team had saved lives by sharing health information with communities in the global south through running ten radio broadcasts daily. The audience thought the results were implausibly impressive.

"Ada's data is too good to be true," one person reported in our questionnaire. She was also presenting work she hadn't personally contributed to. Even though this is common in large collaborations, this distance led to perceptions of a lack of confidence, undermining her credibility.

Sarah, an astrophysicist, had presented her subfield of galactic archaeology - the study of the Milky Way's formation history through the chemical signatures of ancient stars. Yet with only four minutes to speak, she was unable to convey significant depth. The audience read that as a lack of understanding.

The outlandishness of her field's name also harmed perceptions of her legitimacy. "Galaxy [sic] archaeology is too cool a name to exist," one audience member wrote.

By contrast, the two traitors, Jack and Joyce, received the fewest votes. Jack was an actor who created the persona of a climate researcher specialising in rain. Joyce presented her own work but falsified the results.

Interestingly, Joyce's personal connection to her work - she is a Nigerian woman conducting research into Nigerian communities - helped to convince the audience of her authenticity. "Joyce's presentation sounded very considered and genuine - the process of her research and recounts of her personal experiences sounded like she had lots of interest in the area," one person wrote.

The event was meant to be fun and engaging. Yet we also wanted to illustrate the many ways people can misrepresent themselves, whether in science or beyond. Our traitors showed that lies don't just have to be about who you are (Jack is an actor, not a researcher) but about what you say (Joyce is a researcher but falsified her results).

Misinformation has always existed. What's new is the speed at which it spreads, the tools that generate it, and how convincingly it mimics the real thing.

Why maths isn't enough

Our collective capacity to recognise false information is also at risk. This is because, as a society, we continue to promote the importance of hard science subjects at the expense of the critical thinking skills derived from studies of the arts, humanities and social sciences.

This can be seen, for example, in the 2023 UK governmental push to require all school students to take maths until age 18. No such push exists to promote and develop the critical thinking skills of young people. It's easy to see how increasingly convincing falsehoods like bixonimania's existence can be accepted as truth, especially when touted by AI models.

Tools are helpful. AI is a tool, the internet is a tool, the media is a tool. But it's up to us to ensure that we are using them and not being manipulated by them.

In The Traitors, we have little to go on to determine what is true. Yet in the real world, we have the ability to check the truth of claims. With effective caution and critical thinking, it is entirely possible to determine what is trustworthy, but it requires thinking for ourselves. Trust is ours to give, and we need to learn to give it wisely.

The Conversation

Jonathan R. Goodman receives funding from the National Institute of Health Research, the Wellcome Sanger Institute, and the Wellcome Trust.

Mariam Rashid receives funding from the Isaac Newton Trust and the Kavli Foundation.

/Courtesy of The Conversation. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).