On a sunny afternoon, I was scrolling through social media when I came across a video of a young woman tossing her sunscreen into a bin. "I don't trust this stuff anymore," she said to the camera, holding the bottle up like a piece of damning evidence.
Author
- Rachael Kent
Senior Lecturer in Digital Economy & Society Education, Department of Digital Humanities, King's College London
The clip had been viewed over half a million times, with commenters applauding her for "ditching chemicals" and recommending homemade alternatives like coconut oil and zinc powder.
In my research on the effect of digital technology on health, I've seen how posts like this can shape real-world behaviour. And anecdotally, dermatologists have reported seeing more patients with severe sunburns or suspicious moles who say they stopped using sunscreen after watching similar videos.
Sunscreen misinformation created by social media influencers is spreading and this isn't just a random trend. It's being fuelled by the platforms designed to host influencer content.
Get your news from actual experts, straight to your inbox. Sign up to our daily newsletter to receive all The Conversation UK's latest coverage of news and research, from politics and business to the arts and sciences.
In my book, The Digital Health Self , I explain how social media platforms are not neutral arenas for sharing information. They are commercial ecosystems engineered to maximise engagement and time spent online - metrics that directly drive advertising revenue.
Content that sparks emotion - outrage, fear, inspiration - is boosted to the top of your feed. That's why posts questioning or rejecting science often spread further than measured, evidence-based advice.
Health misinformation thrives in this environment. A personal story about throwing out sunscreen performs well because it's dramatic and emotionally charged. Algorithms reward such content with higher visibility: likes, shares and comments all signal popularity.
Each second a user spends watching or reacting gives the platform more data - and more opportunities to serve targeted ads. This is how health misinformation becomes profitable.
In my work, I describe social media platforms as "unregulated public health platforms". They influence what users see and believe about health, but unlike public health institutions, they're not bound by standards for accuracy or harm reduction.
If an influencer claims sunscreen is toxic, that message won't be factchecked or flagged - it will often be amplified. Why? Because controversy fuels engagement.
I call this environment "the credibility arena": a space where trust is built not through expertise, but through performance and aesthetic appeal. As I write in my book: "Trust is earned not by what is known, but by how well one narrates suffering, recovery, and resilience."
A creator crying on camera about "toxins" can feel more authentic to viewers than a calm, clinical explanation of ultraviolet radiation from a medical expert.
This shift has real consequences. Ultraviolet rays are invisible, constant and damaging. They penetrate cloud cover and harm skin even on cool days.
Decades of research, especially in countries like Australia with high skin cancer rates, show that regular use of broad-spectrum sunscreen dramatically reduces risk . And yet, myths spreading online are urging people to do the opposite: to abandon sunscreen as dangerous or unnecessary.
This trend isn't driven solely by individual creators. It's embedded in how content is designed, framed and presented. Algorithms prioritise short, emotionally-charged videos. Interfaces highlight trending sounds and hashtags. Recommendation systems push users toward extreme or dramatic content.
These features all shape what we see and how we interpret it. The "For You" page isn't neutral. It's engineered to keep you scrolling, and shock value outperforms nuance every time.
That's why videos about "ditching chemicals" thrive, even as posts on other aspects of women's health are shadowbanned or suppressed . Shadowbanning refers to when a platform limits the visibility of content - making it harder to find, without informing the user - often due to vague or inconsistently applied moderation rules.
The system rewards spectacle, not science. Once creators discover that a particular format, like tossing products into a bin, boosts engagement, it's replicated over and over again. Visibility isn't organic. It's manufactured.
Those who throw away their sunscreen often believe they're doing the right thing . They're drawn to creators who feel relatable, sincere and independent - especially when official health campaigns seem cold, patronising or out of touch. But the consequences can be serious. Sun damage accumulates silently, raising skin cancer risk with every hour spent unprotected.
Sunscreen isn't perfect . It needs to be reapplied properly and paired with shade and protective clothing. But the evidence for its effectiveness is clear and robust .
The real danger lies in a system that not only allows misinformation to spread, but also incentivises it. A system in which false claims can boost an influencer's reach and a platform's revenue.
To resist harmful health trends, we need to understand the systems that promote them. In the case of sunscreen, rejecting protection isn't just a personal decision - it's a symptom of a digital culture that turns health into content, and often profits from the harm it causes.
Rachael Kent does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.