Eighty-one years after Adolf Hitler died by his own hand in a Berlin bunker, a viral video on TikTok shows an AI-generated vision of the Nazi dictator standing in Antarctica, shoulders broad and face smiling, sipping a White Monster Energy drink while Men at Work's iconic song Down Under plays.
It's an absurd image, but one that makes sense in the context of the "Agartha" trend on TikTok, which is quietly bringing white supremacist narratives into the mainstream to be seen by millions of users.
The modern myth of Agartha , a supposed utopia hidden inside the hollow Earth, was constructed from older pieces by esoteric authors after the second world war. It blends "Aryan" white supremacist themes with ideas of an occult SS and Third Reich spaceships.
Literal belief in hollow-Earth myths or Nazi UFOs is not the point. Instead, it's an aesthetic - one that can host both coded far-right messages and explicit ones, fused with pop-cultural references such as the White Monster Energy meme .
Mainstreaming through borderline content
Agartha videos on social media are " awful but lawful ": the content is objectionable but legal. It allows extremists to embed their narratives into mainstream social media spaces, without triggering moderation or outright rejection by the audience. As a result, they can reach large, young audiences.
To understand how the underlying esoteric myths are used, we analysed a network of more than 43,000 Agartha-related TikTok videos and closely studied selected examples. This analysis is part of an ongoing project led by researchers at Neu-Ulm University of Applied Sciences in Germany. The goal is to understand how extremists abuse platform features to carry radical narratives into the mainstream.
We identified four key mechanisms which far-right actors use to push radical narratives to unsuspecting audiences: aesthetic camouflage, dog-whistles and split-second glimpses, network building, and weaponised irony. Let's unpack each of these terms.
Aesthetic camouflage
Moderation systems on social media platforms aim to remove overt extremist propaganda. These systems work imperfectly, but overt propaganda is unlikely to reach mass audiences before being removed.
Instead, far-right actors often use generative AI to mask racial ideology behind seemingly benign tropes from science fiction and fantasy. This allows a " de-demonising " of their ideas. Elf-like depictions of the "Aryan" inhabitants of Agartha, or footage of an underground utopia, make the idea of a white ethnostate seem palatable.
The engaging aesthetic keeps people watching longer. In turn, this triggers the algorithm to push the video to wider audiences.
Dog whistles and split-second provocations
Far-right actors infuse their content with dog whistles that communicate to a certain audience without triggering moderation.
Our research identified recurring visual symbols in Agartha videos. "Raw milk" signals white supremacy. The number 271 , which appears in hundreds of the videos, is a code for Holocaust denial.
Agartha videos also sometimes include overt extremist markers such as the Hakenkreuz (the Nazi symbol of the "hooked cross" or swastika). They are often flashed for only fractions of a second.
This is a calculated provocation. It tests and pushes platform boundaries, normalising the presence of far-right markers and slowly desensitising viewers. At the same time, successful inclusion of forbidden symbols in videos with viral reach serves as a badge of honour within the in-group.
Building network bridges
The Agartha community is not an isolated echo chamber. Our analysis shows the Agartha network connects to others on TikTok. Around 87% of these connections come via the inclusion of mundane, mainstream hashtags on Agartha videos.
Extremists may deliberately "hijack" benign, high-traffic hashtags, such as #roblox or #loganpaul, to push their content into everyday feeds. Agartha is also strongly connected to gym and fitness content. Hashtags such as #gymtok serve as a bridge, potentially funnelling users towards radical narratives.
Here, we also observed targeted appeals. Agartha videos would commonly include hashtags associated with " looksmaxxing " - a trend for extreme physical self-optimisation - to push their videos towards audiences seen as potentially susceptible, such as insecure young men. Well-known looksmaxxers were also sometimes depicted in Agartha videos.
Weaponised irony
As the example of Adolf Hitler drinking a Monster Energy illustrates, Agartha videos rely heavily on absurd situations, frequently co-opting mainstream social media trends.
Because White Monster Energy is a popular meme within non-extremist spaces, users familiar with the trend are particularly likely to get algorithmic recommendations fused with extremist narratives.
Similarly, actors superimpose mainstream figures into Agartha to force association, such as YouTuber Logan Paul or actor Mads Mikkelsen.
Such co-opting is also done through music, with many hollow-Earth edits set to Men at Work's 1980s hit Down Under. The track serves as a darkly ironic nod to the subterranean utopia.
This increases the chances for algorithmic amplification while providing creators with plausible deniability. Criticisms can be dismissed as a misunderstanding of dark humour or current trends.
Recognising the threat
Agartha is more than just a digital resurfacing of fringe occultism. It is a blueprint for how to design extremist content for algorithmically curated short-video platforms.
It's also a reminder that extremist content on social media does not exist in isolation. Instead, it lives in what researchers call " hybridised spaces ", where users move in and out of extremist discourse. In such spaces, borderline content, outright extremism, mundane trends and humour blend seamlessly - and participants may find their mainstream interests lead them to radical narratives.
![]()
Marten Risius receives funding through the Distinguished Professorship Program via the Bavarian Hightech Agenda from the Bavarian Ministry of Sciences and Arts.
Christopher David does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.