A new study of online communities suggests that their interaction dynamics can amplify small, local imbalances in opinions, rapidly turning initially mixed-opinion communities into highly-polarized ones—even without the algorithms and homogeneity-seeking behaviors typically blamed for sculpting echo chambers. Petter Törnberg of the University of Amsterdam, The Netherlands, presents these findings in the open-access journal PLOS One on May 6, 2026.
Prior research has linked online echo chambers with radicalization, the spread of misinformation, rising polarization, and other issues. The emergence of online echo chambers is often blamed on users seeking like-minded communities and on deliberate algorithmic personalization by online platforms. However, research on such possible mechanisms has been limited. To gain deeper insight, Törnberg ran simulations using computational models of online communities.
Simulated users were randomly assigned either to hold an opinion or its opposite, and then interacted with randomly selected members of their community. If the proportion of community members holding the opposite opinion exceeded a certain threshold, the user left and relocated to a different community. This reflects the real-world tendency of users to exit an online community once their interactions exceed their personal tolerance for disagreement, even if they do not seek complete conformity. Importantly, the simulations lacked algorithmic personalization, and relocation was random, without a preference for seeking out like-minded communities.
Törnberg found that, even with very low disagreement thresholds, small, random imbalances were amplified, such that random interactions became more likely to exceed disagreement thresholds. This pushed more and more users to relocate and possibly to seed imbalances in other communities. Thus, initially mixed-opinion communities rapidly polarized. These findings suggest that online echo chambers could arise unintentionally, even when users are fairly tolerant of opposing opinions, and without algorithmic nudging or homogeneity-seeking behavior. In additional simulations, Törnberg incorporated algorithmic personalization features, finding that, in some cases, they can actually slow relocation and reduce echo-chamber formation, thereby keeping the overall community more heterogeneous.
Törnberg also analyzed a real-world online "manosphere" Reddit echo chamber, r/MensRights, finding that users were more likely to exit the community if their posts were linguistically farther from the group's linguistic "center of gravity."
This study could help to understand the dynamics of online communities, and could inform efforts to reduce their polarization.
Törnberg adds: "The key insight is that echo chambers are not just designed or chosen—they can emerge from the basic architecture of how online interaction is organized."
"Online polarization may be less about what people want or what platforms do, and more about the feedback loops built into digital social life."
"What surprised me most was the finding that the very algorithms often blamed for creating echo chambers can, under some conditions, do the opposite—by keeping people comfortable enough to stay, they can actually preserve diversity."
In your coverage, please use this URL to provide access to the freely available article in PLOS One: https://plos.io/4tx5nIF
Citation: Törnberg P (2026) Echo chambers can emerge without algorithmic personalization or a preference for homogeneity. PLoS One 21(5): e0347207. https://doi.org/10.1371/journal.pone.0347207
Author countries: The Netherlands.
Funding: Dutch Research Council (NWO) VIDI Grant VI.Vidi.231S.089.