Online Debates: Who Gets Heard?

Max Planck Institute for Human Development

In a group-based field experiment, the researchers set up six private subreddits, each with up to 100 participants. Over a four-week period, the groups exchanged views on 20 political topics, knowing that they were being observed by the researchers. The analysis covered 5,819 comments posted in response to the discussion prompts, a further 62,000 comments made by participants in other Reddit communities, and survey responses provided before, during, and after the study period.

The results are clear: Those who perceived a discussion environment to be toxic, disrespectful, or highly polarized tended to remain silent. Surprisingly, however, those same perceptions predicted higher comment counts among active users. In other words, a heated environment may actually motivate the active minority to comment. The most active participants tended to be male, highly interested in politics, and to describe themselves as likely to comment online.

The researchers also tested various types of intervention designed to motivate lurkers to make an active contribution to the discussion. Financial incentives of $2 for every day study participants wrote at least one serious comment succeeded in boosting participation, but only moderately reduced the dominance of power users. Appealing to norms like "Please stay respectful" had little effect, but positive feedback in the form of upvotes was associated with increased activity on the next day.

For Lisa Oswald, the goal isn't to "save" online debates. Oswald is the lead author of the study and a researcher in the Center for Adaptive Rationality at the Max Planck Institute for Human Development. "We can't expect everyone to participate online," she says. She is more interested in understanding why a small minority are highly active while the majority remain silent—and what that means for perceptions of public opinion. The participation gap gives readers a distorted view of what the general public thinks. When reading the comments section under an online news article, for example, anyone assuming it the comments to reflect public opinion will often be sorely mistaken.

The study offers concrete recommendations for platform operators and community managers: A combination of nonmonetary rewards for first-time and high-quality contributions, clear rules to reduce toxicity that are consistently enforced, and caps on comment counts may reduce participation inequality and help to make perceptions of public opinion more realistic.

"You can't just press a button to achieve more participation," Oswald says. "But you can create conditions that make it easier for people to speak up—especially those who have been lurking in the background."

The study was funded by Horizon Europe's Social Media for Democracy (Some4Dem) project. The study data and code are publicly available in anonymized form, offering rare insights into the dynamics of discussions on social media platforms.

At a glance:

  • Online discourse is driven by a minority of highly active users; the majority remain silent—which distorts perceptions of public opinion and can fuel polarization.
  • Field experiment (6 subreddits, 520 people, 4 weeks) shows: Those who found the discussion toxic, disrespectful, or polarized were more likely to remain silent. Surprisingly, however, those same perceptions predicted higher comment counts among active users. The most prolific users tend to be male, highly interested in politics, and to describe themselves as likely to comment online.
  • Interventions: Financial incentives broaden participation but only moderately reduce inequality; appealing to norms without enforcement has little effect (in this environment, which was generally low in toxicity). Visible positive feedback (more upvotes than downvotes) was associated with increased future participation.
  • Implications for platform design: No one-size-fits-all solution, but several potential points of intervention: nonmonetary rewards for first-time and high-quality contributions, clear rules to reduce toxicity that are consistently enforced, and caps on comment counts to reduce the dominance of extremely active users.
/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.