Americans perceive small juries of content experts as the most legitimate moderators of potentially misleading content on social media, according to a survey, but perceive large, nationally representative or politically balanced juries with minimum knowledge qualifications as comparably legitimate. Social media content moderation policies tend to attract criticism, with some calling for more aggressive removal of harmful and misleading content and others decrying moderation as censorship and accusing expert moderators of being politically biased. Less clear is what the general public would like to see in terms of content moderation. Cameron Martel and colleagues surveyed 3,000 US residents through the survey platform YouGov in the summer of 2023. Each respondent was asked to assess the legitimacy of moderators when they disagreed with their decision about whether a piece of content was harmfully misleading. The nine options presented to respondents included moderation juries composed of professional fact-checkers, professional journalists, or domain experts, as well as juries composed of randomly selected nationally representative non-experts, randomly selected users of the social platform, a politically balanced group of non-experts, a coin flip, the head of the social media company, or a computer algorithm. All expert juries were fixed at a size of 3 members while layperson juries were randomized as sizes of 3 members, 30 members, or 3,000 members. Layperson juries also varied on the level of required qualifications for participation (e.g., minimum news knowledge). Some juries were described as evaluating content independently while others were described as deciding through group discussion. Overall, participants preferred domain experts over non-experts or algorithms. However, large nationally representative or politically balanced juries with some minimal level of qualifications that were described as making decisions through discussion were seen just as favorably as small juries of independent experts. Republicans perceived experts as less legitimate compared to Democrats, but still preferred experts to non-experts when layperson juries were small and had no knowledge qualifications. Respondents were unenthusiastic about the head of social media companies making moderation decisions, rating this option as about as attractive as a coin flip. According to the authors, the results have practical implications for social media platforms and content moderation system regulators who wish to design systems that users will trust.
Public Opinion: Who Should Moderate Content?
PNAS Nexus
/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.