led by Pompeu Fabra University (UPF) confirms the rise in ideological polarization and biased or false news posted on Facebook. This groundbreaking research analyzed over 6 million news-related URLs – from 1,231 different domains in the United States – shared on Facebook between 2017 and 2020.
The news stories from these four years covered such significant events as the COVID pandemic, the 2020 US presidential elections (which culminated in the attack on Congress following Trump's loss) and the 2018 midterm elections, in which all seats in the House of Representatives and one third of Senate seats were up for election.
The research shows that the upward trend in the ideological polarization and dissemination of false or biased news stories coincides with changes to the platform, which altered how the information shown to users is presented and attached greater weight to certain types of interactions in the algorithm used to determine this ranking. These changes also coincided with variations in the way users engage with media content (engagement patterns).
The analysis compared the engagement with posts containing over 6 million news-related URLs during a span of 4 years to user ideology
The results of the study were recently published in an article appearing in the journal EPJ Data Science. Emma Fraxanet, a researcher in computational social sciences in the UPF Department of Engineering , is the principal author of this study, part of a research project supervised by Vicenç Gómez, member of the Artificial Intelligence and Machine Learning Research Group . Co-authors include Andreas Kaltenbrunner (UOC) and Fabrizio Germano (UPF and BSE).
The research team analyzed the level of engagement with posts containing over 6 million news-related URLs, calculated with a combination of metrics (clicks, shares, likes, comments and other reactions), which they characterized in relation to user ideology and news source quality. This approach made it possible to observe an ideological gap in the news consumed by conservatives and liberals and track its evolution over time.
An in-depth analysis was also conducted on two significant changes to the platform, implemented in 2018 and 2020, which were revealed in an data leak that sparked a public outcry over Facebook's algorithm. In both instances, the changes were followed by variations in engagement patterns and an increase in ideological polarization and the sharing of low-quality news, with subtle differences in each case. However, according to researchers, in neither of these two cases, neither 2018 nor 2020, may a direct causal relationship be established between the events, an issue which must be explored further in future studies.
To help better understand engagement patterns, the researchers describe them as being U-shaped: engagement is higher among more extreme-leaning users than moderates. Whereas content with higher levels of engagement tends to gain visibility following the changes of 2018 and 2020, this could explain the upward trend in more extreme ideological content, which differs depending on user profile. In addition, the most biased content also comes from the lowest quality sources.
What changes did Facebook implement in 2018 and 2020?
In 2018, the platform decided to decrease the weight attached to likes and increase that of shares and comments in the ranking algorithm, to foster more meaningful interactions between family and friends. The study shows that, in the wake of this change, user ideologies became more polarized and engagement with lower quality content increased.
In 2020, Facebook decreased the weight of shares and increased that of comments in the ranking algorithm. The platform's reasons for doing so are not altogether clear, although it appears that the intention was to limit toxic or low-quality content. In the latter case, despite the rise in both ideological polarization and engagement with lower quality content, activity on the platform dropped significantly.
Beyond the possible effects of the algorithms, the study notes that another reason for such polarization is user behaviour, with users tending to consume content related to their ideological leaning. The findings of the study suggest that, following the changes implemented by the platform, the differences between the news diets of liberals and conservatives have grown, making it harder to find common ground for democratic debate.
Cited paper: