Social Media Algorithms Adapted to Reduce Polarisation

King’s College London

Algorithms which promote positive interactions on social media could be used to increase trust between users.

image of hand on mobile phone with graphics of networked avatars of people

A new model for algorithms which promote positive interactions on social media could be used to increase trust between users and reduce destructive conflict online.

After exploring how content that increases mutual understanding between users can be incentivised by the algorithms that influence what we see on social media, researchers from King's College London and Harvard University have created a framework for understanding the structure of algorithmic attention-allocators.

Social media posts that promote outrage and sensationalism – which are often prioritised by current algorithms due to high engagement levels – can exacerbate societal divisions and decrease capacity to collaboratively address large-scale societal challenges, argue researchers Luke Thorburn, PhD researcher from the Department of Informatics, King's College London, and Aviv Ovadya, an affiliate at Berkman Klein Center at Harvard University and visiting scholar at the Leverhulme Centre for the Future of Intelligence, University of Cambridge.

In a paper published last week, Thorburn and Ovadya offer an alternative approach to the current model. They propose a 'bridging'-based ranking, which would see content that fosters positive debate, deliberation or cooperation prioritised on platforms.

Luke Thorburn said:

"Algorithms which influence how we allocate our attention are underpinned implicitly or explicitly by a formal model of what is attention-worthy, and because these filter and rank content on our feeds, this impacts how we see the world.

"By being more thoughtful about how we define what is worthy of attention, the hope is that we can positively influence how we collectively relate to each other. The idea is not to reduce everyone to believing the same thing, but rather supporting pluralism through establishing positive and constructive debates and discussions."

Bridging-based ranking relies on selecting content which create space for productive conflict, deliberation, or cooperation. Algorithms that surface content aimed building positive interactions for example could be more highly ranked, from posts that build bonds around cat videos to fostering interactions about the different struggles of daily life.

Aviv Ovadya said:

"The systems that help allocate our attention on these platforms have significant impacts on what kinds of behaviours are incentivised on social media and beyond, including in politics, journalism, entertainment, and even in our families and friendships.

"Through incorporating concepts of bridging, we aim to shift the conversation from content distribution or algorithmic amplification based on engagement to attention-allocation that focuses on what actually matters — whether content is actually seen and valued."

The framework proposed by the researchers interrogates how to decide what content is attention-worthy in a reality where we each have only a finite quantity of attention to allocate.

Currently, the algorithms behind recommender systems on platforms such as Twitter and Facebook select which items of content, from a large pool, should be shown to a user. Recommender systems largely seek to maximise attention using engagement-based ranking, selecting the content most likely to elicit clicks, likes, comments, reshares, and other behaviours the platform can measure.

Through measuring this data, the platform can claim it is promoting the content people most want to see, but experts warn this can also incentivise the creation of content that is misleading, sensational, outrageous, or addictive.

Bridging is not new; qualitative, human-facilitated processes for deliberation and conflict resolution are a well-established form of bridging. Algorithmic tools have also been developed, with platforms such as and used for conducting civic forums and successfully surfacing common ground in national political debates.

Moreover, the first large-scale, publicly acknowledged deployment of a bridging system was the feature launched on Twitter in . It involves Twitter crowdsourcing notes that add context to misleading tweets, and to only publish notes which are rated as helpful by people who normally disagree.

The deliberate development of bridging systems however remains rare. Thorburn and Ovadya hope their work exploring opportunities for implementing bridging in spaces which are algorithmically mediated can change this.

Thorburn and Ovadya conclude:

Our social spaces should not default to divisive. Bridging is a core part of healthy social fabrics and the systems that allocate attention within them, whether human or machine, implicit or explicit."Without sufficient bridging, destructive conflict may undermine our relationships and prevent us from cooperating effectively to respond to societal challenges. But these attention-allocators are not irredeemable - we can improve them to support a more deliberative, peaceful, and pluralistic future.– Luke Thorburn and Aviv Ovadya

Bridging Systems: Open Problems for Countering Destructive Divisiveness across Ranking, Recommenders, and Governance was presented at the inaugural conference of the Plurality Research Network held at the Berkeley, University of California, on Friday 13 January.

To learn more about bridging, you can view the paper. This project additionally benefitted from the support of Professors Maria Polukarov and Carmine Ventre in the Department of Informatics.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.