Influence of Facebook algorithms on political polarization tested
A landmark collaboration shows that Facebook’s news feed filters partisan political news to users with the same views. But changing the feed algorithm to reduce exposure to like-minded content does not reduce political polarization.
Voters are polarized in many countries to the point at which they feel greater aversion to people with different political views than affection for like-minded people1. A common concern is that news-feed algorithms on social media could have a role in generating polarization by exposing individuals to more information from politically like-minded sources than they would otherwise see, thus creating ‘echo chambers’ and ‘filter bubbles’2.
An often-proposed solution to this is to change these algorithms to reduce exposure to content from sources that agree with the users’ political views. Four papers, one3 from Nyhan et al. in Nature and three4–6 in Science, now describe the findings of the U.S. 2020 Facebook and Instagram Election Study, a remarkable collaboration between researchers at Meta (the company that owns Facebook and Instagram) and independent academics.
The results cast light on the influence of social media on polarization, and expose the limits of making changes to news-feed algorithms in efforts to depolarize political attitudes.
D. Garcia, Influence of Facebook algorithms on political polarization tested, Nature 620 (2023) 39-41.