Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Showing Original Post only (View all)Study: The political effects of X's feed algorithm [View all]
https://www.nature.com/articles/s41586-026-10098-2Abstract
Feed algorithms are widely suspected to influence political attitudes. However, previous evidence from switching off the algorithm on Meta platforms found no political effects1. Here we present results from a 2023 field experiment on Elon Musks platform X shedding light on this puzzle. We assigned active US-based users randomly to either an algorithmic or a chronological feed for 7 weeks, measuring political attitudes and online behaviour. Switching from a chronological to an algorithmic feed increased engagement and shifted political opinion towards more conservative positions, particularly regarding policy priorities, perceptions of criminal investigations into Donald Trump and views on the war in Ukraine. In contrast, switching from the algorithmic to the chronological feed had no comparable effects. Neither switching the algorithm on nor switching it off significantly affected affective polarization or self-reported partisanship. To investigate the mechanism, we analysed users feed content and behaviour. We found that the algorithm promotes conservative content and demotes posts by traditional media. Exposure to algorithmic content leads users to follow conservative political activist accounts, which they continue to follow even after switching off the algorithm, helping explain the asymmetry in effects. These results suggest that initial exposure to Xs algorithm has persistent effects on users current political attitudes and account-following behaviour, even in the absence of a detectable effect on partisanship.
Main
Social media platforms have fundamentally transformed human lives: a large and growing share of the global population connects with others, gets entertained and learns about the world through social media2. These platforms have also become increasingly important for political news consumption. A quarter of US adults report social media as their primary news source, and one half say they at least sometimes get news from these platforms3. Typically, platforms use feed algorithms to select and order content in personalized feeds for each user4. Before algorithms were introduced, users saw a simple chronological feed that displayed posts from followed accounts, with the most recent posts appearing at the top.
Public intellectuals and scholars have raised concerns about the potential adverse effects of social media, particularly feed algorithms, on social cohesion, trust and democracy5,6,7,8. These concerns arise from the spread of misinformation9,10,11, the promotion of toxic and inflammatory content12,13,14 and the creation of filter bubbles with increasingly polarized content15,16,17,18. There is substantial rigorous quantitative evidence that internet access and social media indeed have important negative effects19,20,21,22. Research on search engine rankings also shows that the order in which information is presented can influence user behaviour and political beliefs23. However, previous literature on the effects of social media feed algorithms reports zero political effects. A large study of Facebook and Instagram, conducted by academics in cooperation with Meta during the 2020 US election, found that experimentally replacing the algorithmically curated feed with a chronological feed did not lead to any detectable effects on users polarization or political attitudes, despite causing a substantial change in political content and lowering user engagement with the platforms1. Similarly, studies on Googles search engine and YouTube algorithms found little evidence of filter bubbles24,25,26,27. Studies of Meta platforms linking content to user behaviour and attitudes also found no impact, despite prevalent like-minded content and amplified political news28,29,30.
Yet, the fact that switching off a feed algorithm does not affect users political attitudes does not mean that algorithms have no political impact. If the initial exposure to the algorithm has a persistent effect on political outcomes, switching off the algorithm might show no effects despite its importance. For instance, this could happen because people start following accounts suggested by the algorithm and continue following them when the algorithm is switched off. In addition, different platforms may have different effects, for instance, due to different informational environments or the different objectives of their owners31,32,33,34.
*snip*
7 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
I believe that 'algorithms' have more effects on people who begin without any strong
Jack Valentino
Wednesday
#2
neither should we give up the field without any resistance, to the enemy....
Jack Valentino
Thursday
#7