Home News New research shows how Meta’s algorithms shaped users’ 2020 election feeds

New research shows how Meta’s algorithms shaped users’ 2020 election feeds

by admin
algorithmic influence

New research findings have been published regarding Meta’s partnership with independent researchers to study the influence of Facebook and Instagram on the 2020 election. Four peer-reviewed papers in Science and Nature offer new insights into how the algorithms of these platforms impacted users’ feeds during the election period.

While Meta claims the research shows that its platforms do not significantly cause harmful polarization or have meaningful effects on political attitudes and behaviors, the initial findings present a more complex view. One study in Nature looked into “echo chambers” where users are exposed to like-minded sources, revealing that reducing such content decreased engagement but had no substantial effect on users’ beliefs or attitudes.

Another Nature study compared chronological feeds with algorithmically-generated ones and found that the latter strongly influenced users’ experiences. The algorithmic feed led to more time spent on the platform, altered content mix, and increased exposure to political content, including some from untrustworthy sources. However, it did not detectably change users’ downstream political attitudes or offline behavior.

Regarding reshared content, removing it significantly decreased political news, particularly from untrustworthy sources, but did not substantially impact political polarization or individual-level political attitudes.

Researchers also analyzed the political news appearing in users’ feeds based on their ideology and found substantial ideological segregation on Facebook, with conservative users being more likely to see content from “untrustworthy” sources and false articles.

While some findings are favorable for Meta, indicating that political content is a minority of what most users see, the research shows that there are no straightforward solutions for addressing social media’s polarization. The platforms are seen as not being the solution to this issue.

Frequently Asked Questions (FAQs) about algorithmic influence

Q: What is the research about and who conducted it?

A: The research examines the impact of Facebook and Instagram algorithms on users’ 2020 election feeds. It was conducted by Meta in partnership with more than a dozen independent researchers.

Q: Where can I find the results of the research?

A: The first results of the research are available in four peer-reviewed papers published in the journals Science and Nature.

Q: What did the research findings reveal about echo chambers?

A: The research confirmed that many users in the US see content from “like-minded” sources, but not all of it is explicitly political or news-related. Reducing “like-minded” content engagement had little effect on users’ beliefs or attitudes.

Q: How did algorithmic feeds differ from chronological feeds in the study?

A: Algorithmic feeds on Facebook and Instagram strongly influenced users’ experiences, leading to more time spent on the platforms, altered content mix, and increased exposure to political content. However, it did not significantly change users’ political attitudes or behaviors.

Q: Did removing reshared content have any impact on political polarization?

A: Removing reshared content decreased the amount of political news, including content from untrustworthy sources, but did not substantially affect political polarization or individual-level political attitudes.

Q: What did the research reveal about ideological segregation on Facebook?

A: The research found substantial ideological segregation, with conservative users more likely to see content from “untrustworthy” sources and false articles.

Q: What does Meta’s policy chief say about the research?

A: Meta’s policy chief, Nick Clegg, suggests that the research adds to existing evidence showing that Meta’s platforms do not significantly cause harmful polarization or have meaningful effects on key political attitudes, beliefs, or behaviors.

Q: Are there clear solutions for addressing polarization on social media?

A: The research shows that there are no obvious solutions to address the polarization on social media platforms, indicating that the platforms themselves are not the sole problem nor the solution to this issue.

More about algorithmic influence

You may also like

Leave a Comment