mirror of
https://github.com/DISARMFoundation/DISARMframeworks.git
synced 2024-12-18 12:24:25 -05:00
28 lines
9.5 KiB
Markdown
28 lines
9.5 KiB
Markdown
# Incident I00115: How Facebook shapes your feed
|
||
|
||
* **Summary:** <i>Facebook’s news feed algorithm has been blamed for fanning sectarian hatred, steering users toward extremism and conspiracy theories, and incentivizing politicians to take more divisive stands. It’s in the spotlight thanks to waves of revelations from the Facebook Papers and testimony from whistleblower Frances Haugen, who argues it’s at the core of the company’s problems.<br><br>But how exactly does it work, and what makes it so influential?<br><br>While the phrase “the algorithm” has taken on sinister, even mythical overtones, it is, at its most basic level, a system that decides a post’s position on the news feed based on predictions about each user’s preferences and tendencies. The details of its design determine what sorts of content thrive on the world’s largest social network, and what types languish — which in turn shapes the posts we all create, and the ways we interact on its platform.<br><br>Facebook doesn’t release comprehensive data on the actual proportions of posts in any given user’s feed, or on Facebook as a whole. And each user’s feed is highly personalized to their behaviors. But a combination of internal Facebook documents, publicly available information and conversations with Facebook insiders offers a glimpse into how different approaches to the algorithm can dramatically alter the categories of content that tend to flourish.</i>
|
||
|
||
* **incident type**:
|
||
|
||
* **Year started:**
|
||
|
||
* **Countries:** ,
|
||
|
||
* **Found via:**
|
||
|
||
* **Date added:**
|
||
|
||
|
||
| Reference | Pub Date | Authors | Org | Archive |
|
||
| --------- | -------- | ------- | --- | ------- |
|
||
| [https://www.washingtonpost.com/technology/interactive/2021/how-facebook-algorithm-works/](https://www.washingtonpost.com/technology/interactive/2021/how-facebook-algorithm-works/) | 2021/10/26 | Will Oremus, Chris Alcantara, Jeremy B. Merrill, Artur Galocha | The Washington Post | [https://web.archive.org/web/20211026142012/https://www.washingtonpost.com/technology/interactive/2021/how-facebook-algorithm-works/](https://web.archive.org/web/20211026142012/https://www.washingtonpost.com/technology/interactive/2021/how-facebook-algorithm-works/) |
|
||
|
||
|
||
|
||
| Technique | Description given for this incident |
|
||
| --------- | ------------------------- |
|
||
| [T0151.001 Social Media Platform](../../generated_pages/techniques/T0151.001.md) | IT00000471 This 2021 report by The Washington Post explains the mechanics of Facebook’s algorithm (T0151.001: Social Media Platform, T0153.006: Content Recommendation Algorithm):<br><br><i>In its early years, Facebook’s algorithm prioritized signals such as likes, clicks and comments to decide which posts to amplify. Publishers, brands and individual users soon learned how to craft posts and headlines designed to induce likes and clicks, giving rise to what came to be known as “clickbait.” By 2013, upstart publishers such as Upworthy and ViralNova were amassing tens of millions of readers with articles designed specifically to game Facebook’s news feed algorithm.<br><br>Facebook realized that users were growing wary of misleading teaser headlines, and the company recalibrated its algorithm in 2014 and 2015 to downgrade clickbait and focus on new metrics, such as the amount of time a user spent reading a story or watching a video, and incorporating surveys on what content users found most valuable. Around the same time, its executives identified video as a business priority, and used the algorithm to boost “native” videos shared directly to Facebook. By the mid-2010s, the news feed had tilted toward slick, professionally produced content, especially videos that would hold people’s attention.<br><br>In 2016, however, Facebook executives grew worried about a decline in “original sharing.” Users were spending so much time passively watching and reading that they weren’t interacting with each other as much. Young people in particular shifted their personal conversations to rivals such as Snapchat that offered more intimacy.<br><br>Once again, Facebook found its answer in the algorithm: It developed a new set of goal metrics that it called “meaningful social interactions,” designed to show users more posts from friends and family, and fewer from big publishers and brands. In particular, the algorithm began to give outsize weight to posts that sparked lots of comments and replies.<br><br>The downside of this approach was that the posts that sparked the most comments tended to be the ones that made people angry or offended them, the documents show. Facebook became an angrier, more polarizing place. It didn’t help that, starting in 2017, the algorithm had assigned reaction emoji — including the angry emoji — five times the weight of a simple “like,” according to company documents.<br><br>[...]<br><br>Internal documents show Facebook researchers found that, for the most politically oriented 1 million American users, nearly 90 percent of the content that Facebook shows them is about politics and social issues. Those groups also received the most misinformation, especially a set of users associated with mostly right-leaning content, who were shown one misinformation post out of every 40, according to a document from June 2020.<br><br>One takeaway is that Facebook’s algorithm isn’t a runaway train. The company may not directly control what any given user posts, but by choosing which types of posts will be seen, it sculpts the information landscape according to its business priorities. Some within the company would like to see Facebook use the algorithm to explicitly promote certain values, such as democracy and civil discourse. Others have suggested that it develop and prioritize new metrics that align with users’ values, as with a 2020 experiment in which the algorithm was trained to predict what posts they would find “good for the world” and “bad for the world,” and optimize for the former.</i> |
|
||
| [T0153.006 Content Recommendation Algorithm](../../generated_pages/techniques/T0153.006.md) | IT00000472 This 2021 report by The Washington Post explains the mechanics of Facebook’s algorithm (T0151.001: Social Media Platform, T0153.006: Content Recommendation Algorithm):<br><br><i>In its early years, Facebook’s algorithm prioritized signals such as likes, clicks and comments to decide which posts to amplify. Publishers, brands and individual users soon learned how to craft posts and headlines designed to induce likes and clicks, giving rise to what came to be known as “clickbait.” By 2013, upstart publishers such as Upworthy and ViralNova were amassing tens of millions of readers with articles designed specifically to game Facebook’s news feed algorithm.<br><br>Facebook realized that users were growing wary of misleading teaser headlines, and the company recalibrated its algorithm in 2014 and 2015 to downgrade clickbait and focus on new metrics, such as the amount of time a user spent reading a story or watching a video, and incorporating surveys on what content users found most valuable. Around the same time, its executives identified video as a business priority, and used the algorithm to boost “native” videos shared directly to Facebook. By the mid-2010s, the news feed had tilted toward slick, professionally produced content, especially videos that would hold people’s attention.<br><br>In 2016, however, Facebook executives grew worried about a decline in “original sharing.” Users were spending so much time passively watching and reading that they weren’t interacting with each other as much. Young people in particular shifted their personal conversations to rivals such as Snapchat that offered more intimacy.<br><br>Once again, Facebook found its answer in the algorithm: It developed a new set of goal metrics that it called “meaningful social interactions,” designed to show users more posts from friends and family, and fewer from big publishers and brands. In particular, the algorithm began to give outsize weight to posts that sparked lots of comments and replies.<br><br>The downside of this approach was that the posts that sparked the most comments tended to be the ones that made people angry or offended them, the documents show. Facebook became an angrier, more polarizing place. It didn’t help that, starting in 2017, the algorithm had assigned reaction emoji — including the angry emoji — five times the weight of a simple “like,” according to company documents.<br><br>[...]<br><br>Internal documents show Facebook researchers found that, for the most politically oriented 1 million American users, nearly 90 percent of the content that Facebook shows them is about politics and social issues. Those groups also received the most misinformation, especially a set of users associated with mostly right-leaning content, who were shown one misinformation post out of every 40, according to a document from June 2020.<br><br>One takeaway is that Facebook’s algorithm isn’t a runaway train. The company may not directly control what any given user posts, but by choosing which types of posts will be seen, it sculpts the information landscape according to its business priorities. Some within the company would like to see Facebook use the algorithm to explicitly promote certain values, such as democracy and civil discourse. Others have suggested that it develop and prioritize new metrics that align with users’ values, as with a 2020 experiment in which the algorithm was trained to predict what posts they would find “good for the world” and “bad for the world,” and optimize for the former.</i> |
|
||
|
||
|
||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW |