DISARMframeworks/generated_pages/incidents/I00114.md

31 lines
34 KiB
Markdown
Raw Normal View History

# Incident I00114: Carols Journey: What Facebook knew about how it radicalized users
* **Summary:** <i>In summer 2019, a new Facebook user named Carol Smith signed up for the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smiths account indicated an interest in politics, parenting and Christianity and followed a few of her favorite brands, including Fox News and then-President Donald Trump.<br><br>Though Smith had never expressed interest in conspiracy theories, in just two days Facebook was recommending she join groups dedicated to QAnon, a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a cabal of pedophiles and Satanists.<br><br>Smith didnt follow the recommended QAnon groups, but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smiths feed was full of groups and pages that had violated Facebooks own rules, including those against hate speech and disinformation.<br><br>Smith wasnt a real person. A researcher employed by Facebook invented the account, along with those of other fictitious “test users” in 2019 and 2020, as part of an experiment in studying the platforms role in misinforming and polarizing users through its recommendations systems.<br><br>That researcher said Smiths Facebook experience was “a barrage of extreme, conspiratorial, and graphic content.” </i>
* **incident type**:
* **Year started:**
* **Countries:** ,
* **Found via:**
* **Date added:**
| Reference | Pub Date | Authors | Org | Archive |
| --------- | -------- | ------- | --- | ------- |
| Technique | Description given for this incident |
| --------- | ------------------------- |
| [T0097.208 Social Cause Persona](../../generated_pages/techniques/T0097.208.md) |  IT00000468 This report examines internal Facebook communications which reveal employees concerns about how the platforms algorithm was recommending users join extremist conspiracy groups.<br><br><i>In summer 2019, a new Facebook user named Carol Smith signed up for the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smiths account indicated an interest in politics, parenting and Christianity and followed a few of her favorite brands, including Fox News and then-President Donald Trump.<br><br>Though Smith had never expressed interest in conspiracy theories, in just two days Facebook was recommending she join groups dedicated to QAnon, a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a cabal of pedophiles and Satanists.<br><br>Smith didnt follow the recommended QAnon groups, but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smiths feed was full of groups and pages that had violated Facebooks own rules, including those against hate speech and disinformation.<br><br>Smith wasnt a real person. A researcher employed by Facebook invented the account, along with those of other fictitious “test users” in 2019 and 2020, as part of an experiment in studying the platforms role in misinforming and polarizing users through its recommendations systems.<br><br>That researcher said Smiths Facebook experience was “a barrage of extreme, conspiratorial, and graphic content.” </i><br><br>Facebooks Algorithm suggested users join groups which supported the QAnon movement (T0151.001: Social Media Platform, T0151.002: Online Community Group, T0153.006: Content Recommendation Algorithm, T0097.208: Social Cause Persona).<br><br>Further investigation by Facebook uncovered that its advertising platform had been used to promote QAnon narratives (T0146: Account, T0114: Deliver Ads, T0153.005: Online Advertising Platform):<br><br><i>For years, company researchers had been running experiments like Carol Smiths to gauge the platforms hand in radicalizing users, according to the documents seen by NBC News.<br><br>This internal work repeatedly found that recommendation tools pushed users into extremist groups, findings that helped inform policy changes and tweaks to recommendations and news feed rankings. Those rankings are a tentacled, ever-evolving system widely known as “the algorithm” that pushes content to users. But the research at that time stopped well short of inspiring any movement to change the groups and pages themselves.<br><br>That reluctance was indicative of “a pattern at Facebook,” Haugen told reporters this month. “They want the shortest path between their current policies and any action.”<br><br>[...]<br><br>By summer 2020, Facebook was hosting thousands of private QAnon groups and pages, with millions of members and followers, according to an unreleased internal investigation. <br><br>A year after the FBI designated QAnon as a potential domestic terrorist threat in the wake of standoffs, alleged planned kidnappings, harassment campaigns and shootings, Facebook labeled QAnon a “Violence Inciting Conspiracy Network” and banned it from the platform, along with militias and other violent social movements. A small team working across several of Facebooks departments found its platforms had hosted hundreds of ads on Facebook and Instagram worth thousands of dollars and millions of views, “praising, supporting, or representing” the conspiracy theory.<br><br>[...]<br><br>For many employees inside Facebook, the enforcement came too late, according to posts left on Workplace, the companys internal message board. <br><br>“Weve known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups,” one integrity researcher, wh
| [T0114 Deliver Ads](../../generated_pages/techniques/T0114.md) |  IT00000469 This report examines internal Facebook communications which reveal employees concerns about how the platforms algorithm was recommending users join extremist conspiracy groups.<br><br><i>In summer 2019, a new Facebook user named Carol Smith signed up for the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smiths account indicated an interest in politics, parenting and Christianity and followed a few of her favorite brands, including Fox News and then-President Donald Trump.<br><br>Though Smith had never expressed interest in conspiracy theories, in just two days Facebook was recommending she join groups dedicated to QAnon, a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a cabal of pedophiles and Satanists.<br><br>Smith didnt follow the recommended QAnon groups, but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smiths feed was full of groups and pages that had violated Facebooks own rules, including those against hate speech and disinformation.<br><br>Smith wasnt a real person. A researcher employed by Facebook invented the account, along with those of other fictitious “test users” in 2019 and 2020, as part of an experiment in studying the platforms role in misinforming and polarizing users through its recommendations systems.<br><br>That researcher said Smiths Facebook experience was “a barrage of extreme, conspiratorial, and graphic content.” </i><br><br>Facebooks Algorithm suggested users join groups which supported the QAnon movement (T0151.001: Social Media Platform, T0151.002: Online Community Group, T0153.006: Content Recommendation Algorithm, T0097.208: Social Cause Persona).<br><br>Further investigation by Facebook uncovered that its advertising platform had been used to promote QAnon narratives (T0146: Account, T0114: Deliver Ads, T0153.005: Online Advertising Platform):<br><br><i>For years, company researchers had been running experiments like Carol Smiths to gauge the platforms hand in radicalizing users, according to the documents seen by NBC News.<br><br>This internal work repeatedly found that recommendation tools pushed users into extremist groups, findings that helped inform policy changes and tweaks to recommendations and news feed rankings. Those rankings are a tentacled, ever-evolving system widely known as “the algorithm” that pushes content to users. But the research at that time stopped well short of inspiring any movement to change the groups and pages themselves.<br><br>That reluctance was indicative of “a pattern at Facebook,” Haugen told reporters this month. “They want the shortest path between their current policies and any action.”<br><br>[...]<br><br>By summer 2020, Facebook was hosting thousands of private QAnon groups and pages, with millions of members and followers, according to an unreleased internal investigation. <br><br>A year after the FBI designated QAnon as a potential domestic terrorist threat in the wake of standoffs, alleged planned kidnappings, harassment campaigns and shootings, Facebook labeled QAnon a “Violence Inciting Conspiracy Network” and banned it from the platform, along with militias and other violent social movements. A small team working across several of Facebooks departments found its platforms had hosted hundreds of ads on Facebook and Instagram worth thousands of dollars and millions of views, “praising, supporting, or representing” the conspiracy theory.<br><br>[...]<br><br>For many employees inside Facebook, the enforcement came too late, according to posts left on Workplace, the companys internal message board. <br><br>“Weve known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups,” one integrity researcher, whose name had been
| [T0151.001 Social Media Platform](../../generated_pages/techniques/T0151.001.md) |  IT00000465 This report examines internal Facebook communications which reveal employees concerns about how the platforms algorithm was recommending users join extremist conspiracy groups.<br><br><i>In summer 2019, a new Facebook user named Carol Smith signed up for the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smiths account indicated an interest in politics, parenting and Christianity and followed a few of her favorite brands, including Fox News and then-President Donald Trump.<br><br>Though Smith had never expressed interest in conspiracy theories, in just two days Facebook was recommending she join groups dedicated to QAnon, a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a cabal of pedophiles and Satanists.<br><br>Smith didnt follow the recommended QAnon groups, but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smiths feed was full of groups and pages that had violated Facebooks own rules, including those against hate speech and disinformation.<br><br>Smith wasnt a real person. A researcher employed by Facebook invented the account, along with those of other fictitious “test users” in 2019 and 2020, as part of an experiment in studying the platforms role in misinforming and polarizing users through its recommendations systems.<br><br>That researcher said Smiths Facebook experience was “a barrage of extreme, conspiratorial, and graphic content.” </i><br><br>Facebooks Algorithm suggested users join groups which supported the QAnon movement (T0151.001: Social Media Platform, T0151.002: Online Community Group, T0153.006: Content Recommendation Algorithm, T0097.208: Social Cause Persona).<br><br>Further investigation by Facebook uncovered that its advertising platform had been used to promote QAnon narratives (T0146: Account, T0114: Deliver Ads, T0153.005: Online Advertising Platform):<br><br><i>For years, company researchers had been running experiments like Carol Smiths to gauge the platforms hand in radicalizing users, according to the documents seen by NBC News.<br><br>This internal work repeatedly found that recommendation tools pushed users into extremist groups, findings that helped inform policy changes and tweaks to recommendations and news feed rankings. Those rankings are a tentacled, ever-evolving system widely known as “the algorithm” that pushes content to users. But the research at that time stopped well short of inspiring any movement to change the groups and pages themselves.<br><br>That reluctance was indicative of “a pattern at Facebook,” Haugen told reporters this month. “They want the shortest path between their current policies and any action.”<br><br>[...]<br><br>By summer 2020, Facebook was hosting thousands of private QAnon groups and pages, with millions of members and followers, according to an unreleased internal investigation. <br><br>A year after the FBI designated QAnon as a potential domestic terrorist threat in the wake of standoffs, alleged planned kidnappings, harassment campaigns and shootings, Facebook labeled QAnon a “Violence Inciting Conspiracy Network” and banned it from the platform, along with militias and other violent social movements. A small team working across several of Facebooks departments found its platforms had hosted hundreds of ads on Facebook and Instagram worth thousands of dollars and millions of views, “praising, supporting, or representing” the conspiracy theory.<br><br>[...]<br><br>For many employees inside Facebook, the enforcement came too late, according to posts left on Workplace, the companys internal message board. <br><br>“Weve known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups,” one integrity researcher, w
| [T0151.002 Online Community Group](../../generated_pages/techniques/T0151.002.md) |  IT00000466 This report examines internal Facebook communications which reveal employees concerns about how the platforms algorithm was recommending users join extremist conspiracy groups.<br><br><i>In summer 2019, a new Facebook user named Carol Smith signed up for the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smiths account indicated an interest in politics, parenting and Christianity and followed a few of her favorite brands, including Fox News and then-President Donald Trump.<br><br>Though Smith had never expressed interest in conspiracy theories, in just two days Facebook was recommending she join groups dedicated to QAnon, a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a cabal of pedophiles and Satanists.<br><br>Smith didnt follow the recommended QAnon groups, but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smiths feed was full of groups and pages that had violated Facebooks own rules, including those against hate speech and disinformation.<br><br>Smith wasnt a real person. A researcher employed by Facebook invented the account, along with those of other fictitious “test users” in 2019 and 2020, as part of an experiment in studying the platforms role in misinforming and polarizing users through its recommendations systems.<br><br>That researcher said Smiths Facebook experience was “a barrage of extreme, conspiratorial, and graphic content.” </i><br><br>Facebooks Algorithm suggested users join groups which supported the QAnon movement (T0151.001: Social Media Platform, T0151.002: Online Community Group, T0153.006: Content Recommendation Algorithm, T0097.208: Social Cause Persona).<br><br>Further investigation by Facebook uncovered that its advertising platform had been used to promote QAnon narratives (T0146: Account, T0114: Deliver Ads, T0153.005: Online Advertising Platform):<br><br><i>For years, company researchers had been running experiments like Carol Smiths to gauge the platforms hand in radicalizing users, according to the documents seen by NBC News.<br><br>This internal work repeatedly found that recommendation tools pushed users into extremist groups, findings that helped inform policy changes and tweaks to recommendations and news feed rankings. Those rankings are a tentacled, ever-evolving system widely known as “the algorithm” that pushes content to users. But the research at that time stopped well short of inspiring any movement to change the groups and pages themselves.<br><br>That reluctance was indicative of “a pattern at Facebook,” Haugen told reporters this month. “They want the shortest path between their current policies and any action.”<br><br>[...]<br><br>By summer 2020, Facebook was hosting thousands of private QAnon groups and pages, with millions of members and followers, according to an unreleased internal investigation. <br><br>A year after the FBI designated QAnon as a potential domestic terrorist threat in the wake of standoffs, alleged planned kidnappings, harassment campaigns and shootings, Facebook labeled QAnon a “Violence Inciting Conspiracy Network” and banned it from the platform, along with militias and other violent social movements. A small team working across several of Facebooks departments found its platforms had hosted hundreds of ads on Facebook and Instagram worth thousands of dollars and millions of views, “praising, supporting, or representing” the conspiracy theory.<br><br>[...]<br><br>For many employees inside Facebook, the enforcement came too late, according to posts left on Workplace, the companys internal message board. <br><br>“Weve known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups,” one integrity researcher,
| [T0153.005 Online Advertising Platform](../../generated_pages/techniques/T0153.005.md) |  IT00000470 This report examines internal Facebook communications which reveal employees concerns about how the platforms algorithm was recommending users join extremist conspiracy groups.<br><br><i>In summer 2019, a new Facebook user named Carol Smith signed up for the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smiths account indicated an interest in politics, parenting and Christianity and followed a few of her favorite brands, including Fox News and then-President Donald Trump.<br><br>Though Smith had never expressed interest in conspiracy theories, in just two days Facebook was recommending she join groups dedicated to QAnon, a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a cabal of pedophiles and Satanists.<br><br>Smith didnt follow the recommended QAnon groups, but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smiths feed was full of groups and pages that had violated Facebooks own rules, including those against hate speech and disinformation.<br><br>Smith wasnt a real person. A researcher employed by Facebook invented the account, along with those of other fictitious “test users” in 2019 and 2020, as part of an experiment in studying the platforms role in misinforming and polarizing users through its recommendations systems.<br><br>That researcher said Smiths Facebook experience was “a barrage of extreme, conspiratorial, and graphic content.” </i><br><br>Facebooks Algorithm suggested users join groups which supported the QAnon movement (T0151.001: Social Media Platform, T0151.002: Online Community Group, T0153.006: Content Recommendation Algorithm, T0097.208: Social Cause Persona).<br><br>Further investigation by Facebook uncovered that its advertising platform had been used to promote QAnon narratives (T0146: Account, T0114: Deliver Ads, T0153.005: Online Advertising Platform):<br><br><i>For years, company researchers had been running experiments like Carol Smiths to gauge the platforms hand in radicalizing users, according to the documents seen by NBC News.<br><br>This internal work repeatedly found that recommendation tools pushed users into extremist groups, findings that helped inform policy changes and tweaks to recommendations and news feed rankings. Those rankings are a tentacled, ever-evolving system widely known as “the algorithm” that pushes content to users. But the research at that time stopped well short of inspiring any movement to change the groups and pages themselves.<br><br>That reluctance was indicative of “a pattern at Facebook,” Haugen told reporters this month. “They want the shortest path between their current policies and any action.”<br><br>[...]<br><br>By summer 2020, Facebook was hosting thousands of private QAnon groups and pages, with millions of members and followers, according to an unreleased internal investigation. <br><br>A year after the FBI designated QAnon as a potential domestic terrorist threat in the wake of standoffs, alleged planned kidnappings, harassment campaigns and shootings, Facebook labeled QAnon a “Violence Inciting Conspiracy Network” and banned it from the platform, along with militias and other violent social movements. A small team working across several of Facebooks departments found its platforms had hosted hundreds of ads on Facebook and Instagram worth thousands of dollars and millions of views, “praising, supporting, or representing” the conspiracy theory.<br><br>[...]<br><br>For many employees inside Facebook, the enforcement came too late, according to posts left on Workplace, the companys internal message board. <br><br>“Weve known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups,” one integrity researc
| [T0153.006 Content Recommendation Algorithm](../../generated_pages/techniques/T0153.006.md) |  IT00000467 This report examines internal Facebook communications which reveal employees concerns about how the platforms algorithm was recommending users join extremist conspiracy groups.<br><br><i>In summer 2019, a new Facebook user named Carol Smith signed up for the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smiths account indicated an interest in politics, parenting and Christianity and followed a few of her favorite brands, including Fox News and then-President Donald Trump.<br><br>Though Smith had never expressed interest in conspiracy theories, in just two days Facebook was recommending she join groups dedicated to QAnon, a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a cabal of pedophiles and Satanists.<br><br>Smith didnt follow the recommended QAnon groups, but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smiths feed was full of groups and pages that had violated Facebooks own rules, including those against hate speech and disinformation.<br><br>Smith wasnt a real person. A researcher employed by Facebook invented the account, along with those of other fictitious “test users” in 2019 and 2020, as part of an experiment in studying the platforms role in misinforming and polarizing users through its recommendations systems.<br><br>That researcher said Smiths Facebook experience was “a barrage of extreme, conspiratorial, and graphic content.” </i><br><br>Facebooks Algorithm suggested users join groups which supported the QAnon movement (T0151.001: Social Media Platform, T0151.002: Online Community Group, T0153.006: Content Recommendation Algorithm, T0097.208: Social Cause Persona).<br><br>Further investigation by Facebook uncovered that its advertising platform had been used to promote QAnon narratives (T0146: Account, T0114: Deliver Ads, T0153.005: Online Advertising Platform):<br><br><i>For years, company researchers had been running experiments like Carol Smiths to gauge the platforms hand in radicalizing users, according to the documents seen by NBC News.<br><br>This internal work repeatedly found that recommendation tools pushed users into extremist groups, findings that helped inform policy changes and tweaks to recommendations and news feed rankings. Those rankings are a tentacled, ever-evolving system widely known as “the algorithm” that pushes content to users. But the research at that time stopped well short of inspiring any movement to change the groups and pages themselves.<br><br>That reluctance was indicative of “a pattern at Facebook,” Haugen told reporters this month. “They want the shortest path between their current policies and any action.”<br><br>[...]<br><br>By summer 2020, Facebook was hosting thousands of private QAnon groups and pages, with millions of members and followers, according to an unreleased internal investigation. <br><br>A year after the FBI designated QAnon as a potential domestic terrorist threat in the wake of standoffs, alleged planned kidnappings, harassment campaigns and shootings, Facebook labeled QAnon a “Violence Inciting Conspiracy Network” and banned it from the platform, along with militias and other violent social movements. A small team working across several of Facebooks departments found its platforms had hosted hundreds of ads on Facebook and Instagram worth thousands of dollars and millions of views, “praising, supporting, or representing” the conspiracy theory.<br><br>[...]<br><br>For many employees inside Facebook, the enforcement came too late, according to posts left on Workplace, the companys internal message board. <br><br>“Weve known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups,” one integrity re
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW