mirror of
https://github.com/DISARMFoundation/DISARMframeworks.git
synced 2024-12-20 13:24:18 -05:00
20 lines
10 KiB
Markdown
20 lines
10 KiB
Markdown
|
# Technique T0153.005: Online Advertising Platform
|
|||
|
|
|||
|
* **Summary**: Google Ads, Facebook Ads, and LinkedIn Marketing Solutions are examples of Online Advertising Platforms.<br><br>Online Advertising Platforms are online platforms which allow people to create Accounts that they can use to upload and deliver adverts to people online.
|
|||
|
|
|||
|
* **Belongs to tactic stage**: TA07
|
|||
|
|
|||
|
|
|||
|
| Incident | Descriptions given for this incident |
|
|||
|
| -------- | -------------------- |
|
|||
|
| [I00097 Report: Not Just Algorithms](../../generated_pages/incidents/I00097.md) | <i>This report explores the role of four systems (recommender systems, content moderation systems, ad approval systems and ad management systems) in creating risks around eating disorders.<br><br>[...]<br><br>Ad approval systems can create risks. We created 12 ‘fake’ ads that promoted dangerous weight loss techniques and behaviours. We tested to see if these ads would be approved to run, and they were. This means dangerous behaviours can be promoted in paid-for advertising. (Requests to run ads were withdrawn after approval or rejection, so no dangerous advertising was published as a result of this experiment.)<br><br>Specifically: On TikTok, 100% of the ads were approved to run; On Facebook, 83% of the ads were approved to run; On Google, 75% of the ads were approved to run.<br><br>Ad management systems can create risks. We investigated how platforms allow advertisers to target users, and found that it is possible to target people who may be interested in pro-eating disorder content.<br><br>Specifically: On TikTok: End-users who interact with pro-eating disorder content on TikTok, download advertisers’ eating disorder apps or visit their websites can be targeted; On Meta: End-users who interact with pro-eating disorder content on Meta, download advertisers’ eating disorder apps or visit their websites can be targeted; On X: End-users who follow pro- eating disorder accounts, or ‘look’ like them, can be targeted; On Google: End-users who search specific words or combinations of words (including pro-eating disorder words), watch pro-eating disorder YouTube channels and probably those who download eating disorder and mental health apps can be targeted.</i><br><br>Advertising platforms managed by TikTok, Facebook, and Google approved adverts to be displayed on their platforms. These platforms enabled users to deliver targeted advertising to potentially vulnerable platform users (T0018: Purchase Targeted Advertisements, T0153.005: Online Advertising Platform). |
|
|||
|
| [I00108 How you thought you support the animals and you ended up funding white supremacists](../../generated_pages/incidents/I00108.md) | <i>This article examines the white nationalist group Suavelos’ use of Facebook to draw visitors to its website without overtly revealing their racist ideology:<br><br>Suavelos uses Facebook and other platforms to amplify its message. In order to bypass the platforms’ community standards and keep their public pages active, Facebook pages such as “I support the police” are a good vehicle to spread a specific agenda without claiming to be racist. In looking back at this Facebook page, we followed Facebook’s algorithm for related pages and found suggested Facebook pages<br><br>[...]<br><br>This amplification strategy on Facebook is successful, as according to SimilarWeb figures, it attracts around 111,000 visits every month on the Suavelos.eu website.<br><br>[...]<br><br>Revenue through online advertisements can be achieved by different platforms through targeted advertisements, like Google Adsense or Doubleclick, or related and similar sponsored content, such as Taboola. Accordingly, Suavelos.eu uses both of these websites to display advertisements and consequently receives funding from such advertisements.<br><br>Once visitors are on the website supporting its advertisement revenue, Suavelos’ goal is to then turn these visitors into regular members of Suavelos network through donations or fees, or have them continue to support Suavelos. </i><br><br>Suevelos created a variety of pages on Facebook which presented as centring on prosocial causes. Facebook’s algorithm helped direct users to these pages (T0092: Build Network, T0151.001: Social Media Platform, T0153.006: Content Recommendation Algorithm, T0151.003: Online Community Page, T0143.208: Social Cause Persona).<br><br>Suevelos used these pages to generate traffic for their WordPress site (T0122: Direct Users to Alternative Platforms, T0152.003: Website Hosting Platform, T0152.004: Website), which used accounts on a variety of online advertising platforms to host adverts (T0146: Account, T0153.005: Online Advertising Platform). |
|
|||
|
| [I00114 ‘Carol’s Journey’: What Facebook knew about how it radicalized users](../../generated_pages/incidents/I00114.md) | This report examines internal Facebook communications which reveal employees’ concerns about how the platform’s algorithm was recommending users join extremist conspiracy groups.<br><br><i>In summer 2019, a new Facebook user named Carol Smith signed up for the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smith’s account indicated an interest in politics, parenting and Christianity and followed a few of her favorite brands, including Fox News and then-President Donald Trump.<br><br>Though Smith had never expressed interest in conspiracy theories, in just two days Facebook was recommending she join groups dedicated to QAnon, a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a cabal of pedophiles and Satanists.<br><br>Smith didn’t follow the recommended QAnon groups, but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smith’s feed was full of groups and pages that had violated Facebook’s own rules, including those against hate speech and disinformation.<br><br>Smith wasn’t a real person. A researcher employed by Facebook invented the account, along with those of other fictitious “test users” in 2019 and 2020, as part of an experiment in studying the platform’s role in misinforming and polarizing users through its recommendations systems.<br><br>That researcher said Smith’s Facebook experience was “a barrage of extreme, conspiratorial, and graphic content.” </i><br><br>Facebook’s Algorithm suggested users join groups which supported the QAnon movement (T0151.001: Social Media Platform, T0151.002: Online Community Group, T0153.006: Content Recommendation Algorithm, T0097.208: Social Cause Persona).<br><br>Further investigation by Facebook uncovered that its advertising platform had been used to promote QAnon narratives (T0146: Account, T0114: Deliver Ads, T0153.005: Online Advertising Platform):<br><br><i>For years, company researchers had been running experiments like Carol Smith’s to gauge the platform’s hand in radicalizing users, according to the documents seen by NBC News.<br><br>This internal work repeatedly found that recommendation tools pushed users into extremist groups, findings that helped inform policy changes and tweaks to recommendations and news feed rankings. Those rankings are a tentacled, ever-evolving system widely known as “the algorithm” that pushes content to users. But the research at that time stopped well short of inspiring any movement to change the groups and pages themselves.<br><br>That reluctance was indicative of “a pattern at Facebook,” Haugen told reporters this month. “They want the shortest path between their current policies and any action.”<br><br>[...]<br><br>By summer 2020, Facebook was hosting thousands of private QAnon groups and pages, with millions of members and followers, according to an unreleased internal investigation. <br><br>A year after the FBI designated QAnon as a potential domestic terrorist threat in the wake of standoffs, alleged planned kidnappings, harassment campaigns and shootings, Facebook labeled QAnon a “Violence Inciting Conspiracy Network” and banned it from the platform, along with militias and other violent social movements. A small team working across several of Facebook’s departments found its platforms had hosted hundreds of ads on Facebook and Instagram worth thousands of dollars and millions of views, “praising, supporting, or representing” the conspiracy theory.<br><br>[...]<br><br>For many employees inside Facebook, the enforcement came too late, according to posts left on Workplace, the company’s internal message board. <br><br>“We’ve known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and group
|
|||
|
|
|||
|
|
|||
|
|
|||
|
| Counters | Response types |
|
|||
|
| -------- | -------------- |
|
|||
|
|
|||
|
|
|||
|
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|