14 KiB
Technique T0151.002: Online Community Group
-
Summary: Some online platforms allow people with Accounts to create Online Community Groups. Groups are usually created around a specific topic or locality, and allow users to post content to the group, and interact with other users’ posted content.
For example, Meta’s Social Media Platform Facebook allows users to create a “Facebook group”. This feature is not exclusive to Social Media Platforms; the Microblogging Platform X (prev. Twitter) allows users to create “X Communities”, groups based on particular topics which users can join and post to; the Software Delivery Platform Steam allows users to create Steam Community Groups.
Online Community Groups can be open or gated (for example, groups can require admin approval before users can join). -
Belongs to tactic stage: TA07
Incident | Descriptions given for this incident |
---|---|
I00105 Gaming the System: The Use of Gaming-Adjacent Communication, Game and Mod Platforms by Extremist Actors | In this report, researchers look at online platforms commonly used by people who play videogames, looking at how these platforms can contribute to radicalisation of gamers: Indie DB [is a platform that serves] to present indie games, which are titles from independent, small developer teams, which can be discussed and downloaded [Indie DB]. [...] [On Indie DB we] found antisemitic, Islamist, sexist and other discriminatory content during the exploration. Both games and comments were located that made positive references to National Socialism. Radicalised users seem to use the opportunities to network, create groups, and communicate in forums. In addition, a number of member profiles with graphic propaganda content could be located. We found several games with propagandistic content advertised on the platform, which caused apparently radicalised users to form a fan community around those games. For example, there are titles such as the antisemitic Fursan Al-Aqsa: The Knights of the Al-Aqsa Mosque, which has won the Best Hardcore Game award at the Game Connection America 2024 Game Development Awards and which has received antisemitic praise on its review page. In the game, players need to target members of the Israeli Defence Forces, and attacks by Palestinian terrorist groups such as Hamas or Lions’ Den can be re-enacted. These include bomb attacks, beheadings and the re-enactment of the terrorist attack in Israel on 7 October 2023. We, therefore, deem Indie DB to be relevant for further analyses of extremist activities in digital gaming spaces. Indie DB is an online software delivery platform on which users can create groups, and participate in discussion forums (T0152.009: Software Delivery Platform, T0151.002: Online Community Group, T0151.009: Legacy Online Forum Platform). The platform hosted games which allowed players to reenact terrorist attacks (T0147.001: Game Asset). |
I00114 ‘Carol’s Journey’: What Facebook knew about how it radicalized users | This report examines internal Facebook communications which reveal employees’ concerns about how the platform’s algorithm was recommending users join extremist conspiracy groups. In summer 2019, a new Facebook user named Carol Smith signed up for the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smith’s account indicated an interest in politics, parenting and Christianity and followed a few of her favorite brands, including Fox News and then-President Donald Trump. Though Smith had never expressed interest in conspiracy theories, in just two days Facebook was recommending she join groups dedicated to QAnon, a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a cabal of pedophiles and Satanists. Smith didn’t follow the recommended QAnon groups, but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smith’s feed was full of groups and pages that had violated Facebook’s own rules, including those against hate speech and disinformation. Smith wasn’t a real person. A researcher employed by Facebook invented the account, along with those of other fictitious “test users” in 2019 and 2020, as part of an experiment in studying the platform’s role in misinforming and polarizing users through its recommendations systems. That researcher said Smith’s Facebook experience was “a barrage of extreme, conspiratorial, and graphic content.” Facebook’s Algorithm suggested users join groups which supported the QAnon movement (T0151.001: Social Media Platform, T0151.002: Online Community Group, T0153.006: Content Recommendation Algorithm, T0097.208: Social Cause Persona). Further investigation by Facebook uncovered that its advertising platform had been used to promote QAnon narratives (T0146: Account Asset, T0114: Deliver Ads, T0153.005: Online Advertising Platform): For years, company researchers had been running experiments like Carol Smith’s to gauge the platform’s hand in radicalizing users, according to the documents seen by NBC News. This internal work repeatedly found that recommendation tools pushed users into extremist groups, findings that helped inform policy changes and tweaks to recommendations and news feed rankings. Those rankings are a tentacled, ever-evolving system widely known as “the algorithm” that pushes content to users. But the research at that time stopped well short of inspiring any movement to change the groups and pages themselves. That reluctance was indicative of “a pattern at Facebook,” Haugen told reporters this month. “They want the shortest path between their current policies and any action.” [...] By summer 2020, Facebook was hosting thousands of private QAnon groups and pages, with millions of members and followers, according to an unreleased internal investigation. A year after the FBI designated QAnon as a potential domestic terrorist threat in the wake of standoffs, alleged planned kidnappings, harassment campaigns and shootings, Facebook labeled QAnon a “Violence Inciting Conspiracy Network” and banned it from the platform, along with militias and other violent social movements. A small team working across several of Facebook’s departments found its platforms had hosted hundreds of ads on Facebook and Instagram worth thousands of dollars and millions of views, “praising, supporting, or representing” the conspiracy theory. [...] For many employees inside Facebook, the enforcement came too late, according to posts left on Workplace, the company’s internal message board. “We’ve known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups,” one integrity researcher, whose name had been redacted, wrote in a post announcing she was leaving the company. “This fringe group has grown to national prominence, with QAnon congressional candidates and QAnon hashtags and groups trending in the mainstream. We were willing to act only * after * things had spiraled into a dire state.” While Facebook’s ban initially appeared effective, a problem remained: The removal of groups and pages didn’t wipe out QAnon’s most extreme followers, who continued to organize on the platform. “There was enough evidence to raise red flags in the expert community that Facebook and other platforms failed to address QAnon’s violent extremist dimension,” said Marc-André Argentino, a research fellow at King’s College London’s International Centre for the Study of Radicalisation, who has extensively studied QAnon. Believers simply rebranded as anti-child-trafficking groups or migrated to other communities, including those around the anti-vaccine movement. [...] These conspiracy groups had become the fastest-growing groups on Facebook, according to the report, but Facebook wasn’t able to control their “meteoric growth,” the researchers wrote, “because we were looking at each entity individually, rather than as a cohesive movement.” A Facebook spokesperson told BuzzFeed News it took many steps to limit election misinformation but that it was unable to catch everything. |
I00123 The Extreme Right on Steam | Analysis of communities on the gaming platform Steam showed that groups who are known to have engaged in acts of terrorism used Steam to host social communities (T0152.009: Software Delivery Platform, T0151.002: Online Community Group): The first is a Finnish-language group which was set up to promote the Nordic Resistance Movement (NRM). NRM are the only group in the sample examined by ISD known to have engaged in terrorist attacks. Swedish members of the group conducted a series of bombings in Gothenburg in 2016 and 2017, and several Finnish members are under investigation in relation to both violent attacks and murder. The NRM Steam group does not host content related to gaming, and instead seems to act as a hub for the movement. The group’s overview section contains a link to the official NRM website, and users are encouraged to find like-minded people to join the group. The group is relatively small, with 87 members, but at the time of writing, it appeared to be active and in use. Interestingly, although the group is in Finnish language, it has members in common with the English language channels identified in this analysis. This suggests that Steam may help facilitate international exchange between right-wing extremists., ISD conducted an investigation into the usage of social groups on Steam. Steam is an online platform used to buy and sell digital games, and includes the Steam community feature, which “allows users to find friends and join groups and discussion forums, while also offering in-game voice and text chat”. Actors have used Steam’s social capabilities to enable online harm campaigns: One function of these Steam groups is the organisation of ‘raids’ – coordinated trolling activity against their political opponents. An example of this can be seen in a white power music group sharing a link to an Israeli Steam group, encouraging other members to “help me raid this juden [German word for Jew] group”. The comments section of said target group show that neo-Nazi and antisemitic comments were consistently posted in the group just two minutes after the instruction had been posted in the extremist group, highlighting the swiftness with which racially motivated harassment can be directed online. Threat actors used social groups on Steam to organise harassment of targets (T0152.009: Software Delivery Platform, T0151.002: Online Community Group, T0049.005: Conduct Swarming, T0048: Harass)., ISD conducted an investigation into the usage of social groups on Steam. Steam is an online platform used to buy and sell digital games, and includes the Steam community feature, which “allows users to find friends and join groups and discussion forums, while also offering in-game voice and text chat”. Actors have used Steam’s social capabilities to enable online harm campaigns: A number of groups were observed encouraging members to join conversations on outside platforms. These include links to Telegram channels connected to white supremacist marches, and media outlets, forums and Discord servers run by neo-Nazis. [...] This off-ramping activity demonstrates how rather than sitting in isolation, Steam fits into the wider extreme right wing online ecosystem, with Steam groups acting as hubs for communities and organizations which span multiple platforms. Accordingly, although the platform appears to fill a specific role in the building and strengthening of communities with similar hobbies and interests, it is suggested that analysis seeking to determine the risk of these communities should focus on their activity across platforms Social Groups on Steam were used to drive new people to other neo-Nazi controlled community assets (T0122: Direct Users to Alternative Platforms, T0152.009: Software Delivery Platform, T0151.002: Online Community Group). |
I00128 #TrollTracker: Outward Influence Operation From Iran | [Meta removed a network of assets for coordinated inauthentic behaviour. One page] in the network, @StopMEK, was promoting views against the People’s Mujahedin of Iran (MEK), the largest and most active political opposition group against the Islamic Republic of Iran Leadership. The content on the page drew narratives showing parallels between the Islamic State of Iraq and Syria (ISIS) and the MEK. Apart from images and memes, the @StopMEK page shared a link to an archived report on how the United States was monitoring the MEK’s movement in Iran in the mid-1990’s. The file was embedded as a QR code on one of the page’s images. In this example a Facebook page presented itself as focusing on a political cause (T0097.208: Social Cause Persona, T0151.001: Social Media Platform, T0151.002: Online Community Group). Within the page it embedded a QR code (T0122: Direct Users to Alternative Platforms, T0153.004: QR Code Asset), which took users to a document hosted on another website (T0152.004: Website Asset). |
Counters | Response types |
---|
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW