mirror of
https://github.com/DISARMFoundation/DISARMframeworks.git
synced 2025-02-13 22:01:27 -05:00
Put back BELOW THE LINE etc. since pages were being duplicated without it
This commit is contained in:
parent
ba341c9379
commit
eb80ad2560
@ -589,7 +589,7 @@ class Disarm:
|
||||
print('Updating {}'.format(datafile))
|
||||
with open(datafile, 'w') as f:
|
||||
f.write(metatext)
|
||||
#f.write(warntext)
|
||||
f.write(warntext)
|
||||
f.write(usertext)
|
||||
f.close()
|
||||
return
|
||||
|
@ -29,3 +29,4 @@
|
||||
| [T0150.005 Compromised Asset](../../generated_pages/techniques/T0150.005.md) | <i>“The August 17 [2022] Telegram post [which contained a falsified letter from the Ukrainian Minister of Foreign Affairs asking Poland to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII] also contained screenshots of Facebook posts that appeared on two Facebook accounts belonging to Polish nationals Piotr Górka, an expert in the history of the Polish Air Force, and Dariusz Walusiak, a Polish historian and documentary maker. The Górka post suggested that he fully supported the Polish government’s decision to change Belwederska Street to Stepan Bandera Street.<br><br> “In a statement to the DFRLab, Górka said his account was accessed without his consent. “This is not my post loaded to my Facebook page,” he explained. “My site was hacked, some days ago.” At the time of publishing, Piotr Górka’s post and his Facebook account were no longer accessible.<br><br> “The post on Górka’s Facebook page was shared by Dariusz Walusiak’s Facebook account; the account also reposted it on the Facebook walls of more than twenty other Facebook users, including Adam Kalita, currently working at Krakow branch of the Institute of National Remembrance; Jan Kasprzyk, head of the Office for War Veterans and Victims of Oppression; and Alicja Kondraciuk, a Polish public figure living in Krakow.<br><br> “Walusiak’s Facebook account is also no longer accessible. Given his work on Polish history and identity, it seems highly unlikely he would support the Bandera measure; the DFRLab has also reached out to him for comment.<br><br> “The fact that Joker DPR’s Telegram post included screenshots of their Facebook posts raises the strong possibility that both Facebook accounts were compromised, and that hackers planted false statements on their pages that would seem out of character for them in order to gain further attention to the forged documents.”</I><br><br> In this example, threat actors used compromised accounts of Polish historians who have enough relevant knowledge to plausibly weigh in on the forged letter’s narrative (T0143.003: Impersonated Persona, T0097.101: Local Persona, T0097.108: Expert Persona, T0146: Account Asset, T0150.005: Compromised Asset, T0151.001: Social Media Platform). <br><br> This matches T0097.108: Expert Persona because the impersonation exploited Górka and Walusiak’s existing personas as experts in Polish history. |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -30,3 +30,4 @@
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | <i>“The Black Matters Facebook Page [operated by Russia’s Internet Research Agency] explored several visual brand identities, moving from a plain logo to a gothic typeface on Jan 19th, 2016. On February 4th, 2016, the person who ran the Facebook Page announced the launch of the website, blackmattersus[.]com, emphasizing media distrust and a desire to build Black independent media; [“I DIDN’T BELIEVE THE MEDIA / SO I BECAME ONE”]”<i><br><br> In this example an asset controlled by Russia’s Internet Research Agency began to present itself as a source of “Black independent media”, claiming that the media could not be trusted (T0097.208: Social Cause Persona, T0097.202: News Outlet Persona, T0143.002: Fabricated Persona). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -30,3 +30,4 @@
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | <i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -26,3 +26,4 @@
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -30,3 +30,4 @@
|
||||
| [T0143.001 Authentic Persona](../../generated_pages/techniques/T0143.001.md) | <i>“On January 4 [2017], a little-known news site based in Donetsk, Ukraine published an article claiming that the United States was sending 3,600 tanks to Europe as part of “the NATO war preparation against Russia”.<br><br> “Like much fake news, this story started with a grain of truth: the US was about to reinforce its armored units in Europe. However, the article converted literally thousands of other vehicles — including hundreds of Humvees and trailers — into tanks, building the US force into something 20 times more powerful than it actually was.<br><br> “The story caught on online. Within three days it had been repeated by a dozen websites in the United States, Canada and Europe, and shared some 40,000 times. It was translated into Norwegian; quoted, unchallenged, by Russian state news agency RIA Novosti; and spread among Russian-language websites.<br><br> “It was also an obvious fake, as any Google news search would have revealed. Yet despite its evident falsehood, it spread widely, and not just in directly Kremlin-run media. Tracking the spread of this fake therefore shines a light on the wider question of how fake stories are dispersed.”</i><br><br> Russian state news agency RIA Novosti presents themselves as a news outlet (T0097.202: News Outlet Persona). RIO Novosti is a real news outlet (T0143.001: Authentic Persona), but it did not carry out a basic investigation into the veracity of the narrative they published implicitly expected of institutions presenting themselves as news outlets.<br><br> We can’t know how or why this narrative ended up being published by RIA Novosti, but we know that it presented a distorted reality as authentic information (T0023: Distort Facts), claiming that the US was sending 3,600 tanks, instead of 3,600 vehicles which included ~180 tanks. |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -27,3 +27,4 @@
|
||||
| [T0145.007 Stock Image Account Imagery](../../generated_pages/techniques/T0145.007.md) | <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br>“Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</I><br><br>This behaviour matches T0145.007: Stock Image Account Imagery because the account was identified as using a stock image as its profile picture. |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -31,3 +31,4 @@
|
||||
| [T0145.006 Attractive Person Account Imagery](../../generated_pages/techniques/T0145.006.md) | <i>“In the wake of the Hamas attack on October 7th, the Israel Defense Forces (IDF) Information Security Department revealed a campaign of Instagram accounts impersonating young, attractive Israeli women who were actively engaging Israeli soldiers, attempting to extract information through direct messages.<br><br> [...]<br><br> “Some profiles underwent a reverse-image search of their photos to ascertain their authenticity. Many of the images searched were found to be appropriated from genuine social media profiles or sites such as Pinterest. When this was the case, the account was marked as confirmed to be inauthentic. One innovative method involves using photos that are initially frames from videos, which allows for evading reverse searches in most cases . This is seen in Figure 4, where an image uploaded by an inauthentic account was a screenshot taken from a TikTok video.”</i><br><br> In this example accounts associated with an influence operation used account imagery showing <i>“young, attractive Israeli women”</i> (T0145.006: Attractive Person Account Imagery), with some of these assets taken from existing accounts not associated with the operation (T0145.001: Copy Account Imagery). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -27,3 +27,4 @@ Alex Scroxton | ComputerWeekly | [https://web.archive.org/web/20240405154259/htt
|
||||
| [T0152.006 Video Platform](../../generated_pages/techniques/T0152.006.md) | The Microsoft Threat Analysis Centre (MTAC) published a report documenting the use of AI by pro-Chinese threat actors:<br><br><i>On 13 January, Spamouflage [(a Pro-Chinese Communist Party actor)] posted audio clips to YouTube of independent candidate [for Taiwan’s Jan 2024 presidential election] Terry Gou – who also founded electronics giant Foxconn – in which Gou endorsed another candidate in the race. This clip was almost certainly AI-generated, and it was swiftly removed by YouTube. A fake letter purporting to be from Gou, endorsing the same candidate, had already circulated – Gou had of course made no such endorsement.</i><br><br>Here Spamoflage used an account on YouTube to post AI Generated audio impersonating an electoral candidate (T0146: Account Asset, T0152.006: Video Platform, T0115: Post Content, T0088.001: Develop AI-Generated Audio (Deepfakes), T0143.003: Impersonated Persona, T0097.110: Party Official Persona).<br><br><i>Spamouflage also exploited AI-powered video platform CapCut – which is owned by TikTok backers ByteDance – to generate fake news anchors which were used in a variety of campaigns targeting the various presidential candidates in Taiwan.</i><br><br>Spamoflage created accounts on CapCut, which it used to create AI-generated videos of fabricated news anchors (T0146: Account Asset, T0154.002: AI Media Platform, T0087.001: Develop AI-Generated Video (Deepfakes), T0143.002: Fabricated Persona, T0097.102: Journalist Persona). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -26,3 +26,4 @@
|
||||
| [T0154.002 AI Media Platform](../../generated_pages/techniques/T0154.002.md) | <i>Last October, British writer Helen was alerted to a series of deepfakes on a porn site that appear to show her engaging in extreme acts of sexual violence. That night, the images replayed themselves over and over in horrific nightmares and she was gripped by an all-consuming feeling of dread. “It’s like you’re in a tunnel, going further and further into this enclosed space, where there’s no light,” she tells Vogue. This feeling pervaded Helen’s life. Whenever she left the house, she felt exposed. On runs, she experienced panic attacks. Helen still has no idea who did this to her.<br><br>[...]<br><br>Amnesty International has been investigating the effects of abuse against women on Twitter, specifically in relation to how they act online thereafter. According to the charity, abuse creates what they’ve called “the silencing effect” whereby women feel discouraged from participating online. The same can be said for victims of deepfakes.<br><br>Helen has never been afraid to use her voice, writing deeply personal accounts of postnatal depression. But the deepfakes created a feeling of shame so strong she thought she’d be carrying this “dirty secret” forever, and so she stopped writing.<br><br>[...]<br><br>Meanwhile, deepfake ‘communities’ are thriving. There are now dedicated sites, user-friendly apps and organised ‘request’ procedures. Some sites allow you to commission custom deepfakes for £25, while on others you can upload a woman’s image and a bot will strip her naked.<br><br>“This violation is not something that should be normalised,” says Gibi, an ASMR artist with 3.13 million YouTube subscribers. Gibi has given up trying to keep tabs on the deepfakes of her. For Gibi, the most egregious part of all of this is the fact that people are “profiting off my face, doing something that I didn’t consent to, like my suffering is your livelihood.” She’s even been approached by a company offering to remove the deepfakes — for £500 a video. This has to end. But how?</i><br><br>A website hosting pornographic content provided users the ability to create deepfake content (T0154.002: AI Media Platform, T0086.002: Develop AI-Generated Images (Deepfakes)). <br><br>Another website enabled users to commission custom deepfakes (T0152.004: Website Asset, T0148.004: Payment Processing Capability, T0086.002: Develop AI-Generated Images (Deepfakes), T0155.005: Paid Access Asset). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -26,3 +26,4 @@
|
||||
| [T0151.011 Community Sub-Forum](../../generated_pages/techniques/T0151.011.md) | This report looks at changes content posted to communities on Reddit (called Subreddits) after teams of voluntary moderators are replaced with what appear to be pro-Russian voices:<br><br><i>The r/antiwar subreddit appears to be a very recent takeover target. With 12,900 members it is not the largest community on Reddit, but this does place it squarely within the top 5% of all communities in terms of membership.<br><br>Three months ago a new moderator team was instated by subreddit head u/democracy101. Any posts documenting Russian aggression in Ukraine are now swiftly removed, while the board has been flooded with posts about how Ukraine is losing, or how American “neocons wrecked” the country.<br><br>The pinned post from moderator u/n0ahbody proclaims: “People who call for an end to Russian aggressions but not the Western aggressions Russia is reacting to don’t really want peace.” This user takes the view that any negative opinion about Russia is “shaped by what the fanatically Russophobic MSM wants you to think,” and that the United States is not threatened by its neighbors. Russia is.”<br><br>When u/n0ahbody took over the sub, the user posted a triumphant and vitriolic diatribe in another pro-Russia subreddit with some 33,500 members, r/EndlessWar. “We are making progress. We are purging the sub of all NAFO and NAFO-adjacent elements. Hundreds of them have been banned over the last 24 hours for various rule infractions, for being NAFO or NAFO-adjacent,” the user said, referencing the grassroots, pro-Ukrainian North Atlantic Fella Organization (NAFO) meme movement.<br><br>Several former users have reported they have indeed been banned from r/antiwar since the change in moderators. “If this subreddit cannot be explicitly against the invasion of Ukraine it will never truly be anti-war,” wrote one user Halcyon_Rein, in the antiwar subreddit on September 6. They then edited the post to say, “Edit: btw, I got f**king banned for this 💀💀💀”</i><br><br>A community hosted on Reddit was taken over by new moderators (T0151.011: Community Sub-Forum, T0150.005: Compromised Asset). These moderators removed content posted to the community which favoured Ukraine over Russia (T0146.004: Administrator Account Asset, T0151.011: Community Sub-Forum, T0124: Suppress Opposition). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -26,3 +26,4 @@
|
||||
| [T0151.012 Image Board Platform](../../generated_pages/techniques/T0151.012.md) | <i>On April 27, 2019, at around 11:30 a.m. local time, a young man with a semi-automatic rifle walked into the Chabad of Poway Synagogue in Poway, California. He opened fire, killing one worshipper and wounding three others. In the hours since the shooting, a manifesto, believed to be written by the shooter, began circulating online. Evidence has also surfaced that, like the Christchurch Mosque shooter, this killer began his rampage with a post on 8chan’s /pol/ board.<br><br>Although both of these attacks may seem different, since they targeted worshippers of different faiths, both shooters were united by the same fascist ideology. They were also both radicalized in the same place: 8chan’s /pol/ board.<br><br>This has been corroborated by posts on the board itself, where “anons,” as the posters call themselves, recirculated the shooter’s since-deleted post. In it, the alleged shooter claims to have been “lurking” on the site for a year and a half. He includes a link to a livestream of his rampage — which thankfully does not appear to have worked — and he also includes a pastebin link to his manifesto.<br><br>The very first response to his announcement was another anon cheering him on and telling him to “get the high score,” AKA, kill a huge number of people.</i><br><br>Before carrying out a mass shooting, the shooter posted a thread to 8chan’s /pol/ board. The post directed users to a variety of different platforms (T0146.006: Open Access Platform, T0151.012: Image Board Platform, T0115: Post Content, T0122: Direct Users to Alternative Platforms); a Facebook account on which the shooter attempted to livestream the shooting (T0146: Account Asset, T0151.001: Social Media Platform); and a manifesto they had written hosted on pastebin (T0146.006: Open Access Platform, T0152.005: Paste Platform, T0115: Post Content) and uploaded to the file sharing platform Mediafire (T0152.010: File Hosting Platform, T0085.004: Develop Document).<br><br>The report looks deeper into 8chan’s /pol/ board:<br><br><i>8chan is a large website, which includes a number of different discussion groups about everything from anime to left-wing politics. /pol/ is one particularly active board on the website, and it is best described as a gathering place for extremely online neo-Nazis.<br><br>[...]<br><br>I’ve browsed /pol/ on an almost daily basis since the Christchurch shooting. It has not been difficult to find calls for violence. On Monday, March 25 of this year, I ran across evidence of anons translating the Christchurch shooter’s manifesto into other languages in an attempt to inspire more shootings across the globe.<br><br>This tactic can work, and today’s shooting is proof. The Poway Synagogue shooter directly cited the Christchurch shooter as his inspiration, saying he decided to carry out his attack roughly two weeks after that shooting. On /pol/, many anons refer to the Christchurch shooter, Brenton Tarrant, as “Saint Tarrant,” complete with medieval-inspired iconography.</i><br><br>Manifestos posted to 8chan are translated and reshared by other platform users (T0101: Create Localised Content, T0146.006: Open Access Platform, T0151.012: Image Board Platform, T0115: Post Content, T0084.004: Appropriate Content).<br><br><i>When I began looking through /pol/ right after the Poway Synagogue shooting, I came across several claims that the shootings had been a “false flag” aimed at making the message board look bad.<br><br>When Bellingcat tweeted out a warning about shitposting and the shooter’s manifesto, in the immediate wake of the attack, probable anons even commented on the Tweet in an attempt to deny that a channer had been behind the attack.<br><br>This is a recognizable pattern that occurs in the wake of any crimes committed by members of the board. While the initial response to the Christchurch shooter’s massacre thread was riotous glee, in the days after the shooting many anons began to claim the attack had been a false flag. This actually sparked significant division and debate between the members of /pol/. In the below image, a user mocks other anons for being unable to “believe something in your favor is real.” Another anon responds, “As the evidence comes out, its [sic] quite clear that this was a false flag.”<br><br>In his manifesto, the Poway Synagogue shooter even weighed in on this debate, accusing other anons who called the Christchurch and Tree of Life Synagogue shootings “false flags” to merely have been scared: “They can’t fathom that there are brave White men alive who have the willpower and courage it takes to say, ‘Fuck my life—I’m willing to sacrifice everything for the benefit of my race.’”</i><br><br>Platform users deny that their platform has been used by mass shooters to publish their manifestos (T0129.006: Deny Involvement). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -26,3 +26,4 @@
|
||||
| [T0154.002 AI Media Platform](../../generated_pages/techniques/T0154.002.md) | <i>“I seriously don't understand why I have to constantly put up with these dumbasses here every day.”<br><br>So began what appeared to be a long tirade from the principal of Pikesville High School, punctuated with racist, antisemitic and offensive tropes. It sounded like it had been secretly recorded.<br><br>The speaker went on to bemoan “ungrateful black kids” and Jewish people in the community.<br><br>The clip, first posted in [January 2024], went viral nationally. But it really struck a nerve in the peaceful, leafy suburb of Pikesville, which has large black and Jewish populations, and in the nearby city of Baltimore, Maryland. Principal Eric Eiswert was put on paid administrative leave pending an investigation.<br><br>[...]<br><br>But what those sharing the clip didn’t realise at the time was that another bombshell was about to drop: the clip was an AI-generated fake.<br><br>[...]<br><br>[In April 2024], Baltimore Police Chief Robert McCullough confirmed they now had “conclusive evidence that the recording was not authentic”.<br><br>And they believed they knew who made the fake.<br><br>Police charged 31-year-old Dazhon Darien, the school’s athletics director, with several counts related to the fake video. Charges included theft, retaliating against a witness and stalking.<br><br>He was arrested at the airport, where police say he was planning to fly to Houston, Texas.<br><br>Police say that Mr Darien had been under investigation by Principal Eiswert over an alleged theft of $1,916 (£1,460) from the school. They also allege there had been “work performance challenges” and his contract was likely not to be renewed.<br><br>Their theory was that by creating the deepfake recording, he hoped to discredit the principal before he could be fired.<br><br>Investigators say they traced an email used to send the original video to a server connected to Mr Darien, and allege that he used Baltimore County Public Schools' computer network to access AI tools. He is due to stand trial in December 2024.</i><br><br>By associating Mr Darien to the server used to email the original AI generated audio, investigators link Darien to the fabricated content (T0149.005: Server Asset, T0088.001: AI Generated Audio (Deepfakes)). They also assert that Darien used computers owned by the school to access platforms used to generate the audio (T0146: Account Asset, T0154.002: AI Media Platform). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -26,3 +26,4 @@
|
||||
| [T0152.005 Paste Platform](../../generated_pages/techniques/T0152.005.md) | <i>A massive trove of documents purporting to contain thousands of emails and other files from the [2017 presidential] campaign of Emmanuel Macron—the French centrist candidate squaring off against right-wing nationalist Marine Le Pen—was posted on the internet Friday afternoon. The Macron campaign says that at least some of the documents are fake. The document dump came just over a day before voting is set to begin in the final round of the election and mere hours before candidates are legally required to stop campaigning.<br><br>At about 2:35 p.m. ET, a post appeared on the 4chan online message board announcing the leak. The documents appear to include emails, internal memos, and screenshots of purported banking records.<br><br>“In this pastebin are links to torrents of emails between Macron, his team and other officials, politicians as well as original documents and photos,” the anonymous 4chan poster wrote. “This was passed on to me today so now I am giving it to you, the people. The leak is massvie and released in the hopes that the human search engine here will be able to start sifting through the contents and figure out exactly what we have here.”<br><br>The Macron campaign issued a statement Friday night saying it was the victim of a “massive and coordinated” hacking attack. That campaign said the leak included some fake documents that were intended “to sow doubt and misinformation.”</i><br><br>Actors posted a to 4chan a link (T0151.012: Image Board Platform, T0146.006: Open Access Platform, T0115: Post Content, T0122: Direct Users to Alternative Platforms) to text content hosted on pastebin (T0152.005: Paste Platform, T0146.006: Open Access Platform, T0115: Post Content), which contained links to download stolen and fabricated documents. |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -26,3 +26,4 @@
|
||||
| [T0151.003 Online Community Page](../../generated_pages/techniques/T0151.003.md) | <i>As families desperately seek to find missing loved ones and communities grapple with immeasurable losses of both life and property in the wake of [2024’s] Hurricane Helene, AI slop scammers appear to be capitalizing on the moment for personal gain.<br><br>A Facebook account called "Coastal Views" usually shares calmer AI imagery of nature-filled beachside scenes. The account's banner image showcases a signpost reading "OBX Live," OBX being shorthand for North Carolina's Outer Banks islands.<br><br>But starting this weekend, the account shifted its approach dramatically, as first flagged by a social media user on X.<br><br>Instead of posting "photos" of leaping dolphins and sandy beaches, the account suddenly started publishing images of flooded mountain neighborhoods, submerged houses, and dogs sitting on top of roofs.<br><br>But instead of spreading vital information to those affected by the natural disaster, or at the very least sharing real photos of the destruction, the account is seemingly trying to use AI to cash in on all the attention the hurricane has been getting.<br><br>The account links to an Etsy page for a business called" OuterBanks2023," where somebody who goes by "Alexandr" sells AI-generated prints of horses touching snouts with sea turtles, Santa running down the shoreline with a reindeer, and sunsets over ocean waves.</i><br><br>A Facebook page which presented itself as being associated with North Carolina which posted AI generated images changed to posting AI generated images of hurricane damage after Hurricane Helene hit North Carolina (T0151.003: Online Community Page, T0151.001: Social Media Platform, T0115: Post Content, T0086.002: Develop AI-Generated Images (Deepfakes), T0068: Respond to Breaking News Event or Active Crisis). <br><br>The account included links (T0122: Direct Users to Alternative Platforms) to an account on Etsy, which sold prints of AI generated images (T0146: Account Asset, T0148.007: eCommerce Platform). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -26,3 +26,4 @@
|
||||
| [T0152.003 Website Hosting Platform](../../generated_pages/techniques/T0152.003.md) | <i>The Moscow firm Social Design Agency (SDA) has been attributed as being behind a Russian disinformation project known as Doppelganger:<br><br>The SDA’s deception work first surfaced in 2022, likely almost immediately after Doppelganger got off the ground. In April of that year, Meta, the parent company of Facebook and Instagram, disclosed in a quarterly report that it had removed from its platforms “a network of about 200 accounts operated from Russia.” By August 2022, German investigative journalists revealed that they had discovered forgeries of about 30 news sites, including many of the country’s biggest media outlets—Frankfurter Allgemeine, Der Spiegel, and Bild—but also Britain’s Daily Mail and France’s 20 Minutes. The sites had deceptive URLs such as www-dailymail-co-uk.dailymail.top. </i><br><br>As part of the SDA’s work, they created many websites which impersonated existing media outlets. Sites used domain impersonation tactics to increase perceived legitimacy of their impersonations (T0097.202: News Outlet Persona, T0143.003: Impersonated Persona, T0152.003: Website Hosting Platform, T0149.003: Lookalike Domain). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -26,3 +26,4 @@
|
||||
| [T0153.006 Content Recommendation Algorithm](../../generated_pages/techniques/T0153.006.md) | <i>This article examines the white nationalist group Suavelos’ use of Facebook to draw visitors to its website without overtly revealing their racist ideology:<br><br>Suavelos uses Facebook and other platforms to amplify its message. In order to bypass the platforms’ community standards and keep their public pages active, Facebook pages such as “I support the police” are a good vehicle to spread a specific agenda without claiming to be racist. In looking back at this Facebook page, we followed Facebook’s algorithm for related pages and found suggested Facebook pages<br><br>[...]<br><br>This amplification strategy on Facebook is successful, as according to SimilarWeb figures, it attracts around 111,000 visits every month on the Suavelos.eu website.<br><br>[...]<br><br>Revenue through online advertisements can be achieved by different platforms through targeted advertisements, like Google Adsense or Doubleclick, or related and similar sponsored content, such as Taboola. Accordingly, Suavelos.eu uses both of these websites to display advertisements and consequently receives funding from such advertisements.<br><br>Once visitors are on the website supporting its advertisement revenue, Suavelos’ goal is to then turn these visitors into regular members of Suavelos network through donations or fees, or have them continue to support Suavelos. </i><br><br>Suevelos created a variety of pages on Facebook which presented as centring on prosocial causes. Facebook’s algorithm helped direct users to these pages (T0092: Build Network, T0151.001: Social Media Platform, T0153.006: Content Recommendation Algorithm, T0151.003: Online Community Page, T0143.208: Social Cause Persona).<br><br>Suevelos used these pages to generate traffic for their WordPress site (T0122: Direct Users to Alternative Platforms, T0152.003: Website Hosting Platform, T0152.004: Website Asset), which used accounts on a variety of online advertising platforms to host adverts (T0146: Account Asset, T0153.005: Online Advertising Platform). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -29,3 +29,4 @@
|
||||
| [T0152.004 Website Asset](../../generated_pages/techniques/T0152.004.md) | This report examines the white nationalist group Suavelos’ use of Facebook to draw visitors to its website without overtly revealing their racist ideology. This section of the report looks at the Suavelos website, and the content it links out to.<br><br><i>In going back to Suavelos’ main page, we also found: A link to a page on a web shop: alabastro.eu; A link to a page to donate money to the founders through Tipee and to the website through PayPal; [and] a link to a private forum that gathers 3.000 members: oppidum.suavelos.eu;</i><br><br>Suavelos linked out to an online store which it controlled (T0152.004: Website Asset, T0148.004: Payment Processing Capability), and to accounts on payment processing platforms PayPal and Tipee (T0146: Account Asset, T0148.003: Payment Processing Platform). <br><br>The Suavelos website also hosted a private forum (T0151.009: Legacy Online Forum Platform, T0155: Gated Asset), and linked out to a variety of assets it controlled on other online platforms: accounts on Twitter (T0146: Account Asset, T0151.008: Microblogging Platform), YouTube (T0146: Account Asset, T0152.006: Video Platform), Instagram and VKontakte (T0146: Account Asset, T0151.001: Social Media Platform). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -29,3 +29,4 @@
|
||||
| [T0152.012 Subscription Service Platform](../../generated_pages/techniques/T0152.012.md) | The EU Disinfo Lab produced a report into disinformation published on crowdfunding platforms:<br><br><i>More worrisome is the direct monetisation of disinformation happening on crowdfunding platforms: on Kickstarter, we found a user openly raising money for a documentary project suggesting that COVID-19 is a conspiracy.</i><br><br>A Kickstarter user attempted to use the platform to fund production of a documentary (T0017: Conduct Fundraising, T0087: Develop Video-Based Content, T0146: Account Asset, T0148.006: Crowdfunding Platform).<br><br><i>On Patreon, we found several instances of direct monetisation of COVID-19 disinformation, including posts promoting a device allegedly protecting against COVID-19 and 5G, as well as posts related to the “Plandemic” conspiracy video, which gained attention on YouTube before being removed by the platform.<br><br>We also found an account called “Stranger than fiction” entirely dedicated to disinformation, which openly states that their content was “Banned by screwtube and fakebook, our videos have been viewed over a billion times.”</i><br><br>The “Stranger than fiction” user presented itself as an alternative news source which had been banned from other platforms (T0146: Account Asset, T0097.202: News Outlet Persona, T0121.001: Bypass Content Bocking, T0152.012: Subscription Service Platform).<br><br><i>On the US-based crowdfunding platform IndieGogo, EU DisinfoLab found a successful crowdfunding campaign of €133.903 for a book called Revolution Q. This book, now also available on Amazon, claims to be “Written for both newcomers and long-time QAnon fans alike, this book is a treasure-trove of information designed to help everyone weather The Storm.”</i><br><br>An IndieGogo account was used to gather funds to produce a book on QAnon (T0017: Conduct Fundraising, T0085.005: Develop Book, T0146: Account Asset, T0148.006: Crowdfunding Platform), with the book later sold on Amazon marketplace (T0148.007: eCommerce Platform). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -28,3 +28,4 @@
|
||||
| [T0152.012 Subscription Service Platform](../../generated_pages/techniques/T0152.012.md) | In this article VICE News discusses a report produced by Advance Democracy on people who use Patreon to spread the false claim that an impending ice age will reverse the harms of the ongoing climate crisis:<br><br><i>“The spread of climate misinformation is prolific on social media, as well as on sites like Patreon, where users are actually financially compensated through the platform for spreading falsehoods,” Daniel Jones, president of Advance Democracy, told VICE News.<br><br>“Companies hosting and promoting climate misinformation have a responsibility to take action to reduce dangerous misinformation, as falsehoods about climate science are every bit as dangerous as lies about vaccinations and disinformation about our elections.”<br><br>Patreon did not respond to VICE News’ request for comment on the report’s findings.<br><br>One of the biggest accounts spreading climate conspiracies is ADAPT 2030, which is run by David DuByne, who has 1,100 followers on Patreon. He is currently making over $3,500 every month from his subscribers.<br><br>[The science DuByne relies on does not support his hypothesis. However,] this has not prevented DuByne and many others from preying on people’s fears about climate change to spread conspiracies about an impending ice age, which they say will miraculously fix all of earth’s climate problems.<br><br>DuByne offers seven different membership levels for supporters, beginning at just $1 per month.<br><br>The most expensive costs $100 a month, and gives patrons “a private 20-minute call with David DuByne once per month, to discuss your particular preparedness issues or concerns.” So far just two people are paying this amount.<br><br>The researchers also found at least eight other accounts on Patreon that have spread climate change conspiracy theories as part of wider conspiracy sharing, including baseless claims about COVID-19 and the legitimacy of Joe Biden’s presidency. Some of these accounts are earning over $600 per month.</i><br><br>David DuByne created an account on Patreon, which he uses to post text, videos, and podcasts for his subscribers to discuss (T0085: Develop Text-Based Content, T0087: Develop Video-Based Content, T0088: Develop Audio-Based Content, T0146: Account Asset, T0115: Post Content, T0152.012: Subscription Service Platform, T0151.014: Comments Section, T0155.006: Subscription Access Asset). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -32,3 +32,4 @@
|
||||
| [T0151.008 Microblogging Platform](../../generated_pages/techniques/T0151.008.md) | Researchers at Mozilla examined influence operations targeting Kenyan citizens on Twitter in 2021, providing “a grim window into the booming and shadowy industry of Twitter influencers for political hire here in Kenya”, and giving insight into operations’ operationalisation:<br><br><i>In our interviews with one of the influencers, they informed us of the agile tactics they use to organize and avoid detection. For example, when it’s time to carry out the campaign the influencers would be added to a Whatsapp group. Here, they received direction about what to post, the hashtags to use, which tweets to engage with and who to target. Synchronizing the tweets was also incredibly important for them. It’s what enables them to achieve their goal of trending on Twitter and gain amplification.<br><br>[...]<br><br>They revealed to us that those participating in the exercise are paid roughly between $10 and $15 to participate in three campaigns per day. Each campaign execution involves tweeting about the hashtags of the day until it appears on the trending section of Twitter. Additionally, some individuals have managed to reach retainer level and get paid about $250 per month. Their job is to make sure the campaigns are executed on a day-by-day basis with different hashtags.</i><br><br>An M-PESA account (T0148.002: Bank Account Asset, T0148.001: Online Banking Platform) was used to pay campaign participants.<br><br>Participants were organised in WhatsApp groups (T0129.005: Coordinate on Encrypted/Closed Networks, T0151.007: Chat Broadcast Group, T0151.004: Chat Platform), in which they planned how to get campaign content trending on Twitter (T0121: Manipulate Platform Algorithm, T0151.008: Microblogging Platform). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
File diff suppressed because one or more lines are too long
@ -29,3 +29,4 @@
|
||||
| [T0150.001 Newly Created Asset](../../generated_pages/techniques/T0150.001.md) | <i>Consumers who complain of poor customer service on X are being targeted by scammers after the social media platform formerly known as Twitter changed its account verification process.<br><br>Bank customers and airline passengers are among those at risk of phishing scams when they complain to companies via X. Fraudsters, masquerading as customer service agents, respond under fake X handles and trick victims into disclosing their bank details to get a promised refund.<br><br>They typically win the trust of victims by displaying the blue checkmark icon, which until this year denoted accounts that had been officially verified by X.<br><br>Changes introduced this year allow the icon to be bought by anyone who pays an £11 monthly fee for the site’s subscription service, renamed this month from Twitter Blue to X Premium. Businesses that pay £950 a month receive a gold tick. X’s terms and conditions do not state whether subscriber accounts are pre-vetted.<br><br>Andrew Thomas was contacted by a scam account after posting a complaint to the travel platform Booking.com. “I’d been trying since April to get a refund after our holiday flights were cancelled and finally resorted to X,” he said.<br><br>“I received a response asking me to follow them, and DM [direct message] them with a contact number. They then called me via WhatsApp asking for my reference number so they could investigate. Later they called back to say that I would be refunded via their payment partner for which I’d need to download an app.”<br><br>Thomas became suspicious and checked the X profile. “It looked like the real thing, but I noticed that there was an unexpected hyphen in the Twitter handle and that it had only joined X in July 2023,” he said.</i><br><br>In this example a newly created paid account was created on X, used to direct users to other platforms (T0146.002: Paid Account Asset, T0146.003: Verified Account Asset, T0146.005: Lookalike Account ID, T0097.205: Business Persona, T0122: Direct Users to Alternative Platforms, T0143.003: Impersonated Persona, T0151.008: Microblogging Platform, T0150.001: Newly Created Asset). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -28,3 +28,4 @@
|
||||
| [T0151.009 Legacy Online Forum Platform](../../generated_pages/techniques/T0151.009.md) | <i>In an effort to prove that the developers behind a popular multiplayer vehicle combat game had made a mistake, a player went ahead and published classified British military documents about one of the real-life tanks featured in the game.<br><br>This truly bizarre turn of events recently occurred in the public forum for War Thunder, a free-to-player multiplayer combat sim featuring modern land, air, and sea craft. Getting a small detail wrong on a piece of equipment might not be a big deal for the average gamer, but for the War Thunder crowd it sure as hell is. With 25,000 devoted players, the game very much bills itself as the military vehicle combat simulator.<br><br>A player, who identified himself as a British tank commander, claimed that the game’s developers at Gaijin Entertainment had inaccurately represented the Challenger 2 main battle tank used by the British military.<br><br>The self-described tank commander’s bio listed his location as Tidworth Camp in Wiltshire, England, according to the UK Defense Journal, which reported that the base is home to the Royal Tank Regiment, which fields Challenger 2 tanks.<br><br>The player, who went by the handle Pyrophoric, reportedly shared an image on the War Thunder forum of the tank’s specs that were pulled from the Challenger 2’s Army Equipment Support Publication, which is essentially a technical manual. <br><br>[...]<br><br>A moderator for the forum, who’s handle is “Templar_”, explained that the developer had removed the material after they received confirmation from the Ministry of Defense that the document is still in fact classified.</i><br><br>A user of War Thunder’s forums posted confidential documents to win an argument (T0089.001: Obtain Authentic Documents, T0146: Account Asset, T0097.105: Military Personnel Persona, T0115: Post Content, T0143.001: Authentic Persona, T0151.009: Legacy Online Forum Platform). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -29,3 +29,4 @@
|
||||
| [T0153.001 Email Platform](../../generated_pages/techniques/T0153.001.md) | <i>An American journalist who runs an independent newsletter published a document [on 26 Sep 2024] that appears to have been stolen from Donald Trump’s presidential campaign — the first public posting of a file that is believed to be part of a dossier that federal officials say is part of an Iranian effort to manipulate the [2024] U.S. election.<br><br>The PDF document is a 271-page opposition research file on former President Donald Trump’s running mate, Sen. JD Vance, R-Ohio.<br><br>For more than two months, hackers who the U.S. says are tied to Iran have tried to persuade the American media to cover files they stole. No outlets took the bait.<br><br>But on Thursday, reporter Ken Klippenstein, who self-publishes on Substack after he left The Intercept this year, published one of the files.<br><br>[...]<br><br>Reporters who have received the documents describe the same pattern: An AOL account emails them files, signed by a person using the name “Robert,” who is reluctant to speak to their identity or reasons for wanting the documents to receive coverage.<br><br>NBC News was not part of the Robert persona’s direct outreach, but it has viewed its correspondence with a reporter at another publication.<br><br> One of the emails from the Robert persona previously viewed by NBC News included three large PDF files, each corresponding to Trump’s three reported finalists for vice president. The Vance file appears to be the one Klippenstein hosts on his site.</i><br><br>In this example hackers attributed to Iran used the Robert persona to email journalists hacked documents (T0146: Account Asset, T0097.100: Individual Persona, T0153.001: Email Platform).<br><br>The journalist Ken Kippenstien used his existing blog on substack to host a link to download the document (T0089: Obtain Private Documents, T0097.102: Journalist Persona, T0115: Post Content, T0143.001: Authentic Persona, T0152.001: Blogging Platform, T0152.002: Blog Asset, T0150.003: Pre-Existing Asset). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -27,3 +27,4 @@
|
||||
| [T0151.008 Microblogging Platform](../../generated_pages/techniques/T0151.008.md) | Ahead of the 2019 UK Election during a leader’s debate, the Conservative party rebranded their “Conservative Campaign Headquarters Press” account to “FactCheckUK”:<br><br><i>The evening of the 19th November 2019 saw the first of three Leaders’ Debates on ITV, starting at 8pm and lasting for an hour. Current Prime Minister and leader of the Conservatives, Boris Johnson faced off against Labour party leader, Jeremy Corbyn. Plenty of people will have been watching the debate live, but a good proportion were “watching” (er, “twitching”?) via Twitter. This is something I’ve done in the past for certain shows. In some cases I just can’t watch or listen, but I can read, and in other cases, the commentary is far more interesting and entertaining than the show itself will ever be. This, for me, is just such a case. But very quickly, all eyes turned upon a modestly sized account with the handle @CCHQPress. That’s short for Conservative Campaign Headquarters Press. According to their (current!) Twitter bio, they are based in Westminster and they provide “snippets of news and commentary from CCHQ” to their 75k followers.<br><br>That is, until a few minutes into the debate.<br><br>All at once, like a person throwing off their street clothes to reveal some sinister new identity underneath, @CCHQPress abruptly shed its name, blue Conservative logo, Boris Johnson banner, and bio description. Moments later, it had entirely reinvented itself.<br><br>The purple banner was emblazoned with white font that read “✓ factcheckUK [with a “FROM CCQH” subheading]”.<br><br>The matching profile picture was a white tick in a purple circle. The bio was updated to: “Fact checking Labour from CCHQ”. And the name now read factcheckUK, with the customary Twitter blue (or white depending on your phone settings!) validation tick still after it</i><br><br>In this example an existing verified social media account on Twitter was repurposed to inauthentically present itself as a Fact Checking service (T0151.008: Microblogging Platform, T0150.003: Pre-Existing Asset, T0146.003: Verified Account Asset, T0097.203: Fact Checking Organisation Persona, T0143.002: Fabricated Persona). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -28,3 +28,4 @@
|
||||
| [T0151.006 Chat Room](../../generated_pages/techniques/T0151.006.md) | Discord is an example of a T0151.004: Chat Platform, which allows users to create their own T0151.005: Chat Community Server. The Institute for Strategic Dialog (ISD) conducted an investigation into the extreme right’s usage of Discord servers:<br><br><i>Discord is a free service accessible via phones and computers. It allows users to talk to each other in real time via voice, text or video chat and emerged in 2015 as a platform designed to assist gamers in communicating with each other while playing video games. The popularity of the platform has surged in recent years, and it is currently estimated to have 140 million monthly active users.<br><br>Chatrooms – known as servers - in the platform can be created by anyone, and they are used for a range of purposes that extend far beyond gaming. Such purposes include the discussion of extreme right-wing ideologies and the planning of offline extremist activity. Ahead of the far-right Unite the Right rally in Charlottesville, Virginia, in August 2017, organisers used Discord to plan and promote events and posted swastikas and praised Hitler in chat rooms with names like “National Socialist Army” and “Führer’s Gas Chamber”.</i><br><br>In this example a Discord server was used to organise the 2017 Charlottesville Unite the Right rally. Chat rooms such in the server were used to discuss different topics related to the rally (T0057: Organise Events, T0126.002: Facilitate Logistics or Support for Attendance, T0151.004: Chat Platform, T0151.005: Chat Community Server, T0151.006: Chat Room).<br><br><i>Another primary activity engaged in the servers analysed are raids against other servers associated with political opponents, and in particular those that appear to be pro-LGBTQ. Raids are a phenomenon in which a small group of users will join a Discord server with the sole purpose of spamming the host with offensive or incendiary messages and content with the aim of upsetting local users or having the host server banned by Discord. On two servers examined here, raiding was their primary function.<br><br>Among servers devoted to this activity, specific channels were often created to host links to servers that users were then encouraged to raid. Users are encouraged to be as offensive as possible with the aim of upsetting or angering users on the raided server, and channels often had content banks of offensive memes and content to be shared on raided servers.<br><br>The use of raids demonstrates the gamified nature of extremist activity on Discord, where use of the platform and harassment of political opponents is itself turned into a type of real-life video game designed to strengthen in-group affiliation. This combined with the broader extremist activity identified in these channels suggests that the combative activity of raiding could provide a pathway for younger people to become more engaged with extremist activity.</i><br><br>Discord servers were used by members of the extreme right to coordinate harassment of targeted communities (T0048: Harass, T0049.005: Conduct Swarming, T0151.004: Chat Platform, T0151.005: Chat Community Server). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -31,3 +31,4 @@
|
||||
| [T0152.009 Software Delivery Platform](../../generated_pages/techniques/T0152.009.md) | ISD conducted an investigation into the usage of social groups on Steam. Steam is an online platform used to buy and sell digital games, and includes the Steam community feature, which “allows users to find friends and join groups and discussion forums, while also offering in-game voice and text chat”. Actors have used Steam’s social capabilities to enable online harm campaigns:<br><br><i>A number of groups were observed encouraging members to join conversations on outside platforms. These include links to Telegram channels connected to white supremacist marches, and media outlets, forums and Discord servers run by neo-Nazis. <br><br>[...]<br><br>This off-ramping activity demonstrates how rather than sitting in isolation, Steam fits into the wider extreme right wing online ecosystem, with Steam groups acting as hubs for communities and organizations which span multiple platforms. Accordingly, although the platform appears to fill a specific role in the building and strengthening of communities with similar hobbies and interests, it is suggested that analysis seeking to determine the risk of these communities should focus on their activity across platforms</i><br><br>Social Groups on Steam were used to drive new people to other neo-Nazi controlled community assets (T0122: Direct Users to Alternative Platforms, T0152.009: Software Delivery Platform, T0151.002: Online Community Group). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -27,3 +27,4 @@
|
||||
| [T0151.008 Microblogging Platform](../../generated_pages/techniques/T0151.008.md) | In 2014 threat actors attributed to Russia spread the false narrative that a local chemical plant had leaked toxic fumes. This report discusses aspects of the operation:<br><br><i>[The chemical plant leak] hoax was just one in a wave of similar attacks during the second half of last year. On Dec. 13, two months after a handful of Ebola cases in the United States touched off a minor media panic, many of the same Twitter accounts used to spread the Columbian Chemicals hoax began to post about an outbreak of Ebola in Atlanta. [...] Again, the attention to detail was remarkable, suggesting a tremendous amount of effort. A YouTube video showed a team of hazmat-suited medical workers transporting a victim from the airport. Beyoncé’s recent single “7/11” played in the background, an apparent attempt to establish the video’s contemporaneity. A truck in the parking lot sported the logo of the Hartsfield-Jackson Atlanta International Airport.</i><br><br>Accounts which previously presented as Louisiana locals were repurposed for use in a different campaign, this time presenting as locals to Atlanta, a place over 500 miles away from Louisiana and in a different timezone (T0146: Account Asset, T0097.101: Local Persona, T0143.002: Fabricated Persona, T0151.008: Microblogging Platform, T0150.004: Repurposed Asset). <br><br>A video was created which appeared to support the campaign’s narrative (T0087: Develop Video-Based Content), with great attention given to small details which made the video appear more legitimate. |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -28,3 +28,4 @@
|
||||
| [T0149.002 Email Domain Asset](../../generated_pages/techniques/T0149.002.md) | <i>The target of the recently observed [highly targeted spearphishing attack by “Charming Kitten”, a hacker group attributed to Iran] had published an article related to Iran. The publicity appears to have garnered the attention of Charming Kitten, who subsequently created an email address to impersonate a reporter of an Israeli media organization in order to send the target an email. Prior to sending malware to the target, the attacker simply asked if the target would be open to reviewing a document they had written related to US foreign policy. The target agreed to do so, since this was not an unusual request; they are frequently asked by journalists to review opinion pieces relating to their field of work.<br><br>In an effort to further gain the target’s confidence, Charming Kitten continued the interaction with another benign email containing a list of questions, to which the target then responded with answers. After multiple days of benign and seemingly legitimate interaction, Charming Kitten finally sent a “draft report”; this was the first time anything opaquely malicious occurred. The “draft report” was, in fact, a password-protected RAR file containing a malicious LNK file. The password for the RAR file was provided in a subsequent email.</i><br><br>In this example, threat actors created an email address on a domain which impersonated an existing Israeli news organisation impersonating a reporter who worked there (T0097.102: Journalist Persona, T0097.202: News Outlet Persona, T0143.003: Impersonated Persona, T0149.003: Lookalike Domain, T0149.002: Email Domain Asset) in order to convince the target to download a document containing malware (T0085.004: Develop Document, T0147.003: Malware Asset). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -28,3 +28,4 @@
|
||||
| [T0153.004 QR Code Asset](../../generated_pages/techniques/T0153.004.md) | <i>[Meta removed a network of assets for coordinated inauthentic behaviour. One page] in the network, @StopMEK, was promoting views against the People’s Mujahedin of Iran (MEK), the largest and most active political opposition group against the Islamic Republic of Iran Leadership.<br><br>The content on the page drew narratives showing parallels between the Islamic State of Iraq and Syria (ISIS) and the MEK.<br><br>Apart from images and memes, the @StopMEK page shared a link to an archived report on how the United States was monitoring the MEK’s movement in Iran in the mid-1990’s. The file was embedded as a QR code on one of the page’s images.</i><br><br>In this example a Facebook page presented itself as focusing on a political cause (T0097.208: Social Cause Persona, T0151.001: Social Media Platform, T0151.002: Online Community Group). Within the page it embedded a QR code (T0122: Direct Users to Alternative Platforms, T0153.004: QR Code Asset), which took users to a document hosted on another website (T0152.004: Website Asset). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -28,3 +28,4 @@
|
||||
| [T0151.008 Microblogging Platform](../../generated_pages/techniques/T0151.008.md) | <i>An 18-year-old hacker who pulled off a huge breach in 2020, infiltrating several high profile Twitter accounts to solicit bitcoin transactions, has agreed to serve three years in prison for his actions.<br><br>Graham Ivan Clark, of Florida, was 17 years old at the time of the hack in July, during which he took over a number of major accounts including those of Joe Biden, Bill Gates and Kim Kardashian West.<br><br>Once he accessed them, Clark tweeted a link to a bitcoin address and wrote “all bitcoin sent to our address below will be sent back to you doubled!” According to court documents, Clark made more than $100,000 from the scheme, which his lawyers say he has since returned.<br><br>Clark was able to access the accounts after convincing an employee at Twitter he worked in the company’s information technology department, according to the Tampa Bay Times.</i><br><br>In this example a threat actor gained access to Twitter’s customer service portal through social engineering (T0146.004: Administrator Account Asset, T0150.005: Compromised Asset, T0151.008: Microblogging Platform), which they used to take over accounts of public figures (T0146.003: Verified Account Asset, T0143.003: Impersonated Persona, T0150.005: Compromised Asset, T0151.008: Microblogging Platform).<br><br>The threat actor used these compromised accounts to trick their followers into sending bitcoin to their wallet (T0148.009: Cryptocurrency Wallet). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -24,55 +24,4 @@
|
||||
| [C00084 Modify disinformation narratives, and rebroadcast them](../../generated_pages/counters/C00084.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0002: Facilitate State Propaganda
|
||||
|
||||
**Summary**: Organise citizens around pro-state messaging. Coordinate paid or volunteer groups to push state propaganda.
|
||||
|
||||
**Tactic**: TA02 Plan Objectives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00029 Create fake website to issue counter narrative and counter narrative through physical merchandise](../../generated_pages/counters/C00029.md) | D03 |
|
||||
| [C00030 Develop a compelling counter narrative (truth based)](../../generated_pages/counters/C00030.md) | D03 |
|
||||
| [C00031 Dilute the core narrative - create multiple permutations, target / amplify](../../generated_pages/counters/C00031.md) | D03 |
|
||||
| [C00082 Ground truthing as automated response to pollution](../../generated_pages/counters/C00082.md) | D03 |
|
||||
| [C00084 Modify disinformation narratives, and rebroadcast them](../../generated_pages/counters/C00084.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0002: Facilitate State Propaganda
|
||||
|
||||
**Summary**: Organise citizens around pro-state messaging. Coordinate paid or volunteer groups to push state propaganda.
|
||||
|
||||
**Tactic**: TA02 Plan Objectives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00029 Create fake website to issue counter narrative and counter narrative through physical merchandise](../../generated_pages/counters/C00029.md) | D03 |
|
||||
| [C00030 Develop a compelling counter narrative (truth based)](../../generated_pages/counters/C00030.md) | D03 |
|
||||
| [C00031 Dilute the core narrative - create multiple permutations, target / amplify](../../generated_pages/counters/C00031.md) | D03 |
|
||||
| [C00082 Ground truthing as automated response to pollution](../../generated_pages/counters/C00082.md) | D03 |
|
||||
| [C00084 Modify disinformation narratives, and rebroadcast them](../../generated_pages/counters/C00084.md) | D03 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -21,49 +21,4 @@
|
||||
| [C00081 Highlight flooding and noise, and explain motivations](../../generated_pages/counters/C00081.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0003: Leverage Existing Narratives
|
||||
|
||||
**Summary**: Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00080 Create competing narrative](../../generated_pages/counters/C00080.md) | D03 |
|
||||
| [C00081 Highlight flooding and noise, and explain motivations](../../generated_pages/counters/C00081.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0003: Leverage Existing Narratives
|
||||
|
||||
**Summary**: Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00080 Create competing narrative](../../generated_pages/counters/C00080.md) | D03 |
|
||||
| [C00081 Highlight flooding and noise, and explain motivations](../../generated_pages/counters/C00081.md) | D03 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -20,47 +20,4 @@
|
||||
| [C00042 Address truth contained in narratives](../../generated_pages/counters/C00042.md) | D04 |
|
||||
|
||||
|
||||
# Technique T0004: Develop Competing Narratives
|
||||
|
||||
**Summary**: Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centred on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the "firehose of misinformation" approach.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00042 Address truth contained in narratives](../../generated_pages/counters/C00042.md) | D04 |
|
||||
|
||||
|
||||
# Technique T0004: Develop Competing Narratives
|
||||
|
||||
**Summary**: Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centred on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the "firehose of misinformation" approach.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00042 Address truth contained in narratives](../../generated_pages/counters/C00042.md) | D04 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -30,67 +30,4 @@
|
||||
| [C00203 Stop offering press credentials to propaganda outlets](../../generated_pages/counters/C00203.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0010: Cultivate Ignorant Agents
|
||||
|
||||
**Summary**: Cultivate propagandists for a cause, the goals of which are not fully comprehended, and who are used cynically by the leaders of the cause. Independent actors use social media and specialised web sites to strategically reinforce and spread messages compatible with their own. Their networks are infiltrated and used by state media disinformation organisations to amplify the state’s own disinformation strategies against target populations. Many are traffickers in conspiracy theories or hoaxes, unified by a suspicion of Western governments and mainstream media. Their narratives, which appeal to leftists hostile to globalism and military intervention and nationalists against immigration, are frequently infiltrated and shaped by state-controlled trolls and altered news items from agencies such as RT and Sputnik. Also know as "useful idiots" or "unwitting agents".
|
||||
|
||||
**Tactic**: TA15 Establish Assets
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00009 Educate high profile influencers on best practices](../../generated_pages/counters/C00009.md) | D02 |
|
||||
| [C00046 Marginalise and discredit extremist groups](../../generated_pages/counters/C00046.md) | D04 |
|
||||
| [C00048 Name and Shame Influencers](../../generated_pages/counters/C00048.md) | D07 |
|
||||
| [C00051 Counter social engineering training](../../generated_pages/counters/C00051.md) | D02 |
|
||||
| [C00111 Reduce polarisation by connecting and presenting sympathetic renditions of opposite views](../../generated_pages/counters/C00111.md) | D04 |
|
||||
| [C00130 Mentorship: elders, youth, credit. Learn vicariously.](../../generated_pages/counters/C00130.md) | D07 |
|
||||
| [C00162 Unravel/target the Potemkin villages](../../generated_pages/counters/C00162.md) | D03 |
|
||||
| [C00169 develop a creative content hub](../../generated_pages/counters/C00169.md) | D03 |
|
||||
| [C00195 Redirect searches away from disinformation or extremist content](../../generated_pages/counters/C00195.md) | D02 |
|
||||
| [C00200 Respected figure (influencer) disavows misinfo](../../generated_pages/counters/C00200.md) | D03 |
|
||||
| [C00203 Stop offering press credentials to propaganda outlets](../../generated_pages/counters/C00203.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0010: Cultivate Ignorant Agents
|
||||
|
||||
**Summary**: Cultivate propagandists for a cause, the goals of which are not fully comprehended, and who are used cynically by the leaders of the cause. Independent actors use social media and specialised web sites to strategically reinforce and spread messages compatible with their own. Their networks are infiltrated and used by state media disinformation organisations to amplify the state’s own disinformation strategies against target populations. Many are traffickers in conspiracy theories or hoaxes, unified by a suspicion of Western governments and mainstream media. Their narratives, which appeal to leftists hostile to globalism and military intervention and nationalists against immigration, are frequently infiltrated and shaped by state-controlled trolls and altered news items from agencies such as RT and Sputnik. Also know as "useful idiots" or "unwitting agents".
|
||||
|
||||
**Tactic**: TA15 Establish Assets
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00009 Educate high profile influencers on best practices](../../generated_pages/counters/C00009.md) | D02 |
|
||||
| [C00046 Marginalise and discredit extremist groups](../../generated_pages/counters/C00046.md) | D04 |
|
||||
| [C00048 Name and Shame Influencers](../../generated_pages/counters/C00048.md) | D07 |
|
||||
| [C00051 Counter social engineering training](../../generated_pages/counters/C00051.md) | D02 |
|
||||
| [C00111 Reduce polarisation by connecting and presenting sympathetic renditions of opposite views](../../generated_pages/counters/C00111.md) | D04 |
|
||||
| [C00130 Mentorship: elders, youth, credit. Learn vicariously.](../../generated_pages/counters/C00130.md) | D07 |
|
||||
| [C00162 Unravel/target the Potemkin villages](../../generated_pages/counters/C00162.md) | D03 |
|
||||
| [C00169 develop a creative content hub](../../generated_pages/counters/C00169.md) | D03 |
|
||||
| [C00195 Redirect searches away from disinformation or extremist content](../../generated_pages/counters/C00195.md) | D02 |
|
||||
| [C00200 Respected figure (influencer) disavows misinfo](../../generated_pages/counters/C00200.md) | D03 |
|
||||
| [C00203 Stop offering press credentials to propaganda outlets](../../generated_pages/counters/C00203.md) | D03 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0014.001: Raise Funds from Malign Actors
|
||||
|
||||
**Summary**: Raising funds from malign actors may include contributions from foreign agents, cutouts or proxies, shell companies, dark money groups, etc.
|
||||
|
||||
**Tactic**: TA15 Establish Assets **Parent Technique:** T0014 Prepare Fundraising Campaigns
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0014.001: Raise Funds from Malign Actors
|
||||
|
||||
**Summary**: Raising funds from malign actors may include contributions from foreign agents, cutouts or proxies, shell companies, dark money groups, etc.
|
||||
|
||||
**Tactic**: TA15 Establish Assets
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0014.002: Raise Funds from Ignorant Agents
|
||||
|
||||
**Summary**: Raising funds from ignorant agents may include scams, donations intended for one stated purpose but then used for another, etc.
|
||||
|
||||
**Tactic**: TA15 Establish Assets **Parent Technique:** T0014 Prepare Fundraising Campaigns
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0014.002: Raise Funds from Ignorant Agents
|
||||
|
||||
**Summary**: Raising funds from ignorant agents may include scams, donations intended for one stated purpose but then used for another, etc.
|
||||
|
||||
**Tactic**: TA15 Establish Assets
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -22,51 +22,4 @@
|
||||
| [C00216 Use advertiser controls to stem flow of funds to bad actors](../../generated_pages/counters/C00216.md) | D02 |
|
||||
|
||||
|
||||
# Technique T0014: Prepare Fundraising Campaigns
|
||||
|
||||
**Summary**: Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities.
|
||||
|
||||
**Tactic**: TA15 Establish Assets
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00059 Verification of project before posting fund requests](../../generated_pages/counters/C00059.md) | D02 |
|
||||
| [C00155 Ban incident actors from funding sites](../../generated_pages/counters/C00155.md) | D02 |
|
||||
| [C00216 Use advertiser controls to stem flow of funds to bad actors](../../generated_pages/counters/C00216.md) | D02 |
|
||||
|
||||
|
||||
# Technique T0014: Prepare Fundraising Campaigns
|
||||
|
||||
**Summary**: Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities.
|
||||
|
||||
**Tactic**: TA15 Establish Assets
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00059 Verification of project before posting fund requests](../../generated_pages/counters/C00059.md) | D02 |
|
||||
| [C00155 Ban incident actors from funding sites](../../generated_pages/counters/C00155.md) | D02 |
|
||||
| [C00216 Use advertiser controls to stem flow of funds to bad actors](../../generated_pages/counters/C00216.md) | D02 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0015.001: Use Existing Hashtag
|
||||
|
||||
**Summary**: Use a dedicated, existing hashtag for the campaign/incident. This Technique covers behaviours previously documented by T0104.005: Use Hashtags, which has since been deprecated.
|
||||
|
||||
**Tactic**: TA06 Develop Content **Parent Technique:** T0015 Create Hashtags and Search Artefacts
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0015.001: Use Existing Hashtag
|
||||
|
||||
**Summary**: Use a dedicated, existing hashtag for the campaign/incident. This Technique covers behaviours previously documented by T0104.005: Use Hashtags, which has since been deprecated.
|
||||
|
||||
**Tactic**: TA06 Develop Content
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0015.002: Create New Hashtag
|
||||
|
||||
**Summary**: Create a campaign/incident specific hashtag. This Technique covers behaviours previously documented by T0104.006: Create Dedicated Hashtag, which has since been deprecated.
|
||||
|
||||
**Tactic**: TA06 Develop Content **Parent Technique:** T0015 Create Hashtags and Search Artefacts
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0015.002: Create New Hashtag
|
||||
|
||||
**Summary**: Create a campaign/incident specific hashtag. This Technique covers behaviours previously documented by T0104.006: Create Dedicated Hashtag, which has since been deprecated.
|
||||
|
||||
**Tactic**: TA06 Develop Content
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -21,49 +21,4 @@
|
||||
| [C00066 Co-opt a hashtag and drown it out (hijack it back)](../../generated_pages/counters/C00066.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0015: Create Hashtags and Search Artefacts
|
||||
|
||||
**Summary**: Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicise the story more widely through trending lists and search behaviour. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites).
|
||||
|
||||
**Tactic**: TA06 Develop Content
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00086 #WeAreNotSafe – Exposing How a Post-October 7th Disinformation Network Operates on Israeli Social Media](../../generated_pages/incidents/I00086.md) | In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”, which posted hashtags alongside campaign content (T0015: Create Hashtags and Search Artefacts):<br><br><i>“The accounts post generic images to fill their account feed to make the account seem real. They then employ a hidden hashtag in their posts, consisting of a seemingly random string of numbers and letters.<br><br>“The hypothesis regarding this tactic is that the group orchestrating these accounts utilizes these hashtags as a means of indexing them. This system likely serves a dual purpose: firstly, to keep track of the network’s expansive network of accounts and unique posts, and secondly, to streamline the process of boosting engagement among these accounts. By searching for these specific, unique hashtags, the group can quickly locate posts from their network and engage with them using other fake accounts, thereby artificially inflating the visibility and perceived authenticity of the fake account.”</i> |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00066 Co-opt a hashtag and drown it out (hijack it back)](../../generated_pages/counters/C00066.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0015: Create Hashtags and Search Artefacts
|
||||
|
||||
**Summary**: Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicise the story more widely through trending lists and search behaviour. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites).
|
||||
|
||||
**Tactic**: TA06 Develop Content
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00086 #WeAreNotSafe – Exposing How a Post-October 7th Disinformation Network Operates on Israeli Social Media](../../generated_pages/incidents/I00086.md) | In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”, which posted hashtags alongside campaign content (T0015: Create Hashtags and Search Artefacts):<br><br><i>“The accounts post generic images to fill their account feed to make the account seem real. They then employ a hidden hashtag in their posts, consisting of a seemingly random string of numbers and letters.<br><br>“The hypothesis regarding this tactic is that the group orchestrating these accounts utilizes these hashtags as a means of indexing them. This system likely serves a dual purpose: firstly, to keep track of the network’s expansive network of accounts and unique posts, and secondly, to streamline the process of boosting engagement among these accounts. By searching for these specific, unique hashtags, the group can quickly locate posts from their network and engage with them using other fake accounts, thereby artificially inflating the visibility and perceived authenticity of the fake account.”</i> |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00066 Co-opt a hashtag and drown it out (hijack it back)](../../generated_pages/counters/C00066.md) | D03 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -25,57 +25,4 @@
|
||||
| [C00178 Fill information voids with non-disinformation content](../../generated_pages/counters/C00178.md) | D04 |
|
||||
|
||||
|
||||
# Technique T0016: Create Clickbait
|
||||
|
||||
**Summary**: Create attention grabbing headlines (outrage, doubt, humour) required to drive traffic & engagement. This is a key asset.
|
||||
|
||||
**Tactic**: TA05 Microtarget
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00079 Three thousand fake tanks](../../generated_pages/incidents/I00079.md) | <i>“On January 4 [2017], however, the Donbas News International (DNI) agency, based in Donetsk, Ukraine, and (since September 2016) an official state media outlet of the unrecognized separatist Donetsk People’s Republic, ran an article under the sensational headline, “US sends 3,600 tanks against Russia — massive NATO deployment under way.” DNI is run by Finnish exile Janus Putkonen, described by the Finnish national broadcaster, YLE, as a “Finnish info warrior”, and the first foreigner to be granted a Donetsk passport.<br><br>“The equally sensational opening paragraph ran, “The NATO war preparation against Russia, ‘Operation Atlantic Resolve’, is in full swing. 2,000 US tanks will be sent in coming days from Germany to Eastern Europe, and 1,600 US tanks is deployed to storage facilities in the Netherlands. At the same time, NATO countries are sending thousands of soldiers in to Russian borders.”<br><br>“The report is based around an obvious factual error, conflating the total number of vehicles with the actual number of tanks, and therefore multiplying the actual tank force 20 times over. For context, military website globalfirepower.com puts the total US tank force at 8,848. If the DNI story had been true, it would have meant sending 40% of all the US’ main battle tanks to Europe in one go.<br><br>“Could this have been an innocent mistake? The simple answer is “no”. The journalist who penned the story had a sufficient command of the details to be able to write, later in the same article, “In January, 26 tanks, 100 other vehicles and 120 containers will be transported by train to Lithuania. Germany will send the 122nd Infantry Battalion.” Yet the same author apparently believed, in the headline and first paragraph, that every single vehicle in Atlantic Resolve is a tank. To call this an innocent mistake is simply not plausible.<br><br>“The DNI story can only realistically be considered a deliberate fake designed to caricaturize and demonize NATO, the United States and Germany (tactfully referred to in the report as having “rolled over Eastern Europe in its war of extermination 75 years ago”) by grossly overstating the number of MBTs involved.”</i><br><br>This behaviour matches T0016: Create Clickbait because the person who wrote the story is shown to be aware of the fact that there were non-tank vehicles later in their story, but still chose to give the article a sensationalist headline claiming that all vehicles being sent were tanks. |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00073 Inoculate populations through media literacy training](../../generated_pages/counters/C00073.md) | D02 |
|
||||
| [C00076 Prohibit images in political discourse channels](../../generated_pages/counters/C00076.md) | D02 |
|
||||
| [C00105 Buy more advertising than misinformation creators](../../generated_pages/counters/C00105.md) | D03 |
|
||||
| [C00106 Click-bait centrist content](../../generated_pages/counters/C00106.md) | D03 |
|
||||
| [C00178 Fill information voids with non-disinformation content](../../generated_pages/counters/C00178.md) | D04 |
|
||||
|
||||
|
||||
# Technique T0016: Create Clickbait
|
||||
|
||||
**Summary**: Create attention grabbing headlines (outrage, doubt, humour) required to drive traffic & engagement. This is a key asset.
|
||||
|
||||
**Tactic**: TA05 Microtarget
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00079 Three thousand fake tanks](../../generated_pages/incidents/I00079.md) | <i>“On January 4 [2017], however, the Donbas News International (DNI) agency, based in Donetsk, Ukraine, and (since September 2016) an official state media outlet of the unrecognized separatist Donetsk People’s Republic, ran an article under the sensational headline, “US sends 3,600 tanks against Russia — massive NATO deployment under way.” DNI is run by Finnish exile Janus Putkonen, described by the Finnish national broadcaster, YLE, as a “Finnish info warrior”, and the first foreigner to be granted a Donetsk passport.<br><br>“The equally sensational opening paragraph ran, “The NATO war preparation against Russia, ‘Operation Atlantic Resolve’, is in full swing. 2,000 US tanks will be sent in coming days from Germany to Eastern Europe, and 1,600 US tanks is deployed to storage facilities in the Netherlands. At the same time, NATO countries are sending thousands of soldiers in to Russian borders.”<br><br>“The report is based around an obvious factual error, conflating the total number of vehicles with the actual number of tanks, and therefore multiplying the actual tank force 20 times over. For context, military website globalfirepower.com puts the total US tank force at 8,848. If the DNI story had been true, it would have meant sending 40% of all the US’ main battle tanks to Europe in one go.<br><br>“Could this have been an innocent mistake? The simple answer is “no”. The journalist who penned the story had a sufficient command of the details to be able to write, later in the same article, “In January, 26 tanks, 100 other vehicles and 120 containers will be transported by train to Lithuania. Germany will send the 122nd Infantry Battalion.” Yet the same author apparently believed, in the headline and first paragraph, that every single vehicle in Atlantic Resolve is a tank. To call this an innocent mistake is simply not plausible.<br><br>“The DNI story can only realistically be considered a deliberate fake designed to caricaturize and demonize NATO, the United States and Germany (tactfully referred to in the report as having “rolled over Eastern Europe in its war of extermination 75 years ago”) by grossly overstating the number of MBTs involved.”</i><br><br>This behaviour matches T0016: Create Clickbait because the person who wrote the story is shown to be aware of the fact that there were non-tank vehicles later in their story, but still chose to give the article a sensationalist headline claiming that all vehicles being sent were tanks. |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00073 Inoculate populations through media literacy training](../../generated_pages/counters/C00073.md) | D02 |
|
||||
| [C00076 Prohibit images in political discourse channels](../../generated_pages/counters/C00076.md) | D02 |
|
||||
| [C00105 Buy more advertising than misinformation creators](../../generated_pages/counters/C00105.md) | D03 |
|
||||
| [C00106 Click-bait centrist content](../../generated_pages/counters/C00106.md) | D03 |
|
||||
| [C00178 Fill information voids with non-disinformation content](../../generated_pages/counters/C00178.md) | D04 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0017.001: Conduct Crowdfunding Campaigns
|
||||
|
||||
**Summary**: An influence operation may Conduct Crowdfunding Campaigns on platforms such as GoFundMe, GiveSendGo, Tipeee, Patreon, etc.
|
||||
|
||||
**Tactic**: TA10 Drive Offline Activity **Parent Technique:** T0017 Conduct Fundraising
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0017.001: Conduct Crowdfunding Campaigns
|
||||
|
||||
**Summary**: An influence operation may Conduct Crowdfunding Campaigns on platforms such as GoFundMe, GiveSendGo, Tipeee, Patreon, etc.
|
||||
|
||||
**Tactic**: TA10 Drive Offline Activity
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -22,51 +22,4 @@
|
||||
| [C00067 Denigrate the recipient/ project (of online funding)](../../generated_pages/counters/C00067.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0017: Conduct Fundraising
|
||||
|
||||
**Summary**: Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services166 on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns to promote operation messaging while raising money to support its activities.
|
||||
|
||||
**Tactic**: TA10 Drive Offline Activity
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00110 How COVID-19 conspiracists and extremists use crowdfunding platforms to fund their activities](../../generated_pages/incidents/I00110.md) | The EU Disinfo Lab produced a report into disinformation published on crowdfunding platforms:<br><br><i>More worrisome is the direct monetisation of disinformation happening on crowdfunding platforms: on Kickstarter, we found a user openly raising money for a documentary project suggesting that COVID-19 is a conspiracy.</i><br><br>A Kickstarter user attempted to use the platform to fund production of a documentary (T0017: Conduct Fundraising, T0087: Develop Video-Based Content, T0146: Account Asset, T0148.006: Crowdfunding Platform).<br><br><i>On Patreon, we found several instances of direct monetisation of COVID-19 disinformation, including posts promoting a device allegedly protecting against COVID-19 and 5G, as well as posts related to the “Plandemic” conspiracy video, which gained attention on YouTube before being removed by the platform.<br><br>We also found an account called “Stranger than fiction” entirely dedicated to disinformation, which openly states that their content was “Banned by screwtube and fakebook, our videos have been viewed over a billion times.”</i><br><br>The “Stranger than fiction” user presented itself as an alternative news source which had been banned from other platforms (T0146: Account Asset, T0097.202: News Outlet Persona, T0121.001: Bypass Content Bocking, T0152.012: Subscription Service Platform).<br><br><i>On the US-based crowdfunding platform IndieGogo, EU DisinfoLab found a successful crowdfunding campaign of €133.903 for a book called Revolution Q. This book, now also available on Amazon, claims to be “Written for both newcomers and long-time QAnon fans alike, this book is a treasure-trove of information designed to help everyone weather The Storm.”</i><br><br>An IndieGogo account was used to gather funds to produce a book on QAnon (T0017: Conduct Fundraising, T0085.005: Develop Book, T0146: Account Asset, T0148.006: Crowdfunding Platform), with the book later sold on Amazon marketplace (T0148.007: eCommerce Platform). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00058 Report crowdfunder as violator](../../generated_pages/counters/C00058.md) | D02 |
|
||||
| [C00067 Denigrate the recipient/ project (of online funding)](../../generated_pages/counters/C00067.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0017: Conduct Fundraising
|
||||
|
||||
**Summary**: Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services166 on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns to promote operation messaging while raising money to support its activities.
|
||||
|
||||
**Tactic**: TA10 Drive Offline Activity
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00110 How COVID-19 conspiracists and extremists use crowdfunding platforms to fund their activities](../../generated_pages/incidents/I00110.md) | The EU Disinfo Lab produced a report into disinformation published on crowdfunding platforms:<br><br><i>More worrisome is the direct monetisation of disinformation happening on crowdfunding platforms: on Kickstarter, we found a user openly raising money for a documentary project suggesting that COVID-19 is a conspiracy.</i><br><br>A Kickstarter user attempted to use the platform to fund production of a documentary (T0017: Conduct Fundraising, T0087: Develop Video-Based Content, T0146: Account Asset, T0148.006: Crowdfunding Platform).<br><br><i>On Patreon, we found several instances of direct monetisation of COVID-19 disinformation, including posts promoting a device allegedly protecting against COVID-19 and 5G, as well as posts related to the “Plandemic” conspiracy video, which gained attention on YouTube before being removed by the platform.<br><br>We also found an account called “Stranger than fiction” entirely dedicated to disinformation, which openly states that their content was “Banned by screwtube and fakebook, our videos have been viewed over a billion times.”</i><br><br>The “Stranger than fiction” user presented itself as an alternative news source which had been banned from other platforms (T0146: Account Asset, T0097.202: News Outlet Persona, T0121.001: Bypass Content Bocking, T0152.012: Subscription Service Platform).<br><br><i>On the US-based crowdfunding platform IndieGogo, EU DisinfoLab found a successful crowdfunding campaign of €133.903 for a book called Revolution Q. This book, now also available on Amazon, claims to be “Written for both newcomers and long-time QAnon fans alike, this book is a treasure-trove of information designed to help everyone weather The Storm.”</i><br><br>An IndieGogo account was used to gather funds to produce a book on QAnon (T0017: Conduct Fundraising, T0085.005: Develop Book, T0146: Account Asset, T0148.006: Crowdfunding Platform), with the book later sold on Amazon marketplace (T0148.007: eCommerce Platform). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00058 Report crowdfunder as violator](../../generated_pages/counters/C00058.md) | D02 |
|
||||
| [C00067 Denigrate the recipient/ project (of online funding)](../../generated_pages/counters/C00067.md) | D03 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -21,49 +21,4 @@
|
||||
| [C00065 Reduce political targeting](../../generated_pages/counters/C00065.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0018: Purchase Targeted Advertisements
|
||||
|
||||
**Summary**: Create or fund advertisements targeted at specific populations
|
||||
|
||||
**Tactic**: TA05 Microtarget
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00097 Report: Not Just Algorithms](../../generated_pages/incidents/I00097.md) | <i>This report explores the role of four systems (recommender systems, content moderation systems, ad approval systems and ad management systems) in creating risks around eating disorders.<br><br>[...]<br><br>Ad approval systems can create risks. We created 12 ‘fake’ ads that promoted dangerous weight loss techniques and behaviours. We tested to see if these ads would be approved to run, and they were. This means dangerous behaviours can be promoted in paid-for advertising. (Requests to run ads were withdrawn after approval or rejection, so no dangerous advertising was published as a result of this experiment.)<br><br>Specifically: On TikTok, 100% of the ads were approved to run; On Facebook, 83% of the ads were approved to run; On Google, 75% of the ads were approved to run.<br><br>Ad management systems can create risks. We investigated how platforms allow advertisers to target users, and found that it is possible to target people who may be interested in pro-eating disorder content.<br><br>Specifically: On TikTok: End-users who interact with pro-eating disorder content on TikTok, download advertisers’ eating disorder apps or visit their websites can be targeted; On Meta: End-users who interact with pro-eating disorder content on Meta, download advertisers’ eating disorder apps or visit their websites can be targeted; On X: End-users who follow pro- eating disorder accounts, or ‘look’ like them, can be targeted; On Google: End-users who search specific words or combinations of words (including pro-eating disorder words), watch pro-eating disorder YouTube channels and probably those who download eating disorder and mental health apps can be targeted.</i><br><br>Advertising platforms managed by TikTok, Facebook, and Google approved adverts to be displayed on their platforms. These platforms enabled users to deliver targeted advertising to potentially vulnerable platform users (T0018: Purchase Targeted Advertisements, T0153.005: Online Advertising Platform). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00065 Reduce political targeting](../../generated_pages/counters/C00065.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0018: Purchase Targeted Advertisements
|
||||
|
||||
**Summary**: Create or fund advertisements targeted at specific populations
|
||||
|
||||
**Tactic**: TA05 Microtarget
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00097 Report: Not Just Algorithms](../../generated_pages/incidents/I00097.md) | <i>This report explores the role of four systems (recommender systems, content moderation systems, ad approval systems and ad management systems) in creating risks around eating disorders.<br><br>[...]<br><br>Ad approval systems can create risks. We created 12 ‘fake’ ads that promoted dangerous weight loss techniques and behaviours. We tested to see if these ads would be approved to run, and they were. This means dangerous behaviours can be promoted in paid-for advertising. (Requests to run ads were withdrawn after approval or rejection, so no dangerous advertising was published as a result of this experiment.)<br><br>Specifically: On TikTok, 100% of the ads were approved to run; On Facebook, 83% of the ads were approved to run; On Google, 75% of the ads were approved to run.<br><br>Ad management systems can create risks. We investigated how platforms allow advertisers to target users, and found that it is possible to target people who may be interested in pro-eating disorder content.<br><br>Specifically: On TikTok: End-users who interact with pro-eating disorder content on TikTok, download advertisers’ eating disorder apps or visit their websites can be targeted; On Meta: End-users who interact with pro-eating disorder content on Meta, download advertisers’ eating disorder apps or visit their websites can be targeted; On X: End-users who follow pro- eating disorder accounts, or ‘look’ like them, can be targeted; On Google: End-users who search specific words or combinations of words (including pro-eating disorder words), watch pro-eating disorder YouTube channels and probably those who download eating disorder and mental health apps can be targeted.</i><br><br>Advertising platforms managed by TikTok, Facebook, and Google approved adverts to be displayed on their platforms. These platforms enabled users to deliver targeted advertising to potentially vulnerable platform users (T0018: Purchase Targeted Advertisements, T0153.005: Online Advertising Platform). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00065 Reduce political targeting](../../generated_pages/counters/C00065.md) | D03 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -20,47 +20,4 @@
|
||||
| [C00090 Fake engagement system](../../generated_pages/counters/C00090.md) | D05 |
|
||||
|
||||
|
||||
# Technique T0020: Trial Content
|
||||
|
||||
**Summary**: Iteratively test incident performance (messages, content etc), e.g. A/B test headline/content enagagement metrics; website and/or funding campaign conversion rates
|
||||
|
||||
**Tactic**: TA08 Conduct Pump Priming
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00090 Fake engagement system](../../generated_pages/counters/C00090.md) | D05 |
|
||||
|
||||
|
||||
# Technique T0020: Trial Content
|
||||
|
||||
**Summary**: Iteratively test incident performance (messages, content etc), e.g. A/B test headline/content enagagement metrics; website and/or funding campaign conversion rates
|
||||
|
||||
**Tactic**: TA08 Conduct Pump Priming
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00090 Fake engagement system](../../generated_pages/counters/C00090.md) | D05 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0022.001: Amplify Existing Conspiracy Theory Narratives
|
||||
|
||||
**Summary**: An influence operation may amplify an existing conspiracy theory narrative that aligns with its incident or campaign goals. By amplifying existing conspiracy theory narratives, operators can leverage the power of the existing communities that support and propagate those theories without needing to expend resources creating new narratives or building momentum and buy in around new narratives.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives **Parent Technique:** T0022 Leverage Conspiracy Theory Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0022.001: Amplify Existing Conspiracy Theory Narratives
|
||||
|
||||
**Summary**: An influence operation may amplify an existing conspiracy theory narrative that aligns with its incident or campaign goals. By amplifying existing conspiracy theory narratives, operators can leverage the power of the existing communities that support and propagate those theories without needing to expend resources creating new narratives or building momentum and buy in around new narratives.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0022.002: Develop Original Conspiracy Theory Narratives
|
||||
|
||||
**Summary**: While this requires more resources than amplifying existing conspiracy theory narratives, an influence operation may develop original conspiracy theory narratives in order to achieve greater control and alignment over the narrative and their campaign goals. Prominent examples include the USSR's Operation INFEKTION disinformation campaign run by the KGB in the 1980s to plant the idea that the United States had invented HIV/AIDS as part of a biological weapons research project at Fort Detrick, Maryland. More recently, Fort Detrick featured prominently in a new conspiracy theory narratives around the origins of the COVID-19 outbreak and pandemic.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives **Parent Technique:** T0022 Leverage Conspiracy Theory Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0022.002: Develop Original Conspiracy Theory Narratives
|
||||
|
||||
**Summary**: While this requires more resources than amplifying existing conspiracy theory narratives, an influence operation may develop original conspiracy theory narratives in order to achieve greater control and alignment over the narrative and their campaign goals. Prominent examples include the USSR's Operation INFEKTION disinformation campaign run by the KGB in the 1980s to plant the idea that the United States had invented HIV/AIDS as part of a biological weapons research project at Fort Detrick, Maryland. More recently, Fort Detrick featured prominently in a new conspiracy theory narratives around the origins of the COVID-19 outbreak and pandemic.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -24,55 +24,4 @@
|
||||
| [C00164 compatriot policy](../../generated_pages/counters/C00164.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0022: Leverage Conspiracy Theory Narratives
|
||||
|
||||
**Summary**: "Conspiracy narratives" appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalised or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the "firehose of falsehoods" model.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00096 Strengthen institutions that are always truth tellers](../../generated_pages/counters/C00096.md) | D07 |
|
||||
| [C00119 Engage payload and debunk.](../../generated_pages/counters/C00119.md) | D07 |
|
||||
| [C00156 Better tell your country or organisation story](../../generated_pages/counters/C00156.md) | D03 |
|
||||
| [C00161 Coalition Building with stakeholders and Third-Party Inducements](../../generated_pages/counters/C00161.md) | D07 |
|
||||
| [C00164 compatriot policy](../../generated_pages/counters/C00164.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0022: Leverage Conspiracy Theory Narratives
|
||||
|
||||
**Summary**: "Conspiracy narratives" appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalised or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the "firehose of falsehoods" model.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00096 Strengthen institutions that are always truth tellers](../../generated_pages/counters/C00096.md) | D07 |
|
||||
| [C00119 Engage payload and debunk.](../../generated_pages/counters/C00119.md) | D07 |
|
||||
| [C00156 Better tell your country or organisation story](../../generated_pages/counters/C00156.md) | D03 |
|
||||
| [C00161 Coalition Building with stakeholders and Third-Party Inducements](../../generated_pages/counters/C00161.md) | D07 |
|
||||
| [C00164 compatriot policy](../../generated_pages/counters/C00164.md) | D03 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0023.001: Reframe Context
|
||||
|
||||
**Summary**: Reframing context refers to removing an event from its surrounding context to distort its intended meaning. Rather than deny that an event occurred, reframing context frames an event in a manner that may lead the target audience to draw a different conclusion about its intentions.
|
||||
|
||||
**Tactic**: TA06 Develop Content **Parent Technique:** T0023 Distort Facts
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0023.001: Reframe Context
|
||||
|
||||
**Summary**: Reframing context refers to removing an event from its surrounding context to distort its intended meaning. Rather than deny that an event occurred, reframing context frames an event in a manner that may lead the target audience to draw a different conclusion about its intentions.
|
||||
|
||||
**Tactic**: TA06 Develop Content
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0023.002: Edit Open-Source Content
|
||||
|
||||
**Summary**: An influence operation may edit open-source content, such as collaborative blogs or encyclopaedias, to promote its narratives on outlets with existing credibility and audiences. Editing open-source content may allow an operation to post content on platforms without dedicating resources to the creation and maintenance of its own assets.
|
||||
|
||||
**Tactic**: TA06 Develop Content **Parent Technique:** T0023 Distort Facts
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0023.002: Edit Open-Source Content
|
||||
|
||||
**Summary**: An influence operation may edit open-source content, such as collaborative blogs or encyclopaedias, to promote its narratives on outlets with existing credibility and audiences. Editing open-source content may allow an operation to post content on platforms without dedicating resources to the creation and maintenance of its own assets.
|
||||
|
||||
**Tactic**: TA06 Develop Content
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -20,47 +20,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0023: Distort Facts
|
||||
|
||||
**Summary**: Change, twist, or exaggerate existing facts to construct a narrative that differs from reality. Examples: images and ideas can be distorted by being placed in an improper content
|
||||
|
||||
**Tactic**: TA06 Develop Content
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00079 Three thousand fake tanks](../../generated_pages/incidents/I00079.md) | <i>“On January 4 [2017], a little-known news site based in Donetsk, Ukraine published an article claiming that the United States was sending 3,600 tanks to Europe as part of “the NATO war preparation against Russia”.<br><br> “Like much fake news, this story started with a grain of truth: the US was about to reinforce its armored units in Europe. However, the article converted literally thousands of other vehicles — including hundreds of Humvees and trailers — into tanks, building the US force into something 20 times more powerful than it actually was.<br><br> “The story caught on online. Within three days it had been repeated by a dozen websites in the United States, Canada and Europe, and shared some 40,000 times. It was translated into Norwegian; quoted, unchallenged, by Russian state news agency RIA Novosti; and spread among Russian-language websites.<br><br> “It was also an obvious fake, as any Google news search would have revealed. Yet despite its evident falsehood, it spread widely, and not just in directly Kremlin-run media. Tracking the spread of this fake therefore shines a light on the wider question of how fake stories are dispersed.”</i><br><br> Russian state news agency RIA Novosti presents themselves as a news outlet (T0097.202: News Outlet Persona). RIO Novosti is a real news outlet (T0143.001: Authentic Persona), but it did not carry out a basic investigation into the veracity of the narrative they published implicitly expected of institutions presenting themselves as news outlets.<br><br> We can’t know how or why this narrative ended up being published by RIA Novosti, but we know that it presented a distorted reality as authentic information (T0023: Distort Facts), claiming that the US was sending 3,600 tanks, instead of 3,600 vehicles which included ~180 tanks. |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0023: Distort Facts
|
||||
|
||||
**Summary**: Change, twist, or exaggerate existing facts to construct a narrative that differs from reality. Examples: images and ideas can be distorted by being placed in an improper content
|
||||
|
||||
**Tactic**: TA06 Develop Content
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00079 Three thousand fake tanks](../../generated_pages/incidents/I00079.md) | <i>“On January 4 [2017], a little-known news site based in Donetsk, Ukraine published an article claiming that the United States was sending 3,600 tanks to Europe as part of “the NATO war preparation against Russia”.<br><br> “Like much fake news, this story started with a grain of truth: the US was about to reinforce its armored units in Europe. However, the article converted literally thousands of other vehicles — including hundreds of Humvees and trailers — into tanks, building the US force into something 20 times more powerful than it actually was.<br><br> “The story caught on online. Within three days it had been repeated by a dozen websites in the United States, Canada and Europe, and shared some 40,000 times. It was translated into Norwegian; quoted, unchallenged, by Russian state news agency RIA Novosti; and spread among Russian-language websites.<br><br> “It was also an obvious fake, as any Google news search would have revealed. Yet despite its evident falsehood, it spread widely, and not just in directly Kremlin-run media. Tracking the spread of this fake therefore shines a light on the wider question of how fake stories are dispersed.”</i><br><br> Russian state news agency RIA Novosti presents themselves as a news outlet (T0097.202: News Outlet Persona). RIO Novosti is a real news outlet (T0143.001: Authentic Persona), but it did not carry out a basic investigation into the veracity of the narrative they published implicitly expected of institutions presenting themselves as news outlets.<br><br> We can’t know how or why this narrative ended up being published by RIA Novosti, but we know that it presented a distorted reality as authentic information (T0023: Distort Facts), claiming that the US was sending 3,600 tanks, instead of 3,600 vehicles which included ~180 tanks. |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -24,55 +24,4 @@
|
||||
| [C00123 Remove or rate limit botnets](../../generated_pages/counters/C00123.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0029: Online Polls
|
||||
|
||||
**Summary**: Create fake online polls, or manipulate existing online polls. Data gathering tactic to target those who engage, and potentially their networks of friends/followers as well
|
||||
|
||||
**Tactic**: TA07 Select Channels and Affordances
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00044 Keep people from posting to social media immediately](../../generated_pages/counters/C00044.md) | D03 |
|
||||
| [C00097 Require use of verified identities to contribute to poll or comment](../../generated_pages/counters/C00097.md) | D02 |
|
||||
| [C00101 Create friction by rate-limiting engagement](../../generated_pages/counters/C00101.md) | D04 |
|
||||
| [C00103 Create a bot that engages / distract trolls](../../generated_pages/counters/C00103.md) | D05 |
|
||||
| [C00123 Remove or rate limit botnets](../../generated_pages/counters/C00123.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0029: Online Polls
|
||||
|
||||
**Summary**: Create fake online polls, or manipulate existing online polls. Data gathering tactic to target those who engage, and potentially their networks of friends/followers as well
|
||||
|
||||
**Tactic**: TA07 Select Channels and Affordances
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00044 Keep people from posting to social media immediately](../../generated_pages/counters/C00044.md) | D03 |
|
||||
| [C00097 Require use of verified identities to contribute to poll or comment](../../generated_pages/counters/C00097.md) | D02 |
|
||||
| [C00101 Create friction by rate-limiting engagement](../../generated_pages/counters/C00101.md) | D04 |
|
||||
| [C00103 Create a bot that engages / distract trolls](../../generated_pages/counters/C00103.md) | D05 |
|
||||
| [C00123 Remove or rate limit botnets](../../generated_pages/counters/C00123.md) | D03 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -23,53 +23,4 @@
|
||||
| [C00160 find and train influencers](../../generated_pages/counters/C00160.md) | D02 |
|
||||
|
||||
|
||||
# Technique T0039: Bait Influencer
|
||||
|
||||
**Summary**: Influencers are people on social media platforms who have large audiences.<br /> <br />Threat Actors can try to trick Influencers such as celebrities, journalists, or local leaders who aren’t associated with their campaign into amplifying campaign content. This gives them access to the Influencer’s audience without having to go through the effort of building it themselves, and it helps legitimise their message by associating it with the Influencer, benefitting from their audience’s trust in them.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00087 Make more noise than the disinformation](../../generated_pages/counters/C00087.md) | D04 |
|
||||
| [C00114 Don't engage with payloads](../../generated_pages/counters/C00114.md) | D02 |
|
||||
| [C00154 Ask media not to report false information](../../generated_pages/counters/C00154.md) | D02 |
|
||||
| [C00160 find and train influencers](../../generated_pages/counters/C00160.md) | D02 |
|
||||
|
||||
|
||||
# Technique T0039: Bait Influencer
|
||||
|
||||
**Summary**: Influencers are people on social media platforms who have large audiences.<br /> <br />Threat Actors can try to trick Influencers such as celebrities, journalists, or local leaders who aren’t associated with their campaign into amplifying campaign content. This gives them access to the Influencer’s audience without having to go through the effort of building it themselves, and it helps legitimise their message by associating it with the Influencer, benefitting from their audience’s trust in them.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00087 Make more noise than the disinformation](../../generated_pages/counters/C00087.md) | D04 |
|
||||
| [C00114 Don't engage with payloads](../../generated_pages/counters/C00114.md) | D02 |
|
||||
| [C00154 Ask media not to report false information](../../generated_pages/counters/C00154.md) | D02 |
|
||||
| [C00160 find and train influencers](../../generated_pages/counters/C00160.md) | D02 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -20,47 +20,4 @@
|
||||
| [C00112 "Prove they are not an op!"](../../generated_pages/counters/C00112.md) | D02 |
|
||||
|
||||
|
||||
# Technique T0040: Demand Insurmountable Proof
|
||||
|
||||
**Summary**: Campaigns often leverage tactical and informational asymmetries on the threat surface, as seen in the Distort and Deny strategies, and the "firehose of misinformation". Specifically, conspiracy theorists can be repeatedly wrong, but advocates of the truth need to be perfect. By constantly escalating demands for proof, propagandists can effectively leverage this asymmetry while also priming its future use, often with an even greater asymmetric advantage. The conspiracist is offered freer rein for a broader range of "questions" while the truth teller is burdened with higher and higher standards of proof.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00112 "Prove they are not an op!"](../../generated_pages/counters/C00112.md) | D02 |
|
||||
|
||||
|
||||
# Technique T0040: Demand Insurmountable Proof
|
||||
|
||||
**Summary**: Campaigns often leverage tactical and informational asymmetries on the threat surface, as seen in the Distort and Deny strategies, and the "firehose of misinformation". Specifically, conspiracy theorists can be repeatedly wrong, but advocates of the truth need to be perfect. By constantly escalating demands for proof, propagandists can effectively leverage this asymmetry while also priming its future use, often with an even greater asymmetric advantage. The conspiracist is offered freer rein for a broader range of "questions" while the truth teller is burdened with higher and higher standards of proof.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00112 "Prove they are not an op!"](../../generated_pages/counters/C00112.md) | D02 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0042: Seed Kernel of Truth
|
||||
|
||||
**Summary**: Wrap lies or altered context/facts around truths. Influence campaigns pursue a variety of objectives with respect to target audiences, prominent among them: 1. undermine a narrative commonly referenced in the target audience; or 2. promote a narrative less common in the target audience, but preferred by the attacker. In both cases, the attacker is presented with a heavy lift. They must change the relative importance of various narratives in the interpretation of events, despite contrary tendencies. When messaging makes use of factual reporting to promote these adjustments in the narrative space, they are less likely to be dismissed out of hand; when messaging can juxtapose a (factual) truth about current affairs with the (abstract) truth explicated in these narratives, propagandists can undermine or promote them selectively. Context matters.
|
||||
|
||||
**Tactic**: TA08 Conduct Pump Priming
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0042: Seed Kernel of Truth
|
||||
|
||||
**Summary**: Wrap lies or altered context/facts around truths. Influence campaigns pursue a variety of objectives with respect to target audiences, prominent among them: 1. undermine a narrative commonly referenced in the target audience; or 2. promote a narrative less common in the target audience, but preferred by the attacker. In both cases, the attacker is presented with a heavy lift. They must change the relative importance of various narratives in the interpretation of events, despite contrary tendencies. When messaging makes use of factual reporting to promote these adjustments in the narrative space, they are less likely to be dismissed out of hand; when messaging can juxtapose a (factual) truth about current affairs with the (abstract) truth explicated in these narratives, propagandists can undermine or promote them selectively. Context matters.
|
||||
|
||||
**Tactic**: TA08 Conduct Pump Priming
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -21,49 +21,4 @@
|
||||
| [C00118 Repurpose images with new text](../../generated_pages/counters/C00118.md) | D04 |
|
||||
|
||||
|
||||
# Technique T0044: Seed Distortions
|
||||
|
||||
**Summary**: Try a wide variety of messages in the early hours surrounding an incident or event, to give a misleading account or impression.
|
||||
|
||||
**Tactic**: TA08 Conduct Pump Priming
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00086 Distract from noise with addictive content](../../generated_pages/counters/C00086.md) | D04 |
|
||||
| [C00118 Repurpose images with new text](../../generated_pages/counters/C00118.md) | D04 |
|
||||
|
||||
|
||||
# Technique T0044: Seed Distortions
|
||||
|
||||
**Summary**: Try a wide variety of messages in the early hours surrounding an incident or event, to give a misleading account or impression.
|
||||
|
||||
**Tactic**: TA08 Conduct Pump Priming
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00086 Distract from noise with addictive content](../../generated_pages/counters/C00086.md) | D04 |
|
||||
| [C00118 Repurpose images with new text](../../generated_pages/counters/C00118.md) | D04 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -21,49 +21,4 @@
|
||||
| [C00184 Media exposure](../../generated_pages/counters/C00184.md) | D04 |
|
||||
|
||||
|
||||
# Technique T0045: Use Fake Experts
|
||||
|
||||
**Summary**: Use the fake experts that were set up during Establish Legitimacy. Pseudo-experts are disposable assets that often appear once and then disappear. Give "credility" to misinformation. Take advantage of credential bias
|
||||
|
||||
**Tactic**: TA08 Conduct Pump Priming
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00113 Debunk and defuse a fake expert / credentials.](../../generated_pages/counters/C00113.md) | D02 |
|
||||
| [C00184 Media exposure](../../generated_pages/counters/C00184.md) | D04 |
|
||||
|
||||
|
||||
# Technique T0045: Use Fake Experts
|
||||
|
||||
**Summary**: Use the fake experts that were set up during Establish Legitimacy. Pseudo-experts are disposable assets that often appear once and then disappear. Give "credility" to misinformation. Take advantage of credential bias
|
||||
|
||||
**Tactic**: TA08 Conduct Pump Priming
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00113 Debunk and defuse a fake expert / credentials.](../../generated_pages/counters/C00113.md) | D02 |
|
||||
| [C00184 Media exposure](../../generated_pages/counters/C00184.md) | D04 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -20,47 +20,4 @@
|
||||
| [C00117 Downgrade / de-amplify so message is seen by fewer people](../../generated_pages/counters/C00117.md) | D04 |
|
||||
|
||||
|
||||
# Technique T0046: Use Search Engine Optimisation
|
||||
|
||||
**Summary**: Manipulate content engagement metrics (ie: Reddit & Twitter) to influence/impact news search results (e.g. Google), also elevates RT & Sputnik headline into Google news alert emails. aka "Black-hat SEO"
|
||||
|
||||
**Tactic**: TA08 Conduct Pump Priming
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00117 Downgrade / de-amplify so message is seen by fewer people](../../generated_pages/counters/C00117.md) | D04 |
|
||||
|
||||
|
||||
# Technique T0046: Use Search Engine Optimisation
|
||||
|
||||
**Summary**: Manipulate content engagement metrics (ie: Reddit & Twitter) to influence/impact news search results (e.g. Google), also elevates RT & Sputnik headline into Google news alert emails. aka "Black-hat SEO"
|
||||
|
||||
**Tactic**: TA08 Conduct Pump Priming
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00117 Downgrade / de-amplify so message is seen by fewer people](../../generated_pages/counters/C00117.md) | D04 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -20,47 +20,4 @@
|
||||
| [C00120 Open dialogue about design of platforms to produce different outcomes](../../generated_pages/counters/C00120.md) | D07 |
|
||||
|
||||
|
||||
# Technique T0047: Censor Social Media as a Political Force
|
||||
|
||||
**Summary**: Use political influence or the power of state to stop critical social media comments. Government requested/driven content take downs (see Google Transperancy reports).
|
||||
|
||||
**Tactic**: TA18 Drive Online Harms
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00120 Open dialogue about design of platforms to produce different outcomes](../../generated_pages/counters/C00120.md) | D07 |
|
||||
|
||||
|
||||
# Technique T0047: Censor Social Media as a Political Force
|
||||
|
||||
**Summary**: Use political influence or the power of state to stop critical social media comments. Government requested/driven content take downs (see Google Transperancy reports).
|
||||
|
||||
**Tactic**: TA18 Drive Online Harms
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00120 Open dialogue about design of platforms to produce different outcomes](../../generated_pages/counters/C00120.md) | D07 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0048.001: Boycott/"Cancel" Opponents
|
||||
|
||||
**Summary**: Cancel culture refers to the phenomenon in which individuals collectively refrain from supporting an individual, organisation, business, or other entity, usually following a real or falsified controversy. An influence operation may exploit cancel culture by emphasising an adversary’s problematic or disputed behaviour and presenting its own content as an alternative.
|
||||
|
||||
**Tactic**: TA18 Drive Online Harms **Parent Technique:** T0048 Harass
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0048.001: Boycott/"Cancel" Opponents
|
||||
|
||||
**Summary**: Cancel culture refers to the phenomenon in which individuals collectively refrain from supporting an individual, organisation, business, or other entity, usually following a real or falsified controversy. An influence operation may exploit cancel culture by emphasising an adversary’s problematic or disputed behaviour and presenting its own content as an alternative.
|
||||
|
||||
**Tactic**: TA18 Drive Online Harms
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0048.002: Harass People Based on Identities
|
||||
|
||||
**Summary**: Examples include social identities like gender, sexuality, race, ethnicity, religion, ability, nationality, etc. as well as roles and occupations like journalist or activist.
|
||||
|
||||
**Tactic**: TA18 Drive Online Harms **Parent Technique:** T0048 Harass
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0048.002: Harass People Based on Identities
|
||||
|
||||
**Summary**: Examples include social identities like gender, sexuality, race, ethnicity, religion, ability, nationality, etc. as well as roles and occupations like journalist or activist.
|
||||
|
||||
**Tactic**: TA18 Drive Online Harms
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0048.003: Threaten to Dox
|
||||
|
||||
**Summary**: Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.
|
||||
|
||||
**Tactic**: TA18 Drive Online Harms **Parent Technique:** T0048 Harass
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0048.003: Threaten to Dox
|
||||
|
||||
**Summary**: Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.
|
||||
|
||||
**Tactic**: TA18 Drive Online Harms
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0048.004: Dox
|
||||
|
||||
**Summary**: Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.
|
||||
|
||||
**Tactic**: TA18 Drive Online Harms **Parent Technique:** T0048 Harass
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0048.004: Dox
|
||||
|
||||
**Summary**: Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.
|
||||
|
||||
**Tactic**: TA18 Drive Online Harms
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -20,49 +20,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0048: Harass
|
||||
|
||||
**Summary**: Threatening or harassing believers of opposing narratives refers to the use of intimidation techniques, including cyberbullying and doxing, to discourage opponents from voicing their dissent. An influence operation may threaten or harass believers of the opposing narratives to deter individuals from posting or proliferating conflicting content.
|
||||
|
||||
**Tactic**: TA18 Drive Online Harms
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00122 The Extreme Right on Discord](../../generated_pages/incidents/I00122.md) | Discord is an example of a T0151.004: Chat Platform, which allows users to create their own T0151.005: Chat Community Server. The Institute for Strategic Dialog (ISD) conducted an investigation into the extreme right’s usage of Discord servers:<br><br><i>Discord is a free service accessible via phones and computers. It allows users to talk to each other in real time via voice, text or video chat and emerged in 2015 as a platform designed to assist gamers in communicating with each other while playing video games. The popularity of the platform has surged in recent years, and it is currently estimated to have 140 million monthly active users.<br><br>Chatrooms – known as servers - in the platform can be created by anyone, and they are used for a range of purposes that extend far beyond gaming. Such purposes include the discussion of extreme right-wing ideologies and the planning of offline extremist activity. Ahead of the far-right Unite the Right rally in Charlottesville, Virginia, in August 2017, organisers used Discord to plan and promote events and posted swastikas and praised Hitler in chat rooms with names like “National Socialist Army” and “Führer’s Gas Chamber”.</i><br><br>In this example a Discord server was used to organise the 2017 Charlottesville Unite the Right rally. Chat rooms such in the server were used to discuss different topics related to the rally (T0057: Organise Events, T0126.002: Facilitate Logistics or Support for Attendance, T0151.004: Chat Platform, T0151.005: Chat Community Server, T0151.006: Chat Room).<br><br><i>Another primary activity engaged in the servers analysed are raids against other servers associated with political opponents, and in particular those that appear to be pro-LGBTQ. Raids are a phenomenon in which a small group of users will join a Discord server with the sole purpose of spamming the host with offensive or incendiary messages and content with the aim of upsetting local users or having the host server banned by Discord. On two servers examined here, raiding was their primary function.<br><br>Among servers devoted to this activity, specific channels were often created to host links to servers that users were then encouraged to raid. Users are encouraged to be as offensive as possible with the aim of upsetting or angering users on the raided server, and channels often had content banks of offensive memes and content to be shared on raided servers.<br><br>The use of raids demonstrates the gamified nature of extremist activity on Discord, where use of the platform and harassment of political opponents is itself turned into a type of real-life video game designed to strengthen in-group affiliation. This combined with the broader extremist activity identified in these channels suggests that the combative activity of raiding could provide a pathway for younger people to become more engaged with extremist activity.</i><br><br>Discord servers were used by members of the extreme right to coordinate harassment of targeted communities (T0048: Harass, T0049.005: Conduct Swarming, T0151.004: Chat Platform, T0151.005: Chat Community Server). |
|
||||
| [I00123 The Extreme Right on Steam](../../generated_pages/incidents/I00123.md) | ISD conducted an investigation into the usage of social groups on Steam. Steam is an online platform used to buy and sell digital games, and includes the Steam community feature, which “allows users to find friends and join groups and discussion forums, while also offering in-game voice and text chat”. Actors have used Steam’s social capabilities to enable online harm campaigns:<br><br><i>One function of these Steam groups is the organisation of ‘raids’ – coordinated trolling activity against their political opponents. An example of this can be seen in a white power music group sharing a link to an Israeli Steam group, encouraging other members to “help me raid this juden [German word for Jew] group”. The comments section of said target group show that neo-Nazi and antisemitic comments were consistently posted in the group just two minutes after the instruction had been posted in the extremist group, highlighting the swiftness with which racially motivated harassment can be directed online.</i><br><br>Threat actors used social groups on Steam to organise harassment of targets (T0152.009: Software Delivery Platform, T0151.002: Online Community Group, T0049.005: Conduct Swarming, T0048: Harass). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0048: Harass
|
||||
|
||||
**Summary**: Threatening or harassing believers of opposing narratives refers to the use of intimidation techniques, including cyberbullying and doxing, to discourage opponents from voicing their dissent. An influence operation may threaten or harass believers of the opposing narratives to deter individuals from posting or proliferating conflicting content.
|
||||
|
||||
**Tactic**: TA18 Drive Online Harms
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00122 The Extreme Right on Discord](../../generated_pages/incidents/I00122.md) | Discord is an example of a T0151.004: Chat Platform, which allows users to create their own T0151.005: Chat Community Server. The Institute for Strategic Dialog (ISD) conducted an investigation into the extreme right’s usage of Discord servers:<br><br><i>Discord is a free service accessible via phones and computers. It allows users to talk to each other in real time via voice, text or video chat and emerged in 2015 as a platform designed to assist gamers in communicating with each other while playing video games. The popularity of the platform has surged in recent years, and it is currently estimated to have 140 million monthly active users.<br><br>Chatrooms – known as servers - in the platform can be created by anyone, and they are used for a range of purposes that extend far beyond gaming. Such purposes include the discussion of extreme right-wing ideologies and the planning of offline extremist activity. Ahead of the far-right Unite the Right rally in Charlottesville, Virginia, in August 2017, organisers used Discord to plan and promote events and posted swastikas and praised Hitler in chat rooms with names like “National Socialist Army” and “Führer’s Gas Chamber”.</i><br><br>In this example a Discord server was used to organise the 2017 Charlottesville Unite the Right rally. Chat rooms such in the server were used to discuss different topics related to the rally (T0057: Organise Events, T0126.002: Facilitate Logistics or Support for Attendance, T0151.004: Chat Platform, T0151.005: Chat Community Server, T0151.006: Chat Room).<br><br><i>Another primary activity engaged in the servers analysed are raids against other servers associated with political opponents, and in particular those that appear to be pro-LGBTQ. Raids are a phenomenon in which a small group of users will join a Discord server with the sole purpose of spamming the host with offensive or incendiary messages and content with the aim of upsetting local users or having the host server banned by Discord. On two servers examined here, raiding was their primary function.<br><br>Among servers devoted to this activity, specific channels were often created to host links to servers that users were then encouraged to raid. Users are encouraged to be as offensive as possible with the aim of upsetting or angering users on the raided server, and channels often had content banks of offensive memes and content to be shared on raided servers.<br><br>The use of raids demonstrates the gamified nature of extremist activity on Discord, where use of the platform and harassment of political opponents is itself turned into a type of real-life video game designed to strengthen in-group affiliation. This combined with the broader extremist activity identified in these channels suggests that the combative activity of raiding could provide a pathway for younger people to become more engaged with extremist activity.</i><br><br>Discord servers were used by members of the extreme right to coordinate harassment of targeted communities (T0048: Harass, T0049.005: Conduct Swarming, T0151.004: Chat Platform, T0151.005: Chat Community Server). |
|
||||
| [I00123 The Extreme Right on Steam](../../generated_pages/incidents/I00123.md) | ISD conducted an investigation into the usage of social groups on Steam. Steam is an online platform used to buy and sell digital games, and includes the Steam community feature, which “allows users to find friends and join groups and discussion forums, while also offering in-game voice and text chat”. Actors have used Steam’s social capabilities to enable online harm campaigns:<br><br><i>One function of these Steam groups is the organisation of ‘raids’ – coordinated trolling activity against their political opponents. An example of this can be seen in a white power music group sharing a link to an Israeli Steam group, encouraging other members to “help me raid this juden [German word for Jew] group”. The comments section of said target group show that neo-Nazi and antisemitic comments were consistently posted in the group just two minutes after the instruction had been posted in the extremist group, highlighting the swiftness with which racially motivated harassment can be directed online.</i><br><br>Threat actors used social groups on Steam to organise harassment of targets (T0152.009: Software Delivery Platform, T0151.002: Online Community Group, T0049.005: Conduct Swarming, T0048: Harass). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.001: Trolls Amplify and Manipulate
|
||||
|
||||
**Summary**: Use trolls to amplify narratives and/or manipulate narratives. Fake profiles/sockpuppets operating to support individuals/narratives from the entire political spectrum (left/right binary). Operating with increased emphasis on promoting local content and promoting real Twitter users generating their own, often divisive political content, as it's easier to amplify existing content than create new/original content. Trolls operate where ever there's a socially divisive issue (issues that can/are be politicized).
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure **Parent Technique:** T0049 Flood Information Space
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.001: Trolls Amplify and Manipulate
|
||||
|
||||
**Summary**: Use trolls to amplify narratives and/or manipulate narratives. Fake profiles/sockpuppets operating to support individuals/narratives from the entire political spectrum (left/right binary). Operating with increased emphasis on promoting local content and promoting real Twitter users generating their own, often divisive political content, as it's easier to amplify existing content than create new/original content. Trolls operate where ever there's a socially divisive issue (issues that can/are be politicized).
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.002: Flood Existing Hashtag
|
||||
|
||||
**Summary**: Hashtags can be used by communities to collate information they post about particular topics (such as their interests, or current events) and users can find communities to join by exploring hashtags they’re interested in.<br /> <br />Threat actors can flood an existing hashtag to try to ruin hashtag functionality, posting content unrelated to the hashtag alongside it, making it a less reliable source of relevant information. They may also try to flood existing hashtags with campaign content, with the intent of maximising exposure to users.<br /> <br />This Technique covers cases where threat actors flood existing hashtags with campaign content.<br /> <br />This Technique covers behaviours previously documented by T0019.002: Hijack Hashtags, which has since been deprecated. This Technique was previously called Hijack Existing Hashtag.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure **Parent Technique:** T0049 Flood Information Space
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.002: Flood Existing Hashtag
|
||||
|
||||
**Summary**: Hashtags can be used by communities to collate information they post about particular topics (such as their interests, or current events) and users can find communities to join by exploring hashtags they’re interested in.<br /> <br />Threat actors can flood an existing hashtag to try to ruin hashtag functionality, posting content unrelated to the hashtag alongside it, making it a less reliable source of relevant information. They may also try to flood existing hashtags with campaign content, with the intent of maximising exposure to users.<br /> <br />This Technique covers cases where threat actors flood existing hashtags with campaign content.<br /> <br />This Technique covers behaviours previously documented by T0019.002: Hijack Hashtags, which has since been deprecated. This Technique was previously called Hijack Existing Hashtag.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.003: Bots Amplify via Automated Forwarding and Reposting
|
||||
|
||||
**Summary**: Automated forwarding and reposting refer to the proliferation of operation content using automated means, such as artificial intelligence or social media bots. An influence operation may use automated activity to increase content exposure without dedicating the resources, including personnel and time, traditionally required to forward and repost content. Use bots to amplify narratives above algorithm thresholds. Bots are automated/programmed profiles designed to amplify content (ie: automatically retweet or like) and give appearance it's more "popular" than it is. They can operate as a network, to function in a coordinated/orchestrated manner. In some cases (more so now) they are an inexpensive/disposable assets used for minimal deployment as bot detection tools improve and platforms are more responsive.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure **Parent Technique:** T0049 Flood Information Space
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.003: Bots Amplify via Automated Forwarding and Reposting
|
||||
|
||||
**Summary**: Automated forwarding and reposting refer to the proliferation of operation content using automated means, such as artificial intelligence or social media bots. An influence operation may use automated activity to increase content exposure without dedicating the resources, including personnel and time, traditionally required to forward and repost content. Use bots to amplify narratives above algorithm thresholds. Bots are automated/programmed profiles designed to amplify content (ie: automatically retweet or like) and give appearance it's more "popular" than it is. They can operate as a network, to function in a coordinated/orchestrated manner. In some cases (more so now) they are an inexpensive/disposable assets used for minimal deployment as bot detection tools improve and platforms are more responsive.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.004: Utilise Spamoflauge
|
||||
|
||||
**Summary**: Spamoflauge refers to the practice of disguising spam messages as legitimate. Spam refers to the use of electronic messaging systems to send out unrequested or unwanted messages in bulk. Simple methods of spamoflauge include replacing letters with numbers to fool keyword-based email spam filters, for example, "you've w0n our jackp0t!". Spamoflauge may extend to more complex techniques such as modifying the grammar or word choice of the language, casting messages as images which spam detectors cannot automatically read, or encapsulating messages in password protected attachments, such as .pdf or .zip files. Influence operations may use spamoflauge to avoid spam filtering systems and increase the likelihood of the target audience receiving operation messaging.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure **Parent Technique:** T0049 Flood Information Space
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.004: Utilise Spamoflauge
|
||||
|
||||
**Summary**: Spamoflauge refers to the practice of disguising spam messages as legitimate. Spam refers to the use of electronic messaging systems to send out unrequested or unwanted messages in bulk. Simple methods of spamoflauge include replacing letters with numbers to fool keyword-based email spam filters, for example, "you've w0n our jackp0t!". Spamoflauge may extend to more complex techniques such as modifying the grammar or word choice of the language, casting messages as images which spam detectors cannot automatically read, or encapsulating messages in password protected attachments, such as .pdf or .zip files. Influence operations may use spamoflauge to avoid spam filtering systems and increase the likelihood of the target audience receiving operation messaging.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -20,49 +20,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.005: Conduct Swarming
|
||||
|
||||
**Summary**: Swarming refers to the coordinated use of accounts to overwhelm the information space with operation content. Unlike information flooding, swarming centres exclusively around a specific event or actor rather than a general narrative. Swarming relies on “horizontal communication” between information assets rather than a top-down, vertical command-and-control approach.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure **Parent Technique:** T0049 Flood Information Space
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00122 The Extreme Right on Discord](../../generated_pages/incidents/I00122.md) | Discord is an example of a T0151.004: Chat Platform, which allows users to create their own T0151.005: Chat Community Server. The Institute for Strategic Dialog (ISD) conducted an investigation into the extreme right’s usage of Discord servers:<br><br><i>Discord is a free service accessible via phones and computers. It allows users to talk to each other in real time via voice, text or video chat and emerged in 2015 as a platform designed to assist gamers in communicating with each other while playing video games. The popularity of the platform has surged in recent years, and it is currently estimated to have 140 million monthly active users.<br><br>Chatrooms – known as servers - in the platform can be created by anyone, and they are used for a range of purposes that extend far beyond gaming. Such purposes include the discussion of extreme right-wing ideologies and the planning of offline extremist activity. Ahead of the far-right Unite the Right rally in Charlottesville, Virginia, in August 2017, organisers used Discord to plan and promote events and posted swastikas and praised Hitler in chat rooms with names like “National Socialist Army” and “Führer’s Gas Chamber”.</i><br><br>In this example a Discord server was used to organise the 2017 Charlottesville Unite the Right rally. Chat rooms such in the server were used to discuss different topics related to the rally (T0057: Organise Events, T0126.002: Facilitate Logistics or Support for Attendance, T0151.004: Chat Platform, T0151.005: Chat Community Server, T0151.006: Chat Room).<br><br><i>Another primary activity engaged in the servers analysed are raids against other servers associated with political opponents, and in particular those that appear to be pro-LGBTQ. Raids are a phenomenon in which a small group of users will join a Discord server with the sole purpose of spamming the host with offensive or incendiary messages and content with the aim of upsetting local users or having the host server banned by Discord. On two servers examined here, raiding was their primary function.<br><br>Among servers devoted to this activity, specific channels were often created to host links to servers that users were then encouraged to raid. Users are encouraged to be as offensive as possible with the aim of upsetting or angering users on the raided server, and channels often had content banks of offensive memes and content to be shared on raided servers.<br><br>The use of raids demonstrates the gamified nature of extremist activity on Discord, where use of the platform and harassment of political opponents is itself turned into a type of real-life video game designed to strengthen in-group affiliation. This combined with the broader extremist activity identified in these channels suggests that the combative activity of raiding could provide a pathway for younger people to become more engaged with extremist activity.</i><br><br>Discord servers were used by members of the extreme right to coordinate harassment of targeted communities (T0048: Harass, T0049.005: Conduct Swarming, T0151.004: Chat Platform, T0151.005: Chat Community Server). |
|
||||
| [I00123 The Extreme Right on Steam](../../generated_pages/incidents/I00123.md) | ISD conducted an investigation into the usage of social groups on Steam. Steam is an online platform used to buy and sell digital games, and includes the Steam community feature, which “allows users to find friends and join groups and discussion forums, while also offering in-game voice and text chat”. Actors have used Steam’s social capabilities to enable online harm campaigns:<br><br><i>One function of these Steam groups is the organisation of ‘raids’ – coordinated trolling activity against their political opponents. An example of this can be seen in a white power music group sharing a link to an Israeli Steam group, encouraging other members to “help me raid this juden [German word for Jew] group”. The comments section of said target group show that neo-Nazi and antisemitic comments were consistently posted in the group just two minutes after the instruction had been posted in the extremist group, highlighting the swiftness with which racially motivated harassment can be directed online.</i><br><br>Threat actors used social groups on Steam to organise harassment of targets (T0152.009: Software Delivery Platform, T0151.002: Online Community Group, T0049.005: Conduct Swarming, T0048: Harass). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.005: Conduct Swarming
|
||||
|
||||
**Summary**: Swarming refers to the coordinated use of accounts to overwhelm the information space with operation content. Unlike information flooding, swarming centres exclusively around a specific event or actor rather than a general narrative. Swarming relies on “horizontal communication” between information assets rather than a top-down, vertical command-and-control approach.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00122 The Extreme Right on Discord](../../generated_pages/incidents/I00122.md) | Discord is an example of a T0151.004: Chat Platform, which allows users to create their own T0151.005: Chat Community Server. The Institute for Strategic Dialog (ISD) conducted an investigation into the extreme right’s usage of Discord servers:<br><br><i>Discord is a free service accessible via phones and computers. It allows users to talk to each other in real time via voice, text or video chat and emerged in 2015 as a platform designed to assist gamers in communicating with each other while playing video games. The popularity of the platform has surged in recent years, and it is currently estimated to have 140 million monthly active users.<br><br>Chatrooms – known as servers - in the platform can be created by anyone, and they are used for a range of purposes that extend far beyond gaming. Such purposes include the discussion of extreme right-wing ideologies and the planning of offline extremist activity. Ahead of the far-right Unite the Right rally in Charlottesville, Virginia, in August 2017, organisers used Discord to plan and promote events and posted swastikas and praised Hitler in chat rooms with names like “National Socialist Army” and “Führer’s Gas Chamber”.</i><br><br>In this example a Discord server was used to organise the 2017 Charlottesville Unite the Right rally. Chat rooms such in the server were used to discuss different topics related to the rally (T0057: Organise Events, T0126.002: Facilitate Logistics or Support for Attendance, T0151.004: Chat Platform, T0151.005: Chat Community Server, T0151.006: Chat Room).<br><br><i>Another primary activity engaged in the servers analysed are raids against other servers associated with political opponents, and in particular those that appear to be pro-LGBTQ. Raids are a phenomenon in which a small group of users will join a Discord server with the sole purpose of spamming the host with offensive or incendiary messages and content with the aim of upsetting local users or having the host server banned by Discord. On two servers examined here, raiding was their primary function.<br><br>Among servers devoted to this activity, specific channels were often created to host links to servers that users were then encouraged to raid. Users are encouraged to be as offensive as possible with the aim of upsetting or angering users on the raided server, and channels often had content banks of offensive memes and content to be shared on raided servers.<br><br>The use of raids demonstrates the gamified nature of extremist activity on Discord, where use of the platform and harassment of political opponents is itself turned into a type of real-life video game designed to strengthen in-group affiliation. This combined with the broader extremist activity identified in these channels suggests that the combative activity of raiding could provide a pathway for younger people to become more engaged with extremist activity.</i><br><br>Discord servers were used by members of the extreme right to coordinate harassment of targeted communities (T0048: Harass, T0049.005: Conduct Swarming, T0151.004: Chat Platform, T0151.005: Chat Community Server). |
|
||||
| [I00123 The Extreme Right on Steam](../../generated_pages/incidents/I00123.md) | ISD conducted an investigation into the usage of social groups on Steam. Steam is an online platform used to buy and sell digital games, and includes the Steam community feature, which “allows users to find friends and join groups and discussion forums, while also offering in-game voice and text chat”. Actors have used Steam’s social capabilities to enable online harm campaigns:<br><br><i>One function of these Steam groups is the organisation of ‘raids’ – coordinated trolling activity against their political opponents. An example of this can be seen in a white power music group sharing a link to an Israeli Steam group, encouraging other members to “help me raid this juden [German word for Jew] group”. The comments section of said target group show that neo-Nazi and antisemitic comments were consistently posted in the group just two minutes after the instruction had been posted in the extremist group, highlighting the swiftness with which racially motivated harassment can be directed online.</i><br><br>Threat actors used social groups on Steam to organise harassment of targets (T0152.009: Software Delivery Platform, T0151.002: Online Community Group, T0049.005: Conduct Swarming, T0048: Harass). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.006: Conduct Keyword Squatting
|
||||
|
||||
**Summary**: Keyword squatting refers to the creation of online content, such as websites, articles, or social media accounts, around a specific search engine-optimized term to overwhelm the search results of that term. An influence may keyword squat to increase content exposure to target audience members who query the exploited term in a search engine and manipulate the narrative around the term.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure **Parent Technique:** T0049 Flood Information Space
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.006: Conduct Keyword Squatting
|
||||
|
||||
**Summary**: Keyword squatting refers to the creation of online content, such as websites, articles, or social media accounts, around a specific search engine-optimized term to overwhelm the search results of that term. An influence may keyword squat to increase content exposure to target audience members who query the exploited term in a search engine and manipulate the narrative around the term.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.007: Inauthentic Sites Amplify News and Narratives
|
||||
|
||||
**Summary**: Inauthentic sites circulate cross-post stories and amplify narratives. Often these sites have no masthead, bylines or attribution.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure **Parent Technique:** T0049 Flood Information Space
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.007: Inauthentic Sites Amplify News and Narratives
|
||||
|
||||
**Summary**: Inauthentic sites circulate cross-post stories and amplify narratives. Often these sites have no masthead, bylines or attribution.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.008: Generate Information Pollution
|
||||
|
||||
**Summary**: Information Pollution occurs when threat actors attempt to ruin a source of information by flooding it with lots of inauthentic or unreliable content, intending to make it harder for legitimate users to find the information they’re looking for.<br /> <br />This sub-technique’s objective is to reduce exposure to target information, rather than promoting exposure to campaign content, for which the parent Technique T0049 can be used.<br /> <br />Analysts will need to infer what the motive for flooding an information space was when deciding whether to use T0049 or T0049.008 to tag a case when an information space is flooded. If such inference is not possible, default to T0049.<br /> <br />This Technique previously used the ID T0019.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure **Parent Technique:** T0049 Flood Information Space
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0049.008: Generate Information Pollution
|
||||
|
||||
**Summary**: Information Pollution occurs when threat actors attempt to ruin a source of information by flooding it with lots of inauthentic or unreliable content, intending to make it harder for legitimate users to find the information they’re looking for.<br /> <br />This sub-technique’s objective is to reduce exposure to target information, rather than promoting exposure to campaign content, for which the parent Technique T0049 can be used.<br /> <br />Analysts will need to infer what the motive for flooding an information space was when deciding whether to use T0049 or T0049.008 to tag a case when an information space is flooded. If such inference is not possible, default to T0049.<br /> <br />This Technique previously used the ID T0019.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -20,47 +20,4 @@
|
||||
| [C00131 Seize and analyse botnet servers](../../generated_pages/counters/C00131.md) | D02 |
|
||||
|
||||
|
||||
# Technique T0049: Flood Information Space
|
||||
|
||||
**Summary**: Flooding sources of information (e.g. Social Media feeds) with a high volume of inauthentic content.<br /> <br />This can be done to control/shape online conversations, drown out opposing points of view, or make it harder to find legitimate information.<br /> <br />Bots and/or patriotic trolls are effective tools to achieve this effect.<br /> <br />This Technique previously used the name Flooding the Information Space.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00131 Seize and analyse botnet servers](../../generated_pages/counters/C00131.md) | D02 |
|
||||
|
||||
|
||||
# Technique T0049: Flood Information Space
|
||||
|
||||
**Summary**: Flooding sources of information (e.g. Social Media feeds) with a high volume of inauthentic content.<br /> <br />This can be done to control/shape online conversations, drown out opposing points of view, or make it harder to find legitimate information.<br /> <br />Bots and/or patriotic trolls are effective tools to achieve this effect.<br /> <br />This Technique previously used the name Flooding the Information Space.
|
||||
|
||||
**Tactic**: TA17 Maximise Exposure
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00131 Seize and analyse botnet servers](../../generated_pages/counters/C00131.md) | D02 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0057.001: Pay for Physical Action
|
||||
|
||||
**Summary**: Paying for physical action occurs when an influence operation pays individuals to act in the physical realm. An influence operation may pay for physical action to create specific situations and frame them in a way that supports operation narratives, for example, paying a group of people to burn a car to later post an image of the burning car and frame it as an act of protest.
|
||||
|
||||
**Tactic**: TA10 Drive Offline Activity **Parent Technique:** T0057 Organise Events
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0057.001: Pay for Physical Action
|
||||
|
||||
**Summary**: Paying for physical action occurs when an influence operation pays individuals to act in the physical realm. An influence operation may pay for physical action to create specific situations and frame them in a way that supports operation narratives, for example, paying a group of people to burn a car to later post an image of the burning car and frame it as an act of protest.
|
||||
|
||||
**Tactic**: TA10 Drive Offline Activity
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0057.002: Conduct Symbolic Action
|
||||
|
||||
**Summary**: Symbolic action refers to activities specifically intended to advance an operation’s narrative by signalling something to the audience, for example, a military parade supporting a state’s narrative of military superiority. An influence operation may use symbolic action to create falsified evidence supporting operation narratives in the physical information space.
|
||||
|
||||
**Tactic**: TA10 Drive Offline Activity **Parent Technique:** T0057 Organise Events
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0057.002: Conduct Symbolic Action
|
||||
|
||||
**Summary**: Symbolic action refers to activities specifically intended to advance an operation’s narrative by signalling something to the audience, for example, a military parade supporting a state’s narrative of military superiority. An influence operation may use symbolic action to create falsified evidence supporting operation narratives in the physical information space.
|
||||
|
||||
**Tactic**: TA10 Drive Offline Activity
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -20,49 +20,4 @@
|
||||
| [C00129 Use banking to cut off access](../../generated_pages/counters/C00129.md) | D02 |
|
||||
|
||||
|
||||
# Technique T0057: Organise Events
|
||||
|
||||
**Summary**: Coordinate and promote real-world events across media platforms, e.g. rallies, protests, gatherings in support of incident narratives.
|
||||
|
||||
**Tactic**: TA10 Drive Offline Activity
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00122 The Extreme Right on Discord](../../generated_pages/incidents/I00122.md) | Discord is an example of a T0151.004: Chat Platform, which allows users to create their own T0151.005: Chat Community Server. The Institute for Strategic Dialog (ISD) conducted an investigation into the extreme right’s usage of Discord servers:<br><br><i>Discord is a free service accessible via phones and computers. It allows users to talk to each other in real time via voice, text or video chat and emerged in 2015 as a platform designed to assist gamers in communicating with each other while playing video games. The popularity of the platform has surged in recent years, and it is currently estimated to have 140 million monthly active users.<br><br>Chatrooms – known as servers - in the platform can be created by anyone, and they are used for a range of purposes that extend far beyond gaming. Such purposes include the discussion of extreme right-wing ideologies and the planning of offline extremist activity. Ahead of the far-right Unite the Right rally in Charlottesville, Virginia, in August 2017, organisers used Discord to plan and promote events and posted swastikas and praised Hitler in chat rooms with names like “National Socialist Army” and “Führer’s Gas Chamber”.</i><br><br>In this example a Discord server was used to organise the 2017 Charlottesville Unite the Right rally. Chat rooms such in the server were used to discuss different topics related to the rally (T0057: Organise Events, T0126.002: Facilitate Logistics or Support for Attendance, T0151.004: Chat Platform, T0151.005: Chat Community Server, T0151.006: Chat Room).<br><br><i>Another primary activity engaged in the servers analysed are raids against other servers associated with political opponents, and in particular those that appear to be pro-LGBTQ. Raids are a phenomenon in which a small group of users will join a Discord server with the sole purpose of spamming the host with offensive or incendiary messages and content with the aim of upsetting local users or having the host server banned by Discord. On two servers examined here, raiding was their primary function.<br><br>Among servers devoted to this activity, specific channels were often created to host links to servers that users were then encouraged to raid. Users are encouraged to be as offensive as possible with the aim of upsetting or angering users on the raided server, and channels often had content banks of offensive memes and content to be shared on raided servers.<br><br>The use of raids demonstrates the gamified nature of extremist activity on Discord, where use of the platform and harassment of political opponents is itself turned into a type of real-life video game designed to strengthen in-group affiliation. This combined with the broader extremist activity identified in these channels suggests that the combative activity of raiding could provide a pathway for younger people to become more engaged with extremist activity.</i><br><br>Discord servers were used by members of the extreme right to coordinate harassment of targeted communities (T0048: Harass, T0049.005: Conduct Swarming, T0151.004: Chat Platform, T0151.005: Chat Community Server). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00129 Use banking to cut off access](../../generated_pages/counters/C00129.md) | D02 |
|
||||
|
||||
|
||||
# Technique T0057: Organise Events
|
||||
|
||||
**Summary**: Coordinate and promote real-world events across media platforms, e.g. rallies, protests, gatherings in support of incident narratives.
|
||||
|
||||
**Tactic**: TA10 Drive Offline Activity
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00122 The Extreme Right on Discord](../../generated_pages/incidents/I00122.md) | Discord is an example of a T0151.004: Chat Platform, which allows users to create their own T0151.005: Chat Community Server. The Institute for Strategic Dialog (ISD) conducted an investigation into the extreme right’s usage of Discord servers:<br><br><i>Discord is a free service accessible via phones and computers. It allows users to talk to each other in real time via voice, text or video chat and emerged in 2015 as a platform designed to assist gamers in communicating with each other while playing video games. The popularity of the platform has surged in recent years, and it is currently estimated to have 140 million monthly active users.<br><br>Chatrooms – known as servers - in the platform can be created by anyone, and they are used for a range of purposes that extend far beyond gaming. Such purposes include the discussion of extreme right-wing ideologies and the planning of offline extremist activity. Ahead of the far-right Unite the Right rally in Charlottesville, Virginia, in August 2017, organisers used Discord to plan and promote events and posted swastikas and praised Hitler in chat rooms with names like “National Socialist Army” and “Führer’s Gas Chamber”.</i><br><br>In this example a Discord server was used to organise the 2017 Charlottesville Unite the Right rally. Chat rooms such in the server were used to discuss different topics related to the rally (T0057: Organise Events, T0126.002: Facilitate Logistics or Support for Attendance, T0151.004: Chat Platform, T0151.005: Chat Community Server, T0151.006: Chat Room).<br><br><i>Another primary activity engaged in the servers analysed are raids against other servers associated with political opponents, and in particular those that appear to be pro-LGBTQ. Raids are a phenomenon in which a small group of users will join a Discord server with the sole purpose of spamming the host with offensive or incendiary messages and content with the aim of upsetting local users or having the host server banned by Discord. On two servers examined here, raiding was their primary function.<br><br>Among servers devoted to this activity, specific channels were often created to host links to servers that users were then encouraged to raid. Users are encouraged to be as offensive as possible with the aim of upsetting or angering users on the raided server, and channels often had content banks of offensive memes and content to be shared on raided servers.<br><br>The use of raids demonstrates the gamified nature of extremist activity on Discord, where use of the platform and harassment of political opponents is itself turned into a type of real-life video game designed to strengthen in-group affiliation. This combined with the broader extremist activity identified in these channels suggests that the combative activity of raiding could provide a pathway for younger people to become more engaged with extremist activity.</i><br><br>Discord servers were used by members of the extreme right to coordinate harassment of targeted communities (T0048: Harass, T0049.005: Conduct Swarming, T0151.004: Chat Platform, T0151.005: Chat Community Server). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00129 Use banking to cut off access](../../generated_pages/counters/C00129.md) | D02 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0059: Play the Long Game
|
||||
|
||||
**Summary**: Play the long game refers to two phenomena: 1. To plan messaging and allow it to grow organically without conducting your own amplification. This is methodical and slow and requires years for the message to take hold 2. To develop a series of seemingly disconnected messaging narratives that eventually combine into a new narrative.
|
||||
|
||||
**Tactic**: TA11 Persist in the Information Environment
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0059: Play the Long Game
|
||||
|
||||
**Summary**: Play the long game refers to two phenomena: 1. To plan messaging and allow it to grow organically without conducting your own amplification. This is methodical and slow and requires years for the message to take hold 2. To develop a series of seemingly disconnected messaging narratives that eventually combine into a new narrative.
|
||||
|
||||
**Tactic**: TA11 Persist in the Information Environment
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -22,51 +22,4 @@
|
||||
| [C00147 Make amplification of social media posts expire (e.g. can't like/ retweet after n days)](../../generated_pages/counters/C00147.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0060: Continue to Amplify
|
||||
|
||||
**Summary**: continue narrative or message amplification after the main incident work has finished
|
||||
|
||||
**Tactic**: TA11 Persist in the Information Environment
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00138 Spam domestic actors with lawsuits](../../generated_pages/counters/C00138.md) | D03 |
|
||||
| [C00143 (botnet) DMCA takedown requests to waste group time](../../generated_pages/counters/C00143.md) | D04 |
|
||||
| [C00147 Make amplification of social media posts expire (e.g. can't like/ retweet after n days)](../../generated_pages/counters/C00147.md) | D03 |
|
||||
|
||||
|
||||
# Technique T0060: Continue to Amplify
|
||||
|
||||
**Summary**: continue narrative or message amplification after the main incident work has finished
|
||||
|
||||
**Tactic**: TA11 Persist in the Information Environment
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
| [C00138 Spam domestic actors with lawsuits](../../generated_pages/counters/C00138.md) | D03 |
|
||||
| [C00143 (botnet) DMCA takedown requests to waste group time](../../generated_pages/counters/C00143.md) | D04 |
|
||||
| [C00147 Make amplification of social media posts expire (e.g. can't like/ retweet after n days)](../../generated_pages/counters/C00147.md) | D03 |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -20,47 +20,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0061: Sell Merchandise
|
||||
|
||||
**Summary**: Sell mechandise refers to getting the message or narrative into physical space in the offline world while making money
|
||||
|
||||
**Tactic**: TA10 Drive Offline Activity
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00109 Coordinated Facebook Pages Designed to Fund a White Supremacist Agenda](../../generated_pages/incidents/I00109.md) | This report examines the white nationalist group Suavelos’ use of Facebook to draw visitors to its website without overtly revealing their racist ideology. This section of the report looks at technical indicators associated with the Suavelos website, and attributions which can be made as a consequence:<i><br><br>[The Google AdSense tag set up on Suavelos.eu was also found on the following domains, indicating that they are controlled by the same actor;] Alabastro.eu: an online shop to buy “white nationalists” t-shirts [and] ARPAC.eu: the website of a registered non-profit organisation advocating to lift regulation on gun control in France.<br><br>Other domains attributed to Suavelos (T0149.001: Domain Asset) reveal a website set up to sell merchandise (T0152.004: Website Asset, T0148.004: Payment Processing Capability, T0061: Sell Merchandise), and a website hosting a registered French non-profit (T0152.004: Website Asset, T0097.207: NGO Persona).<br><br>To learn more about the suavelos.eu domain, we collected the following data: The domain is hosted on OVH; The owner’s identity is protected; The IP Address of the server is 94.23.253.173, which is shared with 20 other domains. <br><br>The relative low number of websites hosted on this IP address could indicate that they all belong to the same people, and are hosted on the same private server.</i><br><br>Suavelos registered a domain using the web hosting provider OVH (T0149.001: Domain Asset, T0152.003: Website Hosting Platform, T0150.006: Purchased). The site’s IP address reveals a server hosting other domains potentially owned by the actors (T0149.005: Server Asset, T0149.006: IP Address Asset). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0061: Sell Merchandise
|
||||
|
||||
**Summary**: Sell mechandise refers to getting the message or narrative into physical space in the offline world while making money
|
||||
|
||||
**Tactic**: TA10 Drive Offline Activity
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00109 Coordinated Facebook Pages Designed to Fund a White Supremacist Agenda](../../generated_pages/incidents/I00109.md) | This report examines the white nationalist group Suavelos’ use of Facebook to draw visitors to its website without overtly revealing their racist ideology. This section of the report looks at technical indicators associated with the Suavelos website, and attributions which can be made as a consequence:<i><br><br>[The Google AdSense tag set up on Suavelos.eu was also found on the following domains, indicating that they are controlled by the same actor;] Alabastro.eu: an online shop to buy “white nationalists” t-shirts [and] ARPAC.eu: the website of a registered non-profit organisation advocating to lift regulation on gun control in France.<br><br>Other domains attributed to Suavelos (T0149.001: Domain Asset) reveal a website set up to sell merchandise (T0152.004: Website Asset, T0148.004: Payment Processing Capability, T0061: Sell Merchandise), and a website hosting a registered French non-profit (T0152.004: Website Asset, T0097.207: NGO Persona).<br><br>To learn more about the suavelos.eu domain, we collected the following data: The domain is hosted on OVH; The owner’s identity is protected; The IP Address of the server is 94.23.253.173, which is shared with 20 other domains. <br><br>The relative low number of websites hosted on this IP address could indicate that they all belong to the same people, and are hosted on the same private server.</i><br><br>Suavelos registered a domain using the web hosting provider OVH (T0149.001: Domain Asset, T0152.003: Website Hosting Platform, T0150.006: Purchased). The site’s IP address reveals a server hosting other domains potentially owned by the actors (T0149.005: Server Asset, T0149.006: IP Address Asset). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0065: Prepare Physical Broadcast Capabilities
|
||||
|
||||
**Summary**: Create or coopt broadcast capabilities (e.g. TV, radio etc).
|
||||
|
||||
**Tactic**: TA15 Establish Assets
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0065: Prepare Physical Broadcast Capabilities
|
||||
|
||||
**Summary**: Create or coopt broadcast capabilities (e.g. TV, radio etc).
|
||||
|
||||
**Tactic**: TA15 Establish Assets
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0066: Degrade Adversary
|
||||
|
||||
**Summary**: Plan to degrade an adversary’s image or ability to act. This could include preparation and use of harmful information about the adversary’s actions or reputation.
|
||||
|
||||
**Tactic**: TA02 Plan Objectives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0066: Degrade Adversary
|
||||
|
||||
**Summary**: Plan to degrade an adversary’s image or ability to act. This could include preparation and use of harmful information about the adversary’s actions or reputation.
|
||||
|
||||
**Tactic**: TA02 Plan Objectives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,47 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0068: Respond to Breaking News Event or Active Crisis
|
||||
|
||||
**Summary**: Media attention on a story or event is heightened during a breaking news event, where unclear facts and incomplete information increase speculation, rumours, and conspiracy theories, which are all vulnerable to manipulation.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00106 Facebook Is Being Flooded With Gross AI-Generated Images of Hurricane Helene Devastation](../../generated_pages/incidents/I00106.md) | <i>As families desperately seek to find missing loved ones and communities grapple with immeasurable losses of both life and property in the wake of [2024’s] Hurricane Helene, AI slop scammers appear to be capitalizing on the moment for personal gain.<br><br>A Facebook account called "Coastal Views" usually shares calmer AI imagery of nature-filled beachside scenes. The account's banner image showcases a signpost reading "OBX Live," OBX being shorthand for North Carolina's Outer Banks islands.<br><br>But starting this weekend, the account shifted its approach dramatically, as first flagged by a social media user on X.<br><br>Instead of posting "photos" of leaping dolphins and sandy beaches, the account suddenly started publishing images of flooded mountain neighborhoods, submerged houses, and dogs sitting on top of roofs.<br><br>But instead of spreading vital information to those affected by the natural disaster, or at the very least sharing real photos of the destruction, the account is seemingly trying to use AI to cash in on all the attention the hurricane has been getting.<br><br>The account links to an Etsy page for a business called" OuterBanks2023," where somebody who goes by "Alexandr" sells AI-generated prints of horses touching snouts with sea turtles, Santa running down the shoreline with a reindeer, and sunsets over ocean waves.</i><br><br>A Facebook page which presented itself as being associated with North Carolina which posted AI generated images changed to posting AI generated images of hurricane damage after Hurricane Helene hit North Carolina (T0151.003: Online Community Page, T0151.001: Social Media Platform, T0115: Post Content, T0086.002: Develop AI-Generated Images (Deepfakes), T0068: Respond to Breaking News Event or Active Crisis). <br><br>The account included links (T0122: Direct Users to Alternative Platforms) to an account on Etsy, which sold prints of AI generated images (T0146: Account Asset, T0148.007: eCommerce Platform). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0068: Respond to Breaking News Event or Active Crisis
|
||||
|
||||
**Summary**: Media attention on a story or event is heightened during a breaking news event, where unclear facts and incomplete information increase speculation, rumours, and conspiracy theories, which are all vulnerable to manipulation.
|
||||
|
||||
**Tactic**: TA14 Develop Narratives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00106 Facebook Is Being Flooded With Gross AI-Generated Images of Hurricane Helene Devastation](../../generated_pages/incidents/I00106.md) | <i>As families desperately seek to find missing loved ones and communities grapple with immeasurable losses of both life and property in the wake of [2024’s] Hurricane Helene, AI slop scammers appear to be capitalizing on the moment for personal gain.<br><br>A Facebook account called "Coastal Views" usually shares calmer AI imagery of nature-filled beachside scenes. The account's banner image showcases a signpost reading "OBX Live," OBX being shorthand for North Carolina's Outer Banks islands.<br><br>But starting this weekend, the account shifted its approach dramatically, as first flagged by a social media user on X.<br><br>Instead of posting "photos" of leaping dolphins and sandy beaches, the account suddenly started publishing images of flooded mountain neighborhoods, submerged houses, and dogs sitting on top of roofs.<br><br>But instead of spreading vital information to those affected by the natural disaster, or at the very least sharing real photos of the destruction, the account is seemingly trying to use AI to cash in on all the attention the hurricane has been getting.<br><br>The account links to an Etsy page for a business called" OuterBanks2023," where somebody who goes by "Alexandr" sells AI-generated prints of horses touching snouts with sea turtles, Santa running down the shoreline with a reindeer, and sunsets over ocean waves.</i><br><br>A Facebook page which presented itself as being associated with North Carolina which posted AI generated images changed to posting AI generated images of hurricane damage after Hurricane Helene hit North Carolina (T0151.003: Online Community Page, T0151.001: Social Media Platform, T0115: Post Content, T0086.002: Develop AI-Generated Images (Deepfakes), T0068: Respond to Breaking News Event or Active Crisis). <br><br>The account included links (T0122: Direct Users to Alternative Platforms) to an account on Etsy, which sold prints of AI generated images (T0146: Account Asset, T0148.007: eCommerce Platform). |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0072.001: Geographic Segmentation
|
||||
|
||||
**Summary**: An influence operation may target populations in a specific geographic location, such as a region, state, or city. An influence operation may use geographic segmentation to Create Localised Content (see: Establish Legitimacy).
|
||||
|
||||
**Tactic**: TA13 Target Audience Analysis **Parent Technique:** T0072 Segment Audiences
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0072.001: Geographic Segmentation
|
||||
|
||||
**Summary**: An influence operation may target populations in a specific geographic location, such as a region, state, or city. An influence operation may use geographic segmentation to Create Localised Content (see: Establish Legitimacy).
|
||||
|
||||
**Tactic**: TA13 Target Audience Analysis
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0072.002: Demographic Segmentation
|
||||
|
||||
**Summary**: An influence operation may target populations based on demographic segmentation, including age, gender, and income. Demographic segmentation may be useful for influence operations aiming to change state policies that affect a specific population sector. For example, an influence operation attempting to influence Medicare funding in the United States would likely target U.S. voters over 65 years of age.
|
||||
|
||||
**Tactic**: TA13 Target Audience Analysis **Parent Technique:** T0072 Segment Audiences
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0072.002: Demographic Segmentation
|
||||
|
||||
**Summary**: An influence operation may target populations based on demographic segmentation, including age, gender, and income. Demographic segmentation may be useful for influence operations aiming to change state policies that affect a specific population sector. For example, an influence operation attempting to influence Medicare funding in the United States would likely target U.S. voters over 65 years of age.
|
||||
|
||||
**Tactic**: TA13 Target Audience Analysis
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0072.003: Economic Segmentation
|
||||
|
||||
**Summary**: An influence operation may target populations based on their income bracket, wealth, or other financial or economic division.
|
||||
|
||||
**Tactic**: TA13 Target Audience Analysis **Parent Technique:** T0072 Segment Audiences
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0072.003: Economic Segmentation
|
||||
|
||||
**Summary**: An influence operation may target populations based on their income bracket, wealth, or other financial or economic division.
|
||||
|
||||
**Tactic**: TA13 Target Audience Analysis
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0072.004: Psychographic Segmentation
|
||||
|
||||
**Summary**: An influence operation may target populations based on psychographic segmentation, which uses audience values and decision-making processes. An operation may individually gather psychographic data with its own surveys or collection tools or externally purchase data from social media companies or online surveys, such as personality quizzes.
|
||||
|
||||
**Tactic**: TA13 Target Audience Analysis **Parent Technique:** T0072 Segment Audiences
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0072.004: Psychographic Segmentation
|
||||
|
||||
**Summary**: An influence operation may target populations based on psychographic segmentation, which uses audience values and decision-making processes. An operation may individually gather psychographic data with its own surveys or collection tools or externally purchase data from social media companies or online surveys, such as personality quizzes.
|
||||
|
||||
**Tactic**: TA13 Target Audience Analysis
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0072.005: Political Segmentation
|
||||
|
||||
**Summary**: An influence operation may target populations based on their political affiliations, especially when aiming to manipulate voting or change policy.
|
||||
|
||||
**Tactic**: TA13 Target Audience Analysis **Parent Technique:** T0072 Segment Audiences
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0072.005: Political Segmentation
|
||||
|
||||
**Summary**: An influence operation may target populations based on their political affiliations, especially when aiming to manipulate voting or change policy.
|
||||
|
||||
**Tactic**: TA13 Target Audience Analysis
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0072: Segment Audiences
|
||||
|
||||
**Summary**: Create audience segmentations by features of interest to the influence campaign, including political affiliation, geographic location, income, demographics, and psychographics.
|
||||
|
||||
**Tactic**: TA13 Target Audience Analysis
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0072: Segment Audiences
|
||||
|
||||
**Summary**: Create audience segmentations by features of interest to the influence campaign, including political affiliation, geographic location, income, demographics, and psychographics.
|
||||
|
||||
**Tactic**: TA13 Target Audience Analysis
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0073: Determine Target Audiences
|
||||
|
||||
**Summary**: Determining the target audiences (segments of the population) who will receive campaign narratives and artefacts intended to achieve the strategic ends.
|
||||
|
||||
**Tactic**: TA01 Plan Strategy
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0073: Determine Target Audiences
|
||||
|
||||
**Summary**: Determining the target audiences (segments of the population) who will receive campaign narratives and artefacts intended to achieve the strategic ends.
|
||||
|
||||
**Tactic**: TA01 Plan Strategy
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0074.001: Geopolitical Advantage
|
||||
|
||||
**Summary**: Favourable position on the international stage in terms of great power politics or regional rivalry. Geopolitics plays out in the realms of foreign policy, national security, diplomacy, and intelligence. It involves nation-state governments, heads of state, foreign ministers, intergovernmental organisations, and regional security alliances.
|
||||
|
||||
**Tactic**: TA01 Plan Strategy **Parent Technique:** T0074 Determine Strategic Ends
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0074.001: Geopolitical Advantage
|
||||
|
||||
**Summary**: Favourable position on the international stage in terms of great power politics or regional rivalry. Geopolitics plays out in the realms of foreign policy, national security, diplomacy, and intelligence. It involves nation-state governments, heads of state, foreign ministers, intergovernmental organisations, and regional security alliances.
|
||||
|
||||
**Tactic**: TA01 Plan Strategy
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0074.002: Domestic Political Advantage
|
||||
|
||||
**Summary**: Favourable position vis-à-vis national or sub-national political opponents such as political parties, interest groups, politicians, candidates.
|
||||
|
||||
**Tactic**: TA01 Plan Strategy **Parent Technique:** T0074 Determine Strategic Ends
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0074.002: Domestic Political Advantage
|
||||
|
||||
**Summary**: Favourable position vis-à-vis national or sub-national political opponents such as political parties, interest groups, politicians, candidates.
|
||||
|
||||
**Tactic**: TA01 Plan Strategy
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0074.003: Economic Advantage
|
||||
|
||||
**Summary**: Favourable position domestically or internationally in the realms of commerce, trade, finance, industry. Economics involves nation-states, corporations, banks, trade blocs, industry associations, cartels.
|
||||
|
||||
**Tactic**: TA01 Plan Strategy **Parent Technique:** T0074 Determine Strategic Ends
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0074.003: Economic Advantage
|
||||
|
||||
**Summary**: Favourable position domestically or internationally in the realms of commerce, trade, finance, industry. Economics involves nation-states, corporations, banks, trade blocs, industry associations, cartels.
|
||||
|
||||
**Tactic**: TA01 Plan Strategy
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0074.004: Ideological Advantage
|
||||
|
||||
**Summary**: Favourable position domestically or internationally in the market for ideas, beliefs, and world views. Competition plays out among faith systems, political systems, and value systems. It can involve sub-national, national or supra-national movements.
|
||||
|
||||
**Tactic**: TA01 Plan Strategy **Parent Technique:** T0074 Determine Strategic Ends
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0074.004: Ideological Advantage
|
||||
|
||||
**Summary**: Favourable position domestically or internationally in the market for ideas, beliefs, and world views. Competition plays out among faith systems, political systems, and value systems. It can involve sub-national, national or supra-national movements.
|
||||
|
||||
**Tactic**: TA01 Plan Strategy
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0074: Determine Strategic Ends
|
||||
|
||||
**Summary**: These are the long-term end-states the campaign aims to bring about. They typically involve an advantageous position vis-a-vis competitors in terms of power or influence. The strategic goal may be to improve or simply to hold one’s position. Competition occurs in the public sphere in the domains of war, diplomacy, politics, economics, and ideology, and can play out between armed groups, nation-states, political parties, corporations, interest groups, or individuals.
|
||||
|
||||
**Tactic**: TA01 Plan Strategy
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0074: Determine Strategic Ends
|
||||
|
||||
**Summary**: These are the long-term end-states the campaign aims to bring about. They typically involve an advantageous position vis-a-vis competitors in terms of power or influence. The strategic goal may be to improve or simply to hold one’s position. Competition occurs in the public sphere in the domains of war, diplomacy, politics, economics, and ideology, and can play out between armed groups, nation-states, political parties, corporations, interest groups, or individuals.
|
||||
|
||||
**Tactic**: TA01 Plan Strategy
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0075.001: Discredit Credible Sources
|
||||
|
||||
**Summary**: Plan to delegitimize the media landscape and degrade public trust in reporting, by discrediting credible sources. This makes it easier to promote influence operation content.
|
||||
|
||||
**Tactic**: TA02 Plan Objectives **Parent Technique:** T0075 Dismiss
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0075.001: Discredit Credible Sources
|
||||
|
||||
**Summary**: Plan to delegitimize the media landscape and degrade public trust in reporting, by discrediting credible sources. This makes it easier to promote influence operation content.
|
||||
|
||||
**Tactic**: TA02 Plan Objectives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0075: Dismiss
|
||||
|
||||
**Summary**: Push back against criticism by dismissing your critics. This might be arguing that the critics use a different standard for you than with other actors or themselves; or arguing that their criticism is biassed.
|
||||
|
||||
**Tactic**: TA02 Plan Objectives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0075: Dismiss
|
||||
|
||||
**Summary**: Push back against criticism by dismissing your critics. This might be arguing that the critics use a different standard for you than with other actors or themselves; or arguing that their criticism is biassed.
|
||||
|
||||
**Tactic**: TA02 Plan Objectives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0076: Distort
|
||||
|
||||
**Summary**: Twist the narrative. Take information, or artefacts like images, and change the framing around them.
|
||||
|
||||
**Tactic**: TA02 Plan Objectives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0076: Distort
|
||||
|
||||
**Summary**: Twist the narrative. Take information, or artefacts like images, and change the framing around them.
|
||||
|
||||
**Tactic**: TA02 Plan Objectives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -19,45 +19,4 @@
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0077: Distract
|
||||
|
||||
**Summary**: Shift attention to a different narrative or actor, for instance by accusing critics of the same activity that they’ve accused you of (e.g. police brutality).
|
||||
|
||||
**Tactic**: TA02 Plan Objectives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
# Technique T0077: Distract
|
||||
|
||||
**Summary**: Shift attention to a different narrative or actor, for instance by accusing critics of the same activity that they’ve accused you of (e.g. police brutality).
|
||||
|
||||
**Tactic**: TA02 Plan Objectives
|
||||
|
||||
|
||||
| Associated Technique | Description |
|
||||
| --------- | ------------------------- |
|
||||
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
x
Reference in New Issue
Block a user