mirror of
https://github.com/DISARMFoundation/DISARMframeworks.git
synced 2024-12-20 21:34:17 -05:00
2357 lines
188 KiB
Markdown
2357 lines
188 KiB
Markdown
# DISARM Techniques:
|
||
|
||
<table border="1">
|
||
<tr>
|
||
<th>disarm_id</th>
|
||
<th>name</th>
|
||
<th>summary</th>
|
||
<th>tactic_id</th>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0002.md">T0002</a></td>
|
||
<td>Facilitate State Propaganda</td>
|
||
<td>Organise citizens around pro-state messaging. Coordinate paid or volunteer groups to push state propaganda.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0003.md">T0003</a></td>
|
||
<td>Leverage Existing Narratives</td>
|
||
<td>Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices.</td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0004.md">T0004</a></td>
|
||
<td>Develop Competing Narratives</td>
|
||
<td>Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centred on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the "firehose of misinformation" approach.</td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0010.md">T0010</a></td>
|
||
<td>Cultivate Ignorant Agents</td>
|
||
<td>Cultivate propagandists for a cause, the goals of which are not fully comprehended, and who are used cynically by the leaders of the cause. Independent actors use social media and specialised web sites to strategically reinforce and spread messages compatible with their own. Their networks are infiltrated and used by state media disinformation organisations to amplify the state’s own disinformation strategies against target populations. Many are traffickers in conspiracy theories or hoaxes, unified by a suspicion of Western governments and mainstream media. Their narratives, which appeal to leftists hostile to globalism and military intervention and nationalists against immigration, are frequently infiltrated and shaped by state-controlled trolls and altered news items from agencies such as RT and Sputnik. Also know as "useful idiots" or "unwitting agents".</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0014.md">T0014</a></td>
|
||
<td>Prepare Fundraising Campaigns</td>
|
||
<td>Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0014.001.md">T0014.001</a></td>
|
||
<td>Raise Funds from Malign Actors</td>
|
||
<td>Raising funds from malign actors may include contributions from foreign agents, cutouts or proxies, shell companies, dark money groups, etc.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0014.002.md">T0014.002</a></td>
|
||
<td>Raise Funds from Ignorant Agents</td>
|
||
<td>Raising funds from ignorant agents may include scams, donations intended for one stated purpose but then used for another, etc.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0015.md">T0015</a></td>
|
||
<td>Create Hashtags and Search Artefacts</td>
|
||
<td>Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicise the story more widely through trending lists and search behaviour. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites).</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0015.001.md">T0015.001</a></td>
|
||
<td>Use Existing Hashtag</td>
|
||
<td>Use a dedicated, existing hashtag for the campaign/incident. This Technique covers behaviours previously documented by T0104.005: Use Hashtags, which has since been deprecated. </td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0015.002.md">T0015.002</a></td>
|
||
<td>Create New Hashtag</td>
|
||
<td>Create a campaign/incident specific hashtag. This Technique covers behaviours previously documented by T0104.006: Create Dedicated Hashtag, which has since been deprecated. </td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0016.md">T0016</a></td>
|
||
<td>Create Clickbait</td>
|
||
<td>Create attention grabbing headlines (outrage, doubt, humour) required to drive traffic & engagement. This is a key asset.</td>
|
||
<td>TA05</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0017.md">T0017</a></td>
|
||
<td>Conduct Fundraising</td>
|
||
<td>Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services166 on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns to promote operation messaging while raising money to support its activities.</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0017.001.md">T0017.001</a></td>
|
||
<td>Conduct Crowdfunding Campaigns</td>
|
||
<td>An influence operation may Conduct Crowdfunding Campaigns on platforms such as GoFundMe, GiveSendGo, Tipeee, Patreon, etc.</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0018.md">T0018</a></td>
|
||
<td>Purchase Targeted Advertisements</td>
|
||
<td>Create or fund advertisements targeted at specific populations</td>
|
||
<td>TA05</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0020.md">T0020</a></td>
|
||
<td>Trial Content</td>
|
||
<td>Iteratively test incident performance (messages, content etc), e.g. A/B test headline/content enagagement metrics; website and/or funding campaign conversion rates</td>
|
||
<td>TA08</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0022.md">T0022</a></td>
|
||
<td>Leverage Conspiracy Theory Narratives</td>
|
||
<td>"Conspiracy narratives" appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalised or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the "firehose of falsehoods" model.</td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0022.001.md">T0022.001</a></td>
|
||
<td>Amplify Existing Conspiracy Theory Narratives</td>
|
||
<td>An influence operation may amplify an existing conspiracy theory narrative that aligns with its incident or campaign goals. By amplifying existing conspiracy theory narratives, operators can leverage the power of the existing communities that support and propagate those theories without needing to expend resources creating new narratives or building momentum and buy in around new narratives.</td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0022.002.md">T0022.002</a></td>
|
||
<td>Develop Original Conspiracy Theory Narratives</td>
|
||
<td>While this requires more resources than amplifying existing conspiracy theory narratives, an influence operation may develop original conspiracy theory narratives in order to achieve greater control and alignment over the narrative and their campaign goals. Prominent examples include the USSR's Operation INFEKTION disinformation campaign run by the KGB in the 1980s to plant the idea that the United States had invented HIV/AIDS as part of a biological weapons research project at Fort Detrick, Maryland. More recently, Fort Detrick featured prominently in a new conspiracy theory narratives around the origins of the COVID-19 outbreak and pandemic.</td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0023.md">T0023</a></td>
|
||
<td>Distort Facts</td>
|
||
<td>Change, twist, or exaggerate existing facts to construct a narrative that differs from reality. Examples: images and ideas can be distorted by being placed in an improper content</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0023.001.md">T0023.001</a></td>
|
||
<td>Reframe Context</td>
|
||
<td>Reframing context refers to removing an event from its surrounding context to distort its intended meaning. Rather than deny that an event occurred, reframing context frames an event in a manner that may lead the target audience to draw a different conclusion about its intentions.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0023.002.md">T0023.002</a></td>
|
||
<td>Edit Open-Source Content</td>
|
||
<td>An influence operation may edit open-source content, such as collaborative blogs or encyclopaedias, to promote its narratives on outlets with existing credibility and audiences. Editing open-source content may allow an operation to post content on platforms without dedicating resources to the creation and maintenance of its own assets.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0029.md">T0029</a></td>
|
||
<td>Online Polls</td>
|
||
<td>Create fake online polls, or manipulate existing online polls. Data gathering tactic to target those who engage, and potentially their networks of friends/followers as well</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0039.md">T0039</a></td>
|
||
<td>Bait Influencer</td>
|
||
<td>Influencers are people on social media platforms who have large audiences.<br /> <br />Threat Actors can try to trick Influencers such as celebrities, journalists, or local leaders who aren’t associated with their campaign into amplifying campaign content. This gives them access to the Influencer’s audience without having to go through the effort of building it themselves, and it helps legitimise their message by associating it with the Influencer, benefitting from their audience’s trust in them.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0040.md">T0040</a></td>
|
||
<td>Demand Insurmountable Proof</td>
|
||
<td>Campaigns often leverage tactical and informational asymmetries on the threat surface, as seen in the Distort and Deny strategies, and the "firehose of misinformation". Specifically, conspiracy theorists can be repeatedly wrong, but advocates of the truth need to be perfect. By constantly escalating demands for proof, propagandists can effectively leverage this asymmetry while also priming its future use, often with an even greater asymmetric advantage. The conspiracist is offered freer rein for a broader range of "questions" while the truth teller is burdened with higher and higher standards of proof.</td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0042.md">T0042</a></td>
|
||
<td>Seed Kernel of Truth</td>
|
||
<td>Wrap lies or altered context/facts around truths. Influence campaigns pursue a variety of objectives with respect to target audiences, prominent among them: 1. undermine a narrative commonly referenced in the target audience; or 2. promote a narrative less common in the target audience, but preferred by the attacker. In both cases, the attacker is presented with a heavy lift. They must change the relative importance of various narratives in the interpretation of events, despite contrary tendencies. When messaging makes use of factual reporting to promote these adjustments in the narrative space, they are less likely to be dismissed out of hand; when messaging can juxtapose a (factual) truth about current affairs with the (abstract) truth explicated in these narratives, propagandists can undermine or promote them selectively. Context matters.</td>
|
||
<td>TA08</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0044.md">T0044</a></td>
|
||
<td>Seed Distortions</td>
|
||
<td>Try a wide variety of messages in the early hours surrounding an incident or event, to give a misleading account or impression.</td>
|
||
<td>TA08</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0045.md">T0045</a></td>
|
||
<td>Use Fake Experts</td>
|
||
<td>Use the fake experts that were set up during Establish Legitimacy. Pseudo-experts are disposable assets that often appear once and then disappear. Give "credility" to misinformation. Take advantage of credential bias</td>
|
||
<td>TA08</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0046.md">T0046</a></td>
|
||
<td>Use Search Engine Optimisation</td>
|
||
<td>Manipulate content engagement metrics (ie: Reddit & Twitter) to influence/impact news search results (e.g. Google), also elevates RT & Sputnik headline into Google news alert emails. aka "Black-hat SEO"</td>
|
||
<td>TA08</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0047.md">T0047</a></td>
|
||
<td>Censor Social Media as a Political Force</td>
|
||
<td>Use political influence or the power of state to stop critical social media comments. Government requested/driven content take downs (see Google Transperancy reports).</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0048.md">T0048</a></td>
|
||
<td>Harass</td>
|
||
<td>Threatening or harassing believers of opposing narratives refers to the use of intimidation techniques, including cyberbullying and doxing, to discourage opponents from voicing their dissent. An influence operation may threaten or harass believers of the opposing narratives to deter individuals from posting or proliferating conflicting content.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0048.001.md">T0048.001</a></td>
|
||
<td>Boycott/"Cancel" Opponents</td>
|
||
<td>Cancel culture refers to the phenomenon in which individuals collectively refrain from supporting an individual, organisation, business, or other entity, usually following a real or falsified controversy. An influence operation may exploit cancel culture by emphasising an adversary’s problematic or disputed behaviour and presenting its own content as an alternative.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0048.002.md">T0048.002</a></td>
|
||
<td>Harass People Based on Identities</td>
|
||
<td>Examples include social identities like gender, sexuality, race, ethnicity, religion, ability, nationality, etc. as well as roles and occupations like journalist or activist.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0048.003.md">T0048.003</a></td>
|
||
<td>Threaten to Dox</td>
|
||
<td>Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0048.004.md">T0048.004</a></td>
|
||
<td>Dox</td>
|
||
<td>Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0049.md">T0049</a></td>
|
||
<td>Flood Information Space</td>
|
||
<td>Flooding sources of information (e.g. Social Media feeds) with a high volume of inauthentic content.<br /> <br />This can be done to control/shape online conversations, drown out opposing points of view, or make it harder to find legitimate information.<br /> <br />Bots and/or patriotic trolls are effective tools to achieve this effect.<br /> <br />This Technique previously used the name Flooding the Information Space.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0049.001.md">T0049.001</a></td>
|
||
<td>Trolls Amplify and Manipulate</td>
|
||
<td>Use trolls to amplify narratives and/or manipulate narratives. Fake profiles/sockpuppets operating to support individuals/narratives from the entire political spectrum (left/right binary). Operating with increased emphasis on promoting local content and promoting real Twitter users generating their own, often divisive political content, as it's easier to amplify existing content than create new/original content. Trolls operate where ever there's a socially divisive issue (issues that can/are be politicized).</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0049.002.md">T0049.002</a></td>
|
||
<td>Flood Existing Hashtag</td>
|
||
<td>Hashtags can be used by communities to collate information they post about particular topics (such as their interests, or current events) and users can find communities to join by exploring hashtags they’re interested in.<br /> <br />Threat actors can flood an existing hashtag to try to ruin hashtag functionality, posting content unrelated to the hashtag alongside it, making it a less reliable source of relevant information. They may also try to flood existing hashtags with campaign content, with the intent of maximising exposure to users.<br /> <br />This Technique covers cases where threat actors flood existing hashtags with campaign content.<br /> <br />This Technique covers behaviours previously documented by T0019.002: Hijack Hashtags, which has since been deprecated. This Technique was previously called Hijack Existing Hashtag.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0049.003.md">T0049.003</a></td>
|
||
<td>Bots Amplify via Automated Forwarding and Reposting</td>
|
||
<td>Automated forwarding and reposting refer to the proliferation of operation content using automated means, such as artificial intelligence or social media bots. An influence operation may use automated activity to increase content exposure without dedicating the resources, including personnel and time, traditionally required to forward and repost content. Use bots to amplify narratives above algorithm thresholds. Bots are automated/programmed profiles designed to amplify content (ie: automatically retweet or like) and give appearance it's more "popular" than it is. They can operate as a network, to function in a coordinated/orchestrated manner. In some cases (more so now) they are an inexpensive/disposable assets used for minimal deployment as bot detection tools improve and platforms are more responsive.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0049.004.md">T0049.004</a></td>
|
||
<td>Utilise Spamoflauge</td>
|
||
<td>Spamoflauge refers to the practice of disguising spam messages as legitimate. Spam refers to the use of electronic messaging systems to send out unrequested or unwanted messages in bulk. Simple methods of spamoflauge include replacing letters with numbers to fool keyword-based email spam filters, for example, "you've w0n our jackp0t!". Spamoflauge may extend to more complex techniques such as modifying the grammar or word choice of the language, casting messages as images which spam detectors cannot automatically read, or encapsulating messages in password protected attachments, such as .pdf or .zip files. Influence operations may use spamoflauge to avoid spam filtering systems and increase the likelihood of the target audience receiving operation messaging.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0049.005.md">T0049.005</a></td>
|
||
<td>Conduct Swarming</td>
|
||
<td>Swarming refers to the coordinated use of accounts to overwhelm the information space with operation content. Unlike information flooding, swarming centres exclusively around a specific event or actor rather than a general narrative. Swarming relies on “horizontal communication” between information assets rather than a top-down, vertical command-and-control approach.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0049.006.md">T0049.006</a></td>
|
||
<td>Conduct Keyword Squatting</td>
|
||
<td>Keyword squatting refers to the creation of online content, such as websites, articles, or social media accounts, around a specific search engine-optimized term to overwhelm the search results of that term. An influence may keyword squat to increase content exposure to target audience members who query the exploited term in a search engine and manipulate the narrative around the term.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0049.007.md">T0049.007</a></td>
|
||
<td>Inauthentic Sites Amplify News and Narratives</td>
|
||
<td>Inauthentic sites circulate cross-post stories and amplify narratives. Often these sites have no masthead, bylines or attribution.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0049.008.md">T0049.008</a></td>
|
||
<td>Generate Information Pollution</td>
|
||
<td>Information Pollution occurs when threat actors attempt to ruin a source of information by flooding it with lots of inauthentic or unreliable content, intending to make it harder for legitimate users to find the information they’re looking for.<br /> <br />This sub-technique’s objective is to reduce exposure to target information, rather than promoting exposure to campaign content, for which the parent Technique T0049 can be used.<br /> <br />Analysts will need to infer what the motive for flooding an information space was when deciding whether to use T0049 or T0049.008 to tag a case when an information space is flooded. If such inference is not possible, default to T0049.<br /> <br />This Technique previously used the ID T0019.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0057.md">T0057</a></td>
|
||
<td>Organise Events</td>
|
||
<td>Coordinate and promote real-world events across media platforms, e.g. rallies, protests, gatherings in support of incident narratives.</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0057.001.md">T0057.001</a></td>
|
||
<td>Pay for Physical Action</td>
|
||
<td>Paying for physical action occurs when an influence operation pays individuals to act in the physical realm. An influence operation may pay for physical action to create specific situations and frame them in a way that supports operation narratives, for example, paying a group of people to burn a car to later post an image of the burning car and frame it as an act of protest.</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0057.002.md">T0057.002</a></td>
|
||
<td>Conduct Symbolic Action</td>
|
||
<td>Symbolic action refers to activities specifically intended to advance an operation’s narrative by signalling something to the audience, for example, a military parade supporting a state’s narrative of military superiority. An influence operation may use symbolic action to create falsified evidence supporting operation narratives in the physical information space.</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0059.md">T0059</a></td>
|
||
<td>Play the Long Game</td>
|
||
<td>Play the long game refers to two phenomena: 1. To plan messaging and allow it to grow organically without conducting your own amplification. This is methodical and slow and requires years for the message to take hold 2. To develop a series of seemingly disconnected messaging narratives that eventually combine into a new narrative.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0060.md">T0060</a></td>
|
||
<td>Continue to Amplify</td>
|
||
<td>continue narrative or message amplification after the main incident work has finished</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0061.md">T0061</a></td>
|
||
<td>Sell Merchandise</td>
|
||
<td>Sell mechandise refers to getting the message or narrative into physical space in the offline world while making money</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0065.md">T0065</a></td>
|
||
<td>Prepare Physical Broadcast Capabilities</td>
|
||
<td>Create or coopt broadcast capabilities (e.g. TV, radio etc).</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0066.md">T0066</a></td>
|
||
<td>Degrade Adversary</td>
|
||
<td>Plan to degrade an adversary’s image or ability to act. This could include preparation and use of harmful information about the adversary’s actions or reputation.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0068.md">T0068</a></td>
|
||
<td>Respond to Breaking News Event or Active Crisis</td>
|
||
<td>Media attention on a story or event is heightened during a breaking news event, where unclear facts and incomplete information increase speculation, rumours, and conspiracy theories, which are all vulnerable to manipulation.</td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0072.md">T0072</a></td>
|
||
<td>Segment Audiences</td>
|
||
<td>Create audience segmentations by features of interest to the influence campaign, including political affiliation, geographic location, income, demographics, and psychographics.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0072.001.md">T0072.001</a></td>
|
||
<td>Geographic Segmentation</td>
|
||
<td>An influence operation may target populations in a specific geographic location, such as a region, state, or city. An influence operation may use geographic segmentation to Create Localised Content (see: Establish Legitimacy).</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0072.002.md">T0072.002</a></td>
|
||
<td>Demographic Segmentation</td>
|
||
<td>An influence operation may target populations based on demographic segmentation, including age, gender, and income. Demographic segmentation may be useful for influence operations aiming to change state policies that affect a specific population sector. For example, an influence operation attempting to influence Medicare funding in the United States would likely target U.S. voters over 65 years of age.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0072.003.md">T0072.003</a></td>
|
||
<td>Economic Segmentation</td>
|
||
<td>An influence operation may target populations based on their income bracket, wealth, or other financial or economic division.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0072.004.md">T0072.004</a></td>
|
||
<td>Psychographic Segmentation</td>
|
||
<td>An influence operation may target populations based on psychographic segmentation, which uses audience values and decision-making processes. An operation may individually gather psychographic data with its own surveys or collection tools or externally purchase data from social media companies or online surveys, such as personality quizzes.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0072.005.md">T0072.005</a></td>
|
||
<td>Political Segmentation</td>
|
||
<td>An influence operation may target populations based on their political affiliations, especially when aiming to manipulate voting or change policy.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0073.md">T0073</a></td>
|
||
<td>Determine Target Audiences</td>
|
||
<td>Determining the target audiences (segments of the population) who will receive campaign narratives and artefacts intended to achieve the strategic ends.</td>
|
||
<td>TA01</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0074.md">T0074</a></td>
|
||
<td>Determine Strategic Ends</td>
|
||
<td>These are the long-term end-states the campaign aims to bring about. They typically involve an advantageous position vis-a-vis competitors in terms of power or influence. The strategic goal may be to improve or simply to hold one’s position. Competition occurs in the public sphere in the domains of war, diplomacy, politics, economics, and ideology, and can play out between armed groups, nation-states, political parties, corporations, interest groups, or individuals. </td>
|
||
<td>TA01</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0074.001.md">T0074.001</a></td>
|
||
<td>Geopolitical Advantage</td>
|
||
<td>Favourable position on the international stage in terms of great power politics or regional rivalry. Geopolitics plays out in the realms of foreign policy, national security, diplomacy, and intelligence. It involves nation-state governments, heads of state, foreign ministers, intergovernmental organisations, and regional security alliances.</td>
|
||
<td>TA01</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0074.002.md">T0074.002</a></td>
|
||
<td>Domestic Political Advantage</td>
|
||
<td>Favourable position vis-à-vis national or sub-national political opponents such as political parties, interest groups, politicians, candidates. </td>
|
||
<td>TA01</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0074.003.md">T0074.003</a></td>
|
||
<td>Economic Advantage</td>
|
||
<td>Favourable position domestically or internationally in the realms of commerce, trade, finance, industry. Economics involves nation-states, corporations, banks, trade blocs, industry associations, cartels. </td>
|
||
<td>TA01</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0074.004.md">T0074.004</a></td>
|
||
<td>Ideological Advantage</td>
|
||
<td>Favourable position domestically or internationally in the market for ideas, beliefs, and world views. Competition plays out among faith systems, political systems, and value systems. It can involve sub-national, national or supra-national movements. </td>
|
||
<td>TA01</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0075.md">T0075</a></td>
|
||
<td>Dismiss</td>
|
||
<td>Push back against criticism by dismissing your critics. This might be arguing that the critics use a different standard for you than with other actors or themselves; or arguing that their criticism is biassed.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0075.001.md">T0075.001</a></td>
|
||
<td>Discredit Credible Sources</td>
|
||
<td>Plan to delegitimize the media landscape and degrade public trust in reporting, by discrediting credible sources. This makes it easier to promote influence operation content.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0076.md">T0076</a></td>
|
||
<td>Distort</td>
|
||
<td>Twist the narrative. Take information, or artefacts like images, and change the framing around them.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0077.md">T0077</a></td>
|
||
<td>Distract</td>
|
||
<td>Shift attention to a different narrative or actor, for instance by accusing critics of the same activity that they’ve accused you of (e.g. police brutality).</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0078.md">T0078</a></td>
|
||
<td>Dismay</td>
|
||
<td>Threaten the critic or narrator of events. For instance, threaten journalists or news outlets reporting on a story.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0079.md">T0079</a></td>
|
||
<td>Divide</td>
|
||
<td>Create conflict between subgroups, to widen divisions in a community</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0080.md">T0080</a></td>
|
||
<td>Map Target Audience Information Environment</td>
|
||
<td>Mapping the target audience information environment analyses the information space itself, including social media analytics, web traffic, and media surveys. Mapping the information environment may help the influence operation determine the most realistic and popular information channels to reach its target audience. Mapping the target audience information environment aids influence operations in determining the most vulnerable areas of the information space to target with messaging.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0080.001.md">T0080.001</a></td>
|
||
<td>Monitor Social Media Analytics</td>
|
||
<td>An influence operation may use social media analytics to determine which factors will increase the operation content’s exposure to its target audience on social media platforms, including views, interactions, and sentiment relating to topics and content types. The social media platform itself or a third-party tool may collect the metrics.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0080.002.md">T0080.002</a></td>
|
||
<td>Evaluate Media Surveys</td>
|
||
<td>An influence operation may evaluate its own or third-party media surveys to determine what type of content appeals to its target audience. Media surveys may provide insight into an audience’s political views, social class, general interests, or other indicators used to tailor operation messaging to its target audience.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0080.003.md">T0080.003</a></td>
|
||
<td>Identify Trending Topics/Hashtags</td>
|
||
<td>An influence operation may identify trending hashtags on social media platforms for later use in boosting operation content. A hashtag40 refers to a word or phrase preceded by the hash symbol (#) on social media used to identify messages and posts relating to a specific topic. All public posts that use the same hashtag are aggregated onto a centralised page dedicated to the word or phrase and sorted either chronologically or by popularity.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0080.004.md">T0080.004</a></td>
|
||
<td>Conduct Web Traffic Analysis</td>
|
||
<td>An influence operation may conduct web traffic analysis to determine which search engines, keywords, websites, and advertisements gain the most traction with its target audience.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0080.005.md">T0080.005</a></td>
|
||
<td>Assess Degree/Type of Media Access</td>
|
||
<td>An influence operation may survey a target audience’s Internet availability and degree of media freedom to determine which target audience members will have access to operation content and on which platforms. An operation may face more difficulty targeting an information environment with heavy restrictions and media control than an environment with independent media, freedom of speech and of the press, and individual liberties.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0081.md">T0081</a></td>
|
||
<td>Identify Social and Technical Vulnerabilities</td>
|
||
<td>Identifying social and technical vulnerabilities determines weaknesses within the target audience information environment for later exploitation. Vulnerabilities include decisive political issues, weak cybersecurity infrastructure, search engine data voids, and other technical and non technical weaknesses in the target information environment. Identifying social and technical vulnerabilities facilitates the later exploitation of the identified weaknesses to advance operation objectives.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0081.001.md">T0081.001</a></td>
|
||
<td>Find Echo Chambers</td>
|
||
<td>Find or plan to create areas (social media groups, search term groups, hashtag groups etc) where individuals only engage with people they agree with.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0081.002.md">T0081.002</a></td>
|
||
<td>Identify Data Voids</td>
|
||
<td>A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation. A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalising on most search engines’ preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0081.003.md">T0081.003</a></td>
|
||
<td>Identify Existing Prejudices</td>
|
||
<td>An influence operation may exploit existing racial, religious, demographic, or social prejudices to further polarise its target audience from the rest of the public.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0081.004.md">T0081.004</a></td>
|
||
<td>Identify Existing Fissures</td>
|
||
<td>An influence operation may identify existing fissures to pit target populations against one another or facilitate a “divide-and-conquer" approach to tailor operation narratives along the divides.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0081.005.md">T0081.005</a></td>
|
||
<td>Identify Existing Conspiracy Narratives/Suspicions</td>
|
||
<td>An influence operation may assess preexisting conspiracy theories or suspicions in a population to identify existing narratives that support operational objectives.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0081.006.md">T0081.006</a></td>
|
||
<td>Identify Wedge Issues</td>
|
||
<td>A wedge issue is a divisive political issue, usually concerning a social phenomenon, that divides individuals along a defined line. An influence operation may exploit wedge issues by intentionally polarising the public along the wedge issue line and encouraging opposition between factions.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0081.007.md">T0081.007</a></td>
|
||
<td>Identify Target Audience Adversaries</td>
|
||
<td>An influence operation may identify or create a real or imaginary adversary to centre operation narratives against. A real adversary may include certain politicians or political parties while imaginary adversaries may include falsified “deep state”62 actors that, according to conspiracies, run the state behind public view.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0081.008.md">T0081.008</a></td>
|
||
<td>Identify Media System Vulnerabilities</td>
|
||
<td>An influence operation may exploit existing weaknesses in a target’s media system. These weaknesses may include existing biases among media agencies, vulnerability to false news agencies on social media, or existing distrust of traditional media sources. An existing distrust among the public in the media system’s credibility holds high potential for exploitation by an influence operation when establishing alternative news agencies to spread operation content.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0082.md">T0082</a></td>
|
||
<td>Develop New Narratives</td>
|
||
<td>Actors may develop new narratives to further strategic or tactical goals, especially when existing narratives adequately align with the campaign goals. New narratives provide more control in terms of crafting the message to achieve specific goals. However, new narratives may require more effort to disseminate than adapting or adopting existing narratives.</td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0083.md">T0083</a></td>
|
||
<td>Integrate Target Audience Vulnerabilities into Narrative</td>
|
||
<td>An influence operation may seek to exploit the preexisting weaknesses, fears, and enemies of the target audience for integration into the operation’s narratives and overall strategy. Integrating existing vulnerabilities into the operational approach conserves resources by exploiting already weak areas of the target information environment instead of forcing the operation to create new vulnerabilities in the environment.</td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0084.md">T0084</a></td>
|
||
<td>Reuse Existing Content</td>
|
||
<td>When an operation recycles content from its own previous operations or plagiarises from external operations. An operation may launder information to conserve resources that would have otherwise been utilised to develop new content.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0084.001.md">T0084.001</a></td>
|
||
<td>Use Copypasta</td>
|
||
<td>Copypasta refers to a piece of text that has been copied and pasted multiple times across various online platforms. A copypasta’s final form may differ from its original source text as users add, delete, or otherwise edit the content as they repost the text.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0084.002.md">T0084.002</a></td>
|
||
<td>Plagiarise Content</td>
|
||
<td>An influence operation may take content from other sources without proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0084.003.md">T0084.003</a></td>
|
||
<td>Deceptively Labelled or Translated</td>
|
||
<td>An influence operation may take authentic content from other sources and add deceptive labels or deceptively translate the content into other langauges.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0084.004.md">T0084.004</a></td>
|
||
<td>Appropriate Content</td>
|
||
<td>An influence operation may take content from other sources with proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources. Examples include the appropriation of content from one inauthentic news site to another inauthentic news site or network in ways that align with the originators licencing or terms of service.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0085.md">T0085</a></td>
|
||
<td>Develop Text-Based Content</td>
|
||
<td>Creating and editing false or misleading text-based artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0085.001.md">T0085.001</a></td>
|
||
<td>Develop AI-Generated Text</td>
|
||
<td>AI-generated texts refers to synthetic text composed by computers using text-generating AI technology. Autonomous generation refers to content created by a bot without human input, also known as bot-created content generation. Autonomous generation represents the next step in automation after language generation and may lead to automated journalism. An influence operation may use read fakes or autonomous generation to quickly develop and distribute content to the target audience.<br><br><b>Associated Techniques and Sub-techniques:</b><br><b>T0085.008: Machine Translated Text:</b> Use this sub-technique when AI has been used to generate a translation of a piece of text.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0085.003.md">T0085.003</a></td>
|
||
<td>Develop Inauthentic News Articles</td>
|
||
<td>An influence operation may develop false or misleading news articles aligned to their campaign goals or narratives.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0085.004.md">T0085.004</a></td>
|
||
<td>Develop Document</td>
|
||
<td>Produce text in the form of a document.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0085.005.md">T0085.005</a></td>
|
||
<td>Develop Book</td>
|
||
<td>Produce text content in the form of a book. <br /> <br />This technique covers both e-books and physical books, however, the former is more easily deployed by threat actors given the lower cost to develop.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0085.006.md">T0085.006</a></td>
|
||
<td>Develop Opinion Article</td>
|
||
<td>Opinion articles (aka “Op-Eds” or “Editorials”) are articles or regular columns flagged as “opinion” posted to news sources, and can be contributed by people outside the organisation. <br /> <br />Flagging articles as opinions allow news organisations to distinguish them from the typical expectations of objective news reporting while distancing the presented opinion from the organisation or its employees.<br /> <br /> The use of this technique is not by itself an indication of malicious or inauthentic content; Op-eds are a common format in media. However, threat actors exploit op-eds to, for example, submit opinion articles to local media to promote their narratives. <br /> <br />Examples from the perspective of a news site involve publishing op-eds from perceived prestigious voices to give legitimacy to an inauthentic publication, or supporting causes by hosting op-eds from actors aligned with the organisation’s goals.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0085.007.md">T0085.007</a></td>
|
||
<td>Create Fake Research</td>
|
||
<td>Create fake academic research. Example: fake social science research is often aimed at hot-button social issues such as gender, race and sexuality. Fake science research can target Climate Science debate or pseudoscience like anti-vaxx.<br /> <br />This Technique previously used the ID T0019.001.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0085.008.md">T0085.008</a></td>
|
||
<td>Machine Translated Text</td>
|
||
<td>Text which has been translated into another language using machine translation tools, such as AI.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0086.md">T0086</a></td>
|
||
<td>Develop Image-Based Content</td>
|
||
<td>Creating and editing false or misleading visual artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include photographing staged real-life situations, repurposing existing digital images, or using image creation and editing technologies.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0086.001.md">T0086.001</a></td>
|
||
<td>Develop Memes</td>
|
||
<td>Memes are one of the most important single artefact types in all of computational propaganda. Memes in this framework denotes the narrow image-based definition. But that naming is no accident, as these items have most of the important properties of Dawkins' original conception as a self-replicating unit of culture. Memes pull together reference and commentary; image and narrative; emotion and message. Memes are a powerful tool and the heart of modern influence campaigns.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0086.002.md">T0086.002</a></td>
|
||
<td>Develop AI-Generated Images (Deepfakes)</td>
|
||
<td>Deepfakes refer to AI-generated falsified photos, videos, or soundbites. An influence operation may use deepfakes to depict an inauthentic situation by synthetically recreating an individual’s face, body, voice, and physical gestures.<br><br> <b>Associated Techniques and Sub-techniques:</b><br> <b>T0145.002: AI-Generated Account Imagery:</b> Analysts should use this sub-technique to document use of AI generated imagery in accounts’ profile pictures or other account imagery.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0086.003.md">T0086.003</a></td>
|
||
<td>Deceptively Edit Images (Cheap Fakes)</td>
|
||
<td>Cheap fakes utilise less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0086.004.md">T0086.004</a></td>
|
||
<td>Aggregate Information into Evidence Collages</td>
|
||
<td>Image files that aggregate positive evidence (Joan Donovan)</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0087.md">T0087</a></td>
|
||
<td>Develop Video-Based Content</td>
|
||
<td>Creating and editing false or misleading video artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include staging videos of purportedly real situations, repurposing existing video artefacts, or using AI-generated video creation and editing technologies (including deepfakes).</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0087.001.md">T0087.001</a></td>
|
||
<td>Develop AI-Generated Videos (Deepfakes)</td>
|
||
<td>Deepfakes refer to AI-generated falsified photos, videos, or soundbites. An influence operation may use deepfakes to depict an inauthentic situation by synthetically recreating an individual’s face, body, voice, and physical gestures.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0087.002.md">T0087.002</a></td>
|
||
<td>Deceptively Edit Video (Cheap Fakes)</td>
|
||
<td>Cheap fakes utilise less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0088.md">T0088</a></td>
|
||
<td>Develop Audio-Based Content</td>
|
||
<td>Creating and editing false or misleading audio artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include creating completely new audio content, repurposing existing audio artefacts (including cheap fakes), or using AI-generated audio creation and editing technologies (including deepfakes).</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0088.001.md">T0088.001</a></td>
|
||
<td>Develop AI-Generated Audio (Deepfakes)</td>
|
||
<td>Deepfakes refer to AI-generated falsified photos, videos, or soundbites. An influence operation may use deepfakes to depict an inauthentic situation by synthetically recreating an individual’s face, body, voice, and physical gestures.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0088.002.md">T0088.002</a></td>
|
||
<td>Deceptively Edit Audio (Cheap Fakes)</td>
|
||
<td>Cheap fakes utilise less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0089.md">T0089</a></td>
|
||
<td>Obtain Private Documents</td>
|
||
<td>Procuring documents that are not publicly available, by whatever means -- whether legal or illegal, highly-resourced or less so. These documents can include authentic non-public documents, authentic non-public documents have been altered, or inauthentic documents intended to appear as if they are authentic non-public documents. All of these types of documents can be "leaked" during later stages in the operation.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0089.001.md">T0089.001</a></td>
|
||
<td>Obtain Authentic Documents</td>
|
||
<td>Procure authentic documents that are not publicly available, by whatever means -- whether legal or illegal, highly-resourced or less so. These documents can be "leaked" during later stages in the operation.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0089.003.md">T0089.003</a></td>
|
||
<td>Alter Authentic Documents</td>
|
||
<td>Alter authentic documents (public or non-public) to achieve campaign goals. The altered documents are intended to appear as if they are authentic and can be "leaked" during later stages in the operation.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0091.md">T0091</a></td>
|
||
<td>Recruit Malign Actors</td>
|
||
<td>Operators recruit bad actors paying recruiting, or exerting control over individuals includes trolls, partisans, and contractors.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0091.001.md">T0091.001</a></td>
|
||
<td>Recruit Contractors</td>
|
||
<td>Operators recruit paid contractor to support the campaign.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0091.002.md">T0091.002</a></td>
|
||
<td>Recruit Partisans</td>
|
||
<td>Operators recruit partisans (ideologically-aligned individuals) to support the campaign.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0091.003.md">T0091.003</a></td>
|
||
<td>Enlist Troll Accounts</td>
|
||
<td>An influence operation may hire trolls, or human operators of fake accounts that aim to provoke others by posting and amplifying content about controversial issues. Trolls can serve to discredit an influence operation’s opposition or bring attention to the operation’s cause through debate. Classic trolls refer to regular people who troll for personal reasons, such as attention-seeking or boredom. Classic trolls may advance operation narratives by coincidence but are not directly affiliated with any larger operation. Conversely, hybrid trolls act on behalf of another institution, such as a state or financial organisation, and post content with a specific ideological goal. Hybrid trolls may be highly advanced and institutionalised or less organised and work for a single individual.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0092.md">T0092</a></td>
|
||
<td>Build Network</td>
|
||
<td>Operators build their own network, creating links between accounts -- whether authentic or inauthentic -- in order amplify and promote narratives and artefacts, and encourage further growth of ther network, as well as the ongoing sharing and engagement with operational content.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0092.001.md">T0092.001</a></td>
|
||
<td>Create Organisations</td>
|
||
<td>Influence operations may establish organisations with legitimate or falsified hierarchies, staff, and content to structure operation assets, provide a sense of legitimacy to the operation, or provide institutional backing to operation activities.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0092.002.md">T0092.002</a></td>
|
||
<td>Use Follow Trains</td>
|
||
<td>A follow train is a group of people who follow each other on a social media platform, often as a way for an individual or campaign to grow its social media following. Follow trains may be a violation of platform Terms of Service. They are also known as follow-for-follow groups.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0092.003.md">T0092.003</a></td>
|
||
<td>Create Community or Sub-Group</td>
|
||
<td>When there is not an existing community or sub-group that meets a campaign's goals, an influence operation may seek to create a community or sub-group.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0093.md">T0093</a></td>
|
||
<td>Acquire/Recruit Network</td>
|
||
<td>Operators acquire an existing network by paying, recruiting, or exerting control over the leaders of the existing network.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0093.001.md">T0093.001</a></td>
|
||
<td>Fund Proxies</td>
|
||
<td>An influence operation may fund proxies, or external entities that work for the operation. An operation may recruit/train users with existing sympathies towards the operation’s narratives and/or goals as proxies. Funding proxies serves various purposes including: - Diversifying operation locations to complicate attribution - Reducing the workload for direct operation assets</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0093.002.md">T0093.002</a></td>
|
||
<td>Acquire Botnets</td>
|
||
<td>A botnet is a group of bots that can function in coordination with each other.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0094.md">T0094</a></td>
|
||
<td>Infiltrate Existing Networks</td>
|
||
<td>Operators deceptively insert social assets into existing networks as group members in order to influence the members of the network and the wider information environment that the network impacts.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0094.001.md">T0094.001</a></td>
|
||
<td>Identify Susceptible Targets in Networks</td>
|
||
<td>When seeking to infiltrate an existing network, an influence operation may identify individuals and groups that might be susceptible to being co-opted or influenced.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0094.002.md">T0094.002</a></td>
|
||
<td>Utilise Butterfly Attacks</td>
|
||
<td>Butterfly attacks occur when operators pretend to be members of a certain social group, usually a group that struggles for representation. An influence operation may mimic a group to insert controversial statements into the discourse, encourage the spread of operation content, or promote harassment among group members. Unlike astroturfing, butterfly attacks aim to infiltrate and discredit existing grassroots movements, organisations, and media campaigns.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0095.md">T0095</a></td>
|
||
<td>Develop Owned Media Assets</td>
|
||
<td>An owned media asset refers to an agency or organisation through which an influence operation may create, develop, and host content and narratives. Owned media assets include websites, blogs, social media pages, forums, and other platforms that facilitate the creation and organisation of content.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0096.md">T0096</a></td>
|
||
<td>Leverage Content Farms</td>
|
||
<td>Using the services of large-scale content providers for creating and amplifying campaign artefacts at scale.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0096.001.md">T0096.001</a></td>
|
||
<td>Create Content Farms</td>
|
||
<td>An influence operation may create an organisation for creating and amplifying campaign artefacts at scale.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0096.002.md">T0096.002</a></td>
|
||
<td>Outsource Content Creation to External Organisations</td>
|
||
<td>An influence operation may outsource content creation to external companies to avoid attribution, increase the rate of content creation, or improve content quality, i.e., by employing an organisation that can create content in the target audience’s native language. Employed organisations may include marketing companies for tailored advertisements or external content farms for high volumes of targeted media.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.md">T0097</a></td>
|
||
<td>Present Persona</td>
|
||
<td>This Technique contains different types of personas commonly taken on by threat actors during influence operations.<br><br>Analysts should use T0097’s sub-techniques to document the type of persona which an account is presenting. For example, an account which describes itself as being a journalist can be tagged with T0097.102: Journalist Persona.<br><br>Personas presented by individuals include:<br><br>T0097.100: Individual Persona<br>T0097.101: Local Persona<br>T0097.102: Journalist Persona<br>T0097.103: Activist Persona<br>T0097.104: Hacktivist Persona<br>T0097.105: Military Personnel Persona<br>T0097.106: Recruiter Persona<br>T0097.107: Researcher Persona<br>T0097.108: Expert Persona<br>T0097.109: Romantic Suitor Persona<br>T0097.110: Party Official Persona<br>T0097.111: Government Official Persona<br>T0097.112: Government Employee Persona<br><br>This Technique also houses institutional personas commonly taken on by threat actors:<br><br>T0097.200: Institutional Persona<br>T0097.201: Local Institution Persona<br>T0097.202: News Outlet Persona<br>T0097.203: Fact Checking Organisation Persona<br>T0097.204: Think Tank Persona<br>T0097.205: Business Persona<br>T0097.206: Government Institution Persona<br>T0097.207: NGO Persona<br>T0097.208: Social Cause Persona<br><br>By using a persona, a threat actor is adding the perceived legitimacy of the persona to their narratives and activities.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.100.md">T0097.100</a></td>
|
||
<td>Individual Persona</td>
|
||
<td>This sub-technique can be used to indicate that an entity is presenting itself as an individual. If the person is presenting themselves as having one of the personas listed below then these sub-techniques should be used instead, as they indicate both the type of persona they presented and that the entity presented itself as an individual:<br><br>T0097.101: Local Persona<br>T0097.102: Journalist Persona<br>T0097.103: Activist Persona<br>T0097.104: Hacktivist Persona<br>T0097.105: Military Personnel Persona<br>T0097.106: Recruiter Persona<br>T0097.107: Researcher Persona<br>T0097.108: Expert Persona<br>T0097.109: Romantic Suitor Persona<br>T0097.110: Party Official Persona<br>T0097.111: Government Official Persona<br>T0097.112: Government Employee Persona</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.101.md">T0097.101</a></td>
|
||
<td>Local Persona</td>
|
||
<td>A person with a local persona presents themselves as living in a particular geography or having local knowledge relevant to a narrative.<br><br>While presenting as a local is not an indication of inauthentic behaviour, an influence operation may have its narratives amplified by people presenting as local to a target area. Threat actors can fabricate locals (T0143.002: Fabricated Persona, T0097.101: Local Persona) to add credibility to their narratives, or to misrepresent the real opinions of locals in the area.<br><br>People who are legitimate locals (T0143.001: Authentic Persona, T0097.101: Local Persona) can use their persona for malicious purposes, or be exploited by threat actors. For example, someone could take money for using their position as a local to provide legitimacy to a false narrative or be tricked into doing so without their knowledge.<br><br><b>Associated Techniques and Sub-techniques</b><br><b>T0097.201: Local Institution Persona:</b> Analysts should use this sub-technique to catalogue cases where an institution is presenting as a local, such as a local news organisation or local business.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.102.md">T0097.102</a></td>
|
||
<td>Journalist Persona</td>
|
||
<td>A person with a journalist persona presents themselves as a reporter or journalist delivering news, conducting interviews, investigations etc.<br><br>While presenting as a journalist is not an indication of inauthentic behaviour, an influence operation may have its narratives amplified by people presenting as journalists. Threat actors can fabricate journalists to give the appearance of legitimacy, justifying the actor’s requests for interviews, etc (T0143.002: Fabricated Persona, T0097.102: Journalist Persona).<br><br>People who have legitimately developed a persona as a journalist (T0143.001: Authentic Persona, T0097.102: Journalist Persona) can use it for malicious purposes, or be exploited by threat actors. For example, someone could take money for using their position as a trusted journalist to provide legitimacy to a false narrative or be tricked into doing so without the journalist’s knowledge.<br><br><b>Associated Techniques and Sub-techniques</b><br><b>T0097.202: News Organisation Persona:</b> People with a journalist persona may present as being part of a news organisation.<br><b>T0097.101: Local Persona:</b> People with a journalist persona may present themselves as local reporters.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.103.md">T0097.103</a></td>
|
||
<td>Activist Persona</td>
|
||
<td>A person with an activist persona presents themselves as an activist; an individual who campaigns for a political cause, organises related events, etc.<br><br>While presenting as an activist is not an indication of inauthentic behaviour, an influence operation may have its narratives amplified by people presenting as activists. Threat actors can fabricate activists to give the appearance of popular support for an evolving grassroots movement (see T0143.002: Fabricated Persona, T0097.103: Activist Persona).<br><br>People who are legitimate activists can use this persona for malicious purposes, or be exploited by threat actors. For example, someone could take money for using their position as an activist to provide visibility to a false narrative or be tricked into doing so without their knowledge (T0143.001: Authentic Persona, T0097.103: Activist Persona).<br><br><b>Associated Techniques and Sub-techniques</b><br><b>T0097.104: Hacktivist Persona:</b> Analysts should use this sub-technique to catalogue cases where an individual is presenting themselves as someone engaged in activism who uses technical tools and methods, including building technical infrastructure and conducting offensive cyber operations, to achieve their goals.<br><b>T0097.207: NGO Persona:</b> People with an activist persona may present as being part of an NGO.<br><b>T0097.208: Social Cause Persona:</b> Analysts should use this sub-technique to catalogue cases where an online account is presenting as posting content related to a particular social cause, while not presenting as an individual.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.104.md">T0097.104</a></td>
|
||
<td>Hacktivist Persona</td>
|
||
<td>A person with a hacktivist persona presents themselves as an activist who conducts offensive cyber operations or builds technical infrastructure for political purposes, rather than the financial motivations commonly attributed to hackers; hacktivists are hacker activists who use their technical knowledge to take political action.<br><br>Hacktivists can build technical infrastructure to support other activists, including secure communication channels and surveillance and censorship circumvention. They can also conduct DDOS attacks and other offensive cyber operations, aiming to take down digital assets or gain access to proprietary information. An influence operation may use hacktivist personas to support their operational narratives and legitimise their operational activities.<br><br>Fabricated Hacktivists are sometimes referred to as “Faketivists”.<br><br><b>Associated Techniques and Sub-techniques</b><br><b>T0097.103: Activist Persona:</b> Analysts should use this sub-technique to catalogue cases where an individual is presenting themselves as someone engaged in activism but doesn’t present themselves as using technical tools and methods to achieve their goals.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.105.md">T0097.105</a></td>
|
||
<td>Military Personnel Persona</td>
|
||
<td>A person with a military personnel persona presents themselves as a serving member or veteran of a military organisation operating in an official capacity on behalf of a government.<br><br>While presenting as military personnel is not an indication of inauthentic behaviour, an influence operation may have its narratives amplified by people presenting as military personnel. Threat actors can fabricate military personnel (T0143.002: Fabricated Persona, T0097.105: Military Personnel Persona) to pose as experts on military topics, or to discredit geopolitical adversaries by pretending to be one of their military personnel and spreading discontent.<br><br>People who have legitimately developed a military persona (T0143.001: Authentic Persona, T0097.105: Military Personnel Persona) can use it for malicious purposes, or be exploited by threat actors. For example, someone could take money for using their position as a member of the military to provide legitimacy to a false narrative or be tricked into doing so without their knowledge.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.106.md">T0097.106</a></td>
|
||
<td>Recruiter Persona</td>
|
||
<td>A person with a recruiter persona presents themselves as a potential employer or provider of freelance work.<br><br>While presenting as a recruiter is not an indication of inauthentic behaviour, threat actors fabricate recruiters (T0143.002: Fabricated Persona, T0097.106: Recruiter Persona) to justify asking for personal information from their targets or to trick targets into working for the threat actors (without revealing who they are).<br><br><b>Associated Techniques and Sub-techniques</b><br><b>T0097.205: Business Persona:</b> People with a recruiter persona may present as being part of a business which they are recruiting for.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.107.md">T0097.107</a></td>
|
||
<td>Researcher Persona</td>
|
||
<td>A person with a researcher persona presents themselves as conducting research (e.g. for academic institutions, or think tanks), or having previously conducted research.<br><br>While presenting as a researcher is not an indication of inauthentic behaviour, an influence operation may have its narratives amplified by people presenting as researchers. Threat actors can fabricate researchers (T0143.002: Fabricated Persona, T0097.107: Researcher Persona) to add credibility to their narratives.<br><br>People who are legitimate researchers (T0143.001: Authentic Persona, T0097.107: Researcher Persona) can use their persona for malicious purposes, or be exploited by threat actors. For example, someone could take money for using their position as a Researcher to provide legitimacy to a false narrative or be tricked into doing so without their knowledge.<br><br><b>Associated Techniques and Sub-techniques</b><br><b>T0097.204: Think Tank Persona:</b> People with a researcher persona may present as being part of a think tank.<br><b>T0097.108: Expert Persona:</b> People who present as researching a given topic are likely to also present as having expertise in the area.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.108.md">T0097.108</a></td>
|
||
<td>Expert Persona</td>
|
||
<td>A person with an expert persona presents themselves as having expertise or experience in a field. Commonly the persona’s expertise will be called upon to add credibility to a given narrative.<br><br>While presenting as an expert is not an indication of inauthentic behaviour, an influence operation may have its narratives amplified by people presenting as experts. Threat actors can fabricate experts (T0143.002: Fabricated Persona, T0097.107: Researcher Persona) to add credibility to their narratives.<br><br>People who are legitimate experts (T0143.001: Authentic Persona, T0097.107: Researcher Persona) can make mistakes, use their persona for malicious purposes, or be exploited by threat actors. For example, someone could take money for using their position as an expert to provide legitimacy to a false narrative or be tricked into doing so without their knowledge.<br><br><b>Associated Techniques and Sub-techniques</b><br><b>T0097.107: Researcher Persona:</b> People who present as experts may also present as conducting or having conducted research into their specialist subject.<br><b>T0097.204: Think Tank Persona:</b> People with an expert persona may present as being part of a think tank.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.109.md">T0097.109</a></td>
|
||
<td>Romantic Suitor Persona</td>
|
||
<td>A person with a romantic suitor persona presents themselves as seeking a romantic or physical connection with another person.<br><br>While presenting as seeking a romantic or physical connection is not an indication of inauthentic behaviour, threat actors can use dating apps, social media channels or dating websites to fabricate romantic suitors to lure targets they can blackmail, extract information from, deceive or trick into giving them money (T0143.002: Fabricated Persona, T0097.109: Romantic Suitor Persona).<br><br>Honeypotting in espionage and Big Butchering in scamming are commonly associated with romantic suitor personas.<br><br><b>Associated Techniques and Sub-techniques</b><br><b>T0151.017: Dating Platform:</b> Analysts can use this sub-technique for tagging cases where an account has been identified as using a dating platform.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.110.md">T0097.110</a></td>
|
||
<td>Party Official Persona</td>
|
||
<td>A person who presents as an official member of a political party, such as leaders of political parties, candidates standing to represent constituents, and campaign staff.<br><br>Presenting as an official of a political party is not an indication of inauthentic behaviour, however threat actors may fabricate individuals who work in political parties to add credibility to their narratives (T0143.002: Fabricated Persona, T0097.110: Party Official Persona). They may also impersonate existing officials of political parties (T0143.003: Impersonated Persona, T0097.110: Party Official Persona).<br><br>Legitimate members of political parties could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.110: Party Official Persona). For example, an electoral candidate could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.<br><br><b>Associated Techniques and Sub-techniques</b><br><b>T0097.111: Government Official Persona:</b> Analysts should use this sub-technique to catalogue cases where an individual is presenting as a member of a government. <br><br>Some party officials will also be government officials. For example, in the United Kingdom the head of government is commonly also the head of their political party.<br><br>Some party officials won’t be government officials. For example, members of a party standing in an election, or party officials who work outside of government (e.g. campaign staff).</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.111.md">T0097.111</a></td>
|
||
<td>Government Official Persona</td>
|
||
<td>A person who presents as an active or previous government official has the government official persona. These are officials serving in government, such as heads of government departments, leaders of countries, and members of government selected to represent constituents.<br><br> Presenting as a government official is not an indication of inauthentic behaviour, however threat actors may fabricate individuals who work in government to add credibility to their narratives (T0143.002: Fabricated Persona, T0097.111: Government Official Persona). They may also impersonate existing members of government (T0143.003: Impersonated Persona, T0097.111: Government Official Persona).<br><br> Legitimate government officials could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.111: Government Official Persona). For example, a government official could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0097.110: Party Official Persona:</b> Analysts should use this sub-technique to catalogue cases where an individual is presenting as a member of a political party. <br><br> Not all government officials are political party officials (such as outside experts brought into government) and not all political party officials are government officials (such as people standing for office who are not yet working in government).<br><br> <b>T0097.206: Government Institution Persona:</b> People presenting as members of a government may also represent a government institution which they are associated with.<br><br> <b>T0097.112: Government Employee Persona:</b> Analysts should use this sub-technique to document people presenting as professionals hired to serve in government institutions and departments, not officials selected to represent constituents, or assigned official roles in government (such as heads of departments).</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.112.md">T0097.112</a></td>
|
||
<td>Government Employee Persona</td>
|
||
<td>A person who presents as an active or previous civil servant has the government employee persona. These are professionals hired to serve in government institutions and departments, not officials selected to represent constituents, or assigned official roles in government (such as heads of departments).<br><br> Presenting as a government employee is not an indication of inauthentic behaviour, however threat actors may fabricate individuals who work in government to add credibility to their narratives (T0143.002: Fabricated Persona, T0097.112: Government Employee Persona). They may also impersonate existing government employees (T0143.003: Impersonated Persona, T0097.112: Government Employee Persona).<br><br> Legitimate government employees could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.112: Government Employee Persona). For example, a government employee could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0097.111: Government Official Persona:</b> Analysts should use this technique to document people who present as an active or previous government official, such as heads of government departments, leaders of countries, and members of government selected to represent constituents.<br> <b>T0097.206: Government Institution Persona:</b> People presenting as members of a government may also present a government institution which they are associated with.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.200.md">T0097.200</a></td>
|
||
<td>Institutional Persona</td>
|
||
<td>This Technique can be used to indicate that an entity is presenting itself as an institution. If the organisation is presenting itself as having one of the personas listed below then these Techniques should be used instead, as they indicate both that the entity presented itself as an institution, and the type of persona they presented:<br><br> T0097.201: Local Institution Persona<br> T0097.202: News Outlet Persona<br> T0097.203: Fact Checking Organisation Persona<br> T0097.204: Think Tank Persona<br> T0097.205: Business Persona<br> T0097.206: Government Institution Persona<br> T0097.207: NGO Persona<br> T0097.208: Social Cause Persona</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.201.md">T0097.201</a></td>
|
||
<td>Local Institution Persona</td>
|
||
<td>Institutions which present themselves as operating in a particular geography, or as having local knowledge relevant to a narrative, are presenting a local institution persona.<br><br> While presenting as a local institution is not an indication of inauthentic behaviour, threat actors may present themselves as such (T0143.002: Fabricated Persona, T0097.201: Local Institution Persona) to add credibility to their narratives, or misrepresent the real opinions of locals in the area.<br><br> Legitimate local institutions could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.201: Local Institution Persona). For example, a local institution could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0097.101: Local Persona:</b> Institutions presenting as local may also present locals working within the organisation.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.202.md">T0097.202</a></td>
|
||
<td>News Outlet Persona</td>
|
||
<td>An institution with a news outlet persona presents itself as an organisation which delivers new information to its target audience.<br><br> While presenting as a news outlet is not an indication of inauthentic behaviour, an influence operation may have its narratives amplified by news organisations. Threat actors can fabricate news organisations (T0143.002: Fabricated Persona, T0097.202: News Outlet Persona), or they can impersonate existing news outlets (T0143.003: Impersonated Persona, T0097.202: News Outlet Persona).<br><br> Legitimate news organisations could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.202: News Outlet Persona).<br><br> <b>Associated Techniques and Sub-techniques</b></br> <b>T0097.102: Journalist Persona:</b> Institutions presenting as news outlets may also present journalists working within the organisation.<br> <b>T0097.201: Local Institution Persona:</b> Institutions presenting as news outlets may present as being a local news outlet.<br> <b>T0097.203: Fact Checking Organisation Persona:</b> Institutions presenting as news outlets may also deliver a fact checking service (e.g. The UK’s BBC News has the fact checking service BBC Verify). When an actor presents as the fact checking arm of a news outlet, they are presenting both a News Outlet Persona and a Fact Checking Organisation Persona.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.203.md">T0097.203</a></td>
|
||
<td>Fact Checking Organisation Persona</td>
|
||
<td>An institution with a fact checking organisation persona presents itself as an organisation which produces reports which assess the validity of others’ reporting / statements.<br><br> While presenting as a fact checking organisation is not an indication of inauthentic behaviour, an influence operation may have its narratives amplified by fact checking organisations. Threat actors can fabricate fact checking organisations (T0143.002: Fabricated Persona, T0097.202: News Outlet Persona), or they can impersonate existing fact checking outlets (T0143.003: Impersonated Persona, T0097.202: News Outlet Persona).<br><br> Legitimate fact checking organisations could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.202: News Outlet Persona).<br><br> <b>Associated Techniques and Sub-techniques</b></br> <b>T0097.102: Journalist Persona:</b> Institutions presenting as fact checking organisations may also present journalists working within the organisation.<br> <b>T0097.202: News Outlet Persona:</b> Fact checking organisations may present as operating as part of a larger news outlet (e.g. The UK’s BBC News has the fact checking service BBC Verify). When an actor presents as the fact checking arm of a news outlet, they are presenting both a News Outlet Persona and a Fact Checking Organisation Persona.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.204.md">T0097.204</a></td>
|
||
<td>Think Tank Persona</td>
|
||
<td>An institution with a think tank persona presents itself as a think tank; an organisation that aims to conduct original research and propose new policies or solutions, especially for social and scientific problems.<br><br> While presenting as a think tank is not an indication of inauthentic behaviour, think tank personas are commonly used by threat actors as a front for their operational activity (T0143.002: Fabricated Persona, T0097.204: Think Tank Persona). They may be created to give legitimacy to narratives and allow them to suggest politically beneficial solutions to societal issues.<br><br> Legitimate think tanks could have a political bias that they may not be transparent about, they could use their persona for malicious purposes, or they could be exploited by threat actors (T0143.001: Authentic Persona, T0097.204: Think Tank Persona). For example, a think tank could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0097.107: Researcher Persona:</b> Institutions presenting as think tanks may also present researchers working within the organisation.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.205.md">T0097.205</a></td>
|
||
<td>Business Persona</td>
|
||
<td>An institution with a business persona presents itself as a for-profit organisation which provides goods or services for a price.<br><br> While presenting as a business is not an indication of inauthentic behaviour, business personas may be used by threat actors as a front for their operational activity (T0143.002: Fabricated Persona, T0097.205: Business Persona).<br><br> Threat actors may also impersonate existing businesses (T0143.003: Impersonated Persona, T0097.205: Business Persona) to exploit their brand or cause reputational damage.<br><br> Legitimate businesses could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.205: Business Persona). For example, a business could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.206.md">T0097.206</a></td>
|
||
<td>Government Institution Persona</td>
|
||
<td>Institutions which present themselves as governments, or government ministries, are presenting a government institution persona.<br><br> While presenting as a government institution is not an indication of inauthentic behaviour, threat actors may impersonate existing government institutions as part of their operation (T0143.003: Impersonated Persona, T0097.206: Government Institution Persona), to add legitimacy to their narratives, or discredit the government.<br><br> Legitimate government institutions could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.206: Government Institution Persona). For example, a government institution could be used by elected officials to spread inauthentic narratives.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0097.111: Government Official Persona:</b> Institutions presenting as governments may also present officials working within the organisation.<br> <b>T0097.112: Government Employee Persona:</b> Institutions presenting as governments may also present employees working within the organisation.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.207.md">T0097.207</a></td>
|
||
<td>NGO Persona</td>
|
||
<td>Institutions which present themselves as an NGO (Non-Governmental Organisation), an organisation which provides services or advocates for public policy (while not being directly affiliated with any government), are presenting an NGO persona.<br><br> While presenting as an NGO is not an indication of inauthentic behaviour, NGO personas are commonly used by threat actors (such as intelligence services) as a front for their operational activity (T0143.002: Fabricated Persona, T0097.207: NGO Persona). They are created to give legitimacy to the influence operation and potentially infiltrate grassroots movements<br><br> Legitimate NGOs could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.207: NGO Persona). For example, an NGO could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.<br><br> <b>Associated Techniques and Sub-techniques:</b><br> <b>T0097.103: Activist Persona:</b> Institutions presenting as activist groups may also present activists working within the organisation.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.208.md">T0097.208</a></td>
|
||
<td>Social Cause Persona</td>
|
||
<td>Online accounts which present themselves as focusing on a social cause are presenting the Social Cause Persona. Examples include accounts which post about current affairs, such as discrimination faced by minorities.<br><br> While presenting as an account invested in a social cause is not an indication of inauthentic behaviour, such personas have been used by threat actors to exploit peoples’ legitimate emotional investment regarding social causes that matter to them (T0143.002: Fabricated Persona, T0097.208: Social Cause Persona).<br><br> Legitimate accounts focused on a social cause could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.208: Social Cause Persona). For example, the account holders could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.<br><br> <b>Associated Techniques and Sub-techniques:</b><br> <b>T0097.103: Activist Persona:</b> Analysts should use this sub-technique to catalogue cases where an individual is presenting themselves as an activist related to a social cause. Accounts with social cause personas do not present themselves as individuals, but may have activists controlling the accounts.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0098.md">T0098</a></td>
|
||
<td>Establish Inauthentic News Sites</td>
|
||
<td>Modern computational propaganda makes use of a cadre of imposter news sites spreading globally. These sites, sometimes motivated by concerns other than propaganda--for instance, click-based revenue--often have some superficial markers of authenticity, such as naming and site-design. But many can be quickly exposed with reference to their owenership, reporting history and adverstising details.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0098.001.md">T0098.001</a></td>
|
||
<td>Create Inauthentic News Sites</td>
|
||
<td>Create Inauthentic News Sites</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0098.002.md">T0098.002</a></td>
|
||
<td>Leverage Existing Inauthentic News Sites</td>
|
||
<td>Leverage Existing Inauthentic News Sites</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0100.md">T0100</a></td>
|
||
<td>Co-Opt Trusted Sources</td>
|
||
<td>An influence operation may co-opt trusted sources by infiltrating or repurposing a source to reach a target audience through existing, previously reliable networks. Co-opted trusted sources may include: - National or local new outlets - Research or academic publications - Online blogs or websites</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0100.001.md">T0100.001</a></td>
|
||
<td>Co-Opt Trusted Individuals</td>
|
||
<td>Co-Opt Trusted Individuals</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0100.002.md">T0100.002</a></td>
|
||
<td>Co-Opt Grassroots Groups</td>
|
||
<td>Co-Opt Grassroots Groups</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0100.003.md">T0100.003</a></td>
|
||
<td>Co-Opt Influencers</td>
|
||
<td>Co-opt Influencers</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0101.md">T0101</a></td>
|
||
<td>Create Localised Content</td>
|
||
<td>Localised content refers to content that appeals to a specific community of individuals, often in defined geographic areas. An operation may create localised content using local language and dialects to resonate with its target audience and blend in with other local news and social media. Localised content may help an operation increase legitimacy, avoid detection, and complicate external attribution.</td>
|
||
<td>TA05</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0102.md">T0102</a></td>
|
||
<td>Leverage Echo Chambers/Filter Bubbles</td>
|
||
<td>An echo chamber refers to an internet subgroup, often along ideological lines, where individuals only engage with “others with which they are already in agreement.” A filter bubble refers to an algorithm's placement of an individual in content that they agree with or regularly engage with, possibly entrapping the user into a bubble of their own making. An operation may create these isolated areas of the internet by match existing groups, or aggregating individuals into a single target audience based on shared interests, politics, values, demographics, and other characteristics. Echo chambers and filter bubbles help to reinforce similar biases and content to the same target audience members.</td>
|
||
<td>TA05</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0102.001.md">T0102.001</a></td>
|
||
<td>Use Existing Echo Chambers/Filter Bubbles</td>
|
||
<td>Use existing Echo Chambers/Filter Bubbles</td>
|
||
<td>TA05</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0102.002.md">T0102.002</a></td>
|
||
<td>Create Echo Chambers/Filter Bubbles</td>
|
||
<td>Create Echo Chambers/Filter Bubbles</td>
|
||
<td>TA05</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0102.003.md">T0102.003</a></td>
|
||
<td>Exploit Data Voids</td>
|
||
<td>A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation. A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalising on most search engines’ preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term.</td>
|
||
<td>TA05</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0107.md">T0107</a></td>
|
||
<td>Bookmarking and Content Curation</td>
|
||
<td>Platforms for searching, sharing, and curating content and media. Examples include Pinterest, Flipboard, etc.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0109.md">T0109</a></td>
|
||
<td>Consumer Review Networks</td>
|
||
<td>Platforms for finding, reviewing, and sharing information about brands, products, services, restaurants, travel destinations, etc. Examples include Yelp, TripAdvisor, etc.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0110.md">T0110</a></td>
|
||
<td>Formal Diplomatic Channels</td>
|
||
<td>Leveraging formal, traditional, diplomatic channels to communicate with foreign governments (written documents, meetings, summits, diplomatic visits, etc). This type of diplomacy is conducted by diplomats of one nation with diplomats and other officials of another nation or international organisation.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0111.md">T0111</a></td>
|
||
<td>Traditional Media</td>
|
||
<td>Examples include TV, Newspaper, Radio, etc.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0111.001.md">T0111.001</a></td>
|
||
<td>TV</td>
|
||
<td>TV</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0111.002.md">T0111.002</a></td>
|
||
<td>Newspaper</td>
|
||
<td>Newspaper</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0111.003.md">T0111.003</a></td>
|
||
<td>Radio</td>
|
||
<td>Radio</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0113.md">T0113</a></td>
|
||
<td>Employ Commercial Analytic Firms</td>
|
||
<td>Commercial analytic firms collect data on target audience activities and evaluate the data to detect trends, such as content receiving high click-rates. An influence operation may employ commercial analytic firms to facilitate external collection on its target audience, complicating attribution efforts and better tailoring the content to audience preferences.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0114.md">T0114</a></td>
|
||
<td>Deliver Ads</td>
|
||
<td>Delivering content via any form of paid media or advertising.</td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0114.001.md">T0114.001</a></td>
|
||
<td>Social Media</td>
|
||
<td>Social Media</td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0114.002.md">T0114.002</a></td>
|
||
<td>Traditional Media</td>
|
||
<td>Examples include TV, Radio, Newspaper, billboards</td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0115.md">T0115</a></td>
|
||
<td>Post Content</td>
|
||
<td>Delivering content by posting via owned media (assets that the operator controls).</td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0115.001.md">T0115.001</a></td>
|
||
<td>Share Memes</td>
|
||
<td>Memes are one of the most important single artefact types in all of computational propaganda. Memes in this framework denotes the narrow image-based definition. But that naming is no accident, as these items have most of the important properties of Dawkins' original conception as a self-replicating unit of culture. Memes pull together reference and commentary; image and narrative; emotion and message. Memes are a powerful tool and the heart of modern influence campaigns.</td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0115.002.md">T0115.002</a></td>
|
||
<td>Post Violative Content to Provoke Takedown and Backlash</td>
|
||
<td>Post Violative Content to Provoke Takedown and Backlash.</td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0115.003.md">T0115.003</a></td>
|
||
<td>One-Way Direct Posting</td>
|
||
<td>Direct posting refers to a method of posting content via a one-way messaging service, where the recipient cannot directly respond to the poster’s messaging. An influence operation may post directly to promote operation narratives to the target audience without allowing opportunities for fact-checking or disagreement, creating a false sense of support for the narrative.</td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0116.md">T0116</a></td>
|
||
<td>Comment or Reply on Content</td>
|
||
<td>Delivering content by replying or commenting via owned media (assets that the operator controls).</td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0116.001.md">T0116.001</a></td>
|
||
<td>Post Inauthentic Social Media Comment</td>
|
||
<td>Use government-paid social media commenters, astroturfers, chat bots (programmed to reply to specific key words/hashtags) influence online conversations, product reviews, web-site comment forums.</td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0117.md">T0117</a></td>
|
||
<td>Attract Traditional Media</td>
|
||
<td>Deliver content by attracting the attention of traditional media (earned media).</td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0118.md">T0118</a></td>
|
||
<td>Amplify Existing Narrative</td>
|
||
<td>An influence operation may amplify existing narratives that align with its narratives to support operation objectives.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0119.md">T0119</a></td>
|
||
<td>Cross-Posting</td>
|
||
<td>Cross-posting refers to posting the same message to multiple internet discussions, social media platforms or accounts, or news groups at one time. An influence operation may post content online in multiple communities and platforms to increase the chances of content exposure to the target audience.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0119.001.md">T0119.001</a></td>
|
||
<td>Post across Groups</td>
|
||
<td>An influence operation may post content across groups to spread narratives and content to new communities within the target audiences or to new target audiences.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0119.002.md">T0119.002</a></td>
|
||
<td>Post across Platform</td>
|
||
<td>An influence operation may post content across platforms to spread narratives and content to new communities within the target audiences or to new target audiences. Posting across platforms can also remove opposition and context, helping the narrative spread with less opposition on the cross-posted platform.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0119.003.md">T0119.003</a></td>
|
||
<td>Post across Disciplines</td>
|
||
<td>Post Across Disciplines</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0120.md">T0120</a></td>
|
||
<td>Incentivize Sharing</td>
|
||
<td>Incentivizing content sharing refers to actions that encourage users to share content themselves, reducing the need for the operation itself to post and promote its own content.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0120.001.md">T0120.001</a></td>
|
||
<td>Use Affiliate Marketing Programmes</td>
|
||
<td>Use Affiliate Marketing Programmes</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0120.002.md">T0120.002</a></td>
|
||
<td>Use Contests and Prizes</td>
|
||
<td>Use Contests and Prizes</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0121.md">T0121</a></td>
|
||
<td>Manipulate Platform Algorithm</td>
|
||
<td>Manipulating a platform algorithm refers to conducting activity on a platform in a way that intentionally targets its underlying algorithm. After analysing a platform’s algorithm (see: Select Platforms), an influence operation may use a platform in a way that increases its content exposure, avoids content removal, or otherwise benefits the operation’s strategy. For example, an influence operation may use bots to amplify its posts so that the platform’s algorithm recognises engagement with operation content and further promotes the content on user timelines.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0121.001.md">T0121.001</a></td>
|
||
<td>Bypass Content Blocking</td>
|
||
<td>Bypassing content blocking refers to actions taken to circumvent network security measures that prevent users from accessing certain servers, resources, or other online spheres. An influence operation may bypass content blocking to proliferate its content on restricted areas of the internet. Common strategies for bypassing content blocking include: - Altering IP addresses to avoid IP filtering - Using a Virtual Private Network (VPN) to avoid IP filtering - Using a Content Delivery Network (CDN) to avoid IP filtering - Enabling encryption to bypass packet inspection blocking - Manipulating text to avoid filtering by keywords - Posting content on multiple platforms to avoid platform-specific removals - Using local facilities or modified DNS servers to avoid DNS filtering</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0122.md">T0122</a></td>
|
||
<td>Direct Users to Alternative Platforms</td>
|
||
<td>Direct users to alternative platforms refers to encouraging users to move from the platform on which they initially viewed operation content and engage with content on alternate information channels, including separate social media channels and inauthentic websites. An operation may drive users to alternative platforms to diversify its information channels and ensure the target audience knows where to access operation content if the initial platform suspends, flags, or otherwise removes original operation assets and content.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0123.md">T0123</a></td>
|
||
<td>Control Information Environment through Offensive Cyberspace Operations</td>
|
||
<td>Controlling the information environment through offensive cyberspace operations uses cyber tools and techniques to alter the trajectory of content in the information space to either prioritise operation messaging or block opposition messaging.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0123.001.md">T0123.001</a></td>
|
||
<td>Delete Opposing Content</td>
|
||
<td>Deleting opposing content refers to the removal of content that conflicts with operational narratives from selected platforms. An influence operation may delete opposing content to censor contradictory information from the target audience, allowing operation narratives to take priority in the information space.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0123.002.md">T0123.002</a></td>
|
||
<td>Block Content</td>
|
||
<td>Content blocking refers to actions taken to restrict internet access or render certain areas of the internet inaccessible. An influence operation may restrict content based on both network and content attributes.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0123.003.md">T0123.003</a></td>
|
||
<td>Destroy Information Generation Capabilities</td>
|
||
<td>Destroying information generation capabilities refers to actions taken to limit, degrade, or otherwise incapacitate an actor’s ability to generate conflicting information. An influence operation may destroy an actor’s information generation capabilities by physically dismantling the information infrastructure, disconnecting resources needed for information generation, or redirecting information generation personnel. An operation may destroy an adversary’s information generation capabilities to limit conflicting content exposure to the target audience and crowd the information space with its own narratives.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0123.004.md">T0123.004</a></td>
|
||
<td>Conduct Server Redirect</td>
|
||
<td>A server redirect, also known as a URL redirect, occurs when a server automatically forwards a user from one URL to another using server-side or client-side scripting languages. An influence operation may conduct a server redirect to divert target audience members from one website to another without their knowledge. The redirected website may pose as a legitimate source, host malware, or otherwise aid operation objectives.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0124.md">T0124</a></td>
|
||
<td>Suppress Opposition</td>
|
||
<td>Operators can suppress the opposition by exploiting platform content moderation tools and processes like reporting non-violative content to platforms for takedown and goading opposition actors into taking actions that result in platform action or target audience disapproval.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0124.001.md">T0124.001</a></td>
|
||
<td>Report Non-Violative Opposing Content</td>
|
||
<td>Reporting opposing content refers to notifying and providing an instance of a violation of a platform’s guidelines and policies for conduct on the platform. In addition to simply reporting the content, an operation may leverage copyright regulations to trick social media and web platforms into removing opposing content by manipulating the content to appear in violation of copyright laws. Reporting opposing content facilitates the suppression of contradictory information and allows operation narratives to take priority in the information space.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0124.002.md">T0124.002</a></td>
|
||
<td>Goad People into Harmful Action (Stop Hitting Yourself)</td>
|
||
<td>Goad people into actions that violate terms of service or will lead to having their content or accounts taken down.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0124.003.md">T0124.003</a></td>
|
||
<td>Exploit Platform TOS/Content Moderation</td>
|
||
<td>Exploit Platform TOS/Content Moderation</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0125.md">T0125</a></td>
|
||
<td>Platform Filtering</td>
|
||
<td>Platform filtering refers to the decontextualization of information as claims cross platforms (from Joan Donovan https://www.hks.harvard.edu/publications/disinformation-design-use-evidence-collages-and-platform-filtering-media-manipulation)</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0126.md">T0126</a></td>
|
||
<td>Encourage Attendance at Events</td>
|
||
<td>Operation encourages attendance at existing real world event.</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0126.001.md">T0126.001</a></td>
|
||
<td>Call to Action to Attend</td>
|
||
<td>Call to action to attend an event</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0126.002.md">T0126.002</a></td>
|
||
<td>Facilitate Logistics or Support for Attendance</td>
|
||
<td>Facilitate logistics or support for travel, food, housing, etc.</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0127.md">T0127</a></td>
|
||
<td>Physical Violence</td>
|
||
<td>Physical violence refers to the use of force to injure, abuse, damage, or destroy. An influence operation may conduct or encourage physical violence to discourage opponents from promoting conflicting content or draw attention to operation narratives using shock value.</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0127.001.md">T0127.001</a></td>
|
||
<td>Conduct Physical Violence</td>
|
||
<td>An influence operation may directly Conduct Physical Violence to achieve campaign goals.</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0127.002.md">T0127.002</a></td>
|
||
<td>Encourage Physical Violence</td>
|
||
<td>An influence operation may Encourage others to engage in Physical Violence to achieve campaign goals.</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0128.md">T0128</a></td>
|
||
<td>Conceal Information Assets</td>
|
||
<td>Conceal the identity or provenance of campaign information assets such as accounts, channels, pages etc. to avoid takedown and attribution.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0128.001.md">T0128.001</a></td>
|
||
<td>Use Pseudonyms</td>
|
||
<td>An operation may use pseudonyms, or fake names, to mask the identity of operational accounts, channels, pages etc., publish anonymous content, or otherwise use falsified personas to conceal the identity of the operation. An operation may coordinate pseudonyms across multiple platforms, for example, by writing an article under a pseudonym and then posting a link to the article on social media on an account, channel, or page with the same falsified name.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0128.002.md">T0128.002</a></td>
|
||
<td>Conceal Network Identity</td>
|
||
<td>Concealing network identity aims to hide the existence an influence operation’s network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organisation.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0128.003.md">T0128.003</a></td>
|
||
<td>Distance Reputable Individuals from Operation</td>
|
||
<td>Distancing reputable individuals from the operation occurs when enlisted individuals, such as celebrities or subject matter experts, actively disengage themselves from operation activities and messaging. Individuals may distance themselves from the operation by deleting old posts or statements, unfollowing operation information assets, or otherwise detaching themselves from the operation’s timeline. An influence operation may want reputable individuals to distance themselves from the operation to reduce operation exposure, particularly if the operation aims to remove all evidence.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0128.004.md">T0128.004</a></td>
|
||
<td>Launder Information Assets</td>
|
||
<td>Laundering occurs when an influence operation acquires control of previously legitimate information assets such as accounts, channels, pages etc. from third parties through sale or exchange and often in contravention of terms of use. Influence operations use laundered assets to reach target audience members from within an existing information community and to complicate attribution.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0128.005.md">T0128.005</a></td>
|
||
<td>Change Names of Information Assets</td>
|
||
<td>Changing names or brand names of information assets such as accounts, channels, pages etc. An operation may change the names or brand names of its assets throughout an operation to avoid detection or alter the names of newly acquired or repurposed assets to fit operational narratives.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0129.md">T0129</a></td>
|
||
<td>Conceal Operational Activity</td>
|
||
<td>Conceal the campaign's operational activity to avoid takedown and attribution.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0129.001.md">T0129.001</a></td>
|
||
<td>Conceal Network Identity</td>
|
||
<td>Concealing network identity aims to hide the existence an influence operation’s network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organisation.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0129.002.md">T0129.002</a></td>
|
||
<td>Generate Content Unrelated to Narrative</td>
|
||
<td>An influence operation may mix its own operation content with legitimate news or external unrelated content to disguise operational objectives, narratives, or existence. For example, an operation may generate "lifestyle" or "cuisine" content alongside regular operation content.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0129.003.md">T0129.003</a></td>
|
||
<td>Break Association with Content</td>
|
||
<td>Breaking association with content occurs when an influence operation actively separates itself from its own content. An influence operation may break association with content by unfollowing, unliking, or unsharing its content, removing attribution from its content, or otherwise taking actions that distance the operation from its messaging. An influence operation may break association with its content to complicate attribution or regain credibility for a new operation.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0129.004.md">T0129.004</a></td>
|
||
<td>Delete URLs</td>
|
||
<td>URL deletion occurs when an influence operation completely removes its website registration, rendering the URL inaccessible. An influence operation may delete its URLs to complicate attribution or remove online documentation that the operation ever occurred.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0129.005.md">T0129.005</a></td>
|
||
<td>Coordinate on Encrypted/Closed Networks</td>
|
||
<td>Coordinate on encrypted/ closed networks</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0129.006.md">T0129.006</a></td>
|
||
<td>Deny Involvement</td>
|
||
<td>Without "smoking gun" proof (and even with proof), incident creator can or will deny involvement. This technique also leverages the attacker advantages outlined in "Demand insurmountable proof", specifically the asymmetric disadvantage for truth-tellers in a "firehose of misinformation" environment.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0129.007.md">T0129.007</a></td>
|
||
<td>Delete Accounts/Account Activity</td>
|
||
<td>Deleting accounts and account activity occurs when an influence operation removes its online social media assets, including social media accounts, posts, likes, comments, and other online artefacts. An influence operation may delete its accounts and account activity to complicate attribution or remove online documentation that the operation ever occurred.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0129.009.md">T0129.009</a></td>
|
||
<td>Remove Post Origins</td>
|
||
<td>Removing post origins refers to the elimination of evidence that indicates the initial source of operation content, often to complicate attribution. An influence operation may remove post origins by deleting watermarks, renaming files, or removing embedded links in its content.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0129.010.md">T0129.010</a></td>
|
||
<td>Misattribute Activity</td>
|
||
<td>Misattributed activity refers to incorrectly attributed operation activity. For example, a state sponsored influence operation may conduct operation activity in a way that mimics another state so that external entities misattribute activity to the incorrect state. An operation may misattribute their activities to complicate attribution, avoid detection, or frame an adversary for negative behaviour.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0130.md">T0130</a></td>
|
||
<td>Conceal Infrastructure</td>
|
||
<td>Conceal the campaign's infrastructure to avoid takedown and attribution.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0130.001.md">T0130.001</a></td>
|
||
<td>Conceal Sponsorship</td>
|
||
<td>Concealing sponsorship aims to mislead or obscure the identity of the hidden sponsor behind an operation rather than entity publicly running the operation. Operations that conceal sponsorship may maintain visible falsified groups, news outlets, non-profits, or other organisations, but seek to mislead or obscure the identity sponsoring, funding, or otherwise supporting these entities. Influence operations may use a variety of techniques to mask the location of their social media accounts to complicate attribution and conceal evidence of foreign interference. Operation accounts may set their location to a false place, often the location of the operation’s target audience, and post in the region’s language</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0130.002.md">T0130.002</a></td>
|
||
<td>Utilise Bulletproof Hosting</td>
|
||
<td>Hosting refers to services through which storage and computing resources are provided to an individual or organisation for the accommodation and maintenance of one or more websites and related services. Services may include web hosting, file sharing, and email distribution. Bulletproof hosting refers to services provided by an entity, such as a domain hosting or web hosting firm, that allows its customer considerable leniency in use of the service. An influence operation may utilise bulletproof hosting to maintain continuity of service for suspicious, illegal, or disruptive operation activities that stricter hosting services would limit, report, or suspend.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0130.003.md">T0130.003</a></td>
|
||
<td>Use Shell Organisations</td>
|
||
<td>Use Shell Organisations to conceal sponsorship.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0130.004.md">T0130.004</a></td>
|
||
<td>Use Cryptocurrency</td>
|
||
<td>Use Cryptocurrency to conceal sponsorship. Examples include Bitcoin, Monero, and Etherium.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0130.005.md">T0130.005</a></td>
|
||
<td>Obfuscate Payment</td>
|
||
<td>Obfuscate Payment</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0131.md">T0131</a></td>
|
||
<td>Exploit TOS/Content Moderation</td>
|
||
<td>Exploiting weaknesses in platforms' terms of service and content moderation policies to avoid takedowns and platform actions.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0131.001.md">T0131.001</a></td>
|
||
<td>Legacy Web Content</td>
|
||
<td>Make incident content visible for a long time, e.g. by exploiting platform terms of service, or placing it where it's hard to remove or unlikely to be removed.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0131.002.md">T0131.002</a></td>
|
||
<td>Post Borderline Content</td>
|
||
<td>Post Borderline Content</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0132.md">T0132</a></td>
|
||
<td>Measure Performance</td>
|
||
<td>A metric used to determine the accomplishment of actions. “Are the actions being executed as planned?”</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0132.001.md">T0132.001</a></td>
|
||
<td>People Focused</td>
|
||
<td>Measure the performance individuals in achieving campaign goals</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0132.002.md">T0132.002</a></td>
|
||
<td>Content Focused</td>
|
||
<td>Measure the performance of campaign content</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0132.003.md">T0132.003</a></td>
|
||
<td>View Focused</td>
|
||
<td>View Focused</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0133.md">T0133</a></td>
|
||
<td>Measure Effectiveness</td>
|
||
<td>A metric used to measure a current system state. “Are we on track to achieve the intended new system state within the planned timescale?”</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0133.001.md">T0133.001</a></td>
|
||
<td>Behaviour Changes</td>
|
||
<td>Monitor and evaluate behaviour changes from misinformation incidents.</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0133.002.md">T0133.002</a></td>
|
||
<td>Content</td>
|
||
<td>Measure current system state with respect to the effectiveness of campaign content.</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0133.003.md">T0133.003</a></td>
|
||
<td>Awareness</td>
|
||
<td>Measure current system state with respect to the effectiveness of influencing awareness.</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0133.004.md">T0133.004</a></td>
|
||
<td>Knowledge</td>
|
||
<td>Measure current system state with respect to the effectiveness of influencing knowledge.</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0133.005.md">T0133.005</a></td>
|
||
<td>Action/Attitude</td>
|
||
<td>Measure current system state with respect to the effectiveness of influencing action/attitude.</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0134.md">T0134</a></td>
|
||
<td>Measure Effectiveness Indicators (or KPIs)</td>
|
||
<td>Ensuring that Key Performance Indicators are identified and tracked, so that the performance and effectiveness of campaigns, and elements of campaigns, can be measured, during and after their execution.</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0134.001.md">T0134.001</a></td>
|
||
<td>Message Reach</td>
|
||
<td>Monitor and evaluate message reach in misinformation incidents.</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0134.002.md">T0134.002</a></td>
|
||
<td>Social Media Engagement</td>
|
||
<td>Monitor and evaluate social media engagement in misinformation incidents.</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0135.md">T0135</a></td>
|
||
<td>Undermine</td>
|
||
<td>Weaken, debilitate, or subvert a target or their actions. An influence operation may be designed to disparage an opponent; sabotage an opponent’s systems or processes; compromise an opponent’s relationships or support system; impair an opponent’s capability; or thwart an opponent’s initiative. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0135.001.md">T0135.001</a></td>
|
||
<td>Smear</td>
|
||
<td>Denigrate, disparage, or discredit an opponent. This is a common tactical objective in political campaigns with a larger strategic goal. It differs from efforts to harm a target through defamation. If there is no ulterior motive and the sole aim is to cause harm to the target, then choose sub-technique “Defame” of technique “Cause Harm” instead.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0135.002.md">T0135.002</a></td>
|
||
<td>Thwart</td>
|
||
<td>Prevent the successful outcome of a policy, operation, or initiative. Actors conduct influence operations to stymie or foil proposals, plans, or courses of action which are not in their interest. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0135.003.md">T0135.003</a></td>
|
||
<td>Subvert</td>
|
||
<td>Sabotage, destroy, or damage a system, process, or relationship. The classic example is the Soviet strategy of “active measures” involving deniable covert activities such as political influence, the use of front organisations, the orchestration of domestic unrest, and the spread of disinformation. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0135.004.md">T0135.004</a></td>
|
||
<td>Polarise</td>
|
||
<td>To cause a target audience to divide into two completely opposing groups. This is a special case of subversion. To divide and conquer is an age-old approach to subverting and overcoming an enemy.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0136.md">T0136</a></td>
|
||
<td>Cultivate Support</td>
|
||
<td>Grow or maintain the base of support for the actor, ally, or action. This includes hard core recruitment, managing alliances, and generating or maintaining sympathy among a wider audience, including reputation management and public relations. Sub-techniques assume support for actor (self) unless otherwise specified. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0136.001.md">T0136.001</a></td>
|
||
<td>Defend Reputaton</td>
|
||
<td>Preserve a positive perception in the public’s mind following an accusation or adverse event. When accused of a wrongful act, an actor may engage in denial, counter accusations, whataboutism, or conspiracy theories to distract public attention and attempt to maintain a positive image. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0136.002.md">T0136.002</a></td>
|
||
<td>Justify Action</td>
|
||
<td>To convince others to exonerate you of a perceived wrongdoing. When an actor finds it untenable to deny doing something, they may attempt to exonerate themselves with disinformation which claims the action was reasonable. This is a special case of “Defend Reputation”. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0136.003.md">T0136.003</a></td>
|
||
<td>Energise Supporters</td>
|
||
<td>Raise the morale of those who support the organisation or group. Invigorate constituents with zeal for the mission or activity. Terrorist groups, political movements, and cults may indoctrinate their supporters with ideologies that are based on warped versions of religion or cause harm to others. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0136.004.md">T0136.004</a></td>
|
||
<td>Boost Reputation</td>
|
||
<td>Elevate the estimation of the actor in the public’s mind. Improve their image or standing. Public relations professionals use persuasive overt communications to achieve this goal; manipulators use covert disinformation. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0136.005.md">T0136.005</a></td>
|
||
<td>Cultvate Support for Initiative</td>
|
||
<td>Elevate or fortify the public backing for a policy, operation, or idea. Domestic and foreign actors can use artificial means to fabricate or amplify public support for a proposal or action. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0136.006.md">T0136.006</a></td>
|
||
<td>Cultivate Support for Ally</td>
|
||
<td>Elevate or fortify the public backing for a partner. Governments may interfere in other countries’ elections by covertly favouring a party or candidate aligned with their interests. They may also mount an influence operation to bolster the reputation of an ally under attack. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0136.007.md">T0136.007</a></td>
|
||
<td>Recruit Members</td>
|
||
<td>Motivate followers to join or subscribe as members of the team. Organisations may mount recruitment drives that use propaganda to entice sympathisers to sign up. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0136.008.md">T0136.008</a></td>
|
||
<td>Increase Prestige</td>
|
||
<td>Improve personal standing within a community. Gain fame, approbation, or notoriety. Conspiracy theorists, those with special access, and ideologues can gain prominence in a community by propagating disinformation, leaking confidential documents, or spreading hate. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0137.md">T0137</a></td>
|
||
<td>Make Money</td>
|
||
<td>Profit from disinformation, conspiracy theories, or online harm. In some cases, the sole objective is financial gain, in other cases the objective is both financial and political. Making money may also be a way to sustain a political campaign. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0137.001.md">T0137.001</a></td>
|
||
<td>Generate Ad Revenue</td>
|
||
<td>Earn income from digital advertisements published alongside inauthentic content. Conspiratorial, false, or provocative content drives internet traffic. Content owners earn money from impressions of, or clicks on, or conversions of ads published on their websites, social media profiles, or streaming services, or ads published when their content appears in search engine results. Fraudsters simulate impressions, clicks, and conversions, or they spin up inauthentic sites or social media profiles just to generate ad revenue. Conspiracy theorists and political operators generate ad revenue as a byproduct of their operation or as a means of sustaining their campaign. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0137.002.md">T0137.002</a></td>
|
||
<td>Scam</td>
|
||
<td>Defraud a target or trick a target into doing something that benefits the attacker. A typical scam is where a fraudster convinces a target to pay for something without the intention of ever delivering anything in return. Alternatively, the fraudster may promise benefits which never materialise, such as a fake cure. Criminals often exploit a fear or crisis or generate a sense of urgency. They may use deepfakes to impersonate authority figures or individuals in distress. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0137.003.md">T0137.003</a></td>
|
||
<td>Raise Funds</td>
|
||
<td>Solicit donations for a cause. Popular conspiracy theorists can attract financial contributions from their followers. Fighting back against the establishment is a popular crowdfunding narrative. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0137.004.md">T0137.004</a></td>
|
||
<td>Sell Items under False Pretences</td>
|
||
<td>Offer products for sale under false pretences. Campaigns may hijack or create causes built on disinformation to sell promotional merchandise. Or charlatans may amplify victims’ unfounded fears to sell them items of questionable utility such as supplements or survival gear. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0137.005.md">T0137.005</a></td>
|
||
<td>Extort</td>
|
||
<td>Coerce money or favours from a target by threatening to expose or corrupt information. Ransomware criminals typically demand money. Intelligence agencies demand national secrets. Sexual predators demand favours. The leverage may be critical, sensitive, or embarrassing information. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0137.006.md">T0137.006</a></td>
|
||
<td>Manipulate Stocks</td>
|
||
<td>Artificially inflate or deflate the price of stocks or other financial instruments and then trade on these to make profit. The most common securities fraud schemes are called “pump and dump” and “poop and scoop”. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0138.md">T0138</a></td>
|
||
<td>Motivate to Act</td>
|
||
<td>Persuade, impel, or provoke the target to behave in a specific manner favourable to the attacker. Some common behaviours are joining, subscribing, voting, buying, demonstrating, fighting, retreating, resigning, boycotting.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0138.001.md">T0138.001</a></td>
|
||
<td>Encourage</td>
|
||
<td>Inspire, animate, or exhort a target to act. An actor can use propaganda, disinformation, or conspiracy theories to stimulate a target to act in its interest. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0138.002.md">T0138.002</a></td>
|
||
<td>Provoke</td>
|
||
<td>Instigate, incite, or arouse a target to act. Social media manipulators exploit moral outrage to propel targets to spread hate, take to the streets to protest, or engage in acts of violence. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0138.003.md">T0138.003</a></td>
|
||
<td>Compel</td>
|
||
<td>Force target to take an action or to stop taking an action it has already started. Actors can use the threat of reputational damage alongside military or economic threats to compel a target.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0139.md">T0139</a></td>
|
||
<td>Dissuade from Acting</td>
|
||
<td>Discourage, deter, or inhibit the target from actions which would be unfavourable to the attacker. The actor may want the target to refrain from voting, buying, fighting, or supplying. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0139.001.md">T0139.001</a></td>
|
||
<td>Discourage</td>
|
||
<td>To make a target disinclined or reluctant to act. Manipulators use disinformation to cause targets to question the utility, legality, or morality of taking an action. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0139.002.md">T0139.002</a></td>
|
||
<td>Silence</td>
|
||
<td>Intimidate or incentivise target into remaining silent or prevent target from speaking out. A threat actor may cow a target into silence as a special case of deterrence. Or they may buy the target’s silence. Or they may repress or restrict the target’s speech. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0139.003.md">T0139.003</a></td>
|
||
<td>Deter</td>
|
||
<td>Prevent target from taking an action for fear of the consequences. Deterrence occurs in the mind of the target, who fears they will be worse off if they take an action than if they don’t. When making threats, aggressors may bluff, feign irrationality, or engage in brinksmanship.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0140.md">T0140</a></td>
|
||
<td>Cause Harm</td>
|
||
<td>Persecute, malign, or inflict pain upon a target. The objective of a campaign may be to cause fear or emotional distress in a target. In some cases, harm is instrumental to achieving a primary objective, as in coercion, repression, or intimidation. In other cases, harm may be inflicted for the satisfaction of the perpetrator, as in revenge or sadistic cruelty. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0140.001.md">T0140.001</a></td>
|
||
<td>Defame</td>
|
||
<td>Attempt to damage the target’s personal reputation by impugning their character. This can range from subtle attempts to misrepresent or insinuate, to obvious attempts to denigrate or disparage, to blatant attempts to malign or vilify. Slander applies to oral expression. Libel applies to written or pictorial material. Defamation is often carried out by online trolls. The sole aim here is to cause harm to the target. If the threat actor uses defamation as a means of undermining the target, then choose sub-technique “Smear” of technique “Undermine” instead. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0140.002.md">T0140.002</a></td>
|
||
<td>Intimidate</td>
|
||
<td>Coerce, bully, or frighten the target. An influence operation may use intimidation to compel the target to act against their will. Or the goal may be to frighten or even terrify the target into silence or submission. In some cases, the goal is simply to make the victim suffer. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0140.003.md">T0140.003</a></td>
|
||
<td>Spread Hate</td>
|
||
<td>Publish and/or propagate demeaning, derisive, or humiliating content targeting an individual or group of individuals with the intent to cause emotional, psychological, or physical distress. Hate speech can cause harm directly or incite others to harm the target. It often aims to stigmatise the target by singling out immutable characteristics such as colour, race, religion, national or ethnic origin, gender, gender identity, sexual orientation, age, disease, or mental or physical disability. Thus, promoting hatred online may involve racism, antisemitism, Islamophobia, xenophobia, sexism, misogyny, homophobia, transphobia, ageism, ableism, or any combination thereof. Motivations for hate speech range from group preservation to ideological superiority to the unbridled infliction of suffering. </td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0143.md">T0143</a></td>
|
||
<td>Persona Legitimacy</td>
|
||
<td>This Technique contains sub-techniques which analysts can use to assert whether an account is presenting an authentic, fabricated, or parody persona:<br><br> T0143.001: Authentic Persona<br> T0143.002: Fabricated Persona<br> T0143.003: Impersonated Persona<br> T0143.004: Parody Persona</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0143.001.md">T0143.001</a></td>
|
||
<td>Authentic Persona</td>
|
||
<td>An individual or institution presenting a persona that legitimately matches who or what they are is presenting an authentic persona.<br><br> For example, an account which presents as being managed by a member of a country’s military, and is legitimately managed by that person, would be presenting an authentic persona (T0143.001: Authentic Persona, T0097.105: Military Personnel).<br><br> Sometimes people can authentically present themselves as who they are while still participating in malicious/inauthentic activity; a legitimate journalist (T0143.001: Authentic Persona, T0097.102: Journalist Persona) may accept bribes to promote products, or they could be tricked by threat actors into sharing an operation’s narrative.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0143.002.md">T0143.002</a></td>
|
||
<td>Fabricated Persona</td>
|
||
<td>An individual or institution pretending to have a persona without any legitimate claim to that persona is presenting a fabricated persona, such as a person who presents themselves as a member of a country’s military without having worked in any capacity with the military (T0143.002: Fabricated Persona, T0097.105: Military Personnel).<br><br> Sometimes real people can present entirely fabricated personas; they can use real names and photos on social media while also pretending to have credentials or traits they don’t have in real life.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0143.003.md">T0143.003</a></td>
|
||
<td>Impersonated Persona</td>
|
||
<td>Threat actors may impersonate existing individuals or institutions to conceal their network identity, add legitimacy to content, or harm the impersonated target’s reputation. This Technique covers situations where an actor presents themselves as another existing individual or institution.<br><br> This Technique was previously called Prepare Assets Impersonating Legitimate Entities and used the ID T0099.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0097: Presented Persona:</b> Analysts can use the sub-techniques of T0097: Presented Persona to categorise the type of impersonation. For example, a document developed by a threat actor which falsely presented as a letter from a government department could be documented using T0085.004: Develop Document, T0143.003: Impersonated Persona, and T0097.206: Government Institution Persona.<br> <b>T0145.001: Copy Account Imagery:</b> Actors may take existing accounts’ profile pictures as part of their impersonation efforts.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0143.004.md">T0143.004</a></td>
|
||
<td>Parody Persona</td>
|
||
<td>Parody is a form of artistic expression that imitates the style or characteristics of a particular work, genre, or individual in a humorous or satirical way, often to comment on or critique the original work or subject matter. People may present as parodies to create humour or make a point by exaggerating or altering elements of the original, while still maintaining recognizable elements.<br><br> The use of parody is not an indication of inauthentic or malicious behaviour; parody allows people to present ideas or criticisms in a comedic or exaggerated manner, softening the impact of sensitive or contentious topics. Because parody is often protected as a form of free speech or artistic expression, it provides a legal and social framework for discussing controversial issues.<br><br> However, parody personas may be perceived as authentic personas, leading to people mistakenly believing that a parody account’s statements represent the real opinions of a parodied target. Threat actors may also use the guise of parody to spread campaign content. Parody personas may disclaim that they are operating as a parody, however this is not always the case, and is not always given prominence.<br><br> <b>Associated Techniques and Sub-techniques</b> <b>T0097: Presented Persona: </b>Analysts can use the sub-techniques of T0097: Presented Persona to categorise the type of parody. For example, an account presenting as a parody of a business could be documented using T0097.205: Business Persona and T0143.003: Parody Persona.<br> <b>T0145.001: Copy Account Imagery:</b> Actors may take existing accounts’ profile pictures as part of their parody efforts.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0144.md">T0144</a></td>
|
||
<td>Persona Legitimacy Evidence</td>
|
||
<td>This Technique contains behaviours which might indicate whether a persona is legitimate, a fabrication, or a parody.<br><br> For example, the same persona being consistently presented across platforms is consistent with how authentic users behave on social media. However, threat actors have also displayed this behaviour as a way to increase the perceived legitimacy of their fabricated personas (aka “backstopping”).</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0144.001.md">T0144.001</a></td>
|
||
<td>Present Persona across Platforms</td>
|
||
<td>This sub-technique covers situations where analysts have identified the same persona being presented across multiple platforms.<br><br> Having multiple accounts presenting the same persona is not an indicator of inauthentic behaviour; many people create accounts and present as themselves on multiple platforms. However, threat actors are known to present the same persona across multiple platforms, benefiting from an increase in perceived legitimacy.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0144.002.md">T0144.002</a></td>
|
||
<td>Persona Template</td>
|
||
<td>Threat actors have been observed following a template when filling their accounts’ online profiles. This may be done to enable account holders to quickly present themselves as a real person with a targeted persona.<br><br> For example, an actor may be instructed to create many fabricated local accounts for use in an operation using a template of “[flag emojis], [location], [personal quote], [political party] supporter” in their account’s description.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0143.002: Fabricated Persona:</b> The use of a templated account biography in a collection of accounts may be an indicator that the personas have been fabricated.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0145.md">T0145</a></td>
|
||
<td>Establish Account Imagery</td>
|
||
<td>Introduce visual elements to an account where a platform allows this functionality (e.g. a profile picture, a cover photo, etc). <br><br> Threat Actors who don’t want to use pictures of themselves in their social media accounts may use alternate imagery to make their account appear more legitimate.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0145.001.md">T0145.001</a></td>
|
||
<td>Copy Account Imagery</td>
|
||
<td>Account imagery copied from an existing account.<br><br> Analysts may use reverse image search tools to try to identify previous uses of account imagery (e.g. a profile picture) by other accounts.<br><br> Threat Actors have been known to copy existing accounts’ imagery to impersonate said accounts, or to provide imagery for unrelated accounts which aren’t intended to impersonate the original assets’ owner.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0143.003: Impersonated Persona:</b> Actors may copy existing accounts’ imagery in an attempt to impersonate them.<br> <b>T0143.004: Parody Persona:</b> Actors may copy existing accounts’ imagery as part of a parody of that account.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0145.002.md">T0145.002</a></td>
|
||
<td>AI-Generated Account Imagery</td>
|
||
<td>AI Generated images used in account imagery.<br><br> An influence operation might flesh out its account by uploading account imagery (e.g. a profile picture), increasing its perceived legitimacy. By using an AI-generated picture for this purpose, they are able to present themselves as a real person without compromising their own identity, or risking detection by taking a real person’s existing profile picture.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0086.002: Develop AI-Generated Images (Deepfakes):</b> Analysts should use this sub-technique to document use of AI generated imagery used to support narratives.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0145.003.md">T0145.003</a></td>
|
||
<td>Animal Account Imagery</td>
|
||
<td>Animal used in account imagery.<br><br> An influence operation might flesh out its account by uploading a profile picture, increasing its perceived authenticity.<br><br> People sometimes legitimately use images of animals as their profile pictures (e.g. of their pets), and threat actors can mimic this behaviour to avoid the risk of detection associated with stealing or AI-generating profile pictures (see T0145.001: Copy Account Imagery and T0145.002: AI-Generated Account Imagery).<br><br> This Technique is often used by Coordinated Inauthentic Behaviour accounts (CIBs). A collection of accounts displaying the same behaviour using similar account imagery can indicate the presence of CIB.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0145.004.md">T0145.004</a></td>
|
||
<td>Scenery Account Imagery</td>
|
||
<td>Scenery or nature used in account imagery.<br><br> An influence operation might flesh out its account by uploading account imagery (e.g. a profile picture), increasing its perceived authenticity.<br><br> People sometimes legitimately use images of scenery as their profile picture, and threat actors can mimic this behaviour to avoid the risk of detection associated with stealing or AI-generating profile pictures (see T0145.001: Copy Account Imagery and T0145.002: AI-Generated Account Imagery).<br><br> This Technique is often used by Coordinated Inauthentic Behaviour accounts (CIBs). A collection of accounts displaying the same behaviour using similar account imagery can indicate the presence of CIB.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0145.005.md">T0145.005</a></td>
|
||
<td>Illustrated Character Account Imagery</td>
|
||
<td>A cartoon/illustrated/anime character used in account imagery.<br><br> An influence operation might flesh out its account by uploading account imagery (e.g. a profile picture), increasing its perceived authenticity.<br><br> People sometimes legitimately use images of illustrated characters as their profile picture, and threat actors can mimic this behaviour to avoid the risk of detection associated with stealing or AI-generating profile pictures (see T0145.001: Copy Account Imagery and T0145.002: AI-Generated Account Imagery).<br><br> This Technique is often used by Coordinated Inauthentic Behaviour accounts (CIBs). A collection of accounts displaying the same behaviour using similar account imagery can indicate the presence of CIB.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0145.006.md">T0145.006</a></td>
|
||
<td>Attractive Person Account Imagery</td>
|
||
<td>Attractive person used in account imagery.<br><br> An influence operation might flesh out its account by uploading account imagery (e.g. a profile picture), increasing its perceived authenticity.<br><br> Pictures of physically attractive people can benefit threat actors by increasing attention given to their posts.<br><br> People sometimes legitimately use images of attractive people as their profile picture, and threat actors can mimic this behaviour to avoid the risk of detection associated with stealing or AI-generating profile pictures (see T0145.001: Copy Account Imagery and T0145.002: AI-Generated Account Imagery).<br><br> This Technique is often used by Coordinated Inauthentic Behaviour accounts (CIBs). A collection of accounts displaying the same behaviour using similar account imagery can indicate the presence of CIB.<br><br> <b>Associated Techniques and Sub-techniques</b><br> <b>T0097.109: Romantic Suitor Persona:</b> Accounts presenting as a romantic suitor may use an attractive person in their account imagery.<br> <b>T0151.017: Dating Platform:</b> Analysts can use this sub-technique for tagging cases where an account has been identified as using a dating platform.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0145.007.md">T0145.007</a></td>
|
||
<td>Stock Image Account Imagery</td>
|
||
<td>Stock images used in account imagery.<br><br> Stock image websites produce photos of people in various situations. Threat Actors can purchase or appropriate these images for use in their account imagery, increasing perceived legitimacy while avoiding the risk of detection associated with stealing or AI-generating profile pictures (see T0145.001: Copy Account Imagery and T0145.002: AI-Generated Account Imagery). <br><br> Stock images tend to include physically attractive people, and this can benefit threat actors by increasing attention given to their posts.<br><br> This Technique is often used by Coordinated Inauthentic Behaviour accounts (CIBs). A collection of accounts displaying the same behaviour using similar account imagery can indicate the presence of CIB.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0146.md">T0146</a></td>
|
||
<td>Account Asset</td>
|
||
<td>An Account is a user-specific profile that allows access to the features and services of an online platform, typically requiring a username and password for authentication.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0146.001.md">T0146.001</a></td>
|
||
<td>Free Account Asset</td>
|
||
<td>Many online platforms allow users to create free accounts on their platform. A Free Account is an Account which does not require payment at account creation and is not subscribed to paid platform features.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0146.002.md">T0146.002</a></td>
|
||
<td>Paid Account Asset</td>
|
||
<td>Some online platforms afford accounts extra features, or other benefits, if the user pays a fee. For example, as of September 2024, content posted by a Paid Account on X (previously Twitter) is prioritised in the platform’s algorithm.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0146.003.md">T0146.003</a></td>
|
||
<td>Verified Account Asset</td>
|
||
<td>Some online platforms apply badges of verification to accounts which meet certain criteria.<br><br>On some platforms (such as dating apps) a verification badge signifies that the account has passed the platform’s identity verification checks. On some platforms (such as X (previously Twitter)) a verification badge signifies that an account has paid for the platform’s service.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0146.004.md">T0146.004</a></td>
|
||
<td>Administrator Account Asset</td>
|
||
<td>Some accounts will have special privileges / will be in control of the Digital Community Hosting Asset; for example, the Admin of a Facebook Page, a Moderator of a Subreddit, etc. etc.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0146.005.md">T0146.005</a></td>
|
||
<td>Lookalike Account ID</td>
|
||
<td>Many platforms which host online communities require creation of a username (or another unique identifier) when an Account is created.<br><br>Sometimes people create usernames which are visually similar to other existing accounts’ usernames. While this is not necessarily an indicator of malicious behaviour, actors can create Lookalike Account IDs to support Impersonations or Parody.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0146.006.md">T0146.006</a></td>
|
||
<td>Open Access Platform</td>
|
||
<td>Some online platforms allow users to take advantage of the platform’s features without creating an account. Examples include the Paste Platform Pastebin, and the Image Board Platforms 4chan and 8chan.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0146.007.md">T0146.007</a></td>
|
||
<td>Automated Account Asset</td>
|
||
<td>An Automated Account is an account which is displaying automated behaviour, such as republishing or liking other accounts’ content, or publishing their own content.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0147.md">T0147</a></td>
|
||
<td>Software Asset</td>
|
||
<td>A Software is a program developed to run on computers or devices that helps users achieve specific goals, such as improving productivity, automating tasks, or having fun.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0147.001.md">T0147.001</a></td>
|
||
<td>Game Asset</td>
|
||
<td>A Game is Software which has been designed for interactive entertainment, where users take on challenges set by the game’s designers.<br><br>While Online Game Platforms allow people to play with each other, Games are designed for single player experiences.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0147.002.md">T0147.002</a></td>
|
||
<td>Game Mod Asset</td>
|
||
<td>A Game Mod is a modification which can be applied to a Game or Multiplayer Online Game to add new content or functionality to the game.<br><br>Users can Modify Games to introduce new content to the game. Modified Games can be distributed on Software Delivery Platforms such as Steam or can be distributed within the Game or Multiplayer Online Game.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0147.003.md">T0147.003</a></td>
|
||
<td>Malware Asset</td>
|
||
<td>Malware is Software which has been designed to cause harm or facilitate malicious behaviour on electronic devices.<br><br>DISARM recommends using the [MITRE ATT&CK Framework](https://attack.mitre.org/) to document malware types and their usage.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0147.004.md">T0147.004</a></td>
|
||
<td>Mobile App Asset</td>
|
||
<td>A Mobile App is an application which has been designed to run on mobile operating systems, such as Android or iOS.<br><br>Mobile Apps can enable access to online platforms (e.g. Facebook’s mobile app) or can provide software which users can run offline on their device.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0148.md">T0148</a></td>
|
||
<td>Financial Instrument</td>
|
||
<td>A Financial Instrument is a platform or software that facilitates the sending, receiving, and management of money, enabling financial transactions between users or organisations.<br><br>Threat actors can deploy financial instruments legitimately to manage their own finances or illegitimately to support fraud schemes.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0148.001.md">T0148.001</a></td>
|
||
<td>Online Banking Platform</td>
|
||
<td>Online Banking Platforms are spaces provided by banks for their customers to manage their Bank Account online.<br><br>The Online Banking Platforms available differ by country. In the United Kingdom, examples of banking institutions which provide Online Banking Platforms include Lloyds, Barclays, and Monzo. In the United States, examples include Citibank, Chase, and Capital One.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0148.002.md">T0148.002</a></td>
|
||
<td>Bank Account Asset</td>
|
||
<td>A Bank Account is a financial account that allows individuals or organisations to store, manage, and access their money, typically for saving, spending, or investment purposes.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0148.003.md">T0148.003</a></td>
|
||
<td>Payment Processing Platform</td>
|
||
<td>Stripe, Paypal, and Apple Pay, Chargebee, Recurly and Zuora are examples of Payment Processing Platforms.<br><br>Payment Processing Platforms produce programs providing Payment Processing or Subscription Processing capabilities which actors can use to set up online storefronts, or to take donations.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0148.004.md">T0148.004</a></td>
|
||
<td>Payment Processing Capability</td>
|
||
<td>A Payment Processing Capability is a feature of online platforms or software which enables the processing of one-off payments (e.g. an online checkout, or donation processing page).<br><br>Payment Processing Capabilities can enable platform users to purchase products or services or can facilitate donations to a given cause.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0148.005.md">T0148.005</a></td>
|
||
<td>Subscription Processing Capability</td>
|
||
<td>A Subscription Processing Capability is a feature of online platforms or software which enables the processing of recurring payments. <br><br>Subscription Processing Capabilities are typically used to enable recurring payments in exchange for continued access to products or services.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0148.006.md">T0148.006</a></td>
|
||
<td>Crowdfunding Platform</td>
|
||
<td>Kickstarter and GoFundMe are examples of Crowdfunding Platforms.<br><br>Crowdfunding Platforms enable users with Accounts to create projects for other platform users to finance, usually in exchange for access to fruits of the project.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0148.007.md">T0148.007</a></td>
|
||
<td>eCommerce Platform</td>
|
||
<td>Amazon, eBay and Etsy are examples of eCommerce Platforms.<br><br>eCommerce Platforms enable users with Accounts to create online storefronts from which other platform users can purchase goods or services.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0148.008.md">T0148.008</a></td>
|
||
<td>Cryptocurrency Exchange Platform</td>
|
||
<td>Coinbase and Kraken are examples of Cryptocurrency Exchange Platforms.<br><br>Cryptocurrency Exchange Platforms provide users a digital marketplace where they can buy, sell, and trade cryptocurrencies, such as Bitcoin or Ethereum. <br><br>Some Cryptocurrency Exchange Platforms allow users to create a Cryptocurrency Wallet.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0148.009.md">T0148.009</a></td>
|
||
<td>Cryptocurrency Wallet</td>
|
||
<td>A Cryptocurrency Wallet is a digital tool that allows users to store, send, and receive cryptocurrencies. It manages private and public keys, enabling secure access to a user's crypto assets.<br><br>An influence operation might use cryptocurrency to conceal that they are conducting operational activities, building assets, or sponsoring aligning entities.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0149.md">T0149</a></td>
|
||
<td>Online Infrastructure</td>
|
||
<td>Online Infrastructure consists of technical assets which enable online activity, such as domains, servers, and IP addresses.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0149.001.md">T0149.001</a></td>
|
||
<td>Domain Asset</td>
|
||
<td>A Domain is a web address (such as “google[.]com”), used to navigate to Websites on the internet.<br><br>Domains differ from Websites in that Websites are considered to be developed web pages which host content, whereas Domains do not necessarily host public-facing web content. <br><br>A threat actor may register a new domain to bypass the old domain being blocked.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0149.002.md">T0149.002</a></td>
|
||
<td>Email Domain Asset</td>
|
||
<td>An Email Domain is a Domain (such as “meta[.]com”) which has the ability to send emails (e.g. from an @meta[.]com address). <br><br>Any Domain which has an MX (Mail Exchange) record and configured SMTP (Simple Mail Transfer Protocol) settings can send and receive emails, and is therefore an Email Domain. </td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0149.003.md">T0149.003</a></td>
|
||
<td>Lookalike Domain</td>
|
||
<td>A Lookalike Domain is a Domain which is visually similar to another Domain, with the potential for web users to mistake one domain for the other.<br><br>Threat actors who want to impersonate organisations’ websites have been observed using a variety of domain impersonation methods. For example, actors wanting to create a domain impersonating netflix.com may use methods such as typosquatting (e.g. n3tflix.com), combosquatting (e.g. netflix-billing.com), or TLD swapping (e.g. netflix.top).</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0149.004.md">T0149.004</a></td>
|
||
<td>Redirecting Domain Asset</td>
|
||
<td>A Redirecting Domain is a Domain which has been configured to redirect users to another Domain when visited.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0149.005.md">T0149.005</a></td>
|
||
<td>Server Asset</td>
|
||
<td>A Server is a computer which provides resources, services, or data to other computers over a network. There are different types of servers, such as web servers (which serve web pages and applications to users), database servers (which manage and provide access to databases), and file servers (which store and share files across a network).</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0149.006.md">T0149.006</a></td>
|
||
<td>IP Address Asset</td>
|
||
<td>An IP Address is a unique numerical label assigned to each device connected to a computer network that uses the Internet Protocol for communication. IP addresses are commonly a part of any online infrastructure.<br><br>IP addresses can be in IPV4 dotted decimal (x.x.x.x) or IPV6 colon-separated hexadecimal (y:y:y:y:y:y:y:y) formats. </td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0149.007.md">T0149.007</a></td>
|
||
<td>VPN Asset</td>
|
||
<td>A VPN (Virtual Private Network) is a service which creates secure, encrypted connections over the internet, allowing users to transmit data safely and access network resources remotely. It masks IP Addresses, enhancing privacy and security by preventing unauthorised access and tracking. VPNs are commonly used for protecting sensitive information, bypassing geographic restrictions, and maintaining online anonymity.<br><br>VPNs can also allow a threat actor to pose as if they are located in one country while in reality being based in another. By doing so, they can try to either mis-attribute their activities to another actor or better hide their own identity.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0149.008.md">T0149.008</a></td>
|
||
<td>Proxy IP Address Asset</td>
|
||
<td>A Proxy IP Address allows a threat actor to mask their real IP Address by putting a layer between them and the online content they’re connecting with. <br><br>Proxy IP Addresses can hide the connection between the threat actor and their online infrastructure.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0149.009.md">T0149.009</a></td>
|
||
<td>Internet Connected Physical Asset</td>
|
||
<td>An Internet Connected Physical Asset (sometimes referred to as IoT (Internet of Things)) is a physical asset which has internet connectivity to support online features, such as digital signage, wireless printers, and smart TVs.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0150.md">T0150</a></td>
|
||
<td>Asset Origin</td>
|
||
<td>Asset Origin contains a list of ways that an actor can obtain an asset. For example, they can create new accounts on online platforms, or they can compromise existing accounts or websites.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0150.001.md">T0150.001</a></td>
|
||
<td>Newly Created Asset</td>
|
||
<td>A Newly Created Asset is an asset which has been created and used for the first time in a documented potential incident.<br><br>For example, analysts which can identify a recent creation date of Accounts participating in the spread of a new narrative can assert these are Newly Created Assets.<br><br>Analysts should use Dormant if the asset was created and laid dormant for an extended period of time before activity.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0150.002.md">T0150.002</a></td>
|
||
<td>Dormant Asset</td>
|
||
<td>A Dormant Asset is an asset which was inactive for an extended period before being used in a documented potential incident.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0150.003.md">T0150.003</a></td>
|
||
<td>Pre-Existing Asset</td>
|
||
<td>Pre-Existing Assets are assets which existed before the observed incident which have not been Repurposed; i.e. they are still being used for their original purpose. <br><br>An example could be an Account which presented itself with a Journalist Persona prior to and during the observed potential incident.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0150.004.md">T0150.004</a></td>
|
||
<td>Repurposed Asset</td>
|
||
<td>Repurposed Assets are assets which have been identified as being used previously, but are now being used for different purposes, or have new Presented Personas.<br><br>Actors have been documented compromising assets, and then repurposing them to present Inauthentic Personas as part of their operations.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0150.005.md">T0150.005</a></td>
|
||
<td>Compromised Asset</td>
|
||
<td>A Compromised Asset is an asset which was originally created or belonged to another person or organisation, but which an actor has gained access to without their consent.<br><br>See also MITRE ATT&CK T1708: Valid Accounts.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0150.006.md">T0150.006</a></td>
|
||
<td>Purchased Asset</td>
|
||
<td>A Purchased Asset is an asset which actors paid for the ownership of. <br><br>For example, threat actors have been observed selling compromised social media accounts on dark web marketplaces, which can be used to disguise operation activity.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0150.007.md">T0150.007</a></td>
|
||
<td>Rented Asset</td>
|
||
<td>A Rented Asset is an asset which actors are temporarily renting or subscribing to. <br><br>For example, threat actors have been observed renting temporary access to legitimate accounts on online platforms in order to disguise operation activity.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0150.008.md">T0150.008</a></td>
|
||
<td>Bulk Created Asset</td>
|
||
<td>A Bulk Created Asset is an asset which was created alongside many other instances of the same asset.<br><br>Actors have been observed bulk creating Accounts on Social Media Platforms such as Facebook. Indicators of bulk asset creation include its creation date, assets’ naming conventions, their configuration (e.g. templated personas, visually similar profile pictures), or their activity (e.g. post timings, narratives posted).</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.md">T0151</a></td>
|
||
<td>Digital Community Hosting Asset</td>
|
||
<td>A Digital Community Hosting Asset is an online asset which can be used by actors to provide spaces for users to interact with each other.<br><br>Sub-techniques categorised under Digital Community Hosting Assets can include Content Hosting and Content Delivery capabilities; however, their nominal primary purpose is to provide a space for community interaction.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.001.md">T0151.001</a></td>
|
||
<td>Social Media Platform</td>
|
||
<td>Examples of popular Social Media Platforms include Facebook, Instagram, and VK.<br><br>Social Media Platforms allow users to create Accounts, which they can configure to present themselves to other platform users. This typically involves Establishing Account Imagery and Presenting a Persona.<br><br>Social Media Platforms typically allow the creation of Online Community Groups and Online Community Pages.<br><br>Accounts on Social Media Platforms are typically presented with a feed of content posted to the platform. The content that populates this feed can be aggregated by the platform’s proprietary Content Recommendation Algorithm, or users can “friend” or “follow” other accounts to add their posts to their feed.<br><br>Many Social Media Platforms also allow users to send direct messages to other users on the platform. </td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.002.md">T0151.002</a></td>
|
||
<td>Online Community Group</td>
|
||
<td>Some online platforms allow people with Accounts to create Online Community Groups. Groups are usually created around a specific topic or locality, and allow users to post content to the group, and interact with other users’ posted content. <br><br>For example, Meta’s Social Media Platform Facebook allows users to create a “Facebook group”. This feature is not exclusive to Social Media Platforms; the Microblogging Platform X (prev. Twitter) allows users to create “X Communities”, groups based on particular topics which users can join and post to; the Software Delivery Platform Steam allows users to create Steam Community Groups.<br><br>Online Community Groups can be open or gated (for example, groups can require admin approval before users can join).</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.003.md">T0151.003</a></td>
|
||
<td>Online Community Page</td>
|
||
<td>A Facebook Page is an example of an Online Community Page.<br><br>Online Community Pages allow Administrator Accounts to post content to the page, which other users can interact with. Pages can be followed or liked by other users - but these users can’t initiate new posts to the page.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.004.md">T0151.004</a></td>
|
||
<td>Chat Platform</td>
|
||
<td>Examples of popular Chat Platforms include WhatsApp, WeChat, Telegram, and Signal; Slack, Mattermost, and Discord; Zoom, GoTo Meeting, and WebEx.<br><br>Chat Platforms allow users to engage in text, audio, or video chats with other platform users.<br><br>Different Chat Platforms afford users different capabilities. Examples include Direct Messaging, Chat Rooms, Chat Broadcast Channels, and Chat Community Servers.<br><br>Some Chat Platforms enable encrypted communication between platform users.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.005.md">T0151.005</a></td>
|
||
<td>Chat Community Server</td>
|
||
<td>Chat Platforms such as Discord, Slack, and Microsoft Teams allow users to create their own Chat Community Servers, which they can invite other platform users to join.<br><br>Chat Community Servers are online communities made up of Chat Rooms (or “Channels”) in which users can discuss the given group’s topic. Groups can either be public (shown in the server’s browsable list of channels, available for any member to view and join) or Gated (users must be added to the chat group by existing members to participate).<br><br>Some Chat Community Servers allow users to create Chat Broadcast Groups, in which only specific members (e.g. server administrators) of the chat are able to post new content to the group.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.006.md">T0151.006</a></td>
|
||
<td>Chat Room</td>
|
||
<td>Many platforms which enable community interaction allow users to create Chat Rooms; a room in which members of the group can talk to each other via text, audio, or video.<br><br>Most Chat Rooms are Gated; users must be added to the chat group before they can post to the chat group, or view its content. For example, on WhatsApp a user can create a Chat Room containing other WhatsApp users whose contact information they have. At this point the user who created the Chat Room has an Administrator Account; they are uniquely able to add other users to the Chat Room.<br><br>However, Chat Rooms made on Chat Community Servers such as Discord can be Gated or open. If left open, anyone on the server can view the Chat Room (“channel”), read its contents, and choose to join it.<br><br>Examples of Platforms which allow creation of Chat Rooms include:<br>Instagram, Facebook, X (prev. Twitter) (Group Direct Messaging)<br>Whatsapp, Telegram, WeChat, Signal (Group Chats)<br>Discord, Slack, Mattermost, Microsoft Teams (Channels)</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.007.md">T0151.007</a></td>
|
||
<td>Chat Broadcast Group</td>
|
||
<td>A Chat Broadcast Group is a type of Chat Group in which only specific members can send content to the channel (typically administrators, or approved group members). Members of the channel may be able to react to content, or comment on it, but can’t directly push new content to the channel.<br><br>Examples include:<br>WhatsApp, Telegram, Discord: Chat Groups in which only admins are able to post new content.<br>X (prev. Twitter): Spaces (an audio discussion hosting feature) in which admins control who can speak at a given moment.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.008.md">T0151.008</a></td>
|
||
<td>Microblogging Platform</td>
|
||
<td>Examples of Microblogging Platforms include TikTok, Threads, Bluesky, Mastodon, QQ, Tumblr, and X (formerly Twitter).<br><br>Microblogging Platforms allow users to create Accounts, which they can configure to present themselves to other platform users. This typically involves Establishing Account Imagery and Presenting a Persona. <br><br>Accounts on Microblogging Platforms are able to post short-form text content alongside media.<br><br>Content posted to the platforms is aggregated into different feeds and presented to the user. Typical feeds include content posted by other Accounts which the user follows, and content promoted by the platform’s proprietary Content Recommendation Algorithm. Users can also search or use hashtags to discover new content.<br><br>Mastodon is an open-source decentralised software which allows anyone to create their own Microblogging Platform that can communicate with other platforms within the “fediverse” (similar to how different email platforms can send emails to each other). Meta’s Threads is a Microblogging Platform which can interact with the fediverse.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.009.md">T0151.009</a></td>
|
||
<td>Legacy Online Forum Platform</td>
|
||
<td>Examples of Legacy Online Forum Platforms include Something Awful (SA Forums), Ars Technica forums, and NeoGAF, and the forums available on the Mumsnet and War Thunder websites.<br><br>Legacy Online Forum Platforms are a type of message board (using software such as vBulletin or phpBB) popular in the early 2000s for online communities. They are often used to provide spaces for a community to exist around a given website or topic. <br><br>Legacy Online Forum Platforms allow users to create Accounts to join in discussion threads posted to any number of Forums and Sub-Forums on the platform. Forums and Sub-Forums can be Gated, allowing access to approved users only. They can vary in size. Some are larger platforms that host a wider set of topics and communities while others are smaller in scope and size. </td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.010.md">T0151.010</a></td>
|
||
<td>Community Forum Platform</td>
|
||
<td>Reddit, Lemmy and Tildes are examples of Community Forum Platforms.<br><br>Community Forum Platforms are exemplified by users’ ability to create their own sub-communities (Community Sub-Forums) which other platform users can join. <br><br>Platform users can view aggregated content from all Community Sub-Forums they subscribe to, or they can view all content from a particular Community Sub-Forum.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.011.md">T0151.011</a></td>
|
||
<td>Community Sub-Forum</td>
|
||
<td>Community Forum Platforms are made up of many Community Sub-Forums. Sub-Forums provide spaces for platform users to create a community based around any topic. <br><br>For example, Reddit (a popular Community Forum Platform) has over 138,000 “subreddits” (Community Sub-Forums), including 1082 unique cat-based communities.<br><br>Typically, Sub-Forums allow users post text, image, or video to them, and other platform users can up/downvote, or comment on it. Sub-forums may have their own extra rules alongside the platform’s global rules, enforced by community moderators. <br><br>While most Sub-Forums are made by users with Accounts on the Community Forum Platform, Sub-Forums can also be created by the platform itself.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.012.md">T0151.012</a></td>
|
||
<td>Image Board Platform</td>
|
||
<td>4chan and 8chan are examples of Image Board Platforms.<br><br>Image Board Platforms provide individual boards on which users can start threads related to the board’s topic. For example, 4chan’s /pol/ board provides a space for users to talk about politics. <br><br>Most Image Board Platforms allow users to post without creating an account. Posts are typically made anonymously, although users can choose to post under a pseudonym.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.013.md">T0151.013</a></td>
|
||
<td>Question and Answer Platform</td>
|
||
<td>Quora, Stack Overflow, and Yahoo Answers are examples of Question and Answer Platforms.<br><br>Question and Answer Platforms allow users to create Accounts letting them post questions to the platform community, and respond to other platform users’ questions.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.014.md">T0151.014</a></td>
|
||
<td>Comments Section</td>
|
||
<td>Many platforms enable community interaction via Comments Sections on posted content. Comments Sections allow platform users to comment on content posted by other users. <br><br>On some platforms Comments Sections are the only place available for community interaction, such as news websites which provide a Comments Section to discuss articles posted to the website.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.015.md">T0151.015</a></td>
|
||
<td>Online Game Platform</td>
|
||
<td>Roblox, Minecraft, Fortnite, League of Legends, and World of Warcraft are examples of Online Game Platforms.<br><br>Online Game Platforms allow users to create Accounts which they can use to access Online Game Sessions; i.e. an individual instance of a multiplayer online game.<br><br>Many Online Game Platforms support text or voice chat within Online Game Sessions.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.016.md">T0151.016</a></td>
|
||
<td>Online Game Session</td>
|
||
<td>Online Game Sessions are instances of a game played on an Online Game Platform. Examples of Online Game Sessions include a match in Fortnite or League of Legends, or a server in Minecraft, Fortnite, or World of Warcraft.<br><br>Some Online Game Platforms (such as Fortnite, League of Legends, and World of Warcraft) host Online Game Sessions on their own Servers, and don’t allow other actors to host Online Game Sessions.<br><br>Some Online Game Platforms (such as Roblox and Minecraft) allow users to host instances of Online Game Sessions on their own Servers.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0151.017.md">T0151.017</a></td>
|
||
<td>Dating Platform</td>
|
||
<td>Tinder, Bumble, Grindr, Tantan, Badoo, Plenty of Fish, hinge, LOVOO, OkCupid, happn, and Mamba are examples of Dating Platforms.<br><br>Dating Platforms allow users to create Accounts, letting them connect with other platform users with the purpose of developing a physical/romantic relationship.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0152.md">T0152</a></td>
|
||
<td>Digital Content Hosting Asset</td>
|
||
<td>Digital Content Hosting Assets are online assets which are primarily designed to allow actors to upload content to the internet. <br><br>Sub-techniques categorised under Digital Content Hosting Assets can include Community Hosting and Content Delivery capabilities; however their nominal primary purpose is to host content online.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0152.001.md">T0152.001</a></td>
|
||
<td>Blogging Platform</td>
|
||
<td>Medium and Substack are examples of Blogging Platforms. <br><br>By creating an Account on a Blogging Platform, people are able to create their own Blog.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0152.002.md">T0152.002</a></td>
|
||
<td>Blog Asset</td>
|
||
<td>Blogs are a collation of posts centred on a particular topic, author, or collection of authors.<br><br>Some platforms are designed to support users in hosting content online, such as Blogging Platforms like Substack which allow users to create Blogs, but other online platforms can also be used to produce a Blog; a Paid Account on X (prev Twitter) is able to post long-form text content to their timeline in a style of a blog.<br><br>Actors may create Accounts on Blogging Platforms to create a Blog, or make their own Blog on a Website.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0152.003.md">T0152.003</a></td>
|
||
<td>Website Hosting Platform</td>
|
||
<td>Examples of Website Hosting Platforms include Wix, Webflow, Weebly, and Wordpress.<br><br>Website Hosting Platforms help users with managing online infrastructure required to host a website online; such as securing IP Addresses and Domains.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0152.004.md">T0152.004</a></td>
|
||
<td>Website Asset</td>
|
||
<td>A Website is a collection of related web pages hosted on a server and accessible via a web browser. Websites have an associated Domain and can host various types of content, such as text, images, videos, and interactive features. <br><br>When a Website is fleshed out, it Presents a Persona to site visitors. For example, the Domain “bbc.co.uk/news” hosts a Website which uses the News Outlet Persona.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0152.005.md">T0152.005</a></td>
|
||
<td>Paste Platform</td>
|
||
<td>Pastebin is an example of a Paste Platform.<br><br>Paste Platforms allow people to upload unformatted text to the platform, which they can share via a link. Some Paste Platforms are Open Access Platforms which allow users to upload content without creating an Account first.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0152.006.md">T0152.006</a></td>
|
||
<td>Video Platform</td>
|
||
<td>YouTube, Vimeo, and LiveLeak are examples of Video Platforms.<br><br>Video Platforms allow people to create Accounts which they can use to upload video content for people to watch on the platform.<br><br>The ability to host videos is not exclusive to Video Platforms; many online platforms allow users with Accounts to upload video content. However, Video Platforms’ primary purpose is to be a place to host and view video content.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0152.007.md">T0152.007</a></td>
|
||
<td>Audio Platform</td>
|
||
<td>Soundcloud, Spotify, and YouTube Music; Apple Podcasts, Podbean, and Captivate are examples of Audio Platforms.<br><br>Audio Platforms allow people to create Accounts which they can use to upload audio content to the platform. <br><br>The ability to host audio is not exclusive to Audio Platforms; many online platforms allow users with Accounts to upload audio content. However, Audio Platforms’ primary purpose is to be a place to host and listen to audio content.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0152.008.md">T0152.008</a></td>
|
||
<td>Live Streaming Platform</td>
|
||
<td>Twitch.tv and Whatnot are examples of Live Streaming Platforms. <br><br>Live Streaming Platforms allow people to create Accounts and stream live content (video or audio). A temporary open Group Chat is created alongside live streamed content for viewers to discuss the stream. Some Live Streaming Platforms allow users to archive streamed content for later non-live viewing.<br><br>The ability to stream live media is not exclusive to Live Streaming Platforms; many online platforms allow users with Accounts to stream content (such as the Video Platform YouTube’s “YouTube Live”, and the Social Media Platform Facebook’s “Facebook Live”). However, Live Streaming Platforms’ primary purpose is to be a place for people to stream content live.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0152.009.md">T0152.009</a></td>
|
||
<td>Software Delivery Platform</td>
|
||
<td>Apple’s App Store, Google’s Google Play Store, and Valve’s Steam are examples of Software Delivery Platforms.<br><br>Software Delivery Platforms are designed to enable users to download programmes uploaded to the platform. Software can be purchased, or downloaded for free. <br><br>Some Software Delivery Platforms require users to have an Account before they can download software, and software they acquire becomes associated with the account (i.e. the account owns a licence to download the software). Some platforms don’t require users to make accounts before downloading software.<br><br>Actors may create their own Software Delivery Platform on a Domain they own.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0152.010.md">T0152.010</a></td>
|
||
<td>File Hosting Platform</td>
|
||
<td>Dropbox and Google Drive are examples of File Hosting Platforms.<br><br>File Hosting Platforms allow people to create Accounts which they can use to host files on another server, enabling access to content on any machine, and the ability to easily share files with anyone online.<br><br>Actors may also create their own File Hosting Platform on a Website or Server they control.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0152.011.md">T0152.011</a></td>
|
||
<td>Wiki Platform</td>
|
||
<td>Wikipedia, Fandom, Ruwiki, TV Tropes, and the SCP Foundation are examples of Wiki Platforms.<br><br>Wikis use wiki software to allow platform users to collaboratively create and maintain an encyclopedia of information related to a given topic. </td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0152.012.md">T0152.012</a></td>
|
||
<td>Subscription Service Platform</td>
|
||
<td>Patreon, Fansly, and OnlyFans are examples of Subscription Service Platforms.<br><br>Subscription Service Platforms enable users with Accounts to host online content to which other platform users can subscribe to access. Content typically requires Paid Subscription to access, however open content is often also supported.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0153.md">T0153</a></td>
|
||
<td>Digital Content Delivery Asset</td>
|
||
<td>Digital Content Delivery Assets are assets which support the delivery of content to users online. <br><br>Sub-techniques categorised under Digital Content Delivery Assets can include Community Hosting and Content Hosting capabilities; however their nominal primary purpose is to support the delivery of content to users online.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0153.001.md">T0153.001</a></td>
|
||
<td>Email Platform</td>
|
||
<td>Gmail, iCloud mail, and Microsoft Outlook are examples of Email Platforms.<br><br>Email Platforms are online platforms which allow people to create Accounts that they can use to send and receive emails to and from other email accounts. <br><br>Instead of using an Email Platform, actors may set up their own Email Domain, letting them send and receive emails on a custom domain.<br><br>Analysts should default to Email Platform if they cannot confirm whether an email was sent using a privately operated email, or via an account on a public email platform (for example, in situations where analysts are coding third party reporting which does not specify the type of email used).</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0153.002.md">T0153.002</a></td>
|
||
<td>Link Shortening Platform</td>
|
||
<td>Bitly and TinyURL are examples of Link Shortening Platforms.<br><br>Link Shortening Platforms are online platforms which allow people to create Accounts that they can use to convert existing URLs into Shortened Links, or into QR Codes.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0153.003.md">T0153.003</a></td>
|
||
<td>Shortened Link Asset</td>
|
||
<td>A Shortened Link is a custom URL which is typically a shortened version of another URL.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0153.004.md">T0153.004</a></td>
|
||
<td>QR Code Asset</td>
|
||
<td>A QR Code allows people to use cameras on their smartphones to open a URL.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0153.005.md">T0153.005</a></td>
|
||
<td>Online Advertising Platform</td>
|
||
<td>Google Ads, Facebook Ads, and LinkedIn Marketing Solutions are examples of Online Advertising Platforms.<br><br>Online Advertising Platforms are online platforms which allow people to create Accounts that they can use to upload and deliver adverts to people online.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0153.006.md">T0153.006</a></td>
|
||
<td>Content Recommendation Algorithm</td>
|
||
<td>Many online platforms have Content Recommendation Algorithms, which promote content posted to the platform to users based on metrics the platform operators are trying to meet. Algorithms typically surface platform content which the user is likely to engage with, based on how they and other users have behaved on the platform.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0153.007.md">T0153.007</a></td>
|
||
<td>Direct Messaging</td>
|
||
<td>Many online platforms allow users to contact other platform users via Direct Messaging; private messaging which can be initiated by a user with other platform users.<br><br>Examples include messaging on WhatsApp, Telegram, and Signal; direct messages (DMs) on Facebook or Instagram.<br><br>Some platforms’ Direct Messaging capabilities provide users with Encrypted Communication.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0154.md">T0154</a></td>
|
||
<td>Digital Content Creation Asset</td>
|
||
<td>Digital Content Creation Assets are Platforms or Software which help actors produce content for publication online.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0154.001.md">T0154.001</a></td>
|
||
<td>AI LLM Platform</td>
|
||
<td>OpenAI’s ChatGPT, Google’s Bard, Microsoft’s Turing-NLG, Google’s T5 (Text-to-Text Transfer Transformer), and Facebook’s BART are examples of AI LLM (Large Language Model) Platforms.<br><br>AI LLM Platforms are online platforms which allow people to create Accounts that they can use to interact with the platform’s AI Large Language Model, to produce text-based content.<br><br>LLMs can create hyper-realistic synthetic text that is both scalable and persuasive. LLMs can largely automate content production, reducing the overhead in persona creation, and generate culturally appropriate outputs that are less prone to exhibiting conspicuous signs of inauthenticity.<br><br>Some platforms implement protections against misuse of AI by their users. Threat Actors have been observed bypassing these protections using prompt injections, poisoning, jailbreaking, or integrity attacks.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0154.002.md">T0154.002</a></td>
|
||
<td>AI Media Platform</td>
|
||
<td>AI Media Platforms are online platforms that allow people to create Accounts which they can use to produce image, video, or audio content (also known as “deepfakes”) using the platform’s AI Software.<br><br>Midjourney, DALL-E, Stable Diffusion, and Adobe Firefly are examples of AI Media Platforms which allow users to Develop AI-Generated Images, AI-Generated Videos and AI-Generated Account Imagery. <br><br>Similarly, Reface, Zao, FaceApp, and Wombo are mobile apps which offer features for creating AI-Generated videos, gifs, or trending memes.<br><br>AI-Generated Audio such as text-to-speech and voice cloning have revolutionised the creation of synthetic voices that closely mimic human speech. AI Media Platforms such as Descript, Fliki, Murf AI, PlayHT, and Resemble AI can be used to generate synthetic voice. <br><br>Some platforms implement protections against misuse of AI by their users. Threat Actors have been observed bypassing these protections using prompt injections, poisoning, jailbreaking, or integrity attacks.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0155.md">T0155</a></td>
|
||
<td>Gated Asset</td>
|
||
<td>Some assets are Gated; closed communities or platforms which can’t be accessed openly. They may be password protected or require admin approval for entry. Many different digital assets can be gated. This technique contains sub-techniques with methods used to gate assets. Analysts can use T0155: Gated Asset if the method of gating is unclear.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0155.001.md">T0155.001</a></td>
|
||
<td>Password Gated Asset</td>
|
||
<td>A Password Gated Asset is an online asset which requires a password to gain access. <br><br>Examples include password protected Servers set up to be a File Hosting Platform, or password protected Community Sub-Forums.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0155.002.md">T0155.002</a></td>
|
||
<td>Invite Gated Asset</td>
|
||
<td>An Invite Gated Asset is an online asset which requires an existing user to invite other users for access to the asset.<br><br>Examples include Chat Groups in which Administrator Accounts are able to add or remove users, or File Hosting Platforms which allow users to invite other users to access their files.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0155.003.md">T0155.003</a></td>
|
||
<td>Approval Gated Asset</td>
|
||
<td>An Approval Gated Asset is an online asset which requires approval from Administrator Accounts for access to the asset.<br><br>Examples include Online Community Groups on Facebook, which can be configured to require questions and approval before access, and Accounts on Social Media Platforms such as Instagram, which allow users to set their accounts as visible to approved friends only.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0155.004.md">T0155.004</a></td>
|
||
<td>Geoblocked Asset</td>
|
||
<td>A Geoblocked Asset is an online asset which cannot be accessed in specific geographical locations.<br><br>Assets can be Geoblocked by choice of the platform, or can have Geoblocking mandated by regulators, and enforced through Internet Service Providers.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0155.005.md">T0155.005</a></td>
|
||
<td>Paid Access Asset</td>
|
||
<td>A Paid Access Asset is an online asset which requires a single payment for permanent access to the asset.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0155.006.md">T0155.006</a></td>
|
||
<td>Subscription Access Asset</td>
|
||
<td>A Subscription Access Asset is an online asset which requires a continued subscription for access to the asset.<br><br>Examples include the Blogging Platform Substack, which affords Blogs hosted on their platform the ability to produce subscriber-only posts, and the Subscription Service Platform Patreon.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0155.007.md">T0155.007</a></td>
|
||
<td>Encrypted Communication Channel</td>
|
||
<td>Some online platforms support encrypted communication between platform users, for example the Chat Platforms Telegram and Signal.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
</table>
|