mirror of
https://github.com/DISARMFoundation/DISARMframeworks.git
synced 2024-12-23 22:39:45 -05:00
605 lines
32 KiB
Markdown
605 lines
32 KiB
Markdown
# DISARM Techniques:
|
||
|
||
<table border="1">
|
||
<tr>
|
||
<th>disarm_id</th>
|
||
<th>name</th>
|
||
<th>summary</th>
|
||
<th>tactic_id</th>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0002.md">T0002</a></td>
|
||
<td>Facilitate State Propaganda</td>
|
||
<td>Organize citizens around pro-state messaging. Coordinate paid or volunteer groups to push state propaganda.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0003.md">T0003</a></td>
|
||
<td>Leverage Existing Narratives</td>
|
||
<td>Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices. </td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0004.md">T0004</a></td>
|
||
<td>Develop Competing Narratives</td>
|
||
<td>Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centered on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the "firehose of misinformation" approach.</td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0007.md">T0007</a></td>
|
||
<td>Create Inauthentic Social Media Pages and Groups</td>
|
||
<td>Create key social engineering assets needed to amplify content, manipulate algorithms, fool public and/or specific incident/campaign targets. Computational propaganda depends substantially on false perceptions of credibility and acceptance. By creating fake users and groups with a variety of interests and commitments, attackers can ensure that their messages both come from trusted sources and appear more widely adopted than they actually are.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0009.md">T0009</a></td>
|
||
<td>Create fake experts</td>
|
||
<td>Stories planted or promoted in computational propaganda operations often make use of experts fabricated from whole cloth, sometimes specifically for the story itself. </td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0010.md">T0010</a></td>
|
||
<td>Cultivate ignorant agents</td>
|
||
<td>Cultivate propagandists for a cause, the goals of which are not fully comprehended, and who are used cynically by the leaders of the cause. Independent actors use social media and specialised web sites to strategically reinforce and spread messages compatible with their own. Their networks are infiltrated and used by state media disinformation organisations to amplify the state’s own disinformation strategies against target populations. Many are traffickers in conspiracy theories or hoaxes, unified by a suspicion of Western governments and mainstream media. Their narratives, which appeal to leftists hostile to globalism and military intervention and nationalists against immigration, are frequently infiltrated and shaped by state-controlled trolls and altered news items from agencies such as RT and Sputnik. Also know as "useful idiots" or "unwitting agents".</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0011.md">T0011</a></td>
|
||
<td>Compromise legitimate accounts</td>
|
||
<td>Hack or take over legimate accounts to distribute misinformation or damaging content.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0013.md">T0013</a></td>
|
||
<td>Create inauthentic websites</td>
|
||
<td>Create media assets to support inauthentic organizations (e.g. think tank), people (e.g. experts) and/or serve as sites to distribute malware/launch phishing operations.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0014.md">T0014</a></td>
|
||
<td>Prepare fundraising campaigns</td>
|
||
<td>Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities. </td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0015.md">T0015</a></td>
|
||
<td>Create hashtags and search artifacts</td>
|
||
<td>Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicize the story more widely through trending lists and search behavior. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites). </td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0016.md">T0016</a></td>
|
||
<td>Create Clickbait</td>
|
||
<td>Create attention grabbing headlines (outrage, doubt, humor) required to drive traffic & engagement. This is a key asset.</td>
|
||
<td>TA05</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0017.md">T0017</a></td>
|
||
<td>Conduct fundraising</td>
|
||
<td>Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services166 on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns to promote operation messaging while raising money to support its activities. </td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0018.md">T0018</a></td>
|
||
<td>Purchase Targeted Advertisements</td>
|
||
<td>Create or fund advertisements targeted at specific populations</td>
|
||
<td>TA05</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0019.md">T0019</a></td>
|
||
<td>Generate information pollution</td>
|
||
<td>Flood social channels; drive traffic/engagement to all assets; create aura/sense/perception of pervasiveness/consensus (for or against or both simultaneously) of an issue or topic. "Nothing is true, but everything is possible." Akin to astroturfing campaign.</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0020.md">T0020</a></td>
|
||
<td>Trial content</td>
|
||
<td>Iteratively test incident performance (messages, content etc), e.g. A/B test headline/content enagagement metrics; website and/or funding campaign conversion rates</td>
|
||
<td>TA08</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0022.md">T0022</a></td>
|
||
<td>Leverage Conspiracy Theory Narratives</td>
|
||
<td>"Conspiracy narratives" appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalized or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the "firehose of falsehoods" model. </td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0023.md">T0023</a></td>
|
||
<td>Distort facts</td>
|
||
<td>Change, twist, or exaggerate existing facts to construct a narrative that differs from reality. Examples: images and ideas can be distorted by being placed in an improper content</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0029.md">T0029</a></td>
|
||
<td>Online polls</td>
|
||
<td>Create fake online polls, or manipulate existing online polls. Data gathering tactic to target those who engage, and potentially their networks of friends/followers as well</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0039 .md">T0039 </a></td>
|
||
<td>Bait legitimate influencers</td>
|
||
<td>Credibility in a social media environment is often a function of the size of a user's network. "Influencers" are so-called because of their reach, typically understood as: 1) the size of their network (i.e. the number of followers, perhaps weighted by their own influence); and 2) The rate at which their comments are re-circulated (these two metrics are related). Add traditional media players at all levels of credibility and professionalism to this, and the number of potential influencial carriers available for unwitting amplification becomes substantial. By targeting high-influence people and organizations in all types of media with narratives and content engineered to appeal their emotional or ideological drivers, influence campaigns are able to add perceived credibility to their messaging via saturation and adoption by trusted agents such as celebrities, journalists and local leaders.</td>
|
||
<td>TA08</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0040.md">T0040</a></td>
|
||
<td>Demand insurmountable proof</td>
|
||
<td>Campaigns often leverage tactical and informational asymmetries on the threat surface, as seen in the Distort and Deny strategies, and the "firehose of misinformation". Specifically, conspiracy theorists can be repeatedly wrong, but advocates of the truth need to be perfect. By constantly escalating demands for proof, propagandists can effectively leverage this asymmetry while also priming its future use, often with an even greater asymmetric advantage. The conspiracist is offered freer rein for a broader range of "questions" while the truth teller is burdened with higher and higher standards of proof.</td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0042.md">T0042</a></td>
|
||
<td>Seed Kernel of truth</td>
|
||
<td>Wrap lies or altered context/facts around truths. Influence campaigns pursue a variety of objectives with respect to target audiences, prominent among them: 1. undermine a narrative commonly referenced in the target audience; or 2. promote a narrative less common in the target audience, but preferred by the attacker. In both cases, the attacker is presented with a heavy lift. They must change the relative importance of various narratives in the interpretation of events, despite contrary tendencies. When messaging makes use of factual reporting to promote these adjustments in the narrative space, they are less likely to be dismissed out of hand; when messaging can juxtapose a (factual) truth about current affairs with the (abstract) truth explicated in these narratives, propagandists can undermine or promote them selectively. Context matters.</td>
|
||
<td>TA08</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0043.md">T0043</a></td>
|
||
<td>Chat apps</td>
|
||
<td>Direct messaging via chat app is an increasing method of delivery. These messages are often automated and new delivery and storage methods make them anonymous, viral, and ephemeral. This is a difficult space to monitor, but also a difficult space to build acclaim or notoriety.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0044.md">T0044</a></td>
|
||
<td>Seed distortions</td>
|
||
<td>Try a wide variety of messages in the early hours surrounding an incident or event, to give a misleading account or impression. </td>
|
||
<td>TA08</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0045.md">T0045</a></td>
|
||
<td>Use fake experts</td>
|
||
<td>Use the fake experts that were set up during Establish Legitimacy. Pseudo-experts are disposable assets that often appear once and then disappear. Give "credility" to misinformation. Take advantage of credential bias</td>
|
||
<td>TA08</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0046.md">T0046</a></td>
|
||
<td>Use Search Engine Optimization</td>
|
||
<td>Manipulate content engagement metrics (ie: Reddit & Twitter) to influence/impact news search results (e.g. Google), also elevates RT & Sputnik headline into Google news alert emails. aka "Black-hat SEO" </td>
|
||
<td>TA08</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0047.md">T0047</a></td>
|
||
<td>Censor social media as a political force</td>
|
||
<td>Use political influence or the power of state to stop critical social media comments. Government requested/driven content take downs (see Google Transperancy reports).</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0048.md">T0048</a></td>
|
||
<td>Harass</td>
|
||
<td>Threatening or harassing believers of opposing narratives refers to the use of intimidation techniques, including cyberbullying and doxing, to discourage opponents from voicing their dissent. An influence operation may threaten or harass believers of the opposing narratives to deter individuals from posting or proliferating conflicting content. </td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0049.md">T0049</a></td>
|
||
<td>Flooding the Information Space</td>
|
||
<td>Flooding and/or mobbing social media channels feeds and/or hashtag with excessive volume of content to control/shape online conversations and/or drown out opposing points of view. Bots and/or patriotic trolls are effective tools to acheive this effect.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0057.md">T0057</a></td>
|
||
<td>Organize Events</td>
|
||
<td>Coordinate and promote real-world events across media platforms, e.g. rallies, protests, gatherings in support of incident narratives.</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0059.md">T0059</a></td>
|
||
<td>Play the long game</td>
|
||
<td>Play the long game refers to two phenomena: 1. To plan messaging and allow it to grow organically without conducting your own amplification. This is methodical and slow and requires years for the message to take hold 2. To develop a series of seemingly disconnected messaging narratives that eventually combine into a new narrative.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0060.md">T0060</a></td>
|
||
<td>Continue to Amplify</td>
|
||
<td>continue narrative or message amplification after the main incident work has finished</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0061.md">T0061</a></td>
|
||
<td>Sell Merchandise</td>
|
||
<td>Sell mechandise refers to getting the message or narrative into physical space in the offline world while making money</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0065.md">T0065</a></td>
|
||
<td>Prepare Physical Broadcast Capabilities</td>
|
||
<td>Create or coopt broadcast capabilities (e.g. TV, radio etc).</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0066.md">T0066</a></td>
|
||
<td>Degrade Adversary</td>
|
||
<td>Plan to degrade an adversary’s image or ability to act. This could include preparation and use of harmful information about the adversary’s actions or reputation.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0068.md">T0068</a></td>
|
||
<td>Respond to Breaking News Event or Active Crisis</td>
|
||
<td>Media attention on a story or event is heightened during a breaking news event, where unclear facts and incomplete information increase speculation, rumors, and conspiracy theories, which are all vulnerable to manipulation. </td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0072.md">T0072</a></td>
|
||
<td>Segment Audiences</td>
|
||
<td>Create audience segmentations by features of interest to the influence campaign, including political affiliation, geographic location, income, demographics, and psychographics. </td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0073.md">T0073</a></td>
|
||
<td>Determine Target Audiences</td>
|
||
<td>tbd</td>
|
||
<td>TA01</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0074.md">T0074</a></td>
|
||
<td>Determine Strategic Ends</td>
|
||
<td>tbd</td>
|
||
<td>TA01</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0075.md">T0075</a></td>
|
||
<td>Dismiss</td>
|
||
<td>Push back against criticism by dismissing your critics. This might be arguing that the critics use a different standard for you than with other actors or themselves; or arguing that their criticism is biased.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0076.md">T0076</a></td>
|
||
<td>Distort</td>
|
||
<td>Twist the narrative. Take information, or artifacts like images, and change the framing around them.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0077.md">T0077</a></td>
|
||
<td>Distract</td>
|
||
<td>Shift attention to a different narrative or actor, for instance by accusing critics of the same activity that they’ve accused you of (e.g. police brutality).</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0078.md">T0078</a></td>
|
||
<td>Dismay</td>
|
||
<td>Threaten the critic or narrator of events. For instance, threaten journalists or news outlets reporting on a story.</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0079.md">T0079</a></td>
|
||
<td>Divide</td>
|
||
<td>Create conflict between subgroups, to widen divisions in a community</td>
|
||
<td>TA02</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0080.md">T0080</a></td>
|
||
<td>Map Target Audience Information Environment</td>
|
||
<td>Mapping the target audience information environment analyzes the information space itself, including social media analytics, web traffic, and media surveys. Mapping the information environment may help the influence operation determine the most realistic and popular information channels to reach its target audience.
|
||
Mapping the target audience information environment aids influence operations in determining the most vulnerable areas of the information space to target with messaging.</td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0081.md">T0081</a></td>
|
||
<td>Identify Social and Technical Vulnerabilities</td>
|
||
<td>Identifying social and technical vulnerabilities determines weaknesses within the target audience information environment for later exploitation. Vulnerabilities include decisive political issues, weak cybersecurity infrastructure, search engine data voids, and other technical and non technical weaknesses in the target information environment.
|
||
Identifying social and technical vulnerabilities facilitates the later exploitation of the identified weaknesses to advance operation objectives. </td>
|
||
<td>TA13</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0082.md">T0082</a></td>
|
||
<td>Develop New Narratives</td>
|
||
<td>tbd</td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0083.md">T0083</a></td>
|
||
<td>Integrate Target Audience Vulnerabilities into Narrative</td>
|
||
<td>An influence operation may seek to exploit the preexisting weaknesses, fears, and enemies of the target audience for integration into the operation’s narratives and overall strategy. Integrating existing vulnerabilities into the operational approach conserves resources by exploiting already weak areas of the target information environment instead of forcing the operation to create new vulnerabilities in the environment.</td>
|
||
<td>TA14</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0084.md">T0084</a></td>
|
||
<td>Reuse Existing Content</td>
|
||
<td>When an operation recycles content from its own previous operations or plagiarizes from external operations. An operation may launder information to conserve resources that would have otherwise been utilized to develop new content. </td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0085.md">T0085</a></td>
|
||
<td>Develop Text-based Content</td>
|
||
<td>tbd</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0086.md">T0086</a></td>
|
||
<td>Develop Image-based Content</td>
|
||
<td>tbd</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0087.md">T0087</a></td>
|
||
<td>Develop Video-based Content</td>
|
||
<td>tbd</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0088.md">T0088</a></td>
|
||
<td>Develop Audio-based Content</td>
|
||
<td>tbd</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0089.md">T0089</a></td>
|
||
<td>Obtain Private Documents</td>
|
||
<td>tbd</td>
|
||
<td>TA06</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0090.md">T0090</a></td>
|
||
<td>Create Inauthentic Accounts</td>
|
||
<td>tbd</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0091.md">T0091</a></td>
|
||
<td>Recruit bad actors</td>
|
||
<td>tbd</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0092.md">T0092</a></td>
|
||
<td>Build Network</td>
|
||
<td>tbd</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0093.md">T0093</a></td>
|
||
<td>Acquire/ recruit Network</td>
|
||
<td>tbd</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0094.md">T0094</a></td>
|
||
<td>Infiltrate Existing Networks</td>
|
||
<td>tbd</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0095.md">T0095</a></td>
|
||
<td>Develop Owned Media Assets</td>
|
||
<td>An owned media asset refers to an agency or organization through which an influence operation may create, develop, and host content and narratives. Owned media assets include websites, blogs, social media pages, forums, and other platforms that facilitate the creation and organization of content.</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0096.md">T0096</a></td>
|
||
<td>Leverage Content Farm</td>
|
||
<td>tbd</td>
|
||
<td>TA15</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0097.md">T0097</a></td>
|
||
<td>Create personas</td>
|
||
<td>tbd</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0098.md">T0098</a></td>
|
||
<td>Establish Inauthentic News Sites</td>
|
||
<td>Modern computational propaganda makes use of a cadre of imposter news sites spreading globally. These sites, sometimes motivated by concerns other than propaganda--for instance, click-based revenue--often have some superficial markers of authenticity, such as naming and site-design. But many can be quickly exposed with reference to their owenership, reporting history and adverstising details.</td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0099.md">T0099</a></td>
|
||
<td>Prepare Assets Impersonating Legitimate Entities</td>
|
||
<td>An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognizable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organizations, or state entities.
|
||
An influence operation may use a wide variety of cyber techniques to impersonate a legitimate entity’s website or social media account. Typosquatting87 is the international registration of a domain name with purposeful variations of the impersonated domain name through intentional typos, top-level domain (TLD) manipulation, or punycode. Typosquatting facilitates the creation of falsified websites by creating similar domain names in the URL box, leaving it to the user to confirm that the URL is correct. </td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0100.md">T0100</a></td>
|
||
<td>Co-opt Trusted Sources</td>
|
||
<td>An influence operation may co-opt trusted sources by infiltrating or repurposing a source to reach a target audience through existing, previously reliable networks. Co-opted trusted sources may include:
|
||
- National or local new outlets
|
||
- Research or academic publications
|
||
- Online blogs or websites </td>
|
||
<td>TA16</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0101.md">T0101</a></td>
|
||
<td>Create Localized Content</td>
|
||
<td>Localized content refers to content that appeals to a specific community of individuals, often in defined geographic areas. An operation may create localized content using local language and dialects to resonate with its target audience and blend in with other local news and social media. Localized content may help an operation increase legitimacy, avoid detection, and complicate external attribution.</td>
|
||
<td>TA05</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0102.md">T0102</a></td>
|
||
<td>Leverage Echo Chambers/Filter Bubbles</td>
|
||
<td>An echo chamber refers to an internet subgroup, often along ideological lines, where individuals only engage with “others with which they are already in agreement.” A filter bubble refers to an algorithm's placement of an individual in content that they agree with or regularly engage with, possibly entrapping the user into a bubble of their own making. An operation may create these isolated areas of the internet by match existing groups, or aggregating individuals into a single target audience based on shared interests, politics, values, demographics, and other characteristics. Echo chambers and filter bubbles help to reinforce similar biases and content to the same target audience members. </td>
|
||
<td>TA05</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0103.md">T0103</a></td>
|
||
<td>Livestream</td>
|
||
<td>tbd</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0104.md">T0104</a></td>
|
||
<td>Social Networks</td>
|
||
<td>tbd</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0105.md">T0105</a></td>
|
||
<td>Media Sharing Networks</td>
|
||
<td>Media sharing networks refer to services whose primary function is the hosting and sharing of specific forms of media. Examples include Instagram, Snapchat, TikTok, Youtube, SoundCloud.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0106.md">T0106</a></td>
|
||
<td>Discussion Forums</td>
|
||
<td>Platforms for finding, discussing, and sharing information and opinions. Examples include Reddit, Quora, Digg, message boards, interest-based discussion forums, etc.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0107.md">T0107</a></td>
|
||
<td>Bookmarking and Content Curation</td>
|
||
<td>Platforms for searching, sharing, and curating content and media. Examples include Pinterest, Flipboard, etc.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0108.md">T0108</a></td>
|
||
<td>Blogging and Publishing Networks</td>
|
||
<td>Examples include WordPress, Blogger, Weebly, Tumblr, Medium, etc. </td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0109.md">T0109</a></td>
|
||
<td>Consumer Review Networks</td>
|
||
<td>Platforms for finding, reviewing, and sharing information about brands, products, services, restaurants, travel destinations, etc. Examples include Yelp, TripAdvisor, etc.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0110.md">T0110</a></td>
|
||
<td>Formal Diplomatic Channels</td>
|
||
<td>tbd</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0111.md">T0111</a></td>
|
||
<td>Traditional Media</td>
|
||
<td>Examples include TV, Newspaper, Radio, etc.</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0112.md">T0112</a></td>
|
||
<td>Email</td>
|
||
<td>tbd</td>
|
||
<td>TA07</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0113.md">T0113</a></td>
|
||
<td>Employ Commercial Analytic Firms</td>
|
||
<td>Commercial analytic firms collect data on target audience activities and evaluate the data to detect trends, such as content receiving high click-rates. An influence operation may employ commercial analytic firms to facilitate external collection on its target audience, complicating attribution efforts and better tailoring the content to audience preferences. </td>
|
||
<td>TA08</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0114.md">T0114</a></td>
|
||
<td>Deliver Ads</td>
|
||
<td>Delivering content via any form of paid media or advertising.</td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0115.md">T0115</a></td>
|
||
<td>Post Content</td>
|
||
<td>Delivering content by posting via owned media (assets that the operator controls). </td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0116.md">T0116</a></td>
|
||
<td>Comment or Reply on Content</td>
|
||
<td>Delivering content by replying or commenting via owned media (assets that the operator controls). </td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0117.md">T0117</a></td>
|
||
<td>Attract Traditional Media</td>
|
||
<td>Deliver content by attracting the attention of traditional media (earned media).</td>
|
||
<td>TA09</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0118.md">T0118</a></td>
|
||
<td>Amplify Existing Narrative</td>
|
||
<td>An influence operation may amplify existing narratives that align with its narratives to support operation objectives. </td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0119.md">T0119</a></td>
|
||
<td>Cross-Posting</td>
|
||
<td>Cross-posting refers to posting the same message to multiple internet discussions, social media platforms or accounts, or news groups at one time. An influence operation may post content online in multiple communities and platforms to increase the chances of content exposure to the target audience. </td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0120.md">T0120</a></td>
|
||
<td>Incentivize Sharing</td>
|
||
<td>Incentivizing content sharing refers to actions that encourage users to share content themselves, reducing the need for the operation itself to post and promote its own content.</td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0121.md">T0121</a></td>
|
||
<td>Manipulate Platform Algorithm</td>
|
||
<td>Manipulating a platform algorithm refers to conducting activity on a platform in a way that intentionally targets its underlying algorithm. After analyzing a platform’s algorithm (see: Select Platforms), an influence operation may use a platform in a way that increases its content exposure, avoids content removal, or otherwise benefits the operation’s strategy. For example, an influence operation may use bots to amplify its posts so that the platform’s algorithm recognizes engagement with operation content and further promotes the content on user timelines. </td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0122.md">T0122</a></td>
|
||
<td>Direct Users to Alternative Platforms</td>
|
||
<td>Direct users to alternative platforms refers to encouraging users to move from the platform on which they initially viewed operation content and engage with content on alternate information channels, including separate social media channels and inauthentic websites. An operation may drive users to alternative platforms to diversify its information channels and ensure the target audience knows where to access operation content if the initial platform suspends, flags, or otherwise removes original operation assets and content. </td>
|
||
<td>TA17</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0123.md">T0123</a></td>
|
||
<td>Control Information Environment through Offensive Cyberspace Operations</td>
|
||
<td>Controling the information environment through offensive cyberspace operations uses cyber tools and techniques to alter the trajectory of content in the information space to either prioritize operation messaging or block opposition messaging.</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0124.md">T0124</a></td>
|
||
<td>Suppress Opposition</td>
|
||
<td>tbd</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0125.md">T0125</a></td>
|
||
<td>Platform Filtering</td>
|
||
<td>Platform filtering refers to the decontextualization of information as claims cross platforms (from Joan Donovan https://www.hks.harvard.edu/publications/disinformation-design-use-evidence-collages-and-platform-filtering-media-manipulation)</td>
|
||
<td>TA18</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0126.md">T0126</a></td>
|
||
<td>Encourage Attendance at Events</td>
|
||
<td>Operation encourages attendance at existing real world event.</td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0127.md">T0127</a></td>
|
||
<td>Physical Violence</td>
|
||
<td>Physical violence refers to the use of force to injure, abuse, damage, or destroy. An influence operation may conduct or encourage physical violence to discourage opponents from promoting conflicting content or draw attention to operation narratives using shock value. </td>
|
||
<td>TA10</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0128.md">T0128</a></td>
|
||
<td>Conceal People</td>
|
||
<td>Conceal the identity or provenance of the account and people assets.</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0129.md">T0129</a></td>
|
||
<td>Conceal Operational Activity</td>
|
||
<td>tbd</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0130.md">T0130</a></td>
|
||
<td>Conceal Infrastructure</td>
|
||
<td>tbd</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0131.md">T0131</a></td>
|
||
<td>Exploit TOS/Content Moderation</td>
|
||
<td>tbd</td>
|
||
<td>TA11</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0132.md">T0132</a></td>
|
||
<td>Measure Performance</td>
|
||
<td>tbd</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0133.md">T0133</a></td>
|
||
<td>Measure Effectiveness</td>
|
||
<td>tbd</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
<tr>
|
||
<td><a href="techniques/T0134.md">T0134</a></td>
|
||
<td>Measure Effectiveness Indicators (or KPIs)</td>
|
||
<td>tbd</td>
|
||
<td>TA12</td>
|
||
</tr>
|
||
</table>
|