mirror of
https://github.com/DISARMFoundation/DISARMframeworks.git
synced 2024-10-01 01:45:36 -04:00
32 KiB
32 KiB
DISARM Techniques:
disarm_id | name | summary | tactic_id |
---|---|---|---|
T0001 | Determine Target Audiences | TA01 | |
T0002 | Determine Strategic Ends | TA01 | |
T0003 | Dismiss | Push back against criticism by dismissing your critics. This might be arguing that the critics use a different standard for you than with other actors or themselves; or arguing that their criticism is biased. | TA02 |
T0004 | Distort | Twist the narrative. Take information, or artifacts like images, and change the framing around them. | TA02 |
T0005 | Distract | Shift attention to a different narrative or actor, for instance by accusing critics of the same activity that they’ve accused you of (e.g. police brutality). | TA02 |
T0006 | Dismay | Threaten the critic or narrator of events. For instance, threaten journalists or news outlets reporting on a story. | TA02 |
T0007 | Divide | Create conflict between subgroups, to widen divisions in a community | TA02 |
T0008 | Degrade Adversary | Plan to degrade an adversary’s image or ability to act. This could include preparation and use of harmful information about the adversary’s actions or reputation. | TA02 |
T0009 | Facilitate State Propaganda | Organize citizens around pro-state messaging. Coordinate paid or volunteer groups to push state propaganda. | TA02 |
T0010 | Map Target Audience Information Environment | Mapping the target audience information environment analyzes the information space itself, including social media analytics, web traffic, and media surveys. Mapping the information environment may help the influence operation determine the most realistic and popular information channels to reach its target audience. Mapping the target audience information environment aids influence operations in determining the most vulnerable areas of the information space to target with messaging. | TA03 |
T0011 | Identify Social and Technical Vulnerabilities | Identifying social and technical vulnerabilities determines weaknesses within the target audience information environment for later exploitation. Vulnerabilities include decisive political issues, weak cybersecurity infrastructure, search engine data voids, and other technical and non technical weaknesses in the target information environment. Identifying social and technical vulnerabilities facilitates the later exploitation of the identified weaknesses to advance operation objectives. | TA03 |
T0012 | Segment Audiences | Create audience segmentations by features of interest to the influence campaign, including political affiliation, geographic location, income, demographics, and psychographics. | TA03 |
T0013 | Develop New Narratives | TA04 | |
T0014 | Leverage Existing Narratives | Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices. | TA04 |
T0015 | Develop Competing Narratives | Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centered on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the "firehose of misinformation" approach. | TA04 |
T0016 | Leverage Conspiracy Theory Narratives | "Conspiracy narratives" appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalized or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the "firehose of falsehoods" model. | TA04 |
T0017 | Integrate Target Audience Vulnerabilities into Narrative | An influence operation may seek to exploit the preexisting weaknesses, fears, and enemies of the target audience for integration into the operation’s narratives and overall strategy. Integrating existing vulnerabilities into the operational approach conserves resources by exploiting already weak areas of the target information environment instead of forcing the operation to create new vulnerabilities in the environment. | TA04 |
T0018 | Respond to Breaking News Event or Active Crisis | Media attention on a story or event is heightened during a breaking news event, where unclear facts and incomplete information increase speculation, rumors, and conspiracy theories, which are all vulnerable to manipulation. | TA04 |
T0019 | Demand insurmountable proof | Campaigns often leverage tactical and informational asymmetries on the threat surface, as seen in the Distort and Deny strategies, and the "firehose of misinformation". Specifically, conspiracy theorists can be repeatedly wrong, but advocates of the truth need to be perfect. By constantly escalating demands for proof, propagandists can effectively leverage this asymmetry while also priming its future use, often with an even greater asymmetric advantage. The conspiracist is offered freer rein for a broader range of "questions" while the truth teller is burdened with higher and higher standards of proof. | TA04 |
T0020 | Reuse Existing Content | When an operation recycles content from its own previous operations or plagiarizes from external operations. An operation may launder information to conserve resources that would have otherwise been utilized to develop new content. | TA05 |
T0021 | Develop Text-based Content | TA05 | |
T0022 | Develop Image-based Content | TA05 | |
T0023 | Develop Video-based Content | TA05 | |
T0024 | Develop Audio-based Content | TA05 | |
T0025 | Generate information pollution | Flood social channels; drive traffic/engagement to all assets; create aura/sense/perception of pervasiveness/consensus (for or against or both simultaneously) of an issue or topic. "Nothing is true, but everything is possible." Akin to astroturfing campaign. | TA05 |
T0026 | Distort facts | Change, twist, or exaggerate existing facts to construct a narrative that differs from reality. Examples: images and ideas can be distorted by being placed in an improper content | TA05 |
T0027 | Create hashtags and search artifacts | Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicize the story more widely through trending lists and search behavior. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites). | TA05 |
T0028 | Obtain Private Documents | TA05 | |
T0029 | Create Inauthentic Accounts | TA06 | |
T0030 | Recruit bad actors | TA06 | |
T0031 | Cultivate ignorant agents | Cultivate propagandists for a cause, the goals of which are not fully comprehended, and who are used cynically by the leaders of the cause. Independent actors use social media and specialised web sites to strategically reinforce and spread messages compatible with their own. Their networks are infiltrated and used by state media disinformation organisations to amplify the state’s own disinformation strategies against target populations. Many are traffickers in conspiracy theories or hoaxes, unified by a suspicion of Western governments and mainstream media. Their narratives, which appeal to leftists hostile to globalism and military intervention and nationalists against immigration, are frequently infiltrated and shaped by state-controlled trolls and altered news items from agencies such as RT and Sputnik. Also know as "useful idiots" or "unwitting agents". | TA06 |
T0032 | Build Network | TA06 | |
T0033 | Acquire/ recruit Network | TA06 | |
T0034 | Infiltrate Existing Networks | TA06 | |
T0035 | Create Inauthentic Social Media Pages and Groups | Create key social engineering assets needed to amplify content, manipulate algorithms, fool public and/or specific incident/campaign targets. Computational propaganda depends substantially on false perceptions of credibility and acceptance. By creating fake users and groups with a variety of interests and commitments, attackers can ensure that their messages both come from trusted sources and appear more widely adopted than they actually are. | TA06 |
T0036 | Create inauthentic websites | Create media assets to support inauthentic organizations (e.g. think tank), people (e.g. experts) and/or serve as sites to distribute malware/launch phishing operations. | TA06 |
T0037 | Prepare Physical Broadcast Capabilities | Create or coopt broadcast capabilities (e.g. TV, radio etc). | TA06 |
T0038 | Develop Owned Media Assets | An owned media asset refers to an agency or organization through which an influence operation may create, develop, and host content and narratives. Owned media assets include websites, blogs, social media pages, forums, and other platforms that facilitate the creation and organization of content. | TA06 |
T0039 | Prepare fundraising campaigns | Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities. | TA06 |
T0040 | Leverage Content Farm | TA06 | |
T0041 | Compromise legitimate accounts | Hack or take over legimate accounts to distribute misinformation or damaging content. | TA07 |
T0042 | Create fake experts | Stories planted or promoted in computational propaganda operations often make use of experts fabricated from whole cloth, sometimes specifically for the story itself. | TA07 |
T0043 | Create personas | TA07 | |
T0044 | Establish Inauthentic News Sites | Modern computational propaganda makes use of a cadre of imposter news sites spreading globally. These sites, sometimes motivated by concerns other than propaganda--for instance, click-based revenue--often have some superficial markers of authenticity, such as naming and site-design. But many can be quickly exposed with reference to their owenership, reporting history and adverstising details. | TA07 |
T0045 | Prepare Assets Impersonating Legitimate Entities | An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognizable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organizations, or state entities. An influence operation may use a wide variety of cyber techniques to impersonate a legitimate entity’s website or social media account. Typosquatting87 is the international registration of a domain name with purposeful variations of the impersonated domain name through intentional typos, top-level domain (TLD) manipulation, or punycode. Typosquatting facilitates the creation of falsified websites by creating similar domain names in the URL box, leaving it to the user to confirm that the URL is correct. | TA07 |
T0046 | Co-opt Trusted Sources | An influence operation may co-opt trusted sources by infiltrating or repurposing a source to reach a target audience through existing, previously reliable networks. Co-opted trusted sources may include: - National or local new outlets - Research or academic publications - Online blogs or websites | TA07 |
T0047 | Create Clickbait | Create attention grabbing headlines (outrage, doubt, humor) required to drive traffic & engagement. This is a key asset. | TA08 |
T0048 | Purchase Targeted Advertisements | Create or fund advertisements targeted at specific populations | TA08 |
T0049 | Create Localized Content | Localized content refers to content that appeals to a specific community of individuals, often in defined geographic areas. An operation may create localized content using local language and dialects to resonate with its target audience and blend in with other local news and social media. Localized content may help an operation increase legitimacy, avoid detection, and complicate external attribution. | TA08 |
T0050 | Leverage Echo Chambers/Filter Bubbles | An echo chamber refers to an internet subgroup, often along ideological lines, where individuals only engage with “others with which they are already in agreement.” A filter bubble refers to an algorithm's placement of an individual in content that they agree with or regularly engage with, possibly entrapping the user into a bubble of their own making. An operation may create these isolated areas of the internet by match existing groups, or aggregating individuals into a single target audience based on shared interests, politics, values, demographics, and other characteristics. Echo chambers and filter bubbles help to reinforce similar biases and content to the same target audience members. | TA08 |
T0051 | Chat apps | Direct messaging via chat app is an increasing method of delivery. These messages are often automated and new delivery and storage methods make them anonymous, viral, and ephemeral. This is a difficult space to monitor, but also a difficult space to build acclaim or notoriety. | TA09 |
T0052 | Livestream | TA09 | |
T0053 | Social Networks | TA09 | |
T0054 | Media Sharing Networks | Media sharing networks refer to services whose primary function is the hosting and sharing of specific forms of media. Examples include Instagram, Snapchat, TikTok, Youtube, SoundCloud. | TA09 |
T0055 | Discussion Forums | Platforms for finding, discussing, and sharing information and opinions. Examples include Reddit, Quora, Digg, message boards, interest-based discussion forums, etc. | TA09 |
T0056 | Bookmarking and Content Curation | Platforms for searching, sharing, and curating content and media. Examples include Pinterest, Flipboard, etc. | TA09 |
T0057 | Blogging and Publishing Networks | Examples include WordPress, Blogger, Weebly, Tumblr, Medium, etc. | TA09 |
T0058 | Consumer Review Networks | Platforms for finding, reviewing, and sharing information about brands, products, services, restaurants, travel destinations, etc. Examples include Yelp, TripAdvisor, etc. | TA09 |
T0059 | Online polls | Create fake online polls, or manipulate existing online polls. Data gathering tactic to target those who engage, and potentially their networks of friends/followers as well | TA09 |
T0060 | Formal Diplomatic Channels | TA09 | |
T0061 | Traditional Media | Examples include TV, Newspaper, Radio, etc. | TA09 |
T0062 | TA09 | ||
T0063 | Trial content | Iteratively test incident performance (messages, content etc), e.g. A/B test headline/content enagagement metrics; website and/or funding campaign conversion rates | TA10 |
T0064 | Bait legitimate influencers | Credibility in a social media environment is often a function of the size of a user's network. "Influencers" are so-called because of their reach, typically understood as: 1) the size of their network (i.e. the number of followers, perhaps weighted by their own influence); and 2) The rate at which their comments are re-circulated (these two metrics are related). Add traditional media players at all levels of credibility and professionalism to this, and the number of potential influencial carriers available for unwitting amplification becomes substantial. By targeting high-influence people and organizations in all types of media with narratives and content engineered to appeal their emotional or ideological drivers, influence campaigns are able to add perceived credibility to their messaging via saturation and adoption by trusted agents such as celebrities, journalists and local leaders. | TA10 |
T0065 | Seed Kernel of truth | Wrap lies or altered context/facts around truths. Influence campaigns pursue a variety of objectives with respect to target audiences, prominent among them: 1. undermine a narrative commonly referenced in the target audience; or 2. promote a narrative less common in the target audience, but preferred by the attacker. In both cases, the attacker is presented with a heavy lift. They must change the relative importance of various narratives in the interpretation of events, despite contrary tendencies. When messaging makes use of factual reporting to promote these adjustments in the narrative space, they are less likely to be dismissed out of hand; when messaging can juxtapose a (factual) truth about current affairs with the (abstract) truth explicated in these narratives, propagandists can undermine or promote them selectively. Context matters. | TA10 |
T0066 | Seed distortions | Try a wide variety of messages in the early hours surrounding an incident or event, to give a misleading account or impression. | TA10 |
T0067 | Use fake experts | Use the fake experts that were set up during Establish Legitimacy. Pseudo-experts are disposable assets that often appear once and then disappear. Give "credility" to misinformation. Take advantage of credential bias | TA10 |
T0068 | Use Search Engine Optimization | Manipulate content engagement metrics (ie: Reddit & Twitter) to influence/impact news search results (e.g. Google), also elevates RT & Sputnik headline into Google news alert emails. aka "Black-hat SEO" | TA10 |
T0069 | Employ Commercial Analytic Firms | Commercial analytic firms collect data on target audience activities and evaluate the data to detect trends, such as content receiving high click-rates. An influence operation may employ commercial analytic firms to facilitate external collection on its target audience, complicating attribution efforts and better tailoring the content to audience preferences. | TA10 |
T0070 | Deliver Ads | Delivering content via any form of paid media or advertising. | TA11 |
T0071 | Post Content | Delivering content by posting via owned media (assets that the operator controls). | TA11 |
T0072 | Comment or Reply on Content | Delivering content by replying or commenting via owned media (assets that the operator controls). | TA11 |
T0073 | Attract Traditional Media | Deliver content by attracting the attention of traditional media (earned media). | TA11 |
T0074 | Flooding the Information Space | Flooding and/or mobbing social media channels feeds and/or hashtag with excessive volume of content to control/shape online conversations and/or drown out opposing points of view. Bots and/or patriotic trolls are effective tools to acheive this effect. | TA12 |
T0075 | Amplify Existing Narrative | An influence operation may amplify existing narratives that align with its narratives to support operation objectives. | TA12 |
T0076 | Cross-Posting | Cross-posting refers to posting the same message to multiple internet discussions, social media platforms or accounts, or news groups at one time. An influence operation may post content online in multiple communities and platforms to increase the chances of content exposure to the target audience. | TA12 |
T0077 | Incentivize Sharing | Incentivizing content sharing refers to actions that encourage users to share content themselves, reducing the need for the operation itself to post and promote its own content. | TA12 |
T0078 | Manipulate Platform Algorithm | Manipulating a platform algorithm refers to conducting activity on a platform in a way that intentionally targets its underlying algorithm. After analyzing a platform’s algorithm (see: Select Platforms), an influence operation may use a platform in a way that increases its content exposure, avoids content removal, or otherwise benefits the operation’s strategy. For example, an influence operation may use bots to amplify its posts so that the platform’s algorithm recognizes engagement with operation content and further promotes the content on user timelines. | TA12 |
T0079 | Direct Users to Alternative Platforms | Direct users to alternative platforms refers to encouraging users to move from the platform on which they initially viewed operation content and engage with content on alternate information channels, including separate social media channels and inauthentic websites. An operation may drive users to alternative platforms to diversify its information channels and ensure the target audience knows where to access operation content if the initial platform suspends, flags, or otherwise removes original operation assets and content. | TA12 |
T0080 | Harass | Threatening or harassing believers of opposing narratives refers to the use of intimidation techniques, including cyberbullying and doxing, to discourage opponents from voicing their dissent. An influence operation may threaten or harass believers of the opposing narratives to deter individuals from posting or proliferating conflicting content. | TA13 |
T0081 | Censor social media as a political force | Use political influence or the power of state to stop critical social media comments. Government requested/driven content take downs (see Google Transperancy reports). | TA13 |
T0082 | Control Information Environment through Offensive Cyberspace Operations | Controling the information environment through offensive cyberspace operations uses cyber tools and techniques to alter the trajectory of content in the information space to either prioritize operation messaging or block opposition messaging. | TA13 |
T0083 | Suppress Opposition | TA13 | |
T0084 | Platform Filtering | Platform filtering refers to the decontextualization of information as claims cross platforms (from Joan Donovan https://www.hks.harvard.edu/publications/disinformation-design-use-evidence-collages-and-platform-filtering-media-manipulation) | TA13 |
T0085 | Encourage Attendance at Events | Operation encourages attendance at existing real world event. | TA14 |
T0086 | Organize Events | Coordinate and promote real-world events across media platforms, e.g. rallies, protests, gatherings in support of incident narratives. | TA14 |
T0087 | Conduct fundraising | Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services166 on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns to promote operation messaging while raising money to support its activities. | TA14 |
T0088 | Physical Violence | Physical violence refers to the use of force to injure, abuse, damage, or destroy. An influence operation may conduct or encourage physical violence to discourage opponents from promoting conflicting content or draw attention to operation narratives using shock value. | TA14 |
T0089 | Merchandising/ Advertising | Merchandising/Advertising refers to getting the message or narrative into physical space in the offline world | TA14 |
T0090 | Conceal People | Conceal the identity or provenance of the account and people assets. | TA15 |
T0091 | Conceal Operational Activity | TA15 | |
T0092 | Conceal Infrastructure | TA15 | |
T0093 | Exploit TOS/Content Moderation | TA15 | |
T0094 | Play the long game | Play the long game refers to two phenomena: 1. To plan messaging and allow it to grow organically without conducting your own amplification. This is methodical and slow and requires years for the message to take hold 2. To develop a series of seemingly disconnected messaging narratives that eventually combine into a new narrative. | TA15 |
T0095 | Measure Performance | TA16 | |
T0096 | Measure Effectiveness | TA16 | |
T0097 | Measure Effectiveness Indicators (or KPIs) | TA16 |