mirror of
https://github.com/DISARMFoundation/DISARMframeworks.git
synced 2024-10-01 01:45:36 -04:00
32 KiB
32 KiB
DISARM Techniques:
disarm_id | name | summary | tactic_id |
---|---|---|---|
T0073 | Determine Target Audiences | tbd | TA01 |
T0074 | Determine Strategic Ends | tbd | TA01 |
T0080 | Map Target Audience Information Environment | Mapping the target audience information environment analyzes the information space itself, including social media analytics, web traffic, and media surveys. Mapping the information environment may help the influence operation determine the most realistic and popular information channels to reach its target audience. Mapping the target audience information environment aids influence operations in determining the most vulnerable areas of the information space to target with messaging. | TA13 |
T0081 | Identify Social and Technical Vulnerabilities | Identifying social and technical vulnerabilities determines weaknesses within the target audience information environment for later exploitation. Vulnerabilities include decisive political issues, weak cybersecurity infrastructure, search engine data voids, and other technical and non technical weaknesses in the target information environment. Identifying social and technical vulnerabilities facilitates the later exploitation of the identified weaknesses to advance operation objectives. | TA13 |
T0082 | Develop New Narratives | tbd | TA14 |
T0083 | Integrate Target Audience Vulnerabilities into Narrative | An influence operation may seek to exploit the preexisting weaknesses, fears, and enemies of the target audience for integration into the operation’s narratives and overall strategy. Integrating existing vulnerabilities into the operational approach conserves resources by exploiting already weak areas of the target information environment instead of forcing the operation to create new vulnerabilities in the environment. | TA14 |
T0084 | Reuse Existing Content | When an operation recycles content from its own previous operations or plagiarizes from external operations. An operation may launder information to conserve resources that would have otherwise been utilized to develop new content. | TA06 |
T0085 | Develop Text-based Content | tbd | TA06 |
T0086 | Develop Image-based Content | tbd | TA06 |
T0087 | Develop Video-based Content | tbd | TA06 |
T0088 | Develop Audio-based Content | tbd | TA06 |
T0089 | Obtain Private Documents | tbd | TA06 |
T0090 | Create Inauthentic Accounts | tbd | TA15 |
T0091 | Recruit bad actors | tbd | TA15 |
T0092 | Build Network | tbd | TA15 |
T0093 | Acquire/ recruit Network | tbd | TA15 |
T0094 | Infiltrate Existing Networks | tbd | TA15 |
T0095 | Develop Owned Media Assets | An owned media asset refers to an agency or organization through which an influence operation may create, develop, and host content and narratives. Owned media assets include websites, blogs, social media pages, forums, and other platforms that facilitate the creation and organization of content. | TA15 |
T0096 | Leverage Content Farm | tbd | TA15 |
T0097 | Create personas | tbd | TA16 |
T0098 | Establish Inauthentic News Sites | Modern computational propaganda makes use of a cadre of imposter news sites spreading globally. These sites, sometimes motivated by concerns other than propaganda--for instance, click-based revenue--often have some superficial markers of authenticity, such as naming and site-design. But many can be quickly exposed with reference to their owenership, reporting history and adverstising details. | TA16 |
T0099 | Prepare Assets Impersonating Legitimate Entities | An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognizable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organizations, or state entities. An influence operation may use a wide variety of cyber techniques to impersonate a legitimate entity’s website or social media account. Typosquatting87 is the international registration of a domain name with purposeful variations of the impersonated domain name through intentional typos, top-level domain (TLD) manipulation, or punycode. Typosquatting facilitates the creation of falsified websites by creating similar domain names in the URL box, leaving it to the user to confirm that the URL is correct. | TA16 |
T0100 | Co-opt Trusted Sources | An influence operation may co-opt trusted sources by infiltrating or repurposing a source to reach a target audience through existing, previously reliable networks. Co-opted trusted sources may include: - National or local new outlets - Research or academic publications - Online blogs or websites | TA16 |
T0101 | Create Localized Content | Localized content refers to content that appeals to a specific community of individuals, often in defined geographic areas. An operation may create localized content using local language and dialects to resonate with its target audience and blend in with other local news and social media. Localized content may help an operation increase legitimacy, avoid detection, and complicate external attribution. | TA05 |
T0102 | Leverage Echo Chambers/Filter Bubbles | An echo chamber refers to an internet subgroup, often along ideological lines, where individuals only engage with “others with which they are already in agreement.” A filter bubble refers to an algorithm's placement of an individual in content that they agree with or regularly engage with, possibly entrapping the user into a bubble of their own making. An operation may create these isolated areas of the internet by match existing groups, or aggregating individuals into a single target audience based on shared interests, politics, values, demographics, and other characteristics. Echo chambers and filter bubbles help to reinforce similar biases and content to the same target audience members. | TA05 |
T0103 | Livestream | tbd | TA07 |
T0104 | Social Networks | tbd | TA07 |
T0105 | Media Sharing Networks | Media sharing networks refer to services whose primary function is the hosting and sharing of specific forms of media. Examples include Instagram, Snapchat, TikTok, Youtube, SoundCloud. | TA07 |
T0106 | Discussion Forums | Platforms for finding, discussing, and sharing information and opinions. Examples include Reddit, Quora, Digg, message boards, interest-based discussion forums, etc. | TA07 |
T0107 | Bookmarking and Content Curation | Platforms for searching, sharing, and curating content and media. Examples include Pinterest, Flipboard, etc. | TA07 |
T0108 | Blogging and Publishing Networks | Examples include WordPress, Blogger, Weebly, Tumblr, Medium, etc. | TA07 |
T0109 | Consumer Review Networks | Platforms for finding, reviewing, and sharing information about brands, products, services, restaurants, travel destinations, etc. Examples include Yelp, TripAdvisor, etc. | TA07 |
T0110 | Formal Diplomatic Channels | tbd | TA07 |
T0111 | Traditional Media | Examples include TV, Newspaper, Radio, etc. | TA07 |
T0112 | tbd | TA07 | |
T0113 | Employ Commercial Analytic Firms | Commercial analytic firms collect data on target audience activities and evaluate the data to detect trends, such as content receiving high click-rates. An influence operation may employ commercial analytic firms to facilitate external collection on its target audience, complicating attribution efforts and better tailoring the content to audience preferences. | TA08 |
T0114 | Deliver Ads | Delivering content via any form of paid media or advertising. | TA09 |
T0115 | Post Content | Delivering content by posting via owned media (assets that the operator controls). | TA09 |
T0116 | Comment or Reply on Content | Delivering content by replying or commenting via owned media (assets that the operator controls). | TA09 |
T0117 | Attract Traditional Media | Deliver content by attracting the attention of traditional media (earned media). | TA09 |
T0118 | Amplify Existing Narrative | An influence operation may amplify existing narratives that align with its narratives to support operation objectives. | TA17 |
T0119 | Cross-Posting | Cross-posting refers to posting the same message to multiple internet discussions, social media platforms or accounts, or news groups at one time. An influence operation may post content online in multiple communities and platforms to increase the chances of content exposure to the target audience. | TA17 |
T0120 | Incentivize Sharing | Incentivizing content sharing refers to actions that encourage users to share content themselves, reducing the need for the operation itself to post and promote its own content. | TA17 |
T0121 | Manipulate Platform Algorithm | Manipulating a platform algorithm refers to conducting activity on a platform in a way that intentionally targets its underlying algorithm. After analyzing a platform’s algorithm (see: Select Platforms), an influence operation may use a platform in a way that increases its content exposure, avoids content removal, or otherwise benefits the operation’s strategy. For example, an influence operation may use bots to amplify its posts so that the platform’s algorithm recognizes engagement with operation content and further promotes the content on user timelines. | TA17 |
T0122 | Direct Users to Alternative Platforms | Direct users to alternative platforms refers to encouraging users to move from the platform on which they initially viewed operation content and engage with content on alternate information channels, including separate social media channels and inauthentic websites. An operation may drive users to alternative platforms to diversify its information channels and ensure the target audience knows where to access operation content if the initial platform suspends, flags, or otherwise removes original operation assets and content. | TA17 |
T0123 | Control Information Environment through Offensive Cyberspace Operations | Controling the information environment through offensive cyberspace operations uses cyber tools and techniques to alter the trajectory of content in the information space to either prioritize operation messaging or block opposition messaging. | TA18 |
T0124 | Suppress Opposition | tbd | TA18 |
T0125 | Platform Filtering | Platform filtering refers to the decontextualization of information as claims cross platforms (from Joan Donovan https://www.hks.harvard.edu/publications/disinformation-design-use-evidence-collages-and-platform-filtering-media-manipulation) | TA18 |
T0126 | Encourage Attendance at Events | Operation encourages attendance at existing real world event. | TA10 |
T0127 | Physical Violence | Physical violence refers to the use of force to injure, abuse, damage, or destroy. An influence operation may conduct or encourage physical violence to discourage opponents from promoting conflicting content or draw attention to operation narratives using shock value. | TA10 |
T0131 | Exploit TOS/Content Moderation | tbd | TA11 |
T0132 | Measure Performance | tbd | TA12 |
T0133 | Measure Effectiveness | tbd | TA12 |
T0134 | Measure Effectiveness Indicators (or KPIs) | tbd | TA12 |
T0002 | Facilitate State Propaganda | Organize citizens around pro-state messaging. Coordinate paid or volunteer groups to push state propaganda. | TA02 |
T0003 | Leverage Existing Narratives | Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices. | TA14 |
T0004 | Develop Competing Narratives | Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centered on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the "firehose of misinformation" approach. | TA14 |
T0007 | Create Inauthentic Social Media Pages and Groups | Create key social engineering assets needed to amplify content, manipulate algorithms, fool public and/or specific incident/campaign targets. Computational propaganda depends substantially on false perceptions of credibility and acceptance. By creating fake users and groups with a variety of interests and commitments, attackers can ensure that their messages both come from trusted sources and appear more widely adopted than they actually are. | TA15 |
T0009 | Create fake experts | Stories planted or promoted in computational propaganda operations often make use of experts fabricated from whole cloth, sometimes specifically for the story itself. | TA16 |
T0010 | Cultivate ignorant agents | Cultivate propagandists for a cause, the goals of which are not fully comprehended, and who are used cynically by the leaders of the cause. Independent actors use social media and specialised web sites to strategically reinforce and spread messages compatible with their own. Their networks are infiltrated and used by state media disinformation organisations to amplify the state’s own disinformation strategies against target populations. Many are traffickers in conspiracy theories or hoaxes, unified by a suspicion of Western governments and mainstream media. Their narratives, which appeal to leftists hostile to globalism and military intervention and nationalists against immigration, are frequently infiltrated and shaped by state-controlled trolls and altered news items from agencies such as RT and Sputnik. Also know as "useful idiots" or "unwitting agents". | TA15 |
T0011 | Compromise legitimate accounts | Hack or take over legimate accounts to distribute misinformation or damaging content. | TA16 |
T0013 | Create inauthentic websites | Create media assets to support inauthentic organizations (e.g. think tank), people (e.g. experts) and/or serve as sites to distribute malware/launch phishing operations. | TA15 |
T0014 | Prepare fundraising campaigns | Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities. | TA15 |
T0015 | Create hashtags and search artifacts | Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicize the story more widely through trending lists and search behavior. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites). | TA06 |
T0016 | Create Clickbait | Create attention grabbing headlines (outrage, doubt, humor) required to drive traffic & engagement. This is a key asset. | TA05 |
T0017 | Conduct fundraising | Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services166 on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns to promote operation messaging while raising money to support its activities. | TA10 |
T0018 | Purchase Targeted Advertisements | Create or fund advertisements targeted at specific populations | TA05 |
T0019 | Generate information pollution | Flood social channels; drive traffic/engagement to all assets; create aura/sense/perception of pervasiveness/consensus (for or against or both simultaneously) of an issue or topic. "Nothing is true, but everything is possible." Akin to astroturfing campaign. | TA06 |
T0020 | Trial content | Iteratively test incident performance (messages, content etc), e.g. A/B test headline/content enagagement metrics; website and/or funding campaign conversion rates | TA08 |
T0022 | Leverage Conspiracy Theory Narratives | "Conspiracy narratives" appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalized or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the "firehose of falsehoods" model. | TA14 |
T0023 | Distort facts | Change, twist, or exaggerate existing facts to construct a narrative that differs from reality. Examples: images and ideas can be distorted by being placed in an improper content | TA06 |
T0029 | Online polls | Create fake online polls, or manipulate existing online polls. Data gathering tactic to target those who engage, and potentially their networks of friends/followers as well | TA07 |
T0039 | Bait legitimate influencers | Credibility in a social media environment is often a function of the size of a user's network. "Influencers" are so-called because of their reach, typically understood as: 1) the size of their network (i.e. the number of followers, perhaps weighted by their own influence); and 2) The rate at which their comments are re-circulated (these two metrics are related). Add traditional media players at all levels of credibility and professionalism to this, and the number of potential influencial carriers available for unwitting amplification becomes substantial. By targeting high-influence people and organizations in all types of media with narratives and content engineered to appeal their emotional or ideological drivers, influence campaigns are able to add perceived credibility to their messaging via saturation and adoption by trusted agents such as celebrities, journalists and local leaders. | TA08 |
T0040 | Demand insurmountable proof | Campaigns often leverage tactical and informational asymmetries on the threat surface, as seen in the Distort and Deny strategies, and the "firehose of misinformation". Specifically, conspiracy theorists can be repeatedly wrong, but advocates of the truth need to be perfect. By constantly escalating demands for proof, propagandists can effectively leverage this asymmetry while also priming its future use, often with an even greater asymmetric advantage. The conspiracist is offered freer rein for a broader range of "questions" while the truth teller is burdened with higher and higher standards of proof. | TA14 |
T0042 | Seed Kernel of truth | Wrap lies or altered context/facts around truths. Influence campaigns pursue a variety of objectives with respect to target audiences, prominent among them: 1. undermine a narrative commonly referenced in the target audience; or 2. promote a narrative less common in the target audience, but preferred by the attacker. In both cases, the attacker is presented with a heavy lift. They must change the relative importance of various narratives in the interpretation of events, despite contrary tendencies. When messaging makes use of factual reporting to promote these adjustments in the narrative space, they are less likely to be dismissed out of hand; when messaging can juxtapose a (factual) truth about current affairs with the (abstract) truth explicated in these narratives, propagandists can undermine or promote them selectively. Context matters. | TA08 |
T0043 | Chat apps | Direct messaging via chat app is an increasing method of delivery. These messages are often automated and new delivery and storage methods make them anonymous, viral, and ephemeral. This is a difficult space to monitor, but also a difficult space to build acclaim or notoriety. | TA07 |
T0044 | Seed distortions | Try a wide variety of messages in the early hours surrounding an incident or event, to give a misleading account or impression. | TA08 |
T0045 | Use fake experts | Use the fake experts that were set up during Establish Legitimacy. Pseudo-experts are disposable assets that often appear once and then disappear. Give "credility" to misinformation. Take advantage of credential bias | TA08 |
T0046 | Use Search Engine Optimization | Manipulate content engagement metrics (ie: Reddit & Twitter) to influence/impact news search results (e.g. Google), also elevates RT & Sputnik headline into Google news alert emails. aka "Black-hat SEO" | TA08 |
T0047 | Censor social media as a political force | Use political influence or the power of state to stop critical social media comments. Government requested/driven content take downs (see Google Transperancy reports). | TA18 |
T0048 | Harass | Threatening or harassing believers of opposing narratives refers to the use of intimidation techniques, including cyberbullying and doxing, to discourage opponents from voicing their dissent. An influence operation may threaten or harass believers of the opposing narratives to deter individuals from posting or proliferating conflicting content. | TA18 |
T0049 | Flooding the Information Space | Flooding and/or mobbing social media channels feeds and/or hashtag with excessive volume of content to control/shape online conversations and/or drown out opposing points of view. Bots and/or patriotic trolls are effective tools to acheive this effect. | TA17 |
T0057 | Organize Events | Coordinate and promote real-world events across media platforms, e.g. rallies, protests, gatherings in support of incident narratives. | TA10 |
T0059 | Play the long game | Play the long game refers to two phenomena: 1. To plan messaging and allow it to grow organically without conducting your own amplification. This is methodical and slow and requires years for the message to take hold 2. To develop a series of seemingly disconnected messaging narratives that eventually combine into a new narrative. | TA11 |
T0060 | Continue to Amplify | continue narrative or message amplification after the main incident work has finished | TA11 |
T0061 | Sell Merchandise | Sell mechandise refers to getting the message or narrative into physical space in the offline world while making money | TA10 |
T0065 | Prepare Physical Broadcast Capabilities | Create or coopt broadcast capabilities (e.g. TV, radio etc). | TA15 |
T0066 | Degrade Adversary | Plan to degrade an adversary’s image or ability to act. This could include preparation and use of harmful information about the adversary’s actions or reputation. | TA02 |
T0068 | Respond to Breaking News Event or Active Crisis | Media attention on a story or event is heightened during a breaking news event, where unclear facts and incomplete information increase speculation, rumors, and conspiracy theories, which are all vulnerable to manipulation. | TA14 |
T0072 | Segment Audiences | Create audience segmentations by features of interest to the influence campaign, including political affiliation, geographic location, income, demographics, and psychographics. | TA13 |
T0075 | Dismiss | Push back against criticism by dismissing your critics. This might be arguing that the critics use a different standard for you than with other actors or themselves; or arguing that their criticism is biased. | TA02 |
T0076 | Distort | Twist the narrative. Take information, or artifacts like images, and change the framing around them. | TA02 |
T0077 | Distract | Shift attention to a different narrative or actor, for instance by accusing critics of the same activity that they’ve accused you of (e.g. police brutality). | TA02 |
T0078 | Dismay | Threaten the critic or narrator of events. For instance, threaten journalists or news outlets reporting on a story. | TA02 |
T0079 | Divide | Create conflict between subgroups, to widen divisions in a community | TA02 |
T0128 | Conceal People | Conceal the identity or provenance of the account and people assets. | TA11 |
T0129 | Conceal Operational Activity | tbd | TA11 |
T0130 | Conceal Infrastructure | tbd | TA11 |