# DISARM Techniques:
disarm_id name summary tactic_id
T0002 Facilitate State Propaganda Organise citizens around pro-state messaging. Coordinate paid or volunteer groups to push state propaganda. TA02
T0003 Leverage Existing Narratives Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices. TA14
T0004 Develop Competing Narratives Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centred on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the "firehose of misinformation" approach. TA14
T0010 Cultivate Ignorant Agents Cultivate propagandists for a cause, the goals of which are not fully comprehended, and who are used cynically by the leaders of the cause. Independent actors use social media and specialised web sites to strategically reinforce and spread messages compatible with their own. Their networks are infiltrated and used by state media disinformation organisations to amplify the state’s own disinformation strategies against target populations. Many are traffickers in conspiracy theories or hoaxes, unified by a suspicion of Western governments and mainstream media. Their narratives, which appeal to leftists hostile to globalism and military intervention and nationalists against immigration, are frequently infiltrated and shaped by state-controlled trolls and altered news items from agencies such as RT and Sputnik. Also know as "useful idiots" or "unwitting agents". TA15
T0014 Prepare Fundraising Campaigns Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities. TA15
T0014.001 Raise Funds from Malign Actors Raising funds from malign actors may include contributions from foreign agents, cutouts or proxies, shell companies, dark money groups, etc. TA15
T0014.002 Raise Funds from Ignorant Agents Raising funds from ignorant agents may include scams, donations intended for one stated purpose but then used for another, etc. TA15
T0015 Create Hashtags and Search Artefacts Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicise the story more widely through trending lists and search behaviour. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites). TA06
T0015.001 Use Existing Hashtag Use a dedicated, existing hashtag for the campaign/incident. This Technique covers behaviours previously documented by T0104.005: Use Hashtags, which has since been deprecated. TA06
T0015.002 Create New Hashtag Create a campaign/incident specific hashtag. This Technique covers behaviours previously documented by T0104.006: Create Dedicated Hashtag, which has since been deprecated. TA06
T0016 Create Clickbait Create attention grabbing headlines (outrage, doubt, humour) required to drive traffic & engagement. This is a key asset. TA05
T0017 Conduct Fundraising Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services166 on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns to promote operation messaging while raising money to support its activities. TA10
T0017.001 Conduct Crowdfunding Campaigns An influence operation may Conduct Crowdfunding Campaigns on platforms such as GoFundMe, GiveSendGo, Tipeee, Patreon, etc. TA10
T0018 Purchase Targeted Advertisements Create or fund advertisements targeted at specific populations TA05
T0020 Trial Content Iteratively test incident performance (messages, content etc), e.g. A/B test headline/content enagagement metrics; website and/or funding campaign conversion rates TA08
T0022 Leverage Conspiracy Theory Narratives "Conspiracy narratives" appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalised or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the "firehose of falsehoods" model. TA14
T0022.001 Amplify Existing Conspiracy Theory Narratives An influence operation may amplify an existing conspiracy theory narrative that aligns with its incident or campaign goals. By amplifying existing conspiracy theory narratives, operators can leverage the power of the existing communities that support and propagate those theories without needing to expend resources creating new narratives or building momentum and buy in around new narratives. TA14
T0022.002 Develop Original Conspiracy Theory Narratives While this requires more resources than amplifying existing conspiracy theory narratives, an influence operation may develop original conspiracy theory narratives in order to achieve greater control and alignment over the narrative and their campaign goals. Prominent examples include the USSR's Operation INFEKTION disinformation campaign run by the KGB in the 1980s to plant the idea that the United States had invented HIV/AIDS as part of a biological weapons research project at Fort Detrick, Maryland. More recently, Fort Detrick featured prominently in a new conspiracy theory narratives around the origins of the COVID-19 outbreak and pandemic. TA14
T0023 Distort Facts Change, twist, or exaggerate existing facts to construct a narrative that differs from reality. Examples: images and ideas can be distorted by being placed in an improper content TA06
T0023.001 Reframe Context Reframing context refers to removing an event from its surrounding context to distort its intended meaning. Rather than deny that an event occurred, reframing context frames an event in a manner that may lead the target audience to draw a different conclusion about its intentions. TA06
T0023.002 Edit Open-Source Content An influence operation may edit open-source content, such as collaborative blogs or encyclopaedias, to promote its narratives on outlets with existing credibility and audiences. Editing open-source content may allow an operation to post content on platforms without dedicating resources to the creation and maintenance of its own assets. TA06
T0029 Online Polls Create fake online polls, or manipulate existing online polls. Data gathering tactic to target those who engage, and potentially their networks of friends/followers as well TA07
T0039 Bait Influencer Influencers are people on social media platforms who have large audiences.

Threat Actors can try to trick Influencers such as celebrities, journalists, or local leaders who aren’t associated with their campaign into amplifying campaign content. This gives them access to the Influencer’s audience without having to go through the effort of building it themselves, and it helps legitimise their message by associating it with the Influencer, benefitting from their audience’s trust in them.
TA17
T0040 Demand Insurmountable Proof Campaigns often leverage tactical and informational asymmetries on the threat surface, as seen in the Distort and Deny strategies, and the "firehose of misinformation". Specifically, conspiracy theorists can be repeatedly wrong, but advocates of the truth need to be perfect. By constantly escalating demands for proof, propagandists can effectively leverage this asymmetry while also priming its future use, often with an even greater asymmetric advantage. The conspiracist is offered freer rein for a broader range of "questions" while the truth teller is burdened with higher and higher standards of proof. TA14
T0042 Seed Kernel of Truth Wrap lies or altered context/facts around truths. Influence campaigns pursue a variety of objectives with respect to target audiences, prominent among them: 1. undermine a narrative commonly referenced in the target audience; or 2. promote a narrative less common in the target audience, but preferred by the attacker. In both cases, the attacker is presented with a heavy lift. They must change the relative importance of various narratives in the interpretation of events, despite contrary tendencies. When messaging makes use of factual reporting to promote these adjustments in the narrative space, they are less likely to be dismissed out of hand; when messaging can juxtapose a (factual) truth about current affairs with the (abstract) truth explicated in these narratives, propagandists can undermine or promote them selectively. Context matters. TA08
T0044 Seed Distortions Try a wide variety of messages in the early hours surrounding an incident or event, to give a misleading account or impression. TA08
T0045 Use Fake Experts Use the fake experts that were set up during Establish Legitimacy. Pseudo-experts are disposable assets that often appear once and then disappear. Give "credility" to misinformation. Take advantage of credential bias TA08
T0046 Use Search Engine Optimisation Manipulate content engagement metrics (ie: Reddit & Twitter) to influence/impact news search results (e.g. Google), also elevates RT & Sputnik headline into Google news alert emails. aka "Black-hat SEO" TA08
T0047 Censor Social Media as a Political Force Use political influence or the power of state to stop critical social media comments. Government requested/driven content take downs (see Google Transperancy reports). TA18
T0048 Harass Threatening or harassing believers of opposing narratives refers to the use of intimidation techniques, including cyberbullying and doxing, to discourage opponents from voicing their dissent. An influence operation may threaten or harass believers of the opposing narratives to deter individuals from posting or proliferating conflicting content. TA18
T0048.001 Boycott/"Cancel" Opponents Cancel culture refers to the phenomenon in which individuals collectively refrain from supporting an individual, organisation, business, or other entity, usually following a real or falsified controversy. An influence operation may exploit cancel culture by emphasising an adversary’s problematic or disputed behaviour and presenting its own content as an alternative. TA18
T0048.002 Harass People Based on Identities Examples include social identities like gender, sexuality, race, ethnicity, religion, ability, nationality, etc. as well as roles and occupations like journalist or activist. TA18
T0048.003 Threaten to Dox Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content. TA18
T0048.004 Dox Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content. TA18
T0049 Flood Information Space Flooding sources of information (e.g. Social Media feeds) with a high volume of inauthentic content.

This can be done to control/shape online conversations, drown out opposing points of view, or make it harder to find legitimate information.

Bots and/or patriotic trolls are effective tools to achieve this effect.

This Technique previously used the name Flooding the Information Space.
TA17
T0049.001 Trolls Amplify and Manipulate Use trolls to amplify narratives and/or manipulate narratives. Fake profiles/sockpuppets operating to support individuals/narratives from the entire political spectrum (left/right binary). Operating with increased emphasis on promoting local content and promoting real Twitter users generating their own, often divisive political content, as it's easier to amplify existing content than create new/original content. Trolls operate where ever there's a socially divisive issue (issues that can/are be politicized). TA17
T0049.002 Flood Existing Hashtag Hashtags can be used by communities to collate information they post about particular topics (such as their interests, or current events) and users can find communities to join by exploring hashtags they’re interested in.

Threat actors can flood an existing hashtag to try to ruin hashtag functionality, posting content unrelated to the hashtag alongside it, making it a less reliable source of relevant information. They may also try to flood existing hashtags with campaign content, with the intent of maximising exposure to users.

This Technique covers cases where threat actors flood existing hashtags with campaign content.

This Technique covers behaviours previously documented by T0019.002: Hijack Hashtags, which has since been deprecated. This Technique was previously called Hijack Existing Hashtag.
TA17
T0049.003 Bots Amplify via Automated Forwarding and Reposting Automated forwarding and reposting refer to the proliferation of operation content using automated means, such as artificial intelligence or social media bots. An influence operation may use automated activity to increase content exposure without dedicating the resources, including personnel and time, traditionally required to forward and repost content. Use bots to amplify narratives above algorithm thresholds. Bots are automated/programmed profiles designed to amplify content (ie: automatically retweet or like) and give appearance it's more "popular" than it is. They can operate as a network, to function in a coordinated/orchestrated manner. In some cases (more so now) they are an inexpensive/disposable assets used for minimal deployment as bot detection tools improve and platforms are more responsive. TA17
T0049.004 Utilise Spamoflauge Spamoflauge refers to the practice of disguising spam messages as legitimate. Spam refers to the use of electronic messaging systems to send out unrequested or unwanted messages in bulk. Simple methods of spamoflauge include replacing letters with numbers to fool keyword-based email spam filters, for example, "you've w0n our jackp0t!". Spamoflauge may extend to more complex techniques such as modifying the grammar or word choice of the language, casting messages as images which spam detectors cannot automatically read, or encapsulating messages in password protected attachments, such as .pdf or .zip files. Influence operations may use spamoflauge to avoid spam filtering systems and increase the likelihood of the target audience receiving operation messaging. TA17
T0049.005 Conduct Swarming Swarming refers to the coordinated use of accounts to overwhelm the information space with operation content. Unlike information flooding, swarming centres exclusively around a specific event or actor rather than a general narrative. Swarming relies on “horizontal communication” between information assets rather than a top-down, vertical command-and-control approach. TA17
T0049.006 Conduct Keyword Squatting Keyword squatting refers to the creation of online content, such as websites, articles, or social media accounts, around a specific search engine-optimized term to overwhelm the search results of that term. An influence may keyword squat to increase content exposure to target audience members who query the exploited term in a search engine and manipulate the narrative around the term. TA17
T0049.007 Inauthentic Sites Amplify News and Narratives Inauthentic sites circulate cross-post stories and amplify narratives. Often these sites have no masthead, bylines or attribution. TA17
T0049.008 Generate Information Pollution Information Pollution occurs when threat actors attempt to ruin a source of information by flooding it with lots of inauthentic or unreliable content, intending to make it harder for legitimate users to find the information they’re looking for.

This sub-technique’s objective is to reduce exposure to target information, rather than promoting exposure to campaign content, for which the parent Technique T0049 can be used.

Analysts will need to infer what the motive for flooding an information space was when deciding whether to use T0049 or T0049.008 to tag a case when an information space is flooded. If such inference is not possible, default to T0049.

This Technique previously used the ID T0019.
TA17
T0057 Organise Events Coordinate and promote real-world events across media platforms, e.g. rallies, protests, gatherings in support of incident narratives. TA10
T0057.001 Pay for Physical Action Paying for physical action occurs when an influence operation pays individuals to act in the physical realm. An influence operation may pay for physical action to create specific situations and frame them in a way that supports operation narratives, for example, paying a group of people to burn a car to later post an image of the burning car and frame it as an act of protest. TA10
T0057.002 Conduct Symbolic Action Symbolic action refers to activities specifically intended to advance an operation’s narrative by signalling something to the audience, for example, a military parade supporting a state’s narrative of military superiority. An influence operation may use symbolic action to create falsified evidence supporting operation narratives in the physical information space. TA10
T0059 Play the Long Game Play the long game refers to two phenomena: 1. To plan messaging and allow it to grow organically without conducting your own amplification. This is methodical and slow and requires years for the message to take hold 2. To develop a series of seemingly disconnected messaging narratives that eventually combine into a new narrative. TA11
T0060 Continue to Amplify continue narrative or message amplification after the main incident work has finished TA11
T0061 Sell Merchandise Sell mechandise refers to getting the message or narrative into physical space in the offline world while making money TA10
T0065 Prepare Physical Broadcast Capabilities Create or coopt broadcast capabilities (e.g. TV, radio etc). TA15
T0066 Degrade Adversary Plan to degrade an adversary’s image or ability to act. This could include preparation and use of harmful information about the adversary’s actions or reputation. TA02
T0068 Respond to Breaking News Event or Active Crisis Media attention on a story or event is heightened during a breaking news event, where unclear facts and incomplete information increase speculation, rumours, and conspiracy theories, which are all vulnerable to manipulation. TA14
T0072 Segment Audiences Create audience segmentations by features of interest to the influence campaign, including political affiliation, geographic location, income, demographics, and psychographics. TA13
T0072.001 Geographic Segmentation An influence operation may target populations in a specific geographic location, such as a region, state, or city. An influence operation may use geographic segmentation to Create Localised Content (see: Establish Legitimacy). TA13
T0072.002 Demographic Segmentation An influence operation may target populations based on demographic segmentation, including age, gender, and income. Demographic segmentation may be useful for influence operations aiming to change state policies that affect a specific population sector. For example, an influence operation attempting to influence Medicare funding in the United States would likely target U.S. voters over 65 years of age. TA13
T0072.003 Economic Segmentation An influence operation may target populations based on their income bracket, wealth, or other financial or economic division. TA13
T0072.004 Psychographic Segmentation An influence operation may target populations based on psychographic segmentation, which uses audience values and decision-making processes. An operation may individually gather psychographic data with its own surveys or collection tools or externally purchase data from social media companies or online surveys, such as personality quizzes. TA13
T0072.005 Political Segmentation An influence operation may target populations based on their political affiliations, especially when aiming to manipulate voting or change policy. TA13
T0073 Determine Target Audiences Determining the target audiences (segments of the population) who will receive campaign narratives and artefacts intended to achieve the strategic ends. TA01
T0074 Determine Strategic Ends These are the long-term end-states the campaign aims to bring about. They typically involve an advantageous position vis-a-vis competitors in terms of power or influence. The strategic goal may be to improve or simply to hold one’s position. Competition occurs in the public sphere in the domains of war, diplomacy, politics, economics, and ideology, and can play out between armed groups, nation-states, political parties, corporations, interest groups, or individuals. TA01
T0074.001 Geopolitical Advantage Favourable position on the international stage in terms of great power politics or regional rivalry. Geopolitics plays out in the realms of foreign policy, national security, diplomacy, and intelligence. It involves nation-state governments, heads of state, foreign ministers, intergovernmental organisations, and regional security alliances. TA01
T0074.002 Domestic Political Advantage Favourable position vis-à-vis national or sub-national political opponents such as political parties, interest groups, politicians, candidates. TA01
T0074.003 Economic Advantage Favourable position domestically or internationally in the realms of commerce, trade, finance, industry. Economics involves nation-states, corporations, banks, trade blocs, industry associations, cartels. TA01
T0074.004 Ideological Advantage Favourable position domestically or internationally in the market for ideas, beliefs, and world views. Competition plays out among faith systems, political systems, and value systems. It can involve sub-national, national or supra-national movements. TA01
T0075 Dismiss Push back against criticism by dismissing your critics. This might be arguing that the critics use a different standard for you than with other actors or themselves; or arguing that their criticism is biassed. TA02
T0075.001 Discredit Credible Sources Plan to delegitimize the media landscape and degrade public trust in reporting, by discrediting credible sources. This makes it easier to promote influence operation content. TA02
T0076 Distort Twist the narrative. Take information, or artefacts like images, and change the framing around them. TA02
T0077 Distract Shift attention to a different narrative or actor, for instance by accusing critics of the same activity that they’ve accused you of (e.g. police brutality). TA02
T0078 Dismay Threaten the critic or narrator of events. For instance, threaten journalists or news outlets reporting on a story. TA02
T0079 Divide Create conflict between subgroups, to widen divisions in a community TA02
T0080 Map Target Audience Information Environment Mapping the target audience information environment analyses the information space itself, including social media analytics, web traffic, and media surveys. Mapping the information environment may help the influence operation determine the most realistic and popular information channels to reach its target audience. Mapping the target audience information environment aids influence operations in determining the most vulnerable areas of the information space to target with messaging. TA13
T0080.001 Monitor Social Media Analytics An influence operation may use social media analytics to determine which factors will increase the operation content’s exposure to its target audience on social media platforms, including views, interactions, and sentiment relating to topics and content types. The social media platform itself or a third-party tool may collect the metrics. TA13
T0080.002 Evaluate Media Surveys An influence operation may evaluate its own or third-party media surveys to determine what type of content appeals to its target audience. Media surveys may provide insight into an audience’s political views, social class, general interests, or other indicators used to tailor operation messaging to its target audience. TA13
T0080.003 Identify Trending Topics/Hashtags An influence operation may identify trending hashtags on social media platforms for later use in boosting operation content. A hashtag40 refers to a word or phrase preceded by the hash symbol (#) on social media used to identify messages and posts relating to a specific topic. All public posts that use the same hashtag are aggregated onto a centralised page dedicated to the word or phrase and sorted either chronologically or by popularity. TA13
T0080.004 Conduct Web Traffic Analysis An influence operation may conduct web traffic analysis to determine which search engines, keywords, websites, and advertisements gain the most traction with its target audience. TA13
T0080.005 Assess Degree/Type of Media Access An influence operation may survey a target audience’s Internet availability and degree of media freedom to determine which target audience members will have access to operation content and on which platforms. An operation may face more difficulty targeting an information environment with heavy restrictions and media control than an environment with independent media, freedom of speech and of the press, and individual liberties. TA13
T0081 Identify Social and Technical Vulnerabilities Identifying social and technical vulnerabilities determines weaknesses within the target audience information environment for later exploitation. Vulnerabilities include decisive political issues, weak cybersecurity infrastructure, search engine data voids, and other technical and non technical weaknesses in the target information environment. Identifying social and technical vulnerabilities facilitates the later exploitation of the identified weaknesses to advance operation objectives. TA13
T0081.001 Find Echo Chambers Find or plan to create areas (social media groups, search term groups, hashtag groups etc) where individuals only engage with people they agree with. TA13
T0081.002 Identify Data Voids A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation. A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalising on most search engines’ preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term. TA13
T0081.003 Identify Existing Prejudices An influence operation may exploit existing racial, religious, demographic, or social prejudices to further polarise its target audience from the rest of the public. TA13
T0081.004 Identify Existing Fissures An influence operation may identify existing fissures to pit target populations against one another or facilitate a “divide-and-conquer" approach to tailor operation narratives along the divides. TA13
T0081.005 Identify Existing Conspiracy Narratives/Suspicions An influence operation may assess preexisting conspiracy theories or suspicions in a population to identify existing narratives that support operational objectives. TA13
T0081.006 Identify Wedge Issues A wedge issue is a divisive political issue, usually concerning a social phenomenon, that divides individuals along a defined line. An influence operation may exploit wedge issues by intentionally polarising the public along the wedge issue line and encouraging opposition between factions. TA13
T0081.007 Identify Target Audience Adversaries An influence operation may identify or create a real or imaginary adversary to centre operation narratives against. A real adversary may include certain politicians or political parties while imaginary adversaries may include falsified “deep state”62 actors that, according to conspiracies, run the state behind public view. TA13
T0081.008 Identify Media System Vulnerabilities An influence operation may exploit existing weaknesses in a target’s media system. These weaknesses may include existing biases among media agencies, vulnerability to false news agencies on social media, or existing distrust of traditional media sources. An existing distrust among the public in the media system’s credibility holds high potential for exploitation by an influence operation when establishing alternative news agencies to spread operation content. TA13
T0082 Develop New Narratives Actors may develop new narratives to further strategic or tactical goals, especially when existing narratives adequately align with the campaign goals. New narratives provide more control in terms of crafting the message to achieve specific goals. However, new narratives may require more effort to disseminate than adapting or adopting existing narratives. TA14
T0083 Integrate Target Audience Vulnerabilities into Narrative An influence operation may seek to exploit the preexisting weaknesses, fears, and enemies of the target audience for integration into the operation’s narratives and overall strategy. Integrating existing vulnerabilities into the operational approach conserves resources by exploiting already weak areas of the target information environment instead of forcing the operation to create new vulnerabilities in the environment. TA14
T0084 Reuse Existing Content When an operation recycles content from its own previous operations or plagiarises from external operations. An operation may launder information to conserve resources that would have otherwise been utilised to develop new content. TA06
T0084.001 Use Copypasta Copypasta refers to a piece of text that has been copied and pasted multiple times across various online platforms. A copypasta’s final form may differ from its original source text as users add, delete, or otherwise edit the content as they repost the text. TA06
T0084.002 Plagiarise Content An influence operation may take content from other sources without proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources. TA06
T0084.003 Deceptively Labelled or Translated An influence operation may take authentic content from other sources and add deceptive labels or deceptively translate the content into other langauges. TA06
T0084.004 Appropriate Content An influence operation may take content from other sources with proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources. Examples include the appropriation of content from one inauthentic news site to another inauthentic news site or network in ways that align with the originators licencing or terms of service. TA06
T0085 Develop Text-Based Content Creating and editing false or misleading text-based artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign. TA06
T0085.001 Develop AI-Generated Text AI-generated texts refers to synthetic text composed by computers using text-generating AI technology. Autonomous generation refers to content created by a bot without human input, also known as bot-created content generation. Autonomous generation represents the next step in automation after language generation and may lead to automated journalism. An influence operation may use read fakes or autonomous generation to quickly develop and distribute content to the target audience.

Associated Techniques and Sub-techniques:
T0085.008: Machine Translated Text: Use this sub-technique when AI has been used to generate a translation of a piece of text.
TA06
T0085.003 Develop Inauthentic News Articles An influence operation may develop false or misleading news articles aligned to their campaign goals or narratives. TA06
T0085.004 Develop Document Produce text in the form of a document. TA06
T0085.005 Develop Book Produce text content in the form of a book. 

This technique covers both e-books and physical books, however, the former is more easily deployed by threat actors given the lower cost to develop.
TA06
T0085.006 Develop Opinion Article Opinion articles (aka “Op-Eds” or “Editorials”) are articles or regular columns flagged as “opinion” posted to news sources, and can be contributed by people outside the organisation. 

Flagging articles as opinions allow news organisations to distinguish them from the typical expectations of objective news reporting while distancing the presented opinion from the organisation or its employees.

The use of this technique is not by itself an indication of malicious or inauthentic content; Op-eds are a common format in media. However, threat actors exploit op-eds to, for example, submit opinion articles to local media to promote their narratives.

Examples from the perspective of a news site involve publishing op-eds from perceived prestigious voices to give legitimacy to an inauthentic publication, or supporting causes by hosting op-eds from actors aligned with the organisation’s goals.
TA06
T0085.007 Create Fake Research Create fake academic research. Example: fake social science research is often aimed at hot-button social issues such as gender, race and sexuality. Fake science research can target Climate Science debate or pseudoscience like anti-vaxx.

This Technique previously used the ID T0019.001.
TA06
T0085.008 Machine Translated Text Text which has been translated into another language using machine translation tools, such as AI. TA06
T0086 Develop Image-Based Content Creating and editing false or misleading visual artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include photographing staged real-life situations, repurposing existing digital images, or using image creation and editing technologies. TA06
T0086.001 Develop Memes Memes are one of the most important single artefact types in all of computational propaganda. Memes in this framework denotes the narrow image-based definition. But that naming is no accident, as these items have most of the important properties of Dawkins' original conception as a self-replicating unit of culture. Memes pull together reference and commentary; image and narrative; emotion and message. Memes are a powerful tool and the heart of modern influence campaigns. TA06
T0086.002 Develop AI-Generated Images (Deepfakes) Deepfakes refer to AI-generated falsified photos, videos, or soundbites. An influence operation may use deepfakes to depict an inauthentic situation by synthetically recreating an individual’s face, body, voice, and physical gestures.

Associated Techniques and Sub-techniques:
T0145.002: AI-Generated Account Imagery: Analysts should use this sub-technique to document use of AI generated imagery in accounts’ profile pictures or other account imagery.
TA06
T0086.003 Deceptively Edit Images (Cheap Fakes) Cheap fakes utilise less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event. TA06
T0086.004 Aggregate Information into Evidence Collages Image files that aggregate positive evidence (Joan Donovan) TA06
T0087 Develop Video-Based Content Creating and editing false or misleading video artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include staging videos of purportedly real situations, repurposing existing video artefacts, or using AI-generated video creation and editing technologies (including deepfakes). TA06
T0087.001 Develop AI-Generated Videos (Deepfakes) Deepfakes refer to AI-generated falsified photos, videos, or soundbites. An influence operation may use deepfakes to depict an inauthentic situation by synthetically recreating an individual’s face, body, voice, and physical gestures. TA06
T0087.002 Deceptively Edit Video (Cheap Fakes) Cheap fakes utilise less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event. TA06
T0088 Develop Audio-Based Content Creating and editing false or misleading audio artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include creating completely new audio content, repurposing existing audio artefacts (including cheap fakes), or using AI-generated audio creation and editing technologies (including deepfakes). TA06
T0088.001 Develop AI-Generated Audio (Deepfakes) Deepfakes refer to AI-generated falsified photos, videos, or soundbites. An influence operation may use deepfakes to depict an inauthentic situation by synthetically recreating an individual’s face, body, voice, and physical gestures. TA06
T0088.002 Deceptively Edit Audio (Cheap Fakes) Cheap fakes utilise less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event. TA06
T0089 Obtain Private Documents Procuring documents that are not publicly available, by whatever means -- whether legal or illegal, highly-resourced or less so. These documents can include authentic non-public documents, authentic non-public documents have been altered, or inauthentic documents intended to appear as if they are authentic non-public documents. All of these types of documents can be "leaked" during later stages in the operation. TA06
T0089.001 Obtain Authentic Documents Procure authentic documents that are not publicly available, by whatever means -- whether legal or illegal, highly-resourced or less so. These documents can be "leaked" during later stages in the operation. TA06
T0089.003 Alter Authentic Documents Alter authentic documents (public or non-public) to achieve campaign goals. The altered documents are intended to appear as if they are authentic and can be "leaked" during later stages in the operation. TA06
T0091 Recruit Malign Actors Operators recruit bad actors paying recruiting, or exerting control over individuals includes trolls, partisans, and contractors. TA15
T0091.001 Recruit Contractors Operators recruit paid contractor to support the campaign. TA15
T0091.002 Recruit Partisans Operators recruit partisans (ideologically-aligned individuals) to support the campaign. TA15
T0091.003 Enlist Troll Accounts An influence operation may hire trolls, or human operators of fake accounts that aim to provoke others by posting and amplifying content about controversial issues. Trolls can serve to discredit an influence operation’s opposition or bring attention to the operation’s cause through debate. Classic trolls refer to regular people who troll for personal reasons, such as attention-seeking or boredom. Classic trolls may advance operation narratives by coincidence but are not directly affiliated with any larger operation. Conversely, hybrid trolls act on behalf of another institution, such as a state or financial organisation, and post content with a specific ideological goal. Hybrid trolls may be highly advanced and institutionalised or less organised and work for a single individual. TA15
T0092 Build Network Operators build their own network, creating links between accounts -- whether authentic or inauthentic -- in order amplify and promote narratives and artefacts, and encourage further growth of ther network, as well as the ongoing sharing and engagement with operational content. TA15
T0092.001 Create Organisations Influence operations may establish organisations with legitimate or falsified hierarchies, staff, and content to structure operation assets, provide a sense of legitimacy to the operation, or provide institutional backing to operation activities. TA15
T0092.002 Use Follow Trains A follow train is a group of people who follow each other on a social media platform, often as a way for an individual or campaign to grow its social media following. Follow trains may be a violation of platform Terms of Service. They are also known as follow-for-follow groups. TA15
T0092.003 Create Community or Sub-Group When there is not an existing community or sub-group that meets a campaign's goals, an influence operation may seek to create a community or sub-group. TA15
T0093 Acquire/Recruit Network Operators acquire an existing network by paying, recruiting, or exerting control over the leaders of the existing network. TA15
T0093.001 Fund Proxies An influence operation may fund proxies, or external entities that work for the operation. An operation may recruit/train users with existing sympathies towards the operation’s narratives and/or goals as proxies. Funding proxies serves various purposes including: - Diversifying operation locations to complicate attribution - Reducing the workload for direct operation assets TA15
T0093.002 Acquire Botnets A botnet is a group of bots that can function in coordination with each other. TA15
T0094 Infiltrate Existing Networks Operators deceptively insert social assets into existing networks as group members in order to influence the members of the network and the wider information environment that the network impacts. TA15
T0094.001 Identify Susceptible Targets in Networks When seeking to infiltrate an existing network, an influence operation may identify individuals and groups that might be susceptible to being co-opted or influenced. TA15
T0094.002 Utilise Butterfly Attacks Butterfly attacks occur when operators pretend to be members of a certain social group, usually a group that struggles for representation. An influence operation may mimic a group to insert controversial statements into the discourse, encourage the spread of operation content, or promote harassment among group members. Unlike astroturfing, butterfly attacks aim to infiltrate and discredit existing grassroots movements, organisations, and media campaigns. TA15
T0095 Develop Owned Media Assets An owned media asset refers to an agency or organisation through which an influence operation may create, develop, and host content and narratives. Owned media assets include websites, blogs, social media pages, forums, and other platforms that facilitate the creation and organisation of content. TA15
T0096 Leverage Content Farms Using the services of large-scale content providers for creating and amplifying campaign artefacts at scale. TA15
T0096.001 Create Content Farms An influence operation may create an organisation for creating and amplifying campaign artefacts at scale. TA15
T0096.002 Outsource Content Creation to External Organisations An influence operation may outsource content creation to external companies to avoid attribution, increase the rate of content creation, or improve content quality, i.e., by employing an organisation that can create content in the target audience’s native language. Employed organisations may include marketing companies for tailored advertisements or external content farms for high volumes of targeted media. TA15
T0097 Present Persona This Technique contains different types of personas commonly taken on by threat actors during influence operations.

Analysts should use T0097’s sub-techniques to document the type of persona which an account is presenting. For example, an account which describes itself as being a journalist can be tagged with T0097.102: Journalist Persona.

Personas presented by individuals include:

T0097.100: Individual Persona
T0097.101: Local Persona
T0097.102: Journalist Persona
T0097.103: Activist Persona
T0097.104: Hacktivist Persona
T0097.105: Military Personnel Persona
T0097.106: Recruiter Persona
T0097.107: Researcher Persona
T0097.108: Expert Persona
T0097.109: Romantic Suitor Persona
T0097.110: Party Official Persona
T0097.111: Government Official Persona
T0097.112: Government Employee Persona

This Technique also houses institutional personas commonly taken on by threat actors:

T0097.200: Institutional Persona
T0097.201: Local Institution Persona
T0097.202: News Outlet Persona
T0097.203: Fact Checking Organisation Persona
T0097.204: Think Tank Persona
T0097.205: Business Persona
T0097.206: Government Institution Persona
T0097.207: NGO Persona
T0097.208: Social Cause Persona

By using a persona, a threat actor is adding the perceived legitimacy of the persona to their narratives and activities.
TA16
T0097.100 Individual Persona This sub-technique can be used to indicate that an entity is presenting itself as an individual. If the person is presenting themselves as having one of the personas listed below then these sub-techniques should be used instead, as they indicate both the type of persona they presented and that the entity presented itself as an individual:

T0097.101: Local Persona
T0097.102: Journalist Persona
T0097.103: Activist Persona
T0097.104: Hacktivist Persona
T0097.105: Military Personnel Persona
T0097.106: Recruiter Persona
T0097.107: Researcher Persona
T0097.108: Expert Persona
T0097.109: Romantic Suitor Persona
T0097.110: Party Official Persona
T0097.111: Government Official Persona
T0097.112: Government Employee Persona
TA16
T0097.101 Local Persona A person with a local persona presents themselves as living in a particular geography or having local knowledge relevant to a narrative.

While presenting as a local is not an indication of inauthentic behaviour,  an influence operation may have its narratives amplified by people presenting as local to a target area. Threat actors can fabricate locals (T0143.002: Fabricated Persona, T0097.101: Local Persona) to add credibility to their narratives, or to misrepresent the real opinions of locals in the area.

People who are legitimate locals (T0143.001: Authentic Persona, T0097.101: Local Persona) can use their persona for malicious purposes, or be exploited by threat actors. For example, someone could take money for using their position as a local to provide legitimacy to a false narrative or be tricked into doing so without their knowledge.

Associated Techniques and Sub-techniques
T0097.201: Local Institution Persona: Analysts should use this sub-technique to catalogue cases where an institution is presenting as a local, such as a local news organisation or local business.
TA16
T0097.102 Journalist Persona A person with a journalist persona presents themselves as a reporter or journalist delivering news, conducting interviews, investigations etc.

While presenting as a journalist is not an indication of inauthentic behaviour, an influence operation may have its narratives amplified by people presenting as journalists. Threat actors can fabricate journalists to give the appearance of legitimacy, justifying the actor’s requests for interviews, etc (T0143.002: Fabricated Persona, T0097.102: Journalist Persona).

People who have legitimately developed a persona as a journalist (T0143.001: Authentic Persona, T0097.102: Journalist Persona) can use it for malicious purposes, or be exploited by threat actors. For example, someone could take money for using their position as a trusted journalist to provide legitimacy to a false narrative or be tricked into doing so without the journalist’s knowledge.

Associated Techniques and Sub-techniques
T0097.202: News Organisation Persona: People with a journalist persona may present as being part of a news organisation.
T0097.101: Local Persona: People with a journalist persona may present themselves as local reporters.
TA16
T0097.103 Activist Persona A person with an activist persona presents themselves as an activist; an individual who campaigns for a political cause, organises related events, etc.

While presenting as an activist is not an indication of inauthentic behaviour, an influence operation may have its narratives amplified by people presenting as activists. Threat actors can fabricate activists to give the appearance of popular support for an evolving grassroots movement (see T0143.002: Fabricated Persona, T0097.103: Activist Persona).

People who are legitimate activists can use this persona for malicious purposes, or be exploited by threat actors. For example, someone could take money for using their position as an activist to provide visibility to a false narrative or be tricked into doing so without their knowledge (T0143.001: Authentic Persona, T0097.103: Activist Persona).

Associated Techniques and Sub-techniques
T0097.104: Hacktivist Persona: Analysts should use this sub-technique to catalogue cases where an individual is presenting themselves as someone engaged in activism who uses technical tools and methods, including building technical infrastructure and conducting offensive cyber operations, to achieve their goals.
T0097.207: NGO Persona: People with an activist persona may present as being part of an NGO.
T0097.208: Social Cause Persona: Analysts should use this sub-technique to catalogue cases where an online account is presenting as posting content related to a particular social cause, while not presenting as an individual.
TA16
T0097.104 Hacktivist Persona A person with a hacktivist persona presents themselves as an activist who conducts offensive cyber operations or builds technical infrastructure for political purposes, rather than the financial motivations commonly attributed to hackers; hacktivists are hacker activists who use their technical knowledge to take political action.

Hacktivists can build technical infrastructure to support other activists, including secure communication channels and surveillance and censorship circumvention. They can also conduct DDOS attacks and other offensive cyber operations, aiming to take down digital assets or gain access to proprietary information. An influence operation may use hacktivist personas to support their operational narratives and legitimise their operational activities.

Fabricated Hacktivists are sometimes referred to as “Faketivists”.

Associated Techniques and Sub-techniques
T0097.103: Activist Persona: Analysts should use this sub-technique to catalogue cases where an individual is presenting themselves as someone engaged in activism but doesn’t present themselves as using technical tools and methods to achieve their goals.
TA16
T0097.105 Military Personnel Persona A person with a military personnel persona presents themselves as a serving member or veteran of a military organisation operating in an official capacity on behalf of a government.

While presenting as military personnel is not an indication of inauthentic behaviour,  an influence operation may have its narratives amplified by people presenting as military personnel. Threat actors can fabricate military personnel (T0143.002: Fabricated Persona, T0097.105: Military Personnel Persona) to pose as experts on military topics, or to discredit geopolitical adversaries by pretending to be one of their military personnel and spreading discontent.

People who have legitimately developed a military persona (T0143.001: Authentic Persona, T0097.105: Military Personnel Persona) can use it for malicious purposes, or be exploited by threat actors. For example, someone could take money for using their position as a member of the military to provide legitimacy to a false narrative or be tricked into doing so without their knowledge.
TA16
T0097.106 Recruiter Persona A person with a recruiter persona presents themselves as a potential employer or provider of freelance work.

While presenting as a recruiter is not an indication of inauthentic behaviour, threat actors fabricate recruiters (T0143.002: Fabricated Persona, T0097.106: Recruiter Persona) to justify asking for personal information from their targets or to trick targets into working for the threat actors (without revealing who they are).

Associated Techniques and Sub-techniques
T0097.205: Business Persona: People with a recruiter persona may present as being part of a business which they are recruiting for.
TA16
T0097.107 Researcher Persona A person with a researcher persona presents themselves as conducting research (e.g. for academic institutions, or think tanks), or having previously conducted research.

While presenting as a researcher is not an indication of inauthentic behaviour,  an influence operation may have its narratives amplified by people presenting as researchers. Threat actors can fabricate researchers (T0143.002: Fabricated Persona, T0097.107: Researcher Persona) to add credibility to their narratives.

People who are legitimate researchers (T0143.001: Authentic Persona, T0097.107: Researcher Persona) can use their persona for malicious purposes, or be exploited by threat actors. For example, someone could take money for using their position as a Researcher to provide legitimacy to a false narrative or be tricked into doing so without their knowledge.

Associated Techniques and Sub-techniques
T0097.204: Think Tank Persona: People with a researcher persona may present as being part of a think tank.
T0097.108: Expert Persona: People who present as researching a given topic are likely to also present as having expertise in the area.
TA16
T0097.108 Expert Persona A person with an expert persona presents themselves as having expertise or experience in a field. Commonly the persona’s expertise will be called upon to add credibility to a given narrative.

While presenting as an expert is not an indication of inauthentic behaviour,  an influence operation may have its narratives amplified by people presenting as experts. Threat actors can fabricate experts (T0143.002: Fabricated Persona, T0097.107: Researcher Persona) to add credibility to their narratives.

People who are legitimate experts (T0143.001: Authentic Persona, T0097.107: Researcher Persona) can make mistakes, use their persona for malicious purposes, or be exploited by threat actors. For example, someone could take money for using their position as an expert to provide legitimacy to a false narrative or be tricked into doing so without their knowledge.

Associated Techniques and Sub-techniques
T0097.107: Researcher Persona: People who present as experts may also present as conducting or having conducted research into their specialist subject.
T0097.204: Think Tank Persona: People with an expert persona may present as being part of a think tank.
TA16
T0097.109 Romantic Suitor Persona A person with a romantic suitor persona presents themselves as seeking a romantic or physical connection with another person.

While presenting as seeking a romantic or physical connection is not an indication of inauthentic behaviour, threat actors can use dating apps, social media channels or dating websites to fabricate romantic suitors to lure targets they can blackmail, extract information from, deceive or trick into giving them money (T0143.002: Fabricated Persona, T0097.109: Romantic Suitor Persona).

Honeypotting in espionage and Big Butchering in scamming are commonly associated with romantic suitor personas.

Associated Techniques and Sub-techniques
T0151.017: Dating Platform: Analysts can use this sub-technique for tagging cases where an account has been identified as using a dating platform.
TA16
T0097.110 Party Official Persona A person who presents as an official member of a political party, such as leaders of political parties, candidates standing to represent constituents, and campaign staff.

Presenting as an official of a political party is not an indication of inauthentic behaviour, however threat actors may fabricate individuals who work in political parties to add credibility to their narratives (T0143.002: Fabricated Persona, T0097.110: Party Official Persona). They may also impersonate existing officials of political parties (T0143.003: Impersonated Persona, T0097.110: Party Official Persona).

Legitimate members of political parties could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.110: Party Official Persona). For example, an electoral candidate could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.

Associated Techniques and Sub-techniques
T0097.111: Government Official Persona: Analysts should use this sub-technique to catalogue cases where an individual is presenting as a member of a government. 

Some party officials will also be government officials. For example, in the United Kingdom the head of government is commonly also the head of their political party.

Some party officials won’t be government officials. For example, members of a party standing in an election, or party officials who work outside of government (e.g. campaign staff).
TA16
T0097.111 Government Official Persona A person who presents as an active or previous government official has the government official persona. These are officials serving in government, such as heads of government departments, leaders of countries, and members of government selected to represent constituents.

Presenting as a government official is not an indication of inauthentic behaviour, however threat actors may fabricate individuals who work in government to add credibility to their narratives (T0143.002: Fabricated Persona, T0097.111: Government Official Persona). They may also impersonate existing members of government (T0143.003: Impersonated Persona, T0097.111: Government Official Persona).

Legitimate government officials could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.111: Government Official Persona). For example, a government official could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.

Associated Techniques and Sub-techniques
T0097.110: Party Official Persona: Analysts should use this sub-technique to catalogue cases where an individual is presenting as a member of a political party. 

Not all government officials are political party officials (such as outside experts brought into government) and not all political party officials are government officials (such as people standing for office who are not yet working in government).

T0097.206: Government Institution Persona: People presenting as members of a government may also represent a government institution which they are associated with.

T0097.112: Government Employee Persona: Analysts should use this sub-technique to document people presenting as professionals hired to serve in government institutions and departments, not officials selected to represent constituents, or assigned official roles in government (such as heads of departments).
TA16
T0097.112 Government Employee Persona A person who presents as an active or previous civil servant has the government employee persona. These are professionals hired to serve in government institutions and departments, not officials selected to represent constituents, or assigned official roles in government (such as heads of departments).

Presenting as a government employee is not an indication of inauthentic behaviour, however threat actors may fabricate individuals who work in government to add credibility to their narratives (T0143.002: Fabricated Persona, T0097.112: Government Employee Persona). They may also impersonate existing government employees (T0143.003: Impersonated Persona, T0097.112: Government Employee Persona).

Legitimate government employees could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.112: Government Employee Persona). For example, a government employee could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.

Associated Techniques and Sub-techniques
T0097.111: Government Official Persona: Analysts should use this technique to document people who present as an active or previous government official, such as heads of government departments, leaders of countries, and members of government selected to represent constituents.
T0097.206: Government Institution Persona: People presenting as members of a government may also present a government institution which they are associated with.
TA16
T0097.200 Institutional Persona This Technique can be used to indicate that an entity is presenting itself as an institution. If the organisation is presenting itself as having one of the personas listed below then these Techniques should be used instead, as they indicate both that the entity presented itself as an institution, and the type of persona they presented:

T0097.201: Local Institution Persona
T0097.202: News Outlet Persona
T0097.203: Fact Checking Organisation Persona
T0097.204: Think Tank Persona
T0097.205: Business Persona
T0097.206: Government Institution Persona
T0097.207: NGO Persona
T0097.208: Social Cause Persona
TA16
T0097.201 Local Institution Persona Institutions which present themselves as operating in a particular geography, or as having local knowledge relevant to a narrative, are presenting a local institution persona.

While presenting as a local institution is not an indication of inauthentic behaviour, threat actors may present themselves as such (T0143.002: Fabricated Persona, T0097.201: Local Institution Persona) to add credibility to their narratives, or misrepresent the real opinions of locals in the area.

Legitimate local institutions could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.201: Local Institution Persona). For example, a local institution could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.

Associated Techniques and Sub-techniques
T0097.101: Local Persona: Institutions presenting as local may also present locals working within the organisation.
TA16
T0097.202 News Outlet Persona An institution with a news outlet persona presents itself as an organisation which delivers new information to its target audience.

While presenting as a news outlet is not an indication of inauthentic behaviour, an influence operation may have its narratives amplified by news organisations. Threat actors can fabricate news organisations (T0143.002: Fabricated Persona, T0097.202: News Outlet Persona), or they can impersonate existing news outlets (T0143.003: Impersonated Persona, T0097.202: News Outlet Persona).

Legitimate news organisations could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.202: News Outlet Persona).

Associated Techniques and Sub-techniques
T0097.102: Journalist Persona: Institutions presenting as news outlets may also present journalists working within the organisation.
T0097.201: Local Institution Persona: Institutions presenting as news outlets may present as being a local news outlet.
T0097.203: Fact Checking Organisation Persona: Institutions presenting as news outlets may also deliver a fact checking service (e.g. The UK’s BBC News has the fact checking service BBC Verify). When an actor presents as the fact checking arm of a news outlet, they are presenting both a News Outlet Persona and a Fact Checking Organisation Persona.
TA16
T0097.203 Fact Checking Organisation Persona An institution with a fact checking organisation persona presents itself as an organisation which produces reports which assess the validity of others’ reporting / statements.

While presenting as a fact checking organisation is not an indication of inauthentic behaviour, an influence operation may have its narratives amplified by fact checking organisations. Threat actors can fabricate fact checking organisations (T0143.002: Fabricated Persona, T0097.202: News Outlet Persona), or they can impersonate existing fact checking outlets (T0143.003: Impersonated Persona, T0097.202: News Outlet Persona).

Legitimate fact checking organisations could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.202: News Outlet Persona).

Associated Techniques and Sub-techniques
T0097.102: Journalist Persona: Institutions presenting as fact checking organisations may also present journalists working within the organisation.
T0097.202: News Outlet Persona: Fact checking organisations may present as operating as part of a larger news outlet (e.g. The UK’s BBC News has the fact checking service BBC Verify). When an actor presents as the fact checking arm of a news outlet, they are presenting both a News Outlet Persona and a Fact Checking Organisation Persona.
TA16
T0097.204 Think Tank Persona An institution with a think tank persona presents itself as a think tank; an organisation that aims to conduct original research and propose new policies or solutions, especially for social and scientific problems.

While presenting as a think tank is not an indication of inauthentic behaviour, think tank personas are commonly used by threat actors as a front for their operational activity (T0143.002: Fabricated Persona, T0097.204: Think Tank Persona). They may be created to give legitimacy to narratives and allow them to suggest politically beneficial solutions to societal issues.

Legitimate think tanks could have a political bias that they may not be transparent about, they could use their persona for malicious purposes, or they could be exploited by threat actors (T0143.001: Authentic Persona, T0097.204: Think Tank Persona). For example, a think tank could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.

Associated Techniques and Sub-techniques
T0097.107: Researcher Persona: Institutions presenting as think tanks may also present researchers working within the organisation.
TA16
T0097.205 Business Persona An institution with a business persona presents itself as a for-profit organisation which provides goods or services for a price.

While presenting as a business is not an indication of inauthentic behaviour, business personas may be used by threat actors as a front for their operational activity (T0143.002: Fabricated Persona, T0097.205: Business Persona).

Threat actors may also impersonate existing businesses (T0143.003: Impersonated Persona, T0097.205: Business Persona) to exploit their brand or cause reputational damage.

Legitimate businesses could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.205: Business Persona). For example, a business could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.
TA16
T0097.206 Government Institution Persona Institutions which present themselves as governments, or government ministries, are presenting a government institution persona.

While presenting as a government institution is not an indication of inauthentic behaviour, threat actors may impersonate existing government institutions as part of their operation (T0143.003: Impersonated Persona, T0097.206: Government Institution Persona), to add legitimacy to their narratives, or discredit the government.

Legitimate government institutions could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.206: Government Institution Persona). For example, a government institution could be used by elected officials to spread inauthentic narratives.

Associated Techniques and Sub-techniques
T0097.111: Government Official Persona: Institutions presenting as governments may also present officials working within the organisation.
T0097.112: Government Employee Persona: Institutions presenting as governments may also present employees working within the organisation.
TA16
T0097.207 NGO Persona Institutions which present themselves as an NGO (Non-Governmental Organisation), an organisation which provides services or advocates for public policy (while not being directly affiliated with any government), are presenting an NGO persona.

While presenting as an NGO is not an indication of inauthentic behaviour, NGO personas are commonly used by threat actors (such as intelligence services) as a front for their operational activity (T0143.002: Fabricated Persona, T0097.207: NGO Persona). They are created to give legitimacy to the influence operation and potentially infiltrate grassroots movements

Legitimate NGOs could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.207: NGO Persona). For example, an NGO could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.

Associated Techniques and Sub-techniques:
T0097.103: Activist Persona: Institutions presenting as activist groups may also present activists working within the organisation.
TA16
T0097.208 Social Cause Persona Online accounts which present themselves as focusing on a social cause are presenting the Social Cause Persona. Examples include accounts which post about current affairs, such as discrimination faced by minorities.

While presenting as an account invested in a social cause is not an indication of inauthentic behaviour, such personas have been used by threat actors to exploit peoples’ legitimate emotional investment regarding social causes that matter to them (T0143.002: Fabricated Persona, T0097.208: Social Cause Persona).

Legitimate accounts focused on a social cause could use their persona for malicious purposes, or be exploited by threat actors (T0143.001: Authentic Persona, T0097.208: Social Cause Persona). For example, the account holders could take money for using their position to provide legitimacy to a false narrative, or be tricked into doing so without their knowledge.

Associated Techniques and Sub-techniques:
T0097.103: Activist Persona: Analysts should use this sub-technique to catalogue cases where an individual is presenting themselves as an activist related to a social cause. Accounts with social cause personas do not present themselves as individuals, but may have activists controlling the accounts.
TA16
T0098 Establish Inauthentic News Sites Modern computational propaganda makes use of a cadre of imposter news sites spreading globally. These sites, sometimes motivated by concerns other than propaganda--for instance, click-based revenue--often have some superficial markers of authenticity, such as naming and site-design. But many can be quickly exposed with reference to their owenership, reporting history and adverstising details. TA16
T0098.001 Create Inauthentic News Sites Create Inauthentic News Sites TA16
T0098.002 Leverage Existing Inauthentic News Sites Leverage Existing Inauthentic News Sites TA16
T0100 Co-Opt Trusted Sources An influence operation may co-opt trusted sources by infiltrating or repurposing a source to reach a target audience through existing, previously reliable networks. Co-opted trusted sources may include: - National or local new outlets - Research or academic publications - Online blogs or websites TA16
T0100.001 Co-Opt Trusted Individuals Co-Opt Trusted Individuals TA16
T0100.002 Co-Opt Grassroots Groups Co-Opt Grassroots Groups TA16
T0100.003 Co-Opt Influencers Co-opt Influencers TA16
T0101 Create Localised Content Localised content refers to content that appeals to a specific community of individuals, often in defined geographic areas. An operation may create localised content using local language and dialects to resonate with its target audience and blend in with other local news and social media. Localised content may help an operation increase legitimacy, avoid detection, and complicate external attribution. TA05
T0102 Leverage Echo Chambers/Filter Bubbles An echo chamber refers to an internet subgroup, often along ideological lines, where individuals only engage with “others with which they are already in agreement.” A filter bubble refers to an algorithm's placement of an individual in content that they agree with or regularly engage with, possibly entrapping the user into a bubble of their own making. An operation may create these isolated areas of the internet by match existing groups, or aggregating individuals into a single target audience based on shared interests, politics, values, demographics, and other characteristics. Echo chambers and filter bubbles help to reinforce similar biases and content to the same target audience members. TA05
T0102.001 Use Existing Echo Chambers/Filter Bubbles Use existing Echo Chambers/Filter Bubbles TA05
T0102.002 Create Echo Chambers/Filter Bubbles Create Echo Chambers/Filter Bubbles TA05
T0102.003 Exploit Data Voids A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation. A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalising on most search engines’ preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term. TA05
T0107 Bookmarking and Content Curation Platforms for searching, sharing, and curating content and media. Examples include Pinterest, Flipboard, etc. TA07
T0109 Consumer Review Networks Platforms for finding, reviewing, and sharing information about brands, products, services, restaurants, travel destinations, etc. Examples include Yelp, TripAdvisor, etc. TA07
T0110 Formal Diplomatic Channels Leveraging formal, traditional, diplomatic channels to communicate with foreign governments (written documents, meetings, summits, diplomatic visits, etc). This type of diplomacy is conducted by diplomats of one nation with diplomats and other officials of another nation or international organisation. TA07
T0111 Traditional Media Examples include TV, Newspaper, Radio, etc. TA07
T0111.001 TV TV TA07
T0111.002 Newspaper Newspaper TA07
T0111.003 Radio Radio TA07
T0113 Employ Commercial Analytic Firms Commercial analytic firms collect data on target audience activities and evaluate the data to detect trends, such as content receiving high click-rates. An influence operation may employ commercial analytic firms to facilitate external collection on its target audience, complicating attribution efforts and better tailoring the content to audience preferences. TA15
T0114 Deliver Ads Delivering content via any form of paid media or advertising. TA09
T0114.001 Social Media Social Media TA09
T0114.002 Traditional Media Examples include TV, Radio, Newspaper, billboards TA09
T0115 Post Content Delivering content by posting via owned media (assets that the operator controls). TA09
T0115.001 Share Memes Memes are one of the most important single artefact types in all of computational propaganda. Memes in this framework denotes the narrow image-based definition. But that naming is no accident, as these items have most of the important properties of Dawkins' original conception as a self-replicating unit of culture. Memes pull together reference and commentary; image and narrative; emotion and message. Memes are a powerful tool and the heart of modern influence campaigns. TA09
T0115.002 Post Violative Content to Provoke Takedown and Backlash Post Violative Content to Provoke Takedown and Backlash. TA09
T0115.003 One-Way Direct Posting Direct posting refers to a method of posting content via a one-way messaging service, where the recipient cannot directly respond to the poster’s messaging. An influence operation may post directly to promote operation narratives to the target audience without allowing opportunities for fact-checking or disagreement, creating a false sense of support for the narrative. TA09
T0116 Comment or Reply on Content Delivering content by replying or commenting via owned media (assets that the operator controls). TA09
T0116.001 Post Inauthentic Social Media Comment Use government-paid social media commenters, astroturfers, chat bots (programmed to reply to specific key words/hashtags) influence online conversations, product reviews, web-site comment forums. TA09
T0117 Attract Traditional Media Deliver content by attracting the attention of traditional media (earned media). TA09
T0118 Amplify Existing Narrative An influence operation may amplify existing narratives that align with its narratives to support operation objectives. TA17
T0119 Cross-Posting Cross-posting refers to posting the same message to multiple internet discussions, social media platforms or accounts, or news groups at one time. An influence operation may post content online in multiple communities and platforms to increase the chances of content exposure to the target audience. TA17
T0119.001 Post across Groups An influence operation may post content across groups to spread narratives and content to new communities within the target audiences or to new target audiences. TA17
T0119.002 Post across Platform An influence operation may post content across platforms to spread narratives and content to new communities within the target audiences or to new target audiences. Posting across platforms can also remove opposition and context, helping the narrative spread with less opposition on the cross-posted platform. TA17
T0119.003 Post across Disciplines Post Across Disciplines TA17
T0120 Incentivize Sharing Incentivizing content sharing refers to actions that encourage users to share content themselves, reducing the need for the operation itself to post and promote its own content. TA17
T0120.001 Use Affiliate Marketing Programmes Use Affiliate Marketing Programmes TA17
T0120.002 Use Contests and Prizes Use Contests and Prizes TA17
T0121 Manipulate Platform Algorithm Manipulating a platform algorithm refers to conducting activity on a platform in a way that intentionally targets its underlying algorithm. After analysing a platform’s algorithm (see: Select Platforms), an influence operation may use a platform in a way that increases its content exposure, avoids content removal, or otherwise benefits the operation’s strategy. For example, an influence operation may use bots to amplify its posts so that the platform’s algorithm recognises engagement with operation content and further promotes the content on user timelines. TA17
T0121.001 Bypass Content Blocking Bypassing content blocking refers to actions taken to circumvent network security measures that prevent users from accessing certain servers, resources, or other online spheres. An influence operation may bypass content blocking to proliferate its content on restricted areas of the internet. Common strategies for bypassing content blocking include: - Altering IP addresses to avoid IP filtering - Using a Virtual Private Network (VPN) to avoid IP filtering - Using a Content Delivery Network (CDN) to avoid IP filtering - Enabling encryption to bypass packet inspection blocking - Manipulating text to avoid filtering by keywords - Posting content on multiple platforms to avoid platform-specific removals - Using local facilities or modified DNS servers to avoid DNS filtering TA17
T0122 Direct Users to Alternative Platforms Direct users to alternative platforms refers to encouraging users to move from the platform on which they initially viewed operation content and engage with content on alternate information channels, including separate social media channels and inauthentic websites. An operation may drive users to alternative platforms to diversify its information channels and ensure the target audience knows where to access operation content if the initial platform suspends, flags, or otherwise removes original operation assets and content. TA17
T0123 Control Information Environment through Offensive Cyberspace Operations Controlling the information environment through offensive cyberspace operations uses cyber tools and techniques to alter the trajectory of content in the information space to either prioritise operation messaging or block opposition messaging. TA18
T0123.001 Delete Opposing Content Deleting opposing content refers to the removal of content that conflicts with operational narratives from selected platforms. An influence operation may delete opposing content to censor contradictory information from the target audience, allowing operation narratives to take priority in the information space. TA18
T0123.002 Block Content Content blocking refers to actions taken to restrict internet access or render certain areas of the internet inaccessible. An influence operation may restrict content based on both network and content attributes. TA18
T0123.003 Destroy Information Generation Capabilities Destroying information generation capabilities refers to actions taken to limit, degrade, or otherwise incapacitate an actor’s ability to generate conflicting information. An influence operation may destroy an actor’s information generation capabilities by physically dismantling the information infrastructure, disconnecting resources needed for information generation, or redirecting information generation personnel. An operation may destroy an adversary’s information generation capabilities to limit conflicting content exposure to the target audience and crowd the information space with its own narratives. TA18
T0123.004 Conduct Server Redirect A server redirect, also known as a URL redirect, occurs when a server automatically forwards a user from one URL to another using server-side or client-side scripting languages. An influence operation may conduct a server redirect to divert target audience members from one website to another without their knowledge. The redirected website may pose as a legitimate source, host malware, or otherwise aid operation objectives. TA18
T0124 Suppress Opposition Operators can suppress the opposition by exploiting platform content moderation tools and processes like reporting non-violative content to platforms for takedown and goading opposition actors into taking actions that result in platform action or target audience disapproval. TA18
T0124.001 Report Non-Violative Opposing Content Reporting opposing content refers to notifying and providing an instance of a violation of a platform’s guidelines and policies for conduct on the platform. In addition to simply reporting the content, an operation may leverage copyright regulations to trick social media and web platforms into removing opposing content by manipulating the content to appear in violation of copyright laws. Reporting opposing content facilitates the suppression of contradictory information and allows operation narratives to take priority in the information space. TA18
T0124.002 Goad People into Harmful Action (Stop Hitting Yourself) Goad people into actions that violate terms of service or will lead to having their content or accounts taken down. TA18
T0124.003 Exploit Platform TOS/Content Moderation Exploit Platform TOS/Content Moderation TA18
T0125 Platform Filtering Platform filtering refers to the decontextualization of information as claims cross platforms (from Joan Donovan https://www.hks.harvard.edu/publications/disinformation-design-use-evidence-collages-and-platform-filtering-media-manipulation) TA18
T0126 Encourage Attendance at Events Operation encourages attendance at existing real world event. TA10
T0126.001 Call to Action to Attend Call to action to attend an event TA10
T0126.002 Facilitate Logistics or Support for Attendance Facilitate logistics or support for travel, food, housing, etc. TA10
T0127 Physical Violence Physical violence refers to the use of force to injure, abuse, damage, or destroy. An influence operation may conduct or encourage physical violence to discourage opponents from promoting conflicting content or draw attention to operation narratives using shock value. TA10
T0127.001 Conduct Physical Violence An influence operation may directly Conduct Physical Violence to achieve campaign goals. TA10
T0127.002 Encourage Physical Violence An influence operation may Encourage others to engage in Physical Violence to achieve campaign goals. TA10
T0128 Conceal Information Assets Conceal the identity or provenance of campaign information assets such as accounts, channels, pages etc. to avoid takedown and attribution. TA11
T0128.001 Use Pseudonyms An operation may use pseudonyms, or fake names, to mask the identity of operational accounts, channels, pages etc., publish anonymous content, or otherwise use falsified personas to conceal the identity of the operation. An operation may coordinate pseudonyms across multiple platforms, for example, by writing an article under a pseudonym and then posting a link to the article on social media on an account, channel, or page with the same falsified name. TA11
T0128.002 Conceal Network Identity Concealing network identity aims to hide the existence an influence operation’s network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organisation. TA11
T0128.003 Distance Reputable Individuals from Operation Distancing reputable individuals from the operation occurs when enlisted individuals, such as celebrities or subject matter experts, actively disengage themselves from operation activities and messaging. Individuals may distance themselves from the operation by deleting old posts or statements, unfollowing operation information assets, or otherwise detaching themselves from the operation’s timeline. An influence operation may want reputable individuals to distance themselves from the operation to reduce operation exposure, particularly if the operation aims to remove all evidence. TA11
T0128.004 Launder Information Assets Laundering occurs when an influence operation acquires control of previously legitimate information assets such as accounts, channels, pages etc. from third parties through sale or exchange and often in contravention of terms of use. Influence operations use laundered assets to reach target audience members from within an existing information community and to complicate attribution. TA11
T0128.005 Change Names of Information Assets Changing names or brand names of information assets such as accounts, channels, pages etc. An operation may change the names or brand names of its assets throughout an operation to avoid detection or alter the names of newly acquired or repurposed assets to fit operational narratives. TA11
T0129 Conceal Operational Activity Conceal the campaign's operational activity to avoid takedown and attribution. TA11
T0129.001 Conceal Network Identity Concealing network identity aims to hide the existence an influence operation’s network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organisation. TA11
T0129.002 Generate Content Unrelated to Narrative An influence operation may mix its own operation content with legitimate news or external unrelated content to disguise operational objectives, narratives, or existence. For example, an operation may generate "lifestyle" or "cuisine" content alongside regular operation content. TA11
T0129.003 Break Association with Content Breaking association with content occurs when an influence operation actively separates itself from its own content. An influence operation may break association with content by unfollowing, unliking, or unsharing its content, removing attribution from its content, or otherwise taking actions that distance the operation from its messaging. An influence operation may break association with its content to complicate attribution or regain credibility for a new operation. TA11
T0129.004 Delete URLs URL deletion occurs when an influence operation completely removes its website registration, rendering the URL inaccessible. An influence operation may delete its URLs to complicate attribution or remove online documentation that the operation ever occurred. TA11
T0129.005 Coordinate on Encrypted/Closed Networks Coordinate on encrypted/ closed networks TA11
T0129.006 Deny Involvement Without "smoking gun" proof (and even with proof), incident creator can or will deny involvement. This technique also leverages the attacker advantages outlined in "Demand insurmountable proof", specifically the asymmetric disadvantage for truth-tellers in a "firehose of misinformation" environment. TA11
T0129.007 Delete Accounts/Account Activity Deleting accounts and account activity occurs when an influence operation removes its online social media assets, including social media accounts, posts, likes, comments, and other online artefacts. An influence operation may delete its accounts and account activity to complicate attribution or remove online documentation that the operation ever occurred. TA11
T0129.009 Remove Post Origins Removing post origins refers to the elimination of evidence that indicates the initial source of operation content, often to complicate attribution. An influence operation may remove post origins by deleting watermarks, renaming files, or removing embedded links in its content. TA11
T0129.010 Misattribute Activity Misattributed activity refers to incorrectly attributed operation activity. For example, a state sponsored influence operation may conduct operation activity in a way that mimics another state so that external entities misattribute activity to the incorrect state. An operation may misattribute their activities to complicate attribution, avoid detection, or frame an adversary for negative behaviour. TA11
T0130 Conceal Infrastructure Conceal the campaign's infrastructure to avoid takedown and attribution. TA11
T0130.001 Conceal Sponsorship Concealing sponsorship aims to mislead or obscure the identity of the hidden sponsor behind an operation rather than entity publicly running the operation. Operations that conceal sponsorship may maintain visible falsified groups, news outlets, non-profits, or other organisations, but seek to mislead or obscure the identity sponsoring, funding, or otherwise supporting these entities. Influence operations may use a variety of techniques to mask the location of their social media accounts to complicate attribution and conceal evidence of foreign interference. Operation accounts may set their location to a false place, often the location of the operation’s target audience, and post in the region’s language TA11
T0130.002 Utilise Bulletproof Hosting Hosting refers to services through which storage and computing resources are provided to an individual or organisation for the accommodation and maintenance of one or more websites and related services. Services may include web hosting, file sharing, and email distribution. Bulletproof hosting refers to services provided by an entity, such as a domain hosting or web hosting firm, that allows its customer considerable leniency in use of the service. An influence operation may utilise bulletproof hosting to maintain continuity of service for suspicious, illegal, or disruptive operation activities that stricter hosting services would limit, report, or suspend. TA11
T0130.003 Use Shell Organisations Use Shell Organisations to conceal sponsorship. TA11
T0130.004 Use Cryptocurrency Use Cryptocurrency to conceal sponsorship. Examples include Bitcoin, Monero, and Etherium. TA11
T0130.005 Obfuscate Payment Obfuscate Payment TA11
T0131 Exploit TOS/Content Moderation Exploiting weaknesses in platforms' terms of service and content moderation policies to avoid takedowns and platform actions. TA11
T0131.001 Legacy Web Content Make incident content visible for a long time, e.g. by exploiting platform terms of service, or placing it where it's hard to remove or unlikely to be removed. TA11
T0131.002 Post Borderline Content Post Borderline Content TA11
T0132 Measure Performance A metric used to determine the accomplishment of actions. “Are the actions being executed as planned?” TA12
T0132.001 People Focused Measure the performance individuals in achieving campaign goals TA12
T0132.002 Content Focused Measure the performance of campaign content TA12
T0132.003 View Focused View Focused TA12
T0133 Measure Effectiveness A metric used to measure a current system state. “Are we on track to achieve the intended new system state within the planned timescale?” TA12
T0133.001 Behaviour Changes Monitor and evaluate behaviour changes from misinformation incidents. TA12
T0133.002 Content Measure current system state with respect to the effectiveness of campaign content. TA12
T0133.003 Awareness Measure current system state with respect to the effectiveness of influencing awareness. TA12
T0133.004 Knowledge Measure current system state with respect to the effectiveness of influencing knowledge. TA12
T0133.005 Action/Attitude Measure current system state with respect to the effectiveness of influencing action/attitude. TA12
T0134 Measure Effectiveness Indicators (or KPIs) Ensuring that Key Performance Indicators are identified and tracked, so that the performance and effectiveness of campaigns, and elements of campaigns, can be measured, during and after their execution. TA12
T0134.001 Message Reach Monitor and evaluate message reach in misinformation incidents. TA12
T0134.002 Social Media Engagement Monitor and evaluate social media engagement in misinformation incidents. TA12
T0135 Undermine Weaken, debilitate, or subvert a target or their actions. An influence operation may be designed to disparage an opponent; sabotage an opponent’s systems or processes; compromise an opponent’s relationships or support system; impair an opponent’s capability; or thwart an opponent’s initiative. TA02
T0135.001 Smear Denigrate, disparage, or discredit an opponent. This is a common tactical objective in political campaigns with a larger strategic goal. It differs from efforts to harm a target through defamation. If there is no ulterior motive and the sole aim is to cause harm to the target, then choose sub-technique “Defame” of technique “Cause Harm” instead. TA02
T0135.002 Thwart Prevent the successful outcome of a policy, operation, or initiative. Actors conduct influence operations to stymie or foil proposals, plans, or courses of action which are not in their interest. TA02
T0135.003 Subvert Sabotage, destroy, or damage a system, process, or relationship. The classic example is the Soviet strategy of “active measures” involving deniable covert activities such as political influence, the use of front organisations, the orchestration of domestic unrest, and the spread of disinformation. TA02
T0135.004 Polarise To cause a target audience to divide into two completely opposing groups. This is a special case of subversion. To divide and conquer is an age-old approach to subverting and overcoming an enemy. TA02
T0136 Cultivate Support Grow or maintain the base of support for the actor, ally, or action. This includes hard core recruitment, managing alliances, and generating or maintaining sympathy among a wider audience, including reputation management and public relations. Sub-techniques assume support for actor (self) unless otherwise specified. TA02
T0136.001 Defend Reputaton Preserve a positive perception in the public’s mind following an accusation or adverse event. When accused of a wrongful act, an actor may engage in denial, counter accusations, whataboutism, or conspiracy theories to distract public attention and attempt to maintain a positive image. TA02
T0136.002 Justify Action To convince others to exonerate you of a perceived wrongdoing. When an actor finds it untenable to deny doing something, they may attempt to exonerate themselves with disinformation which claims the action was reasonable. This is a special case of “Defend Reputation”. TA02
T0136.003 Energise Supporters Raise the morale of those who support the organisation or group. Invigorate constituents with zeal for the mission or activity. Terrorist groups, political movements, and cults may indoctrinate their supporters with ideologies that are based on warped versions of religion or cause harm to others. TA02
T0136.004 Boost Reputation Elevate the estimation of the actor in the public’s mind. Improve their image or standing. Public relations professionals use persuasive overt communications to achieve this goal; manipulators use covert disinformation. TA02
T0136.005 Cultvate Support for Initiative Elevate or fortify the public backing for a policy, operation, or idea. Domestic and foreign actors can use artificial means to fabricate or amplify public support for a proposal or action. TA02
T0136.006 Cultivate Support for Ally Elevate or fortify the public backing for a partner. Governments may interfere in other countries’ elections by covertly favouring a party or candidate aligned with their interests. They may also mount an influence operation to bolster the reputation of an ally under attack. TA02
T0136.007 Recruit Members Motivate followers to join or subscribe as members of the team. Organisations may mount recruitment drives that use propaganda to entice sympathisers to sign up. TA02
T0136.008 Increase Prestige Improve personal standing within a community. Gain fame, approbation, or notoriety. Conspiracy theorists, those with special access, and ideologues can gain prominence in a community by propagating disinformation, leaking confidential documents, or spreading hate. TA02
T0137 Make Money Profit from disinformation, conspiracy theories, or online harm. In some cases, the sole objective is financial gain, in other cases the objective is both financial and political. Making money may also be a way to sustain a political campaign. TA02
T0137.001 Generate Ad Revenue Earn income from digital advertisements published alongside inauthentic content. Conspiratorial, false, or provocative content drives internet traffic. Content owners earn money from impressions of, or clicks on, or conversions of ads published on their websites, social media profiles, or streaming services, or ads published when their content appears in search engine results. Fraudsters simulate impressions, clicks, and conversions, or they spin up inauthentic sites or social media profiles just to generate ad revenue. Conspiracy theorists and political operators generate ad revenue as a byproduct of their operation or as a means of sustaining their campaign. TA02
T0137.002 Scam Defraud a target or trick a target into doing something that benefits the attacker. A typical scam is where a fraudster convinces a target to pay for something without the intention of ever delivering anything in return. Alternatively, the fraudster may promise benefits which never materialise, such as a fake cure. Criminals often exploit a fear or crisis or generate a sense of urgency. They may use deepfakes to impersonate authority figures or individuals in distress. TA02
T0137.003 Raise Funds Solicit donations for a cause. Popular conspiracy theorists can attract financial contributions from their followers. Fighting back against the establishment is a popular crowdfunding narrative. TA02
T0137.004 Sell Items under False Pretences Offer products for sale under false pretences. Campaigns may hijack or create causes built on disinformation to sell promotional merchandise. Or charlatans may amplify victims’ unfounded fears to sell them items of questionable utility such as supplements or survival gear. TA02
T0137.005 Extort Coerce money or favours from a target by threatening to expose or corrupt information. Ransomware criminals typically demand money. Intelligence agencies demand national secrets. Sexual predators demand favours. The leverage may be critical, sensitive, or embarrassing information. TA02
T0137.006 Manipulate Stocks Artificially inflate or deflate the price of stocks or other financial instruments and then trade on these to make profit. The most common securities fraud schemes are called “pump and dump” and “poop and scoop”. TA02
T0138 Motivate to Act Persuade, impel, or provoke the target to behave in a specific manner favourable to the attacker. Some common behaviours are joining, subscribing, voting, buying, demonstrating, fighting, retreating, resigning, boycotting. TA02
T0138.001 Encourage Inspire, animate, or exhort a target to act. An actor can use propaganda, disinformation, or conspiracy theories to stimulate a target to act in its interest. TA02
T0138.002 Provoke Instigate, incite, or arouse a target to act. Social media manipulators exploit moral outrage to propel targets to spread hate, take to the streets to protest, or engage in acts of violence. TA02
T0138.003 Compel Force target to take an action or to stop taking an action it has already started. Actors can use the threat of reputational damage alongside military or economic threats to compel a target. TA02
T0139 Dissuade from Acting Discourage, deter, or inhibit the target from actions which would be unfavourable to the attacker. The actor may want the target to refrain from voting, buying, fighting, or supplying. TA02
T0139.001 Discourage To make a target disinclined or reluctant to act. Manipulators use disinformation to cause targets to question the utility, legality, or morality of taking an action. TA02
T0139.002 Silence Intimidate or incentivise target into remaining silent or prevent target from speaking out. A threat actor may cow a target into silence as a special case of deterrence. Or they may buy the target’s silence. Or they may repress or restrict the target’s speech. TA02
T0139.003 Deter Prevent target from taking an action for fear of the consequences. Deterrence occurs in the mind of the target, who fears they will be worse off if they take an action than if they don’t. When making threats, aggressors may bluff, feign irrationality, or engage in brinksmanship. TA02
T0140 Cause Harm Persecute, malign, or inflict pain upon a target. The objective of a campaign may be to cause fear or emotional distress in a target. In some cases, harm is instrumental to achieving a primary objective, as in coercion, repression, or intimidation. In other cases, harm may be inflicted for the satisfaction of the perpetrator, as in revenge or sadistic cruelty. TA02
T0140.001 Defame Attempt to damage the target’s personal reputation by impugning their character. This can range from subtle attempts to misrepresent or insinuate, to obvious attempts to denigrate or disparage, to blatant attempts to malign or vilify. Slander applies to oral expression. Libel applies to written or pictorial material. Defamation is often carried out by online trolls. The sole aim here is to cause harm to the target. If the threat actor uses defamation as a means of undermining the target, then choose sub-technique “Smear” of technique “Undermine” instead. TA02
T0140.002 Intimidate Coerce, bully, or frighten the target. An influence operation may use intimidation to compel the target to act against their will. Or the goal may be to frighten or even terrify the target into silence or submission. In some cases, the goal is simply to make the victim suffer. TA02
T0140.003 Spread Hate Publish and/or propagate demeaning, derisive, or humiliating content targeting an individual or group of individuals with the intent to cause emotional, psychological, or physical distress. Hate speech can cause harm directly or incite others to harm the target. It often aims to stigmatise the target by singling out immutable characteristics such as colour, race, religion, national or ethnic origin, gender, gender identity, sexual orientation, age, disease, or mental or physical disability. Thus, promoting hatred online may involve racism, antisemitism, Islamophobia, xenophobia, sexism, misogyny, homophobia, transphobia, ageism, ableism, or any combination thereof. Motivations for hate speech range from group preservation to ideological superiority to the unbridled infliction of suffering. TA02
T0143 Persona Legitimacy This Technique contains sub-techniques which analysts can use to assert whether an account is presenting an authentic, fabricated, or parody persona:

T0143.001: Authentic Persona
T0143.002: Fabricated Persona
T0143.003: Impersonated Persona
T0143.004: Parody Persona
TA16
T0143.001 Authentic Persona An individual or institution presenting a persona that legitimately matches who or what they are is presenting an authentic persona.

For example, an account which presents as being managed by a member of a country’s military, and is legitimately managed by that person, would be presenting an authentic persona (T0143.001: Authentic Persona, T0097.105: Military Personnel).

Sometimes people can authentically present themselves as who they are while still participating in malicious/inauthentic activity; a legitimate journalist (T0143.001: Authentic Persona, T0097.102: Journalist Persona) may accept bribes to promote products, or they could be tricked by threat actors into sharing an operation’s narrative.
TA16
T0143.002 Fabricated Persona An individual or institution pretending to have a persona without any legitimate claim to that persona is presenting a fabricated persona, such as a person who presents themselves as a member of a country’s military without having worked in any capacity with the military (T0143.002: Fabricated Persona, T0097.105: Military Personnel).

Sometimes real people can present entirely fabricated personas; they can use real names and photos on social media while also pretending to have credentials or traits they don’t have in real life.
TA16
T0143.003 Impersonated Persona Threat actors may impersonate existing individuals or institutions to conceal their network identity, add legitimacy to content, or harm the impersonated target’s reputation. This Technique covers situations where an actor presents themselves as another existing individual or institution.

This Technique was previously called Prepare Assets Impersonating Legitimate Entities and used the ID T0099.

Associated Techniques and Sub-techniques
T0097: Presented Persona: Analysts can use the sub-techniques of T0097: Presented Persona to categorise the type of impersonation. For example, a document developed by a threat actor which falsely presented as a letter from a government department could be documented using T0085.004: Develop Document, T0143.003: Impersonated Persona, and T0097.206: Government Institution Persona.
T0145.001: Copy Account Imagery: Actors may take existing accounts’ profile pictures as part of their impersonation efforts.
TA16
T0143.004 Parody Persona Parody is a form of artistic expression that imitates the style or characteristics of a particular work, genre, or individual in a humorous or satirical way, often to comment on or critique the original work or subject matter. People may present as parodies to create humour or make a point by exaggerating or altering elements of the original, while still maintaining recognizable elements.

The use of parody is not an indication of inauthentic or malicious behaviour; parody allows people to present ideas or criticisms in a comedic or exaggerated manner, softening the impact of sensitive or contentious topics. Because parody is often protected as a form of free speech or artistic expression, it provides a legal and social framework for discussing controversial issues.

However, parody personas may be perceived as authentic personas, leading to people mistakenly believing that a parody account’s statements represent the real opinions of a parodied target. Threat actors may also use the guise of parody to spread campaign content. Parody personas may disclaim that they are operating as a parody, however this is not always the case, and is not always given prominence.

Associated Techniques and Sub-techniques T0097: Presented Persona: Analysts can use the sub-techniques of T0097: Presented Persona to categorise the type of parody. For example, an account presenting as a parody of a business could be documented using T0097.205: Business Persona and T0143.003: Parody Persona.
T0145.001: Copy Account Imagery: Actors may take existing accounts’ profile pictures as part of their parody efforts.
TA16
T0144 Persona Legitimacy Evidence This Technique contains behaviours which might indicate whether a persona is legitimate, a fabrication, or a parody.

For example, the same persona being consistently presented across platforms is consistent with how authentic users behave on social media. However, threat actors have also displayed this behaviour as a way to increase the perceived legitimacy of their fabricated personas (aka “backstopping”).
TA16
T0144.001 Present Persona across Platforms This sub-technique covers situations where analysts have identified the same persona being presented across multiple platforms.

Having multiple accounts presenting the same persona is not an indicator of inauthentic behaviour; many people create accounts and present as themselves on multiple platforms. However, threat actors are known to present the same persona across multiple platforms, benefiting from an increase in perceived legitimacy.
TA16
T0144.002 Persona Template Threat actors have been observed following a template when filling their accounts’ online profiles. This may be done to enable account holders to quickly present themselves as a real person with a targeted persona.

For example, an actor may be instructed to create many fabricated local accounts for use in an operation using a template of “[flag emojis], [location], [personal quote], [political party] supporter” in their account’s description.

Associated Techniques and Sub-techniques
T0143.002: Fabricated Persona: The use of a templated account biography in a collection of accounts may be an indicator that the personas have been fabricated.
TA16
T0145 Establish Account Imagery Introduce visual elements to an account where a platform allows this functionality (e.g. a profile picture, a cover photo, etc). 

Threat Actors who don’t want to use pictures of themselves in their social media accounts may use alternate imagery to make their account appear more legitimate.
TA15
T0145.001 Copy Account Imagery Account imagery copied from an existing account.

Analysts may use reverse image search tools to try to identify previous uses of account imagery (e.g. a profile picture) by other accounts.

Threat Actors have been known to copy existing accounts’ imagery to impersonate said accounts, or to provide imagery for unrelated accounts which aren’t intended to impersonate the original assets’ owner.

Associated Techniques and Sub-techniques
T0143.003: Impersonated Persona: Actors may copy existing accounts’ imagery in an attempt to impersonate them.
T0143.004: Parody Persona: Actors may copy existing accounts’ imagery as part of a parody of that account.
TA15
T0145.002 AI-Generated Account Imagery AI Generated images used in account imagery.

An influence operation might flesh out its account by uploading account imagery (e.g. a profile picture), increasing its perceived legitimacy. By using an AI-generated picture for this purpose, they are able to present themselves as a real person without compromising their own identity, or risking detection by taking a real person’s existing profile picture.

Associated Techniques and Sub-techniques
T0086.002: Develop AI-Generated Images (Deepfakes): Analysts should use this sub-technique to document use of AI generated imagery used to support narratives.
TA15
T0145.003 Animal Account Imagery Animal used in account imagery.

An influence operation might flesh out its account by uploading a profile picture, increasing its perceived authenticity.

People sometimes legitimately use images of animals as their profile pictures (e.g. of their pets), and threat actors can mimic this behaviour to avoid the risk of detection associated with stealing or AI-generating profile pictures (see T0145.001: Copy Account Imagery and T0145.002: AI-Generated Account Imagery).

This Technique is often used by Coordinated Inauthentic Behaviour accounts (CIBs). A collection of accounts displaying the same behaviour using similar account imagery can indicate the presence of CIB.
TA15
T0145.004 Scenery Account Imagery Scenery or nature used in account imagery.

An influence operation might flesh out its account by uploading account imagery (e.g. a profile picture), increasing its perceived authenticity.

People sometimes legitimately use images of scenery as their profile picture, and threat actors can mimic this behaviour to avoid the risk of detection associated with stealing or AI-generating profile pictures (see T0145.001: Copy Account Imagery and T0145.002: AI-Generated Account Imagery).

This Technique is often used by Coordinated Inauthentic Behaviour accounts (CIBs). A collection of accounts displaying the same behaviour using similar account imagery can indicate the presence of CIB.
TA15
T0145.005 Illustrated Character Account Imagery A cartoon/illustrated/anime character used in account imagery.

An influence operation might flesh out its account by uploading account imagery (e.g. a profile picture), increasing its perceived authenticity.

People sometimes legitimately use images of illustrated characters as their profile picture, and threat actors can mimic this behaviour to avoid the risk of detection associated with stealing or AI-generating profile pictures (see T0145.001: Copy Account Imagery and T0145.002: AI-Generated Account Imagery).

This Technique is often used by Coordinated Inauthentic Behaviour accounts (CIBs). A collection of accounts displaying the same behaviour using similar account imagery can indicate the presence of CIB.
TA15
T0145.006 Attractive Person Account Imagery Attractive person used in account imagery.

An influence operation might flesh out its account by uploading account imagery (e.g. a profile picture), increasing its perceived authenticity.

Pictures of physically attractive people can benefit threat actors by increasing attention given to their posts.

People sometimes legitimately use images of attractive people as their profile picture, and threat actors can mimic this behaviour to avoid the risk of detection associated with stealing or AI-generating profile pictures (see T0145.001: Copy Account Imagery and T0145.002: AI-Generated Account Imagery).

This Technique is often used by Coordinated Inauthentic Behaviour accounts (CIBs). A collection of accounts displaying the same behaviour using similar account imagery can indicate the presence of CIB.

Associated Techniques and Sub-techniques
T0097.109: Romantic Suitor Persona: Accounts presenting as a romantic suitor may use an attractive person in their account imagery.
T0151.017: Dating Platform: Analysts can use this sub-technique for tagging cases where an account has been identified as using a dating platform.
TA15
T0145.007 Stock Image Account Imagery Stock images used in account imagery.

Stock image websites produce photos of people in various situations. Threat Actors can purchase or appropriate these images for use in their account imagery, increasing perceived legitimacy while avoiding the risk of detection associated with stealing or AI-generating profile pictures (see T0145.001: Copy Account Imagery and T0145.002: AI-Generated Account Imagery). 

Stock images tend to include physically attractive people, and this can benefit threat actors by increasing attention given to their posts.

This Technique is often used by Coordinated Inauthentic Behaviour accounts (CIBs). A collection of accounts displaying the same behaviour using similar account imagery can indicate the presence of CIB.
TA15
T0146 Account Asset An Account is a user-specific profile that allows access to the features and services of an online platform, typically requiring a username and password for authentication. TA06
T0146.001 Free Account Asset Many online platforms allow users to create free accounts on their platform. A Free Account is an Account which does not require payment at account creation and is not subscribed to paid platform features. TA06
T0146.002 Paid Account Asset Some online platforms afford accounts extra features, or other benefits, if the user pays a fee. For example, as of September 2024, content posted by a Paid Account on X (previously Twitter) is prioritised in the platform’s algorithm. TA06
T0146.003 Verified Account Asset Some online platforms apply badges of verification to accounts which meet certain criteria.

On some platforms (such as dating apps) a verification badge signifies that the account has passed the platform’s identity verification checks. On some platforms (such as X (previously Twitter)) a verification badge signifies that an account has paid for the platform’s service.
TA06
T0146.004 Administrator Account Asset Some accounts will have special privileges / will be in control of the Digital Community Hosting Asset; for example, the Admin of a Facebook Page, a Moderator of a Subreddit, etc. etc. TA06
T0146.005 Lookalike Account ID Many platforms which host online communities require creation of a username (or another unique identifier) when an Account is created.

Sometimes people create usernames which are visually similar to other existing accounts’ usernames. While this is not necessarily an indicator of malicious behaviour, actors can create Lookalike Account IDs to support Impersonations or Parody.
TA06
T0146.006 Open Access Platform Some online platforms allow users to take advantage of the platform’s features without creating an account. Examples include the Paste Platform Pastebin, and the Image Board Platforms 4chan and 8chan. TA06
T0146.007 Automated Account Asset An Automated Account is an account which is displaying automated behaviour, such as republishing or liking other accounts’ content, or publishing their own content. TA06
T0147 Software Asset A Software is a program developed to run on computers or devices that helps users achieve specific goals, such as improving productivity, automating tasks, or having fun. TA06
T0147.001 Game Asset A Game is Software which has been designed for interactive entertainment, where users take on challenges set by the game’s designers.

While Online Game Platforms allow people to play with each other, Games are designed for single player experiences.
TA06
T0147.002 Game Mod Asset A Game Mod is a modification which can be applied to a Game or Multiplayer Online Game to add new content or functionality to the game.

Users can Modify Games to introduce new content to the game. Modified Games can be distributed on Software Delivery Platforms such as Steam or can be distributed within the Game or Multiplayer Online Game.
TA06
T0147.003 Malware Asset Malware is Software which has been designed to cause harm or facilitate malicious behaviour on electronic devices.

DISARM recommends using the [MITRE ATT&CK Framework](https://attack.mitre.org/) to document malware types and their usage.
TA06
T0147.004 Mobile App Asset A Mobile App is an application which has been designed to run on mobile operating systems, such as Android or iOS.

Mobile Apps can enable access to online platforms (e.g. Facebook’s mobile app) or can provide software which users can run offline on their device.
TA06
T0148 Financial Instrument A Financial Instrument is a platform or software that facilitates the sending, receiving, and management of money, enabling financial transactions between users or organisations.

Threat actors can deploy financial instruments legitimately to manage their own finances or illegitimately to support fraud schemes.
TA06
T0148.001 Online Banking Platform Online Banking Platforms are spaces provided by banks for their customers to manage their Bank Account online.

The Online Banking Platforms available differ by country. In the United Kingdom, examples of banking institutions which provide Online Banking Platforms include Lloyds, Barclays, and Monzo. In the United States, examples include Citibank, Chase, and Capital One.
TA06
T0148.002 Bank Account Asset A Bank Account is a financial account that allows individuals or organisations to store, manage, and access their money, typically for saving, spending, or investment purposes. TA06
T0148.003 Payment Processing Platform Stripe, Paypal, and Apple Pay, Chargebee, Recurly and Zuora are examples of Payment Processing Platforms.

Payment Processing Platforms produce programs providing Payment Processing or Subscription Processing capabilities which actors can use to set up online storefronts, or to take donations.
TA06
T0148.004 Payment Processing Capability A Payment Processing Capability is a feature of online platforms or software which enables the processing of one-off payments (e.g. an online checkout, or donation processing page).

Payment Processing Capabilities can enable platform users to purchase products or services or can facilitate donations to a given cause.
TA06
T0148.005 Subscription Processing Capability A Subscription Processing Capability is a feature of online platforms or software which enables the processing of recurring payments.

Subscription Processing Capabilities are typically used to enable recurring payments in exchange for continued access to products or services.
TA06
T0148.006 Crowdfunding Platform Kickstarter and GoFundMe are examples of Crowdfunding Platforms.

Crowdfunding Platforms enable users with Accounts to create projects for other platform users to finance, usually in exchange for access to fruits of the project.
TA06
T0148.007 eCommerce Platform Amazon, eBay and Etsy are examples of eCommerce Platforms.

eCommerce Platforms enable users with Accounts to create online storefronts from which other platform users can purchase goods or services.
TA06
T0148.008 Cryptocurrency Exchange Platform Coinbase and Kraken are examples of Cryptocurrency Exchange Platforms.

Cryptocurrency Exchange Platforms provide users a digital marketplace where they can buy, sell, and trade cryptocurrencies, such as Bitcoin or Ethereum.

Some Cryptocurrency Exchange Platforms allow users to create a Cryptocurrency Wallet.
TA06
T0148.009 Cryptocurrency Wallet A Cryptocurrency Wallet is a digital tool that allows users to store, send, and receive cryptocurrencies. It manages private and public keys, enabling secure access to a user's crypto assets.

An influence operation might use cryptocurrency to conceal that they are conducting operational activities, building assets, or sponsoring aligning entities.
TA06
T0149 Online Infrastructure Online Infrastructure consists of technical assets which enable online activity, such as domains, servers, and IP addresses. TA06
T0149.001 Domain Asset A Domain is a web address (such as “google[.]com”), used to navigate to Websites on the internet.

Domains differ from Websites in that Websites are considered to be developed web pages which host content, whereas Domains do not necessarily host public-facing web content.

A threat actor may register a new domain to bypass the old domain being blocked.
TA06
T0149.002 Email Domain Asset An Email Domain is a Domain (such as “meta[.]com”) which has the ability to send emails (e.g. from an @meta[.]com address).

Any Domain which has an MX (Mail Exchange) record and configured SMTP (Simple Mail Transfer Protocol) settings can send and receive emails, and is therefore an Email Domain.
TA06
T0149.003 Lookalike Domain A Lookalike Domain is a Domain which is visually similar to another Domain, with the potential for web users to mistake one domain for the other.

Threat actors who want to impersonate organisations’ websites have been observed using a variety of domain impersonation methods. For example, actors wanting to create a domain impersonating netflix.com may use methods such as typosquatting (e.g. n3tflix.com), combosquatting (e.g. netflix-billing.com), or TLD swapping (e.g. netflix.top).
TA06
T0149.004 Redirecting Domain Asset A Redirecting Domain is a Domain which has been configured to redirect users to another Domain when visited. TA06
T0149.005 Server Asset A Server is a computer which provides resources, services, or data to other computers over a network. There are different types of servers, such as web servers (which serve web pages and applications to users), database servers (which manage and provide access to databases), and file servers (which store and share files across a network). TA06
T0149.006 IP Address Asset An IP Address is a unique numerical label assigned to each device connected to a computer network that uses the Internet Protocol for communication. IP addresses are commonly a part of any online infrastructure.

IP addresses can be in IPV4 dotted decimal (x.x.x.x) or IPV6 colon-separated hexadecimal (y:y:y:y:y:y:y:y) formats.
TA06
T0149.007 VPN Asset A VPN (Virtual Private Network) is a service which creates secure, encrypted connections over the internet, allowing users to transmit data safely and access network resources remotely. It masks IP Addresses, enhancing privacy and security by preventing unauthorised access and tracking. VPNs are commonly used for protecting sensitive information, bypassing geographic restrictions, and maintaining online anonymity.

VPNs can also allow a threat actor to pose as if they are located in one country while in reality being based in another. By doing so, they can try to either mis-attribute their activities to another actor or better hide their own identity.
TA06
T0149.008 Proxy IP Address Asset A Proxy IP Address allows a threat actor to mask their real IP Address by putting a layer between them and the online content they’re connecting with.

Proxy IP Addresses can hide the connection between the threat actor and their online infrastructure.
TA06
T0149.009 Internet Connected Physical Asset An Internet Connected Physical Asset (sometimes referred to as IoT (Internet of Things)) is a physical asset which has internet connectivity to support online features, such as digital signage, wireless printers, and smart TVs. TA06
T0150 Asset Origin Asset Origin contains a list of ways that an actor can obtain an asset. For example, they can create new accounts on online platforms, or they can compromise existing accounts or websites. TA06
T0150.001 Newly Created Asset A Newly Created Asset is an asset which has been created and used for the first time in a documented potential incident.

For example, analysts which can identify a recent creation date of Accounts participating in the spread of a new narrative can assert these are Newly Created Assets.

Analysts should use Dormant if the asset was created and laid dormant for an extended period of time before activity.
TA06
T0150.002 Dormant Asset A Dormant Asset is an asset which was inactive for an extended period before being used in a documented potential incident. TA06
T0150.003 Pre-Existing Asset Pre-Existing Assets are assets which existed before the observed incident which have not been Repurposed; i.e. they are still being used for their original purpose.

An example could be an Account which presented itself with a Journalist Persona prior to and during the observed potential incident.
TA06
T0150.004 Repurposed Asset Repurposed Assets are assets which have been identified as being used previously, but are now being used for different purposes, or have new Presented Personas.

Actors have been documented compromising assets, and then repurposing them to present Inauthentic Personas as part of their operations.
TA06
T0150.005 Compromised Asset A Compromised Asset is an asset which was originally created or belonged to another person or organisation, but which an actor has gained access to without their consent.

See also MITRE ATT&CK T1708: Valid Accounts.
TA06
T0150.006 Purchased Asset A Purchased Asset is an asset which actors paid for the ownership of.

For example, threat actors have been observed selling compromised social media accounts on dark web marketplaces, which can be used to disguise operation activity.
TA06
T0150.007 Rented Asset A Rented Asset is an asset which actors are temporarily renting or subscribing to.

For example, threat actors have been observed renting temporary access to legitimate accounts on online platforms in order to disguise operation activity.
TA06
T0150.008 Bulk Created Asset A Bulk Created Asset is an asset which was created alongside many other instances of the same asset.

Actors have been observed bulk creating Accounts on Social Media Platforms such as Facebook. Indicators of bulk asset creation include its creation date, assets’ naming conventions, their configuration (e.g. templated personas, visually similar profile pictures), or their activity (e.g. post timings, narratives posted).
TA06
T0151 Digital Community Hosting Asset A Digital Community Hosting Asset is an online asset which can be used by actors to provide spaces for users to interact with each other.

Sub-techniques categorised under Digital Community Hosting Assets can include Content Hosting and Content Delivery capabilities; however, their nominal primary purpose is to provide a space for community interaction.
TA07
T0151.001 Social Media Platform Examples of popular Social Media Platforms include Facebook, Instagram, and VK.

Social Media Platforms allow users to create Accounts, which they can configure to present themselves to other platform users. This typically involves Establishing Account Imagery and Presenting a Persona.

Social Media Platforms typically allow the creation of Online Community Groups and Online Community Pages.

Accounts on Social Media Platforms are typically presented with a feed of content posted to the platform. The content that populates this feed can be aggregated by the platform’s proprietary Content Recommendation Algorithm, or users can “friend” or “follow” other accounts to add their posts to their feed.

Many Social Media Platforms also allow users to send direct messages to other users on the platform.
TA07
T0151.002 Online Community Group Some online platforms allow people with Accounts to create Online Community Groups. Groups are usually created around a specific topic or locality, and allow users to post content to the group, and interact with other users’ posted content.

For example, Meta’s Social Media Platform Facebook allows users to create a “Facebook group”. This feature is not exclusive to Social Media Platforms; the Microblogging Platform X (prev. Twitter) allows users to create “X Communities”, groups based on particular topics which users can join and post to; the Software Delivery Platform Steam allows users to create Steam Community Groups.

Online Community Groups can be open or gated (for example, groups can require admin approval before users can join).
TA07
T0151.003 Online Community Page A Facebook Page is an example of an Online Community Page.

Online Community Pages allow Administrator Accounts to post content to the page, which other users can interact with. Pages can be followed or liked by other users - but these users can’t initiate new posts to the page.
TA07
T0151.004 Chat Platform Examples of popular Chat Platforms include WhatsApp, WeChat, Telegram, and Signal; Slack, Mattermost, and Discord; Zoom, GoTo Meeting, and WebEx.

Chat Platforms allow users to engage in text, audio, or video chats with other platform users.

Different Chat Platforms afford users different capabilities. Examples include Direct Messaging, Chat Rooms, Chat Broadcast Channels, and Chat Community Servers.

Some Chat Platforms enable encrypted communication between platform users.
TA07
T0151.005 Chat Community Server Chat Platforms such as Discord, Slack, and Microsoft Teams allow users to create their own Chat Community Servers, which they can invite other platform users to join.

Chat Community Servers are online communities made up of Chat Rooms (or “Channels”) in which users can discuss the given group’s topic. Groups can either be public (shown in the server’s browsable list of channels, available for any member to view and join) or Gated (users must be added to the chat group by existing members to participate).

Some Chat Community Servers allow users to create Chat Broadcast Groups, in which only specific members (e.g. server administrators) of the chat are able to post new content to the group.
TA07
T0151.006 Chat Room Many platforms which enable community interaction allow users to create Chat Rooms; a room in which members of the group can talk to each other via text, audio, or video.

Most Chat Rooms are Gated; users must be added to the chat group before they can post to the chat group, or view its content. For example, on WhatsApp a user can create a Chat Room containing other WhatsApp users whose contact information they have. At this point the user who created the Chat Room has an Administrator Account; they are uniquely able to add other users to the Chat Room.

However, Chat Rooms made on Chat Community Servers such as Discord can be Gated or open. If left open, anyone on the server can view the Chat Room (“channel”), read its contents, and choose to join it.

Examples of Platforms which allow creation of Chat Rooms include:
Instagram, Facebook, X (prev. Twitter) (Group Direct Messaging)
Whatsapp, Telegram, WeChat, Signal (Group Chats)
Discord, Slack, Mattermost, Microsoft Teams (Channels)
TA07
T0151.007 Chat Broadcast Group A Chat Broadcast Group is a type of Chat Group in which only specific members can send content to the channel (typically administrators, or approved group members). Members of the channel may be able to react to content, or comment on it, but can’t directly push new content to the channel.

Examples include:
WhatsApp, Telegram, Discord: Chat Groups in which only admins are able to post new content.
X (prev. Twitter): Spaces (an audio discussion hosting feature) in which admins control who can speak at a given moment.
TA07
T0151.008 Microblogging Platform Examples of Microblogging Platforms include TikTok, Threads, Bluesky, Mastodon, QQ, Tumblr, and X (formerly Twitter).

Microblogging Platforms allow users to create Accounts, which they can configure to present themselves to other platform users. This typically involves Establishing Account Imagery and Presenting a Persona.

Accounts on Microblogging Platforms are able to post short-form text content alongside media.

Content posted to the platforms is aggregated into different feeds and presented to the user. Typical feeds include content posted by other Accounts which the user follows, and content promoted by the platform’s proprietary Content Recommendation Algorithm. Users can also search or use hashtags to discover new content.

Mastodon is an open-source decentralised software which allows anyone to create their own Microblogging Platform that can communicate with other platforms within the “fediverse” (similar to how different email platforms can send emails to each other). Meta’s Threads is a Microblogging Platform which can interact with the fediverse.
TA07
T0151.009 Legacy Online Forum Platform Examples of Legacy Online Forum Platforms include Something Awful (SA Forums), Ars Technica forums, and NeoGAF, and the forums available on the Mumsnet and War Thunder websites.

Legacy Online Forum Platforms are a type of message board (using software such as vBulletin or phpBB) popular in the early 2000s for online communities. They are often used to provide spaces for a community to exist around a given website or topic.

Legacy Online Forum Platforms allow users to create Accounts to join in discussion threads posted to any number of Forums and Sub-Forums on the platform. Forums and Sub-Forums can be Gated, allowing access to approved users only. They can vary in size. Some are larger platforms that host a wider set of topics and communities while others are smaller in scope and size.
TA07
T0151.010 Community Forum Platform Reddit, Lemmy and Tildes are examples of Community Forum Platforms.

Community Forum Platforms are exemplified by users’ ability to create their own sub-communities (Community Sub-Forums) which other platform users can join.

Platform users can view aggregated content from all Community Sub-Forums they subscribe to, or they can view all content from a particular Community Sub-Forum.
TA07
T0151.011 Community Sub-Forum Community Forum Platforms are made up of many Community Sub-Forums. Sub-Forums provide spaces for platform users to create a community based around any topic.

For example, Reddit (a popular Community Forum Platform) has over 138,000 “subreddits” (Community Sub-Forums), including 1082 unique cat-based communities.

Typically, Sub-Forums allow users post text, image, or video to them, and other platform users can up/downvote, or comment on it. Sub-forums may have their own extra rules alongside the platform’s global rules, enforced by community moderators.

While most Sub-Forums are made by users with Accounts on the Community Forum Platform, Sub-Forums can also be created by the platform itself.
TA07
T0151.012 Image Board Platform 4chan and 8chan are examples of Image Board Platforms.

Image Board Platforms provide individual boards on which users can start threads related to the board’s topic. For example, 4chan’s /pol/ board provides a space for users to talk about politics.

Most Image Board Platforms allow users to post without creating an account. Posts are typically made anonymously, although users can choose to post under a pseudonym.
TA07
T0151.013 Question and Answer Platform Quora, Stack Overflow, and Yahoo Answers are examples of Question and Answer Platforms.

Question and Answer Platforms allow users to create Accounts letting them post questions to the platform community, and respond to other platform users’ questions.
TA07
T0151.014 Comments Section Many platforms enable community interaction via Comments Sections on posted content. Comments Sections allow platform users to comment on content posted by other users.

On some platforms Comments Sections are the only place available for community interaction, such as news websites which provide a Comments Section to discuss articles posted to the website.
TA07
T0151.015 Online Game Platform Roblox, Minecraft, Fortnite, League of Legends, and World of Warcraft are examples of Online Game Platforms.

Online Game Platforms allow users to create Accounts which they can use to access Online Game Sessions; i.e. an individual instance of a multiplayer online game.

Many Online Game Platforms support text or voice chat within Online Game Sessions.
TA07
T0151.016 Online Game Session Online Game Sessions are instances of a game played on an Online Game Platform. Examples of Online Game Sessions include a match in Fortnite or League of Legends, or a server in Minecraft, Fortnite, or World of Warcraft.

Some Online Game Platforms (such as Fortnite, League of Legends, and World of Warcraft) host Online Game Sessions on their own Servers, and don’t allow other actors to host Online Game Sessions.

Some Online Game Platforms (such as Roblox and Minecraft) allow users to host instances of Online Game Sessions on their own Servers.
TA07
T0151.017 Dating Platform Tinder, Bumble, Grindr, Tantan, Badoo, Plenty of Fish, hinge, LOVOO, OkCupid, happn, and Mamba are examples of Dating Platforms.

Dating Platforms allow users to create Accounts, letting them connect with other platform users with the purpose of developing a physical/romantic relationship.
TA07
T0152 Digital Content Hosting Asset Digital Content Hosting Assets are online assets which are primarily designed to allow actors to upload content to the internet.

Sub-techniques categorised under Digital Content Hosting Assets can include Community Hosting and Content Delivery capabilities; however their nominal primary purpose is to host content online.
TA07
T0152.001 Blogging Platform Medium and Substack are examples of Blogging Platforms.

By creating an Account on a Blogging Platform, people are able to create their own Blog.
TA07
T0152.002 Blog Asset Blogs are a collation of posts centred on a particular topic, author, or collection of authors.

Some platforms are designed to support users in hosting content online, such as Blogging Platforms like Substack which allow users to create Blogs, but other online platforms can also be used to produce a Blog; a Paid Account on X (prev Twitter) is able to post long-form text content to their timeline in a style of a blog.

Actors may create Accounts on Blogging Platforms to create a Blog, or make their own Blog on a Website.
TA07
T0152.003 Website Hosting Platform Examples of Website Hosting Platforms include Wix, Webflow, Weebly, and Wordpress.

Website Hosting Platforms help users with managing online infrastructure required to host a website online; such as securing IP Addresses and Domains.
TA07
T0152.004 Website Asset A Website is a collection of related web pages hosted on a server and accessible via a web browser. Websites have an associated Domain and can host various types of content, such as text, images, videos, and interactive features.

When a Website is fleshed out, it Presents a Persona to site visitors. For example, the Domain “bbc.co.uk/news” hosts a Website which uses the News Outlet Persona.
TA07
T0152.005 Paste Platform Pastebin is an example of a Paste Platform.

Paste Platforms allow people to upload unformatted text to the platform, which they can share via a link. Some Paste Platforms are Open Access Platforms which allow users to upload content without creating an Account first.
TA07
T0152.006 Video Platform YouTube, Vimeo, and LiveLeak are examples of Video Platforms.

Video Platforms allow people to create Accounts which they can use to upload video content for people to watch on the platform.

The ability to host videos is not exclusive to Video Platforms; many online platforms allow users with Accounts to upload video content. However, Video Platforms’ primary purpose is to be a place to host and view video content.
TA07
T0152.007 Audio Platform Soundcloud, Spotify, and YouTube Music; Apple Podcasts, Podbean, and Captivate are examples of Audio Platforms.

Audio Platforms allow people to create Accounts which they can use to upload audio content to the platform.

The ability to host audio is not exclusive to Audio Platforms; many online platforms allow users with Accounts to upload audio content. However, Audio Platforms’ primary purpose is to be a place to host and listen to audio content.
TA07
T0152.008 Live Streaming Platform Twitch.tv and Whatnot are examples of Live Streaming Platforms.

Live Streaming Platforms allow people to create Accounts and stream live content (video or audio). A temporary open Group Chat is created alongside live streamed content for viewers to discuss the stream. Some Live Streaming Platforms allow users to archive streamed content for later non-live viewing.

The ability to stream live media is not exclusive to Live Streaming Platforms; many online platforms allow users with Accounts to stream content (such as the Video Platform YouTube’s “YouTube Live”, and the Social Media Platform Facebook’s “Facebook Live”). However, Live Streaming Platforms’ primary purpose is to be a place for people to stream content live.
TA07
T0152.009 Software Delivery Platform Apple’s App Store, Google’s Google Play Store, and Valve’s Steam are examples of Software Delivery Platforms.

Software Delivery Platforms are designed to enable users to download programmes uploaded to the platform. Software can be purchased, or downloaded for free.

Some Software Delivery Platforms require users to have an Account before they can download software, and software they acquire becomes associated with the account (i.e. the account owns a licence to download the software). Some platforms don’t require users to make accounts before downloading software.

Actors may create their own Software Delivery Platform on a Domain they own.
TA07
T0152.010 File Hosting Platform Dropbox and Google Drive are examples of File Hosting Platforms.

File Hosting Platforms allow people to create Accounts which they can use to host files on another server, enabling access to content on any machine, and the ability to easily share files with anyone online.

Actors may also create their own File Hosting Platform on a Website or Server they control.
TA07
T0152.011 Wiki Platform Wikipedia, Fandom, Ruwiki, TV Tropes, and the SCP Foundation are examples of Wiki Platforms.

Wikis use wiki software to allow platform users to collaboratively create and maintain an encyclopedia of information related to a given topic.
TA07
T0152.012 Subscription Service Platform Patreon, Fansly, and OnlyFans are examples of Subscription Service Platforms.

Subscription Service Platforms enable users with Accounts to host online content to which other platform users can subscribe to access. Content typically requires Paid Subscription to access, however open content is often also supported.
TA07
T0153 Digital Content Delivery Asset Digital Content Delivery Assets are assets which support the delivery of content to users online.

Sub-techniques categorised under Digital Content Delivery Assets can include Community Hosting and Content Hosting capabilities; however their nominal primary purpose is to support the delivery of content to users online.
TA07
T0153.001 Email Platform Gmail, iCloud mail, and Microsoft Outlook are examples of Email Platforms.

Email Platforms are online platforms which allow people to create Accounts that they can use to send and receive emails to and from other email accounts.

Instead of using an Email Platform, actors may set up their own Email Domain, letting them send and receive emails on a custom domain.

Analysts should default to Email Platform if they cannot confirm whether an email was sent using a privately operated email, or via an account on a public email platform (for example, in situations where analysts are coding third party reporting which does not specify the type of email used).
TA07
T0153.002 Link Shortening Platform Bitly and TinyURL are examples of Link Shortening Platforms.

Link Shortening Platforms are online platforms which allow people to create Accounts that they can use to convert existing URLs into Shortened Links, or into QR Codes.
TA07
T0153.003 Shortened Link Asset A Shortened Link is a custom URL which is typically a shortened version of another URL. TA07
T0153.004 QR Code Asset A QR Code allows people to use cameras on their smartphones to open a URL. TA07
T0153.005 Online Advertising Platform Google Ads, Facebook Ads, and LinkedIn Marketing Solutions are examples of Online Advertising Platforms.

Online Advertising Platforms are online platforms which allow people to create Accounts that they can use to upload and deliver adverts to people online.
TA07
T0153.006 Content Recommendation Algorithm Many online platforms have Content Recommendation Algorithms, which promote content posted to the platform to users based on metrics the platform operators are trying to meet. Algorithms typically surface platform content which the user is likely to engage with, based on how they and other users have behaved on the platform. TA07
T0153.007 Direct Messaging Many online platforms allow users to contact other platform users via Direct Messaging; private messaging which can be initiated by a user with other platform users.

Examples include messaging on WhatsApp, Telegram, and Signal; direct messages (DMs) on Facebook or Instagram.

Some platforms’ Direct Messaging capabilities provide users with Encrypted Communication.
TA07
T0154 Digital Content Creation Asset Digital Content Creation Assets are Platforms or Software which help actors produce content for publication online. TA07
T0154.001 AI LLM Platform OpenAI’s ChatGPT, Google’s Bard, Microsoft’s Turing-NLG, Google’s T5 (Text-to-Text Transfer Transformer), and Facebook’s BART are examples of AI LLM (Large Language Model) Platforms.

AI LLM Platforms are online platforms which allow people to create Accounts that they can use to interact with the platform’s AI Large Language Model, to produce text-based content.

LLMs can create hyper-realistic synthetic text that is both scalable and persuasive. LLMs can largely automate content production, reducing the overhead in persona creation, and generate culturally appropriate outputs that are less prone to exhibiting conspicuous signs of inauthenticity.

Some platforms implement protections against misuse of AI by their users. Threat Actors have been observed bypassing these protections using prompt injections, poisoning, jailbreaking, or integrity attacks.
TA07
T0154.002 AI Media Platform AI Media Platforms are online platforms that allow people to create Accounts which they can use to produce image, video, or audio content (also known as “deepfakes”) using the platform’s AI Software.

Midjourney, DALL-E, Stable Diffusion, and Adobe Firefly are examples of AI Media Platforms which allow users to Develop AI-Generated Images, AI-Generated Videos and AI-Generated Account Imagery.

Similarly, Reface, Zao, FaceApp, and Wombo are mobile apps which offer features for creating AI-Generated videos, gifs, or trending memes.

AI-Generated Audio such as text-to-speech and voice cloning have revolutionised the creation of synthetic voices that closely mimic human speech. AI Media Platforms such as Descript, Fliki, Murf AI, PlayHT, and Resemble AI can be used to generate synthetic voice.

Some platforms implement protections against misuse of AI by their users. Threat Actors have been observed bypassing these protections using prompt injections, poisoning, jailbreaking, or integrity attacks.
TA07
T0155 Gated Asset Some assets are Gated; closed communities or platforms which can’t be accessed openly. They may be password protected or require admin approval for entry. Many different digital assets can be gated. This technique contains sub-techniques with methods used to gate assets. Analysts can use T0155: Gated Asset if the method of gating is unclear. TA07
T0155.001 Password Gated Asset A Password Gated Asset is an online asset which requires a password to gain access.

Examples include password protected Servers set up to be a File Hosting Platform, or password protected Community Sub-Forums.
TA07
T0155.002 Invite Gated Asset An Invite Gated Asset is an online asset which requires an existing user to invite other users for access to the asset.

Examples include Chat Groups in which Administrator Accounts are able to add or remove users, or File Hosting Platforms which allow users to invite other users to access their files.
TA07
T0155.003 Approval Gated Asset An Approval Gated Asset is an online asset which requires approval from Administrator Accounts for access to the asset.

Examples include Online Community Groups on Facebook, which can be configured to require questions and approval before access, and Accounts on Social Media Platforms such as Instagram, which allow users to set their accounts as visible to approved friends only.
TA07
T0155.004 Geoblocked Asset A Geoblocked Asset is an online asset which cannot be accessed in specific geographical locations.

Assets can be Geoblocked by choice of the platform, or can have Geoblocking mandated by regulators, and enforced through Internet Service Providers.
TA07
T0155.005 Paid Access Asset A Paid Access Asset is an online asset which requires a single payment for permanent access to the asset. TA07
T0155.006 Subscription Access Asset A Subscription Access Asset is an online asset which requires a continued subscription for access to the asset.

Examples include the Blogging Platform Substack, which affords Blogs hosted on their platform the ability to produce subscriber-only posts, and the Subscription Service Platform Patreon.
TA07
T0155.007 Encrypted Communication Channel Some online platforms support encrypted communication between platform users, for example the Chat Platforms Telegram and Signal. TA07