DISARMframeworks/DISARM_MASTER_DATA/techniques.csv
Sara-Jayne Terp 1bc8d88b63 moved to datasets as CSVs
Changed from data held in excelfiles to data held in CSV files.  This gives us a better view of what's changed in the datasets when we push them to git.
2022-08-25 09:50:52 -04:00

86 KiB
Raw Blame History

1disarm_idnamename_DEtactic_idsummarysummary_DEchanges from v0.1longname
2T0002Facilitate State PropagandaTA02Organize citizens around pro-state messaging. Coordinate paid or volunteer groups to push state propaganda.no changeT0002 - Facilitate State Propaganda
3T0003Leverage Existing NarrativesTA14Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices. no changeT0003 - Leverage Existing Narratives
4T0004Develop Competing NarrativesTA14Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centered on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the "firehose of misinformation" approach.no changeT0004 - Develop Competing Narratives
5T0007Create Inauthentic Social Media Pages and GroupsTA15Create key social engineering assets needed to amplify content, manipulate algorithms, fool public and/or specific incident/campaign targets. Computational propaganda depends substantially on false perceptions of credibility and acceptance. By creating fake users and groups with a variety of interests and commitments, attackers can ensure that their messages both come from trusted sources and appear more widely adopted than they actually are.no changeT0007 - Create Inauthentic Social Media Pages and Groups
6T0009Create fake expertsTA16Stories planted or promoted in computational propaganda operations often make use of experts fabricated from whole cloth, sometimes specifically for the story itself. no changeT0009 - Create fake experts
7T0009.001Utilize Academic/Pseudoscientific JustificationsTA16Utilize Academic/Pseudoscientific Justificationsno changeUtilize Academic/Pseudoscientific Justifications -
8T0010Cultivate ignorant agentsTA15Cultivate propagandists for a cause, the goals of which are not fully comprehended, and who are used cynically by the leaders of the cause. Independent actors use social media and specialised web sites to strategically reinforce and spread messages compatible with their own. Their networks are infiltrated and used by state media disinformation organisations to amplify the states own disinformation strategies against target populations. Many are traffickers in conspiracy theories or hoaxes, unified by a suspicion of Western governments and mainstream media. Their narratives, which appeal to leftists hostile to globalism and military intervention and nationalists against immigration, are frequently infiltrated and shaped by state-controlled trolls and altered news items from agencies such as RT and Sputnik. Also know as "useful idiots" or "unwitting agents".no changeT0010 - Cultivate ignorant agents
9T0011Compromise legitimate accountsTA16Hack or take over legimate accounts to distribute misinformation or damaging content.no changeT0011 - Compromise legitimate accounts
10T0013Create inauthentic websitesTA15Create media assets to support inauthentic organizations (e.g. think tank), people (e.g. experts) and/or serve as sites to distribute malware/launch phishing operations.no changeT0013 - Create inauthentic websites
11T0014Prepare fundraising campaignsTA15Fundraising campaigns refer to an influence operations systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities. no changeT0014 - Prepare fundraising campaigns
12T0014.001Raise funds from malign actorsTA15Raising funds from malign actors may include contributions from foreign agents, cutouts or proxies, shell companies, dark money groups, etc. no changeRaise funds from malign actors -
13T0014.002Raise funds from ignorant agentsTA15Raising funds from ignorant agents may include scams, donations intended for one stated purpose but then used for another, etc. no changeRaise funds from ignorant agents -
14T0015Create hashtags and search artifactsTA06Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicize the story more widely through trending lists and search behavior. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites). no changeT0015 - Create hashtags and search artifacts
15T0016Create ClickbaitTA05Create attention grabbing headlines (outrage, doubt, humor) required to drive traffic & engagement. This is a key asset.no changeT0016 - Create Clickbait
16T0017Conduct fundraisingTA10Fundraising campaigns refer to an influence operations systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services166 on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns to promote operation messaging while raising money to support its activities. no changeT0017 - Conduct fundraising
17T0017.001Conduct Crowdfunding CampaignsTA10An influence operation may Conduct Crowdfunding Campaigns on platforms such as GoFundMe, GiveSendGo, Tipeee, Patreon, etc.no changeConduct Crowdfunding Campaigns -
18T0018Purchase Targeted AdvertisementsTA05Create or fund advertisements targeted at specific populationsno changeT0018 - Purchase Targeted Advertisements
19T0019Generate information pollutionTA06Flood social channels; drive traffic/engagement to all assets; create aura/sense/perception of pervasiveness/consensus (for or against or both simultaneously) of an issue or topic. "Nothing is true, but everything is possible." Akin to astroturfing campaign.no changeT0019 - Generate information pollution
20T0019.001Create fake researchTA06Create fake academic research. Example: fake social science research is often aimed at hot-button social issues such as gender, race and sexuality. Fake science research can target Climate Science debate or pseudoscience like anti-vaxxno changeCreate fake research -
21T0019.002Hijack HashtagsTA06Hashtag hijacking occurs when users “[use] a trending hashtag to promote topics that are substantially different from its recent context” (VanDam and Tan, 2016) or “to promote ones own social media agenda” (Darius and Stephany, 2019).no changeHijack Hashtags -
22T0020Trial contentTA08Iteratively test incident performance (messages, content etc), e.g. A/B test headline/content enagagement metrics; website and/or funding campaign conversion ratesno changeT0020 - Trial content
23T0022Leverage Conspiracy Theory NarrativesTA14"Conspiracy narratives" appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalized or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the "firehose of falsehoods" model. no changeT0022 - Leverage Conspiracy Theory Narratives
24T0022.001Amplify Existing Conspiracy Theory NarrativesTA14An influence operation may amplify an existing conspiracy theory narrative that aligns with its incident or campaign goals. By amplifying existing conspiracy theory narratives, operators can leverage the power of the existing communities that support and propagate those theories without needing to expend resources creating new narratives or building momentum and buy in around new narratives. no changeAmplify Existing Conspiracy Theory Narratives -
25T0022.002Develop Original Conspiracy Theory NarrativesTA14While this requires more resources than amplifying existing conspiracy theory narratives, an influence operation may develop original conspiracy theory narratives in order to achieve greater control and alignment over the narrative and their campaign goals. Prominent examples include the USSR's Operation INFEKTION disinformation campaign run by the KGB in the 1980s to plant the idea that the United States had invented HIV/AIDS as part of a biological weapons research project at Fort Detrick, Maryland. More recently, Fort Detrick featured prominently in a new conspiracy theory narratives around the origins of the COVID-19 outbreak and pandemic. no changeDevelop Original Conspiracy Theory Narratives -
26T0023Distort factsTA06Change, twist, or exaggerate existing facts to construct a narrative that differs from reality. Examples: images and ideas can be distorted by being placed in an improper contentno changeT0023 - Distort facts
27T0023.001Reframe ContextTA06Reframing context refers to removing an event from its surrounding context to distort its intended meaning. Rather than deny that an event occurred, reframing context frames an event in a manner that may lead the target audience to draw a different conclusion about its intentions. no changeReframe Context -
28T0023.002Edit Open-Source ContentTA06An influence operation may edit open-source content, such as collaborative blogs or encyclopedias, to promote its narratives on outlets with existing credibility and audiences. Editing open-source content may allow an operation to post content on platforms without dedicating resources to the creation and maintenance of its own assets. no changeEdit Open-Source Content -
29T0029Online pollsTA07Create fake online polls, or manipulate existing online polls. Data gathering tactic to target those who engage, and potentially their networks of friends/followers as wellno changeT0029 - Online polls
30T0039 Bait legitimate influencersTA08Credibility in a social media environment is often a function of the size of a user's network. "Influencers" are so-called because of their reach, typically understood as: 1) the size of their network (i.e. the number of followers, perhaps weighted by their own influence); and 2) The rate at which their comments are re-circulated (these two metrics are related). Add traditional media players at all levels of credibility and professionalism to this, and the number of potential influencial carriers available for unwitting amplification becomes substantial. By targeting high-influence people and organizations in all types of media with narratives and content engineered to appeal their emotional or ideological drivers, influence campaigns are able to add perceived credibility to their messaging via saturation and adoption by trusted agents such as celebrities, journalists and local leaders.no changeT0039 - Bait legitimate influencers
31T0040Demand insurmountable proofTA14Campaigns often leverage tactical and informational asymmetries on the threat surface, as seen in the Distort and Deny strategies, and the "firehose of misinformation". Specifically, conspiracy theorists can be repeatedly wrong, but advocates of the truth need to be perfect. By constantly escalating demands for proof, propagandists can effectively leverage this asymmetry while also priming its future use, often with an even greater asymmetric advantage. The conspiracist is offered freer rein for a broader range of "questions" while the truth teller is burdened with higher and higher standards of proof.no changeT0040 - Demand insurmountable proof
32T0042Seed Kernel of truthTA08Wrap lies or altered context/facts around truths. Influence campaigns pursue a variety of objectives with respect to target audiences, prominent among them: 1. undermine a narrative commonly referenced in the target audience; or 2. promote a narrative less common in the target audience, but preferred by the attacker. In both cases, the attacker is presented with a heavy lift. They must change the relative importance of various narratives in the interpretation of events, despite contrary tendencies. When messaging makes use of factual reporting to promote these adjustments in the narrative space, they are less likely to be dismissed out of hand; when messaging can juxtapose a (factual) truth about current affairs with the (abstract) truth explicated in these narratives, propagandists can undermine or promote them selectively. Context matters.no changeT0042 - Seed Kernel of truth
33T0043Chat appsTA07Direct messaging via chat app is an increasing method of delivery. These messages are often automated and new delivery and storage methods make them anonymous, viral, and ephemeral. This is a difficult space to monitor, but also a difficult space to build acclaim or notoriety.no changeT0043 - Chat apps
34T0043.001Use Encrypted Chat AppsTA07Examples include Signal, WhatsApp, Discord, Wire, etc.no changeUse Encrypted Chat Apps -
35T0043.002Use Unencrypted Chats AppsTA07Examples include SMS, etc.no changeUse Unencrypted Chats Apps -
36T0044Seed distortionsTA08Try a wide variety of messages in the early hours surrounding an incident or event, to give a misleading account or impression. no changeT0044 - Seed distortions
37T0045Use fake expertsTA08Use the fake experts that were set up during Establish Legitimacy. Pseudo-experts are disposable assets that often appear once and then disappear. Give "credility" to misinformation. Take advantage of credential biasno changeT0045 - Use fake experts
38T0046Use Search Engine OptimizationTA08Manipulate content engagement metrics (ie: Reddit & Twitter) to influence/impact news search results (e.g. Google), also elevates RT & Sputnik headline into Google news alert emails. aka "Black-hat SEO" no changeT0046 - Use Search Engine Optimization
39T0047Censor social media as a political forceTA18Use political influence or the power of state to stop critical social media comments. Government requested/driven content take downs (see Google Transperancy reports).no changeT0047 - Censor social media as a political force
40T0048HarassTA18Threatening or harassing believers of opposing narratives refers to the use of intimidation techniques, including cyberbullying and doxing, to discourage opponents from voicing their dissent. An influence operation may threaten or harass believers of the opposing narratives to deter individuals from posting or proliferating conflicting content. no changeT0048 - Harass
41T0048.001Boycott/"Cancel" OpponentsTA18Cancel culture refers to the phenomenon in which individuals collectively refrain from supporting an individual, organization, business, or other entity, usually following a real or falsified controversy. An influence operation may exploit cancel culture by emphasizing an adversarys problematic or disputed behavior and presenting its own content as an alternative. no changeBoycott/"Cancel" Opponents -
42T0048.002Harass People Based on IdentitiesTA18Examples include social identities like gender, sexuality, race, ethnicity, religion, ability, nationality, etc. as well as roles and occupations like journalist or activist.no changeHarass People Based on Identities -
43T0048.003Threaten to DoxTA18Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content. no changeThreaten to Dox -
44T0048.004DoxTA18Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content. no changeDox -
45T0049Flooding the Information SpaceTA17Flooding and/or mobbing social media channels feeds and/or hashtag with excessive volume of content to control/shape online conversations and/or drown out opposing points of view. Bots and/or patriotic trolls are effective tools to acheive this effect.no changeT0049 - Flooding the Information Space
46T0049.001Trolls amplify and manipulateTA17Use trolls to amplify narratives and/or manipulate narratives. Fake profiles/sockpuppets operating to support individuals/narratives from the entire political spectrum (left/right binary). Operating with increased emphasis on promoting local content and promoting real Twitter users generating their own, often divisive political content, as it's easier to amplify existing content than create new/original content. Trolls operate where ever there's a socially divisive issue (issues that can/are be politicized).no changeTrolls amplify and manipulate -
47T0049.002Hijack existing hashtagTA17Take over an existing hashtag to drive exposure.no changeHijack existing hashtag -
48T0049.003Bots Amplify via Automated Forwarding and RepostingTA17Automated forwarding and reposting refer to the proliferation of operation content using automated means, such as artificial intelligence or social media bots. An influence operation may use automated activity to increase content exposure without dedicating the resources, including personnel and time, traditionally required to forward and repost content. Use bots to amplify narratives above algorithm thresholds. Bots are automated/programmed profiles designed to amplify content (ie: automatically retweet or like) and give appearance it's more "popular" than it is. They can operate as a network, to function in a coordinated/orchestrated manner. In some cases (more so now) they are an inexpensive/disposable assets used for minimal deployment as bot detection tools improve and platforms are more responsive.no changeBots Amplify via Automated Forwarding and Reposting -
49T0049.004Utilize SpamoflaugeTA17Spamoflauge refers to the practice of disguising spam messages as legitimate. Spam refers to the use of electronic messaging systems to send out unrequested or unwanted messages in bulk. Simple methods of spamoflauge include replacing letters with numbers to fool keyword-based email spam filters, for example, "you've w0n our jackp0t!". Spamoflauge may extend to more complex techniques such as modifying the grammar or word choice of the language, casting messages as images which spam detectors cannot automatically read, or encapsulating messages in password protected attachments, such as .pdf or .zip files. Influence operations may use spamoflauge to avoid spam filtering systems and increase the likelihood of the target audience receiving operation messaging. no changeUtilize Spamoflauge -
50T0049.005Conduct SwarmingTA17Swarming refers to the coordinated use of accounts to overwhelm the information space with operation content. Unlike information flooding, swarming centers exclusively around a specific event or actor rather than a general narrative. Swarming relies on “horizontal communication” between information assets rather than a top-down, vertical command-and-control approach. no changeConduct Swarming -
51T0049.006Conduct Keyword SquattingTA17Keyword squatting refers to the creation of online content, such as websites, articles, or social media accounts, around a specific search engine-optimized term to overwhelm the search results of that term. An influence may keyword squat to increase content exposure to target audience members who query the exploited term in a search engine and manipulate the narrative around the term. no changeConduct Keyword Squatting -
52T0049.007Inauthentic Sites Amplify News and NarrativesTA17Inauthentic sites circulate cross-post stories and amplify narratives. Often these sites have no masthead, bylines or attribution.no changeInauthentic Sites Amplify News and Narratives -
53T0057Organize EventsTA10Coordinate and promote real-world events across media platforms, e.g. rallies, protests, gatherings in support of incident narratives.no changeT0057 - Organize Events
54T0057.001Pay for Physical ActionTA10Paying for physical action occurs when an influence operation pays individuals to act in the physical realm. An influence operation may pay for physical action to create specific situations and frame them in a way that supports operation narratives, for example, paying a group of people to burn a car to later post an image of the burning car and frame it as an act of protest. no changePay for Physical Action -
55T0057.002Conduct Symbolic ActionTA10Symbolic action refers to activities specifically intended to advance an operations narrative by signaling something to the audience, for example, a military parade supporting a states narrative of military superiority. An influence operation may use symbolic action to create falsified evidence supporting operation narratives in the physical information space. no changeConduct Symbolic Action -
56T0059Play the long gameTA11Play the long game refers to two phenomena: 1. To plan messaging and allow it to grow organically without conducting your own amplification. This is methodical and slow and requires years for the message to take hold 2. To develop a series of seemingly disconnected messaging narratives that eventually combine into a new narrative.no changeT0059 - Play the long game
57T0060Continue to AmplifyTA11continue narrative or message amplification after the main incident work has finishedno changeT0060 - Continue to Amplify
58T0061Sell MerchandiseTA10Sell mechandise refers to getting the message or narrative into physical space in the offline world while making moneyno changeT0061 - Sell Merchandise
59T0065Prepare Physical Broadcast CapabilitiesTA15Create or coopt broadcast capabilities (e.g. TV, radio etc).no changeT0065 - Prepare Physical Broadcast Capabilities
60T0066Degrade AdversaryTA02Plan to degrade an adversarys image or ability to act. This could include preparation and use of harmful information about the adversarys actions or reputation.no changeT0066 - Degrade Adversary
61T0068Respond to Breaking News Event or Active CrisisTA14Media attention on a story or event is heightened during a breaking news event, where unclear facts and incomplete information increase speculation, rumors, and conspiracy theories, which are all vulnerable to manipulation. no changeT0068 - Respond to Breaking News Event or Active Crisis
62T0072Segment AudiencesTA13Create audience segmentations by features of interest to the influence campaign, including political affiliation, geographic location, income, demographics, and psychographics. no changeT0072 - Segment Audiences
63T0072.001Geographic SegmentationTA13An influence operation may target populations in a specific geographic location, such as a region, state, or city. An influence operation may use geographic segmentation to Create Localized Content (see: Establish Legitimacy). no changeGeographic Segmentation -
64T0072.002Demographic SegmentationTA13An influence operation may target populations based on demographic segmentation, including age, gender, and income. Demographic segmentation may be useful for influence operations aiming to change state policies that affect a specific population sector. For example, an influence operation attempting to influence Medicare funding in the United States would likely target U.S. voters over 65 years of age. no changeDemographic Segmentation -
65T0072.003Economic SegmentationTA13An influence operation may target populations based on their income bracket, wealth, or other financial or economic division. no changeEconomic Segmentation -
66T0072.004Psychographic SegmentationTA13An influence operation may target populations based on psychographic segmentation, which uses audience values and decision-making processes. An operation may individually gather psychographic data with its own surveys or collection tools or externally purchase data from social media companies or online surveys, such as personality quizzes. no changePsychographic Segmentation -
67T0072.005Political SegmentationTA13An influence operation may target populations based on their political affiliations, especially when aiming to manipulate voting or change policy.no changePolitical Segmentation -
68T0073Determine Target AudiencesTA01Determining the target audiences (segments of the population) who will receive campaign narratives and artifacts intended to achieve the strategic ends.newT0073 - Determine Target Audiences
69T0074Determine Strategic EndsTA01Determining the campaigns goals or objectives. Examples include achieving achieving geopolitical advantage like undermining trust in an adversary, gaining domestic political advantage, achieving financial gain, or attaining a policy change, newT0074 - Determine Strategic Ends
70T0075DismissTA02Push back against criticism by dismissing your critics. This might be arguing that the critics use a different standard for you than with other actors or themselves; or arguing that their criticism is biased.Split from T0001T0075 - Dismiss
71T0075.001Discredit Credible SourcesTA02Plan to delegitimize the media landscape and degrade public trust in reporting, by discrediting credible sources. This makes it easier to promote influence operation content.no changeDiscredit Credible Sources -
72T0076DistortTA02Twist the narrative. Take information, or artifacts like images, and change the framing around them.Split from T0001T0076 - Distort
73T0077DistractTA02Shift attention to a different narrative or actor, for instance by accusing critics of the same activity that theyve accused you of (e.g. police brutality).Split from T0001T0077 - Distract
74T0078DismayTA02Threaten the critic or narrator of events. For instance, threaten journalists or news outlets reporting on a story.Split from T0001T0078 - Dismay
75T0079DivideTA02Create conflict between subgroups, to widen divisions in a communitySplit from T0001T0079 - Divide
76T0080Map Target Audience Information EnvironmentTA13Mapping the target audience information environment analyzes the information space itself, including social media analytics, web traffic, and media surveys. Mapping the information environment may help the influence operation determine the most realistic and popular information channels to reach its target audience. Mapping the target audience information environment aids influence operations in determining the most vulnerable areas of the information space to target with messaging.newT0080 - Map Target Audience Information Environment
77T0080.001Monitor Social Media AnalyticsTA13An influence operation may use social media analytics to determine which factors will increase the operation contents exposure to its target audience on social media platforms, including views, interactions, and sentiment relating to topics and content types. The social media platform itself or a third-party tool may collect the metrics. no changeMonitor Social Media Analytics -
78T0080.002Evaluate Media SurveysTA13An influence operation may evaluate its own or third-party media surveys to determine what type of content appeals to its target audience. Media surveys may provide insight into an audiences political views, social class, general interests, or other indicators used to tailor operation messaging to its target audience.no changeEvaluate Media Surveys -
79T0080.003Identify Trending Topics/HashtagsTA13An influence operation may identify trending hashtags on social media platforms for later use in boosting operation content. A hashtag40 refers to a word or phrase preceded by the hash symbol (#) on social media used to identify messages and posts relating to a specific topic. All public posts that use the same hashtag are aggregated onto a centralized page dedicated to the word or phrase and sorted either chronologically or by popularity. no changeIdentify Trending Topics/Hashtags -
80T0080.004Conduct Web Traffic AnalysisTA13An influence operation may conduct web traffic analysis to determine which search engines, keywords, websites, and advertisements gain the most traction with its target audience.no changeConduct Web Traffic Analysis -
81T0080.005Assess Degree/Type of Media AccessTA13An influence operation may survey a target audiences Internet availability and degree of media freedom to determine which target audience members will have access to operation content and on which platforms. An operation may face more difficulty targeting an information environment with heavy restrictions and media control than an environment with independent media, freedom of speech and of the press, and individual liberties. no changeAssess Degree/Type of Media Access -
82T0081Identify Social and Technical VulnerabilitiesTA13Identifying social and technical vulnerabilities determines weaknesses within the target audience information environment for later exploitation. Vulnerabilities include decisive political issues, weak cybersecurity infrastructure, search engine data voids, and other technical and non technical weaknesses in the target information environment. Identifying social and technical vulnerabilities facilitates the later exploitation of the identified weaknesses to advance operation objectives. newT0081 - Identify Social and Technical Vulnerabilities
83T0081.001Find Echo ChambersTA13Find or plan to create areas (social media groups, search term groups, hashtag groups etc) where individuals only engage with people they agree with. no changeFind Echo Chambers -
84T0081.002Identify Data VoidsTA13A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation. A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalizing on most search engines preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term. no changeIdentify Data Voids -
85T0081.003Identify Existing PrejudicesTA13An influence operation may exploit existing racial, religious, demographic, or social prejudices to further polarize its target audience from the rest of the public.no changeIdentify Existing Prejudices -
86T0081.004Identify Existing FissuresTA13An influence operation may identify existing fissures to pit target populations against one another or facilitate a “divide-and-conquer" approach to tailor operation narratives along the divides.no changeIdentify Existing Fissures -
87T0081.005Identify Existing Conspiracy Narratives/SuspicionsTA13An influence operation may assess preexisting conspiracy theories or suspicions in a population to identify existing narratives that support operational objectives. no changeIdentify Existing Conspiracy Narratives/Suspicions -
88T0081.006Identify Wedge IssuesTA13A wedge issue is a divisive political issue, usually concerning a social phenomenon, that divides individuals along a defined line. An influence operation may exploit wedge issues by intentionally polarizing the public along the wedge issue line and encouraging opposition between factions.no changeIdentify Wedge Issues -
89T0081.007Identify Target Audience AdversariesTA13An influence operation may identify or create a real or imaginary adversary to center operation narratives against. A real adversary may include certain politicians or political parties while imaginary adversaries may include falsified “deep state”62 actors that, according to conspiracies, run the state behind public view. no changeIdentify Target Audience Adversaries -
90T0081.008Identify Media System VulnerabilitiesTA13An influence operation may exploit existing weaknesses in a targets media system. These weaknesses may include existing biases among media agencies, vulnerability to false news agencies on social media, or existing distrust of traditional media sources. An existing distrust among the public in the media systems credibility holds high potential for exploitation by an influence operation when establishing alternative news agencies to spread operation content. no changeIdentify Media System Vulnerabilities -
91T0082Develop New NarrativesTA14Actors may develop new narratives to further strategic or tactical goals, especially when existing narratives adequately align with the campaign goals. New narratives provide more control in terms of crafting the message to achieve specific goals. However, new narratives may require more effort to disseminate than adapting or adopting existing narratives. newT0082 - Develop New Narratives
92T0083Integrate Target Audience Vulnerabilities into NarrativeTA14An influence operation may seek to exploit the preexisting weaknesses, fears, and enemies of the target audience for integration into the operations narratives and overall strategy. Integrating existing vulnerabilities into the operational approach conserves resources by exploiting already weak areas of the target information environment instead of forcing the operation to create new vulnerabilities in the environment.newT0083 - Integrate Target Audience Vulnerabilities into Narrative
93T0084Reuse Existing ContentTA06When an operation recycles content from its own previous operations or plagiarizes from external operations. An operation may launder information to conserve resources that would have otherwise been utilized to develop new content. newT0084 - Reuse Existing Content
94T0084.001Use CopypastaTA06Copypasta refers to a piece of text that has been copied and pasted multiple times across various online platforms. A copypastas final form may differ from its original source text as users add, delete, or otherwise edit the content as they repost the text. no changeUse Copypasta -
95T0084.002Plagiarize ContentTA06An influence operation may take content from other sources without proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources. no changePlagiarize Content -
96T0084.003Deceptively Labeled or TranslatedTA06An influence operation may take authentic content from other sources and add deceptive labels or deceptively translate the content into other langauges. no changeDeceptively Labeled or Translated -
97T0084.004Appropriate ContentTA06An influence operation may take content from other sources with proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources. Examples include the appropriation of content from one inauthentic news site to another inauthentic news site or network in ways that align with the originators licensing or terms of service.no changeAppropriate Content -
98T0085Develop Text-based ContentTA06Creating and editing false or misleading text-based artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign.newT0085 - Develop Text-based Content
99T0085.001Develop AI-Generated TextTA06AI-generated texts refers to synthetic text composed by computers using text-generating AI technology. Autonomous generation refers to content created by a bot without human input, also known as bot-created content generation. Autonomous generation represents the next step in automation after language generation and may lead to automated journalism. An influence operation may use read fakes or autonomous generation to quickly develop and distribute content to the target audience.no changeDevelop AI-Generated Text -
100T0085.002Develop False or Altered DocumentsTA06Develop False or Altered Documentsno changeDevelop False or Altered Documents -
101T0085.003Develop Inauthentic News ArticlesTA06An influence operation may develop false or misleading news articles aligned to their campaign goals or narratives. no changeDevelop Inauthentic News Articles -
102T0086Develop Image-based ContentTA06Creating and editing false or misleading visual artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include photographing staged real-life situations, repurposing existing digital images, or using image creation and editing technologies.newT0086 - Develop Image-based Content
103T0086.001Develop MemesTA06Memes are one of the most important single artefact types in all of computational propaganda. Memes in this framework denotes the narrow image-based definition. But that naming is no accident, as these items have most of the important properties of Dawkins' original conception as a self-replicating unit of culture. Memes pull together reference and commentary; image and narrative; emotion and message. Memes are a powerful tool and the heart of modern influence campaigns.no changeDevelop Memes -
104T0086.002Develop AI-Generated Images (Deepfakes)TA06Deepfakes refer to AI-generated falsified photos, videos, or soundbites. An influence operation may use deepfakes to depict an inauthentic situation by synthetically recreating an individuals face, body, voice, and physical gestures.no changeDevelop AI-Generated Images (Deepfakes) -
105T0086.003Deceptively Edit Images (Cheap fakes)TA06Cheap fakes utilize less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.no changeDeceptively Edit Images (Cheap fakes) -
106T0086.004Aggregate Information into Evidence CollagesTA06Image files that aggregate positive evidence (Joan Donovan)no changeAggregate Information into Evidence Collages -
107T0087Develop Video-based ContentTA06Creating and editing false or misleading video artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include staging videos of purportedly real situations, repurposing existing video artifacts, or using AI-generated video creation and editing technologies (including deepfakes).newT0087 - Develop Video-based Content
108T0087.001Develop AI-Generated Videos (Deepfakes)TA06Deepfakes refer to AI-generated falsified photos, videos, or soundbites. An influence operation may use deepfakes to depict an inauthentic situation by synthetically recreating an individuals face, body, voice, and physical gestures.no changeDevelop AI-Generated Videos (Deepfakes) -
109T0087.002Deceptively Edit Video (Cheap fakes)TA06Cheap fakes utilize less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.no changeDeceptively Edit Video (Cheap fakes) -
110T0088Develop Audio-based ContentTA06Creating and editing false or misleading audio artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include creating completely new audio content, repurposing existing audio artifacts (including cheap fakes), or using AI-generated audio creation and editing technologies (including deepfakes).newT0088 - Develop Audio-based Content
111T0088.001Develop AI-Generated Audio (Deepfakes)TA06Deepfakes refer to AI-generated falsified photos, videos, or soundbites. An influence operation may use deepfakes to depict an inauthentic situation by synthetically recreating an individuals face, body, voice, and physical gestures.no changeDevelop AI-Generated Audio (Deepfakes) -
112T0088.002Deceptively Edit Audio (Cheap fakes)TA06Cheap fakes utilize less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.no changeDeceptively Edit Audio (Cheap fakes) -
113T0089Obtain Private DocumentsTA06Procuring documents that are not publicly available, by whatever means -- whether legal or illegal, highly-resourced or less so. These documents can include authentic non-public documents, authentic non-public documents have been altered, or inauthentic documents intended to appear as if they are authentic non-public documents. All of these types of documents can be "leaked" during later stages in the operation.newT0089 - Obtain Private Documents
114T0089.001Obtain Authentic DocumentsTA06Procure authentic documents that are not publicly available, by whatever means -- whether legal or illegal, highly-resourced or less so. These documents can be "leaked" during later stages in the operation.no changeObtain Authentic Documents -
115T0089.002Create Inauthentic DocumentsTA06Create inauthentic documents intended to appear as if they are authentic non-public documents. These documents can be "leaked" during later stages in the operation.no changeCreate Inauthentic Documents -
116T0089.003Alter Authentic DocumentsTA06Alter authentic documents (public or non-public) to achieve campaign goals. The altered documents are intended to appear as if they are authentic can be "leaked" during later stages in the operation.no changeAlter Authentic Documents -
117T0090Create Inauthentic AccountsTA15Inauthentic accounts include bot accounts, cyborg accounts, sockpuppet accounts, and anonymous accounts.newT0090 - Create Inauthentic Accounts
118T0090.001Create Anonymous AccountsTA15Anonymous accounts or anonymous users refer to users that access network resources without providing a username or password. An influence operation may use anonymous accounts to spread content without direct attribution to the operation. no changeCreate Anonymous Accounts -
119T0090.002Create Cyborg AccountsTA15Cyborg accounts refer to partly manned, partly automated social media accounts. Cyborg accounts primarily act as bots, but a human operator periodically takes control of the account to engage with real social media users by responding to comments and posting original content. Influence operations may use cyborg accounts to reduce the amount of direct human input required to maintain a regular account but increase the apparent legitimacy of the cyborg account by occasionally breaking its bot-like behavior with human interaction. no changeCreate Cyborg Accounts -
120T0090.003Create Bot AccountsTA15Bots refer to autonomous internet users that interact with systems or other users while imitating traditional human behavior. Bots use a variety of tools to stay active without direct human operation, including artificial intelligence and big data analytics. For example, an individual may program a Twitter bot to retweet a tweet every time it contains a certain keyword or hashtag. An influence operation may use bots to increase its exposure and artificially promote its content across the internet without dedicating additional time or human resources. Amplifier bots promote operation content through reposts, shares, and likes to increase the contents online popularity. Hacker bots are traditionally covert bots running on computer scripts that rarely engage with users and work primarily as agents of larger cyberattacks, such as a Distributed Denial of Service attacks. Spammer bots are programmed to post content on social media or in comment sections, usually as a supplementary tool. Impersonator bots102 pose as real people by mimicking human behavior, complicating their detection. no changeCreate Bot Accounts -
121T0090.004Create Sockpuppet AccountsTA15Sockpuppet accounts refer to falsified accounts that either promote the influence operations own material or attack critics of the material online. Individuals who control sockpuppet accounts also man at least one other user account.67 Sockpuppet accounts help legitimize operation narratives by providing an appearance of external support for the material and discrediting opponents of the operation. no changeCreate Sockpuppet Accounts -
122T0091Recruit malign actorsTA15Operators recruit bad actors paying recruiting, or exerting control over individuals includes trolls, partisans, and contractors.newT0091 - Recruit malign actors
123T0091.001Recruit ContractorsTA15Operators recruit paid contractor to support the campaign.no changeRecruit Contractors -
124T0091.002Recruit PartisansTA15Operators recruit partisans (ideologically-aligned individuals) to support the campaign.no changeRecruit Partisans -
125T0091.003Enlist Troll AccountsTA15An influence operation may hire trolls, or human operators of fake accounts that aim to provoke others by posting and amplifying content about controversial issues. Trolls can serve to discredit an influence operations opposition or bring attention to the operations cause through debate. Classic trolls refer to regular people who troll for personal reasons, such as attention-seeking or boredom. Classic trolls may advance operation narratives by coincidence but are not directly affiliated with any larger operation. Conversely, hybrid trolls act on behalf of another institution, such as a state or financial organization, and post content with a specific ideological goal. Hybrid trolls may be highly advanced and institutionalized or less organized and work for a single individual. no changeEnlist Troll Accounts -
126T0092Build NetworkTA15Operators build their own network, creating links between accounts -- whether authentic or inauthentic -- in order amplify and promote narratives and artifacts, and encourage further growth of ther network, as well as the ongoing sharing and engagement with operational content.newT0092 - Build Network
127T0092.001Create OrganizationsTA15Influence operations may establish organizations with legitimate or falsified hierarchies, staff, and content to structure operation assets, provide a sense of legitimacy to the operation, or provide institutional backing to operation activities.no changeCreate Organizations -
128T0092.002Use Follow TrainsTA15A follow train is a group of people who follow each other on a social media platform, often as a way for an individual or campaign to grow its social media following. Follow trains may be a violation of platform Terms of Service. They are also known as follow-for-follow groups. no changeUse Follow Trains -
129T0092.003Create Community or Sub-groupTA15When there is not an existing community or sub-group that meets a campaign's goals, an influence operation may seek to create a community or sub-group. no changeCreate Community or Sub-group -
130T0093Acquire/Recruit NetworkTA15Operators acquire an existing network by paying, recruiting, or exerting control over the leaders of the existing network. newT0093 - Acquire/Recruit Network
131T0093.001Fund ProxiesTA15An influence operation may fund proxies, or external entities that work for the operation. An operation may recruit/train users with existing sympathies towards the operations narratives and/or goals as proxies. Funding proxies serves various purposes including: - Diversifying operation locations to complicate attribution - Reducing the workload for direct operation assets no changeFund Proxies -
132T0093.002Acquire BotnetsTA15A botnet is a group of bots that can function in coordination with each other. no changeAcquire Botnets -
133T0094Infiltrate Existing NetworksTA15Operators deceptively insert social assets into existing networks as group members in order to influence the members of the network and the wider information environment that the network impacts.newT0094 - Infiltrate Existing Networks
134T0094.001Identify susceptible targets in networksTA15When seeking to infiltrate an existing network, an influence operation may identify individuals and groups that might be susceptible to being co-opted or influenced.no changeIdentify susceptible targets in networks -
135T0094.002Utilize Butterfly AttacksTA15Butterfly attacks occur when operators pretend to be members of a certain social group, usually a group that struggles for representation. An influence operation may mimic a group to insert controversial statements into the discourse, encourage the spread of operation content, or promote harassment among group members. Unlike astroturfing, butterfly attacks aim to infiltrate and discredit existing grassroots movements, organizations, and media campaigns. no changeUtilize Butterfly Attacks -
136T0095Develop Owned Media AssetsTA15An owned media asset refers to an agency or organization through which an influence operation may create, develop, and host content and narratives. Owned media assets include websites, blogs, social media pages, forums, and other platforms that facilitate the creation and organization of content.newT0095 - Develop Owned Media Assets
137T0096Leverage Content FarmsTA15Using the services of large-scale content providers for creating and amplifying campaign artifacts at scale.newT0096 - Leverage Content Farms
138T0096.001Create Content FarmsTA15An influence operation may create an organization for creating and amplifying campaign artifacts at scale.no changeCreate Content Farms -
139T0096.002Outsource Content Creation to External OrganizationsTA15An influence operation may outsource content creation to external companies to avoid attribution, increase the rate of content creation, or improve content quality, i.e., by employing an organization that can create content in the target audiences native language. Employed organizations may include marketing companies for tailored advertisements or external content farms for high volumes of targeted media. no changeOutsource Content Creation to External Organizations -
140T0097Create personasTA16Creating fake people, often with accounts across multiple platforms. These personas can be as simple as a name, can contain slightly more background like location, profile pictures, backstory, or can be effectively backstopped with indicators like fake identity documents. newT0097 - Create personas
141T0097.001Backstop personas TA16Create other assets/dossier/cover/fake relationships and/or connections or documents, sites, bylines, attributions, to establish/augment/inflate crediblity/believabilityno changeBackstop personas -
142T0098Establish Inauthentic News SitesTA16Modern computational propaganda makes use of a cadre of imposter news sites spreading globally. These sites, sometimes motivated by concerns other than propaganda--for instance, click-based revenue--often have some superficial markers of authenticity, such as naming and site-design. But many can be quickly exposed with reference to their owenership, reporting history and adverstising details.newT0098 - Establish Inauthentic News Sites
143T0098.001Create Inauthentic News SitesTA16Create Inauthentic News Sitesno changeCreate Inauthentic News Sites -
144T0098.002Leverage Existing Inauthentic News SitesTA16Leverage Existing Inauthentic News Sitesno changeLeverage Existing Inauthentic News Sites -
145T0099Prepare Assets Impersonating Legitimate EntitiesTA16An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognizable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organizations, or state entities. An influence operation may use a wide variety of cyber techniques to impersonate a legitimate entitys website or social media account. Typosquatting87 is the international registration of a domain name with purposeful variations of the impersonated domain name through intentional typos, top-level domain (TLD) manipulation, or punycode. Typosquatting facilitates the creation of falsified websites by creating similar domain names in the URL box, leaving it to the user to confirm that the URL is correct. newT0099 - Prepare Assets Impersonating Legitimate Entities
146T0099.001AstroturfingTA16Astroturfing occurs when an influence operation disguises itself as grassroots movement or organization that supports operation narratives. Unlike butterfly attacks, astroturfing aims to increase the appearance of popular support for the operation cause and does not infiltrate existing groups to discredit their objectives. no changeAstroturfing -
147T0099.002Spoof/parody account/siteTA16An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognizable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organizations, or state entities. no changeSpoof/parody account/site -
148T0100Co-opt Trusted SourcesTA16An influence operation may co-opt trusted sources by infiltrating or repurposing a source to reach a target audience through existing, previously reliable networks. Co-opted trusted sources may include: - National or local new outlets - Research or academic publications - Online blogs or websites newT0100 - Co-opt Trusted Sources
149T0100.001Co-Opt Trusted IndividualsTA16Co-Opt Trusted Individualsno changeCo-Opt Trusted Individuals -
150T0100.002Co-Opt Grassroots GroupsTA16Co-Opt Grassroots Groupsno changeCo-Opt Grassroots Groups -
151T0100.003Co-opt InfluencersTA16Co-opt Influencersno changeCo-opt Influencers -
152T0101Create Localized ContentTA05Localized content refers to content that appeals to a specific community of individuals, often in defined geographic areas. An operation may create localized content using local language and dialects to resonate with its target audience and blend in with other local news and social media. Localized content may help an operation increase legitimacy, avoid detection, and complicate external attribution.newT0101 - Create Localized Content
153T0102Leverage Echo Chambers/Filter BubblesTA05An echo chamber refers to an internet subgroup, often along ideological lines, where individuals only engage with “others with which they are already in agreement.” A filter bubble refers to an algorithm's placement of an individual in content that they agree with or regularly engage with, possibly entrapping the user into a bubble of their own making. An operation may create these isolated areas of the internet by match existing groups, or aggregating individuals into a single target audience based on shared interests, politics, values, demographics, and other characteristics. Echo chambers and filter bubbles help to reinforce similar biases and content to the same target audience members. newT0102 - Leverage Echo Chambers/Filter Bubbles
154T0102.001Use existing Echo Chambers/Filter BubblesTA05Use existing Echo Chambers/Filter Bubblesno changeUse existing Echo Chambers/Filter Bubbles -
155T0102.002Create Echo Chambers/Filter BubblesTA05Create Echo Chambers/Filter Bubblesno changeCreate Echo Chambers/Filter Bubbles -
156T0102.003Exploit Data VoidsTA05A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation. A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalizing on most search engines preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term. no changeExploit Data Voids -
157T0103LivestreamTA07A livestream refers to an online broadcast capability that allows for real-time communication to closed or open networks.newT0103 - Livestream
158T0103.001Video LivestreamTA07A video livestream refers to an online video broadcast capability that allows for real-time communication to closed or open networks.no changeVideo Livestream -
159T0103.002Audio LivestreamTA07An audio livestream refers to an online audio broadcast capability that allows for real-time communication to closed or open networks.no changeAudio Livestream -
160T0104Social NetworksTA07Social media are interactive digital channels that facilitate the creation and sharing of information, ideas, interests, and other forms of expression through virtual communities and networks.newT0104 - Social Networks
161T0104.001Mainstream Social NetworksTA07Examples include Facebook, Twitter, LinkedIn, etc.no changeMainstream Social Networks -
162T0104.002Dating AppsTA07A video livestream refers to an online video broadcast capability that allows for real-time communication to closed or open networks. Examples include Facebook Live, Instagram, Youtube, Tik Tok, and Twitter. no changeDating Apps -
163T0104.003Private/Closed Social NetworksTA07An audio livestream refers to an online audio broadcast capability that allows for real-time communication to closed or open networks. Examples include Twitter Spaces, no changePrivate/Closed Social Networks -
164T0104.004Interest-Based NetworksTA07Examples include smaller and niche networks including Gettr, Truth Social, Parler, etc.no changeInterest-Based Networks -
165T0104.005Use hashtagsTA07Use a dedicated, existing hashtag for the campaign/incident.no changeUse hashtags -
166T0104.006Create dedicated hashtagTA07Create a campaign/incident specific hashtag.no changeCreate dedicated hashtag -
167T0105Media Sharing NetworksTA07Media sharing networks refer to services whose primary function is the hosting and sharing of specific forms of media. Examples include Instagram, Snapchat, TikTok, Youtube, SoundCloud.newT0105 - Media Sharing Networks
168T0105.001Photo SharingTA07Examples include Instagram, Snapchat, Flickr, etcno changePhoto Sharing -
169T0105.002Video SharingTA07Examples include Youtube, TikTok, ShareChat, Rumble, etcno changeVideo Sharing -
170T0105.003Audio sharingTA07Examples include podcasting apps, Soundcloud, etc.no changeAudio sharing -
171T0106Discussion ForumsTA07Platforms for finding, discussing, and sharing information and opinions. Examples include Reddit, Quora, Digg, message boards, interest-based discussion forums, etc.newT0106 - Discussion Forums
172T0106.001Anonymous Message BoardsTA07Examples include the Chansno changeAnonymous Message Boards -
173T0107Bookmarking and Content CurationTA07Platforms for searching, sharing, and curating content and media. Examples include Pinterest, Flipboard, etc.newT0107 - Bookmarking and Content Curation
174T0108Blogging and Publishing NetworksTA07Examples include WordPress, Blogger, Weebly, Tumblr, Medium, etc. newT0108 - Blogging and Publishing Networks
175T0109Consumer Review NetworksTA07Platforms for finding, reviewing, and sharing information about brands, products, services, restaurants, travel destinations, etc. Examples include Yelp, TripAdvisor, etc.newT0109 - Consumer Review Networks
176T0110Formal Diplomatic ChannelsTA07Leveraging formal, traditional, diplomatic channels to communicate with foreign governments (written documents, meetings, summits, diplomatic visits, etc). This type of diplomacy is conducted by diplomats of one nation with diplomats and other officials of another nation or international organization.newT0110 - Formal Diplomatic Channels
177T0111Traditional MediaTA07Examples include TV, Newspaper, Radio, etc.newT0111 - Traditional Media
178T0111.001TVTA07TVno changeTV -
179T0111.002NewspaperTA07Newspaperno changeNewspaper -
180T0111.003RadioTA07Radiono changeRadio -
181T0112EmailTA07Delivering content and narratives via email. This can include using list management or high-value individually targeted messaging.newT0112 - Email
182T0113Employ Commercial Analytic FirmsTA08Commercial analytic firms collect data on target audience activities and evaluate the data to detect trends, such as content receiving high click-rates. An influence operation may employ commercial analytic firms to facilitate external collection on its target audience, complicating attribution efforts and better tailoring the content to audience preferences. newT0113 - Employ Commercial Analytic Firms
183T0114Deliver AdsTA09Delivering content via any form of paid media or advertising.newT0114 - Deliver Ads
184T0114.001Social mediaTA09Social Mediano changeSocial media -
185T0114.002Traditional MediaTA09Examples include TV, Radio, Newspaper, billboardsno changeTraditional Media -
186T0115Post ContentTA09Delivering content by posting via owned media (assets that the operator controls). newT0115 - Post Content
187T0115.001Share MemesTA09Memes are one of the most important single artefact types in all of computational propaganda. Memes in this framework denotes the narrow image-based definition. But that naming is no accident, as these items have most of the important properties of Dawkins' original conception as a self-replicating unit of culture. Memes pull together reference and commentary; image and narrative; emotion and message. Memes are a powerful tool and the heart of modern influence campaigns.no changeShare Memes -
188T0115.002Post Violative Content to Provoke Takedown and BacklashTA09Post Violative Content to Provoke Takedown and Backlash.no changePost Violative Content to Provoke Takedown and Backlash -
189T0115.003One-Way Direct PostingTA09Direct posting refers to a method of posting content via a one-way messaging service, where the recipient cannot directly respond to the posters messaging. An influence operation may post directly to promote operation narratives to the target audience without allowing opportunities for fact-checking or disagreement, creating a false sense of support for the narrative. no changeOne-Way Direct Posting -
190T0116Comment or Reply on ContentTA09Delivering content by replying or commenting via owned media (assets that the operator controls). newT0116 - Comment or Reply on Content
191T0116.001Post inauthentic social media commentTA09Use government-paid social media commenters, astroturfers, chat bots (programmed to reply to specific key words/hashtags) influence online conversations, product reviews, web-site comment forums.no changePost inauthentic social media comment -
192T0117Attract Traditional MediaTA09Deliver content by attracting the attention of traditional media (earned media).newT0117 - Attract Traditional Media
193T0118Amplify Existing NarrativeTA17An influence operation may amplify existing narratives that align with its narratives to support operation objectives. newT0118 - Amplify Existing Narrative
194T0119Cross-PostingTA17Cross-posting refers to posting the same message to multiple internet discussions, social media platforms or accounts, or news groups at one time. An influence operation may post content online in multiple communities and platforms to increase the chances of content exposure to the target audience. newT0119 - Cross-Posting
195T0119.001Post Across GroupsTA17An influence operation may post content across groups to spread narratives and content to new communities within the target audiences or to new target audiences. no changePost Across Groups -
196T0119.002Post Across PlatformTA17An influence operation may post content across platforms to spread narratives and content to new communities within the target audiences or to new target audiences. Posting across platforms can also remove opposition and context, helping the narrative spread with less opposition on the cross-posted platform. no changePost Across Platform -
197T0119.003Post Across DisciplinesTA17Post Across Disciplinesno changePost Across Disciplines -
198T0120Incentivize SharingTA17Incentivizing content sharing refers to actions that encourage users to share content themselves, reducing the need for the operation itself to post and promote its own content.newT0120 - Incentivize Sharing
199T0120.001Use Affiliate Marketing ProgramsTA17Use Affiliate Marketing Programsno changeUse Affiliate Marketing Programs -
200T0120.002Use Contests and PrizesTA17Use Contests and Prizesno changeUse Contests and Prizes -
201T0121Manipulate Platform AlgorithmTA17Manipulating a platform algorithm refers to conducting activity on a platform in a way that intentionally targets its underlying algorithm. After analyzing a platforms algorithm (see: Select Platforms), an influence operation may use a platform in a way that increases its content exposure, avoids content removal, or otherwise benefits the operations strategy. For example, an influence operation may use bots to amplify its posts so that the platforms algorithm recognizes engagement with operation content and further promotes the content on user timelines. newT0121 - Manipulate Platform Algorithm
202T0121.001Bypass Content BlockingTA17Bypassing content blocking refers to actions taken to circumvent network security measures that prevent users from accessing certain servers, resources, or other online spheres. An influence operation may bypass content blocking to proliferate its content on restricted areas of the internet. Common strategies for bypassing content blocking include: - Altering IP addresses to avoid IP filtering - Using a Virtual Private Network (VPN) to avoid IP filtering - Using a Content Delivery Network (CDN) to avoid IP filtering - Enabling encryption to bypass packet inspection blocking - Manipulating text to avoid filtering by keywords - Posting content on multiple platforms to avoid platform-specific removals - Using local facilities or modified DNS servers to avoid DNS filtering no changeBypass Content Blocking -
203T0122Direct Users to Alternative PlatformsTA17Direct users to alternative platforms refers to encouraging users to move from the platform on which they initially viewed operation content and engage with content on alternate information channels, including separate social media channels and inauthentic websites. An operation may drive users to alternative platforms to diversify its information channels and ensure the target audience knows where to access operation content if the initial platform suspends, flags, or otherwise removes original operation assets and content. newT0122 - Direct Users to Alternative Platforms
204T0123Control Information Environment through Offensive Cyberspace OperationsTA18Controlling the information environment through offensive cyberspace operations uses cyber tools and techniques to alter the trajectory of content in the information space to either prioritize operation messaging or block opposition messaging.newT0123 - Control Information Environment through Offensive Cyberspace Operations
205T0123.001Delete Opposing ContentTA18Deleting opposing content refers to the removal of content that conflicts with operational narratives from selected platforms. An influence operation may delete opposing content to censor contradictory information from the target audience, allowing operation narratives to take priority in the information space.no changeDelete Opposing Content -
206T0123.002Block ContentTA18Content blocking refers to actions taken to restrict internet access or render certain areas of the internet inaccessible. An influence operation may restrict content based on both network and content attributes. no changeBlock Content -
207T0123.003Destroy Information Generation CapabilitiesTA18Destroying information generation capabilities refers to actions taken to limit, degrade, or otherwise incapacitate an actors ability to generate conflicting information. An influence operation may destroy an actors information generation capabilities by physically dismantling the information infrastructure, disconnecting resources needed for information generation, or redirecting information generation personnel. An operation may destroy an adversarys information generation capabilities to limit conflicting content exposure to the target audience and crowd the information space with its own narratives. no changeDestroy Information Generation Capabilities -
208T0123.004Conduct Server RedirectTA18A server redirect, also known as a URL redirect, occurs when a server automatically forwards a user from one URL to another using server-side scripting languages. An influence operation may conduct a server redirect to divert target audience members from one website to another without their knowledge. The redirected website may pose as a legitimate source, host malware, or otherwise aid operation objectives.no changeConduct Server Redirect -
209T0124Suppress OppositionTA18Operators can suppress the opposition by exploiting platform content moderation tools and processes like reporting non-violative content to platforms for takedown and goading opposition actors into taking actions that result in platform action or target audience disapproval. newT0124 - Suppress Opposition
210T0124.001Report Non-Violative Opposing ContentTA18Reporting opposing content refers to notifying and providing an instance of a violation of a platforms guidelines and policies for conduct on the platform. In addition to simply reporting the content, an operation may leverage copyright regulations to trick social media and web platforms into removing opposing content by manipulating the content to appear in violation of copyright laws. Reporting opposing content facilitates the suppression of contradictory information and allows operation narratives to take priority in the information space. no changeReport Non-Violative Opposing Content -
211T0124.002Goad People into Harmful Action (Stop Hitting Yourself)TA18Goad people into actions that violate terms of service or will lead to having their content or accounts taken down. no changeGoad People into Harmful Action (Stop Hitting Yourself) -
212T0124.003Exploit Platform TOS/Content ModerationTA18Exploit Platform TOS/Content Moderationno changeExploit Platform TOS/Content Moderation -
213T0125Platform FilteringTA18Platform filtering refers to the decontextualization of information as claims cross platforms (from Joan Donovan https://www.hks.harvard.edu/publications/disinformation-design-use-evidence-collages-and-platform-filtering-media-manipulation)newT0125 - Platform Filtering
214T0126Encourage Attendance at EventsTA10Operation encourages attendance at existing real world event.newT0126 - Encourage Attendance at Events
215T0126.001Call to action to attend TA10Call to action to attend an eventno changeCall to action to attend -
216T0126.002Facilitate logistics or support for attendanceTA10Facilitate logistics or support for travel, food, housing, etc.no changeFacilitate logistics or support for attendance -
217T0127Physical ViolenceTA10Physical violence refers to the use of force to injure, abuse, damage, or destroy. An influence operation may conduct or encourage physical violence to discourage opponents from promoting conflicting content or draw attention to operation narratives using shock value. newT0127 - Physical Violence
218T0127.001Conduct Physical ViolenceTA10An influence operation may directly Conduct Physical Violence to achieve campaign goals. no changeConduct Physical Violence -
219T0127.002Encourage Physical ViolenceTA10An influence operation may Encourage others to engage in Physical Violence to achieve campaign goals. no changeEncourage Physical Violence -
220T0128Conceal PeopleTA11Conceal the identity or provenance of a campaign account and people assets to avoid takedown and attribution.split from T0012T0128 - Conceal People
221T0128.001Use PseudonymsTA11An operation may use pseudonyms, or fake names, to mask the identity of operation accounts, publish anonymous content, or otherwise use falsified personas to conceal identity of the operation. An operation may coordinate pseudonyms across multiple platforms, for example, by writing an article under a pseudonym and then posting a link to the article on social media on an account with the same falsified name. no changeUse Pseudonyms -
222T0128.002Conceal Network IdentityTA11Concealing network identity aims to hide the existence an influence operations network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organization. no changeConceal Network Identity -
223T0128.003Distance Reputable Individuals from OperationTA11Distancing reputable individuals from the operation occurs when enlisted individuals, such as celebrities or subject matter experts, actively disengage themselves from operation activities and messaging. Individuals may distance themselves from the operation by deleting old posts or statements, unfollowing operation information assets, or otherwise detaching themselves from the operations timeline. An influence operation may want reputable individuals to distance themselves from the operation to reduce operation exposure, particularly if the operation aims to remove all evidence.no changeDistance Reputable Individuals from Operation -
224T0128.004Launder AccountsTA11Account laundering occurs when an influence operation acquires control of previously legitimate online accounts from third parties through sale or exchange and often in contravention of terms of use. Influence operations use laundered accounts to reach target audience members from an existing information channel and complicate attribution. no changeLaunder Accounts -
225T0128.005Change Names of AccountsTA11Changing names of accounts occurs when an operation changes the name of an existing social media account. An operation may change the names of its accounts throughout an operation to avoid detection or alter the names of newly acquired or repurposed accounts to fit operational narratives. no changeChange Names of Accounts -
226T0129Conceal Operational ActivityTA11Conceal the campaign's operational activity to avoid takedown and attribution.split from T0012T0129 - Conceal Operational Activity
227T0129.001Conceal Network IdentityTA11Concealing network identity aims to hide the existence an influence operations network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organization. no changeConceal Network Identity -
228T0129.002Generate Content Unrelated to NarrativeTA11An influence operation may mix its own operation content with legitimate news or external unrelated content to disguise operational objectives, narratives, or existence. For example, an operation may generate "lifestyle" or "cuisine" content alongside regular operation content. no changeGenerate Content Unrelated to Narrative -
229T0129.003Break Association with ContentTA11Breaking association with content occurs when an influence operation actively separates itself from its own content. An influence operation may break association with content by unfollowing, unliking, or unsharing its content, removing attribution from its content, or otherwise taking actions that distance the operation from its messaging. An influence operation may break association with its content to complicate attribution or regain credibility for a new operation. no changeBreak Association with Content -
230T0129.004Delete URLsTA11URL deletion occurs when an influence operation completely removes its website registration, rendering the URL inaccessible. An influence operation may delete its URLs to complicate attribution or remove online documentation that the operation ever occurred.no changeDelete URLs -
231T0129.005Coordinate on encrypted/closed networksTA11Coordinate on encrypted/ closed networksno changeCoordinate on encrypted/closed networks -
232T0129.006Deny involvementTA11Without "smoking gun" proof (and even with proof), incident creator can or will deny involvement. This technique also leverages the attacker advantages outlined in "Demand insurmountable proof", specifically the asymmetric disadvantage for truth-tellers in a "firehose of misinformation" environment.no changeDeny involvement -
233T0129.007Delete Accounts/Account ActivityTA11Deleting accounts and account activity occurs when an influence operation removes its online social media assets, including social media accounts, posts, likes, comments, and other online artifacts. An influence operation may delete its accounts and account activity to complicate attribution or remove online documentation that the operation ever occurred. no changeDelete Accounts/Account Activity -
234T0129.008Redirect URLsTA11An influence operation may redirect its falsified or typosquatted URLs to legitimate websites to increase the operation's appearance of legitimacy, complicate attribution, and avoid detection. no changeRedirect URLs -
235T0129.009Remove Post OriginsTA11Removing post origins refers to the elimination of evidence that indicates the initial source of operation content, often to complicate attribution. An influence operation may remove post origins by deleting watermarks, renaming files, or removing embedded links in its content. no changeRemove Post Origins -
236T0129.010Misattribute ActivityTA11Misattributed activity refers to incorrectly attributed operation activity. For example, a state sponsored influence operation may conduct operation activity in a way that mimics another state so that external entities misattribute activity to the incorrect state. An operation may misattribute their activities to complicate attribution, avoid detection, or frame an adversary for negative behavior. no changeMisattribute Activity -
237T0130Conceal InfrastructureTA11Conceal the campaign's infrastructure to avoid takedown and attribution.split from T0012T0130 - Conceal Infrastructure
238T0130.001Conceal SponsorshipTA11Concealing sponsorship aims to mislead or obscure the identity of the hidden sponsor behind an operation rather than entity publicly running the operation. Operations that conceal sponsorship may maintain visible falsified groups, news outlets, non-profits, or other organizations, but seek to mislead or obscure the identity sponsoring, funding, or otherwise supporting these entities. Influence operations may use a variety of techniques to mask the location of their social media accounts to complicate attribution and conceal evidence of foreign interference. Operation accounts may set their location to a false place, often the location of the operations target audience, and post in the regions languageno changeConceal Sponsorship -
239T0130.002Utilize Bulletproof HostingTA11Hosting refers to services through which storage and computing resources are provided to an individual or organization for the accommodation and maintenance of one or more websites and related services. Services may include web hosting, file sharing, and email distribution. Bulletproof hosting refers to services provided by an entity, such as a domain hosting or web hosting firm, that allows its customer considerable leniency in use of the service. An influence operation may utilize bulletproof hosting to maintain continuity of service for suspicious, illegal, or disruptive operation activities that stricter hosting services would limit, report, or suspend. no changeUtilize Bulletproof Hosting -
240T0130.003Use Shell OrganizationsTA11Use Shell Organizations to conceal sponsorship.no changeUse Shell Organizations -
241T0130.004Use CryptocurrencyTA11Use Cryptocurrency to conceal sponsorship. Examples include Bitcoin, Monero, and Etherium. no changeUse Cryptocurrency -
242T0130.005Obfuscate PaymentTA11Obfuscate Paymentno changeObfuscate Payment -
243T0131Exploit TOS/Content ModerationTA11Exploiting weaknesses in platforms' terms of service and content moderation policies to avoid takedowns and platform actions.newT0131 - Exploit TOS/Content Moderation
244T0131.001Legacy web contentTA11Make incident content visible for a long time, e.g. by exploiting platform terms of service, or placing it where it's hard to remove or unlikely to be removed.no changeLegacy web content -
245T0131.002Post Borderline ContentTA11Post Borderline Contentno changePost Borderline Content -
246T0132Measure PerformanceTA12A metric used to determine the accomplishment of actions. “Are the actions being executed as planned?”newT0132 - Measure Performance
247T0132.001People FocusedTA12Measure the performance individuals in achieving campaign goalsno changePeople Focused -
248T0132.002Content FocusedTA12Measure the performance of campaign contentno changeContent Focused -
249T0132.003View FocusedTA12View Focusedno changeView Focused -
250T0133Measure EffectivenessTA12A metric used to measure a current system state. “Are we on track to achieve the intended new system state within the planned timescale?”newT0133 - Measure Effectiveness
251T0133.001Behavior changesTA12Monitor and evaluate behaviour changes from misinformation incidents. no changeBehavior changes -
252T0133.002ContentTA12Measure current system state with respect to the effectiveness of campaign content. no changeContent -
253T0133.003AwarenessTA12Measure current system state with respect to the effectiveness of influencing awareness. no changeAwareness -
254T0133.004KnowledgeTA12Measure current system state with respect to the effectiveness of influencing knowledge. no changeKnowledge -
255T0133.005Action/attitudeTA12Measure current system state with respect to the effectiveness of influencing action/attitude. no changeAction/attitude -
256T0134Measure Effectiveness Indicators (or KPIs)TA12Ensuring that Key Performace Indicators are identified and tracked, so that the performance and effectivess of campaigns, and elements of campaigns, can be measured, during and after their executionnewT0134 - Measure Effectiveness Indicators (or KPIs)
257T0134.001Message reachTA12Monitor and evaluate message reach in misinformation incidents. no changeMessage reach -
258T0134.002Social media engagementTA12Monitor and evaluate social media engagement in misinformation incidents.no changeSocial media engagement -