# DISARM Techniques:
disarm_id name summary tactic_id
T0001 5Ds (dismiss, distort, distract, dismay, divide) Nimmo's "4Ds of propaganda": dismiss, distort, distract, dismay (MisinfosecWG added divide in 2019). Misinformation promotes an agenda by advancing narratives supportive of that agenda. This is most effective when the advanced narrative pre-dates the revelation of the specific misinformation content. This is often not possible. TA01
T0002 Facilitate State Propaganda Organize citizens around pro-state messaging. Coordinate paid or volunteer groups to push state propaganda. TA01
T0003 Leverage Existing Narratives Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices. TA01
T0004 Devise Competing Narratives Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centered on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the "firehose of misinformation" approach. TA01
T0006 Develop Narrative Concepts The promotion of beneficial master narratives is perhaps the most effective method for achieving long-term strategic narrative dominance. From a "whole of society" perspective the promotion of the society's core master narratives should occupy a central strategic role. From a misinformation campaign / cognitive security perpectve the tactics around master narratives center more precisely on the day-to-day promotion and reinforcement of this messaging. In other words, beneficial, high-coverage master narratives are a central strategic goal and their promotion consitutes an ongoing tactical struggle carried out at a whole-of-society level. Tactically, their promotion covers a broad spectrum of activities both on- and offline. TA02
T0007 Create fake Social Media Profiles / Pages / Groups Create key social engineering assets needed to amplify content, manipulate algorithms, fool public and/or specific incident/campaign targets. Computational propaganda depends substantially on false perceptions of credibility and acceptance. By creating fake users and groups with a variety of interests and commitments, attackers can ensure that their messages both come from trusted sources and appear more widely adopted than they actually are. TA03
T0008 Create fake or imposter news sites Modern computational propaganda makes use of a cadre of imposter news sites spreading globally. These sites, sometimes motivated by concerns other than propaganda--for instance, click-based revenue--often have some superficial markers of authenticity, such as naming and site-design. But many can be quickly exposed with reference to their owenership, reporting history and adverstising details. TA03
T0009 Create fake experts Stories planted or promoted in computational propaganda operations often make use of experts fabricated from whole cloth, sometimes specifically for the story itself. TA03
T0010 Cultivate ignorant agents Cultivate propagandists for a cause, the goals of which are not fully comprehended, and who are used cynically by the leaders of the cause. Independent actors use social media and specialised web sites to strategically reinforce and spread messages compatible with their own. Their networks are infiltrated and used by state media disinformation organisations to amplify the state’s own disinformation strategies against target populations. Many are traffickers in conspiracy theories or hoaxes, unified by a suspicion of Western governments and mainstream media. Their narratives, which appeal to leftists hostile to globalism and military intervention and nationalists against immigration, are frequently infiltrated and shaped by state-controlled trolls and altered news items from agencies such as RT and Sputnik. Also know as "useful idiots" or "unwitting agents". TA04
T0011 Compromise legitimate account Hack or take over legimate accounts to distribute misinformation or damaging content. TA04
T0012 Use concealment Use anonymous social media profiles. Examples include page or group administrators, masked "whois" website directory data, no bylines connected to news article, no masthead connect to news websites. TA04
T0013 Create fake websites Create media assets to support fake organizations (e.g. think tank), people (e.g. experts) and/or serve as sites to distribute malware/launch phishing operations. TA04
T0014 Prepare fundraising campaigns Generate revenue through new or existing funding campaigns. e.g. Gather data, advance credible persona via Gofundme; Patreon; or via fake website connecting via PayPal or Stripe. TA04
T0015 Create hashtags Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicize the story more widely through trending lists and search behavior. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites). TA04
T0016 Clickbait Create attention grabbing headlines (outrage, doubt, humor) required to drive traffic & engagement. This is a key asset TA05
T0017 Conduct Fundraising Campaigns Drive traffic/engagement to funding campaign sites; helps provide measurable metrics to assess conversion rates TA05
T0018 Purchase advertisements Create or fund advertisements targeted at specific populations TA05
T0019 Generate information pollution Flood social channels; drive traffic/engagement to all assets; create aura/sense/perception of pervasiveness/consensus (for or against or both simultaneously) of an issue or topic. "Nothing is true, but everything is possible." Akin to astroturfing campaign. TA06
T0020 Trial content Iteratively test incident performance (messages, content etc), e.g. A/B test headline/content enagagement metrics; website and/or funding campaign conversion rates TA06
T0021 Memes Memes are one of the most important single artefact types in all of computational propaganda. Memes in this framework denotes the narrow image-based definition. But that naming is no accident, as these items have most of the important properties of Dawkins' original conception as a self-replicating unit of culture. Memes pull together reference and commentary; image and narrative; emotion and message. Memes are a powerful tool and the heart of modern influence campaigns. TA06
T0022 Conspiracy narratives "Conspiracy narratives appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalized or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the ""firehose of falsehoods"" model. TA06
T0023 Distort facts Change, twist, or exaggerate existing facts to construct a narrative that differs from reality. Examples: images and ideas can be distorted by being placed in an improper content TA06
T0024 Create fake videos and images Create fake videos and/or images by manipulating existing content or generating new content (e.g. deepfakes). TA06
T0025 Leak altered documents Obtain documents (eg by theft or leak), then alter and release, possibly among factual documents/sources. TA06
T0026 Create pseudoscientific or disingenuous research Create fake academic research. Example: fake social science research is often aimed at hot-button social issues such as gender, race and sexuality. Fake science research can target Climate Science debate or pseudoscience like anti-vaxx TA06
T0027 Adapt existing narratives Adapting existing narratives to current operational goals is the tactical sweet-spot for an effective misinformation campaign. Leveraging existing narratives is not only more effective, it requires substantially less resourcing, as the promotion of new master narratives operates on a much larger scale, both time and scope. Fluid, dynamic & often interchangeable key master narratives can be ("The morally corrupt West") adapted to divisive (LGBT propaganda) or to distort (individuals working as CIA operatives). For Western audiences, different but equally powerful framings are available, such as "USA has a fraught history in race relations, especially in criminal justice areas." TA06
T0028 Create competing narratives Misinformation promotes an agenda by advancing narratives supportive of that agenda. This is most effective when the advanced narrative pre-dates the revelation of the specific misinformation content. But this is often not possible. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centered on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the *firehose of misinformation* approach. TA06
T0029 Manipulate online polls Create fake online polls, or manipulate existing online polls. Data gathering tactic to target those who engage, and potentially their networks of friends/followers as well TA07
T0030 Backstop personas Create other assets/dossier/cover/fake relationships and/or connections or documents, sites, bylines, attributions, to establish/augment/inflate crediblity/believability TA07
T0031 YouTube Use YouTube as a narrative dissemination channel TA07
T0032 Reddit Use Reddit as a narrative dissemination channel TA07
T0033 Instagram Use Instagram as a narrative dissemination channel TA07
T0034 LinkedIn Use LinkedIn as a narrative dissemination channel TA07
T0035 Pinterest Use Pinterest as a narrative dissemination channel TA07
T0036 WhatsApp Use WhatsApp as a narrative dissemination channel TA07
T0037 Facebook Use Facebook as a narrative dissemination channel TA07
T0038 Twitter Use Twitter as a narrative dissemination channel TA07
T0039 Bait legitimate influencers Credibility in a social media environment is often a function of the size of a user's network. "Influencers" are so-called because of their reach, typically understood as: 1) the size of their network (i.e. the number of followers, perhaps weighted by their own influence); and 2) The rate at which their comments are re-circulated (these two metrics are related). Add traditional media players at all levels of credibility and professionalism to this, and the number of potential influencial carriers available for unwitting amplification becomes substantial. By targeting high-influence people and organizations in all types of media with narratives and content engineered to appeal their emotional or ideological drivers, influence campaigns are able to add perceived credibility to their messaging via saturation and adoption by trusted agents such as celebrities, journalists and local leaders. TA08
T0040 Demand unsurmountable proof Campaigns often leverage tactical and informational asymmetries on the threat surface, as seen in the Distort and Deny strategies, and the "firehose of misinformation". Specifically, conspiracy theorists can be repeatedly wrong, but advocates of the truth need to be perfect. By constantly escalating demands for proof, propagandists can effectively leverage this asymmetry while also priming its future use, often with an even greater asymmetric advantage. The conspiracist is offered freer rein for a broader range of "questions" while the truth teller is burdened with higher and higher standards of proof. TA08
T0041 Deny involvement Without "smoking gun" proof (and even with proof), incident creator can or will deny involvement. This technique also leverages the attacker advantages outlined in T0040 "Demand unsurmountable proof", specifically the asymmetric disadvantage for truth-tellers in a "firehose of misinformation" environment. TA08
T0042 Kernel of Truth Wrap lies or altered context/facts around truths. Influence campaigns pursue a variety of objectives with respect to target audiences, prominent among them: 1. undermine a narrative commonly referenced in the target audience; or 2. promote a narrative less common in the target audience, but preferred by the attacker. In both cases, the attacker is presented with a heavy lift. They must change the relative importance of various narratives in the interpretation of events, despite contrary tendencies. When messaging makes use of factual reporting to promote these adjustments in the narrative space, they are less likely to be dismissed out of hand; when messaging can juxtapose a (factual) truth about current affairs with the (abstract) truth explicated in these narratives, propagandists can undermine or promote them selectively. Context matters. TA08
T0043 Use SMS/ WhatsApp/ Chat apps Direct messaging via encypted app is an increasing method of delivery. These messages are often automated and new delivery and storage methods make them anonymous, viral, and ephemeral. This is a diffucult space to monitor, but also a difficult space to build acclaim or notoriety. TA08
T0044 Seed distortions Try a wide variety of messages in the early hours surrounding an incident or event, to give a misleading account or impression. TA08
T0045 Use fake experts Use the fake experts that were set up in T0009. Pseudo-experts are disposable assets that often appear once and then disappear. Give "credility" to misinformation. Take advantage of credential bias TA08
T0046 Search Engine Optimization Manipulate content engagement metrics (ie: Reddit & Twitter) to influence/impact news search results (e.g. Google), also elevates RT & Sputnik headline into Google news alert emails. aka "Black-hat SEO" TA08
T0047 Muzzle social media as a political force Use political influence or the power of state to stop critical social media comments. Government requested/driven content take downs (see Google Transperancy reports. TA09
T0048 Cow online opinion leaders Intimidate, coerce, threaten critics/dissidents/journalists via trolling, doxing. TA09
T0049 Flooding Flooding and/or mobbing social media channels feeds and/or hashtag with excessive volume of content to control/shape online conversations and/or drown out opposing points of view. Bots and/or patriotic trolls are effective tools to acheive this effect. TA09
T0050 Cheerleading domestic social media ops Deploy state-coordinated social media commenters and astroturfers. Both internal/domestic and external social media influence operations. TA09
T0051 Fabricate social media comment Use government-paid social media commenters, astroturfers, chat bots (programmed to reply to specific key words/hashtags) influence online conversations, product reviews, web-site comment forums. TA09
T0052 Tertiary sites amplify news Create content/news/opinion web-sites to cross-post stories. Tertiary sites circulate and amplify narratives. Often these sites have no masthead, bylines or attribution. TA09
T0053 Twitter trolls amplify and manipulate Use trolls to amplify narratives and/or manipulate narratives. Fake profiles/sockpuppets operating to support individuals/narratives from the entire political spectrum (left/right binary). Operating with increased emphasis on promoting local content and promoting real Twitter users generating their own, often divisive political content, as it's easier to amplify existing content than create new/original content. Trolls operate where ever there's a socially divisive issue (issues that can/are be politicized). TA09
T0054 Twitter bots amplify Use bots to amplify narratives above algorithm thresholds. Bots are automated/programmed profiles designed to amplify content (ie: automatically retweet or like) and give appearance it's more "popular" than it is. They can operate as a network, to function in a coordinated/orchestrated manner. In some cases (more so now) they are an inexpensive/disposable assets used for minimal deployment as bot detection tools improve and platforms are more responsive. TA09
T0055 Use hashtag Use a dedicated hashtag for the incident - either create a campaign/incident specific hashtag, or take over an existing hashtag. TA09
T0056 Dedicated channels disseminate information pollution Output information pollution (e.g. articles on an unreported false story/event) through channels controlled by or related to the incident creator. TA09
T0057 Organise remote rallies and events Coordinate and promote real-world events across media platforms, e.g. rallies, protests, gatherings in support of incident narratives. TA10
T0058 Legacy web content Make incident content visible for a long time, e.g. by exploiting platform terms of service, or placing it where it's hard to remove or unlikely to be removed. TA11
T0059 Play the long game Play the long game can mean a couple of things: 1. To plan messaging and allow it to grow organically without conducting your own amplification. This is methodical and slow and requires years for the message to take hold 2. To develop a series of seemingly disconnected messaging narratives that eventually combine into a new narrative. TA11
T0060 Continue to amplify continue narrative or message amplification after the main incident work has finished TA11
T0061 Sell merchandising Sell hats, t-shirts, flags and other branded content that's designed to be seen in the real world TA10
T0062 Behaviour changes Monitor and evaluate behaviour changes from misinformation incidents. TA12
T0063 Message reach Monitor and evaluate message reach in misinformation incidents. TA12
T0064 Social media engagement Monitor and evaluate social media engagement in misinformation incidents. TA12
T0065 Use physical broadcast capabilities Create or coopt broadcast capabilities (e.g. TV, radio etc). TA04
T0066 Degrade adversary Plan to degrade an adversary’s image or ability to act. This could include preparation and use of harmful information about the adversary’s actions or reputation. TA02
T0067 Plan to discredit credible sources Plan to delegitimize the media landscape and degrade public trust in reporting, by discrediting credible sources. This makes it easier to promote influence operation content. TA02
T0068 Respond to breaking news event Media attention on a story or event is heightened during a breaking news event, where unclear facts and incomplete information increase speculation, rumors, and conspiracy theories, which are all vulnerable to manipulation. TA02
T0069 Respond to active crisis Panic, rumors, and speculation are heightened during active crises (violent events, natural disasters, public health phenomena, etc); these are all vulnerable to manipulation. TA02
T0070 Analyze existing communities Assess influence operation potential of existing social media communities, where communities share interests, experiences, politics, or other characteristics that join online users together. Assessment includes the potential use of social group trauma that could be targeted to gain support, using emotional appeals to shared grievances in a set population. TA02
T0071 Find echo chambers Find or plan to create areas (social media groups, search term groups, hashtag groups etc) where individuals only engage with people they agree with. TA13
T0072 Segment audiences Create audience segmentations by features of interest to the influence campaign, including political affiliation, geographic location, income, demographics, and psychographics. TA13