DISARMframeworks/DISARM_MASTER_DATA/countermeasures.csv
Sara-Jayne Terp 1bc8d88b63 moved to datasets as CSVs
Changed from data held in excelfiles to data held in CSV files.  This gives us a better view of what's changed in the datasets when we push them to git.
2022-08-25 09:50:52 -04:00

46 KiB
Raw Permalink Blame History

1disarm_idnamemetatechniquesummaryhow_foundreferencesincident_idstacticresponsetypenotestagslongname
2C00022Innoculate. Positive campaign to promote feeling of safetyM001 - resilienceUsed to counter ability based and fear based attacks2019-11-workshopTA01 Strategic PlanningD04narrativeC00022 - Innoculate. Positive campaign to promote feeling of safety
3C00006Charge for social mediaM004 - frictionInclude a paid-for privacy option, e.g. pay Facebook for an option of them not collecting your personal information. There are examples of this not working, e.g. most people dont use proton mail etc. 2019-11-workshopTA01 Strategic PlanningD02actionC00006 - Charge for social media
4C00008Create shared fact-checking databaseM006 - scoringShare fact-checking resources - tips, responses, countermessages, across respose groups. 2019-11-workshop 2019-11-searchI00049,I00050TA01 Strategic PlanningD04informationC00008 - Create shared fact-checking database
5C00009Educate high profile influencers on best practicesM001 - resilienceFind online influencers. Provide training in the mechanisms of disinformation, how to spot campaigns, and/or how to contribute to responses by countermessaging, boosting information sites etc. 2019-11-workshopTA02 Objective PlanningD02educationC00009 - Educate high profile influencers on best practices
6C00010Enhanced privacy regulation for social mediaM004 - frictionImplement stronger privacy standards, to reduce the ability to microtarget community members. 2019-11-workshopTA01 Strategic PlanningD02regulationC00010 - Enhanced privacy regulation for social media
7C00011Media literacy. Games to identify fake newsM001 - resilienceCreate and use games to show people the mechanics of disinformation, and how to counter them. 2019-11-workshopTA02 Objective PlanningD02educationC00011 - Media literacy. Games to identify fake news
8C00012Platform regulationM007 - metatechniqueEmpower existing regulators to govern social media. Also covers Destroy. Includes: Include the role of social media in the regulatory framework for media. The U.S. approach will need to be carefully crafted to protect First Amendment principles, create needed transparency, ensure liability, and impose costs for noncompliance. Includes Create policy that makes social media police disinformation. Includes: Use fraud legislation to clean up social media2019-11-workshop, 2019-11-searchHicks19TA01 Strategic PlanningD02regulationC00012 - Platform regulation
9C00013Rating framework for newsM006 - scoringThis is "strategic innoculation", raising the standards of what people expect in terms of evidence when consuming news. Example: journalistic ethics, or journalistic licensing body. Include full transcripts, link source, add items. 2019-11-workshopTA01 Strategic PlanningD02informationC00013 - Rating framework for news
10C00014Real-time updates to fact-checking databaseM006 - scoringUpdate fact-checking databases and resources in real time. Especially import for time-limited events like natural disasters. 2019-11-workshopTA06 Develop ContentD04informationC00014 - Real-time updates to fact-checking database
11C00016CensorshipM005 - removalAlter and/or block the publication/dissemination of information controlled by disinformation creators. Not recommended. grugqTaylor81TA01 Strategic PlanningD02actionC00016 - Censorship
12C00017Repair broken social connectionsM010 - countermessagingFor example, use a media campaign to promote in-group to out-group in person communication / activities . Technique could be in terms of forcing a reality-check by talking to people instead of reading about bogeymen. 2019-11-workshopTA01 Strategic PlanningD03actionC00017 - Repair broken social connections
13C00019Reduce effect of division-enablersM003 - daylightincludes Promote constructive communication by shaming division-enablers, and Promote playbooks to call out division-enablers2019-11-workshopTA01 Strategic PlanningD03actionC00019 - Reduce effect of division-enablers
14C00021Encourage in-person communicationM001 - resilienceEncourage offline communication2019-11-workshopTA01 Strategic PlanningD04actionC00021 - Encourage in-person communication
15C00024Promote healthy narrativesM001 - resilienceIncludes promoting constructive narratives i.e. not polarising (e.g. pro-life, pro-choice, pro-USA). Includes promoting identity neutral narratives. 2019-11-workshopTA01 Strategic PlanningD04narrativeC00024 - Promote healthy narratives
16C00026Shore up democracy based messagesM010 - countermessagingMessages about e.g. peace, freedom. And make it sexy. Includes Deploy Information and Narrative-Building in Service of Statecraft: Promote a narrative of transparency, truthfulness, liberal values, and democracy. Implement a compelling narrative via effective mechanisms of communication. Continually reassess messages, mechanisms, and audiences over time. Counteract efforts to manipulate media, undermine free markets, and suppress political freedoms via public diplomacy2019-11-workshop, 2019-11-searchHicks19TA01 Strategic PlanningD04narrativeC00026 - Shore up democracy based messages
17C00027Create culture of civilityM001 - resilienceThis is passive. Includes promoting civility as an identity that people will defend. 2019-11-workshopTA01 Strategic PlanningD07narrativeC00027 - Create culture of civility
18C00029Create fake website to issue counter narrative and counter narrative through physical merchandiseM002 - diversionCreate websites in disinformation voids - spaces where people are looking for known disinformation. 2019-11-workshopTA02 Objective PlanningD03narrativeC00029 - Create fake website to issue counter narrative and counter narrative through physical merchandise
19C00028Make information provenance availableM011 - verificationBlockchain audit log and validation with collaborative decryption to post comments. Use blockchain technology to require collaborative validation before posts or comments are submitted. This could be used to adjust upvote weight via a trust factor of people and organisations you trust, or other criteria.2019-11-workshopTA02 Objective PlanningD03informationC00028 - Make information provenance available
20C00030Develop a compelling counter narrative (truth based)M002 - diversion2019-11-workshopTA02 Objective PlanningD03narrativeC00030 - Develop a compelling counter narrative (truth based)
21C00031Dilute the core narrative - create multiple permutations, target / amplifyM009 - dilutionCreate competing narratives. Included "Facilitate State Propaganda" as diluting the narrative could have an effect on the pro-state narrative used by volunteers, or lower their involvement.2019-11-workshopTA02 Objective PlanningD03CAVEAT: some element of disinformation is simply filling the information space with so much data that it overwhelms people and they shutdown. Any swarm-counter-narrative needs to be cautious of this outcome.narrativeC00031 - Dilute the core narrative - create multiple permutations, target / amplify
22C00042Address truth contained in narrativesM010 - countermessagingFocus on and boost truths in misinformation narratives, removing misinformation from them. 2019-11-workshopTA15 Establish Social AssetsD04narrativeC00042 - Address truth contained in narratives
23C00032Hijack content and link to truth- based infoM002 - diversionLink to platform2019-11-workshopTA06 Develop ContentD03informationC00032 - Hijack content and link to truth- based info
24C00034Create more friction at account creationM004 - frictionCounters fake account2019-11-workshopTA15 - Establish Social AssetsD04actionC00034 - Create more friction at account creation
25C00036Infiltrate the in-group to discredit leaders (divide)M013 - targetingAll of these would be highly affected by infiltration or false-claims of infiltration.2019-11-workshopTA15 - Establish Social AssetsD02actionC00036 - Infiltrate the in-group to discredit leaders (divide)
26C00040third party verification for peopleM011 - verificationcounters fake experts2019-11-workshopTA15 - Establish Social AssetsD02informationC00040 - third party verification for people
27C00067Denigrate the recipient/ project (of online funding)M013 - targetingReduce the credibility of groups behind misinformation-linked funding campaigns. 2019-11-workshopTA15 Establish Social AssetsD03narrativeC00067 - Denigrate the recipient/ project (of online funding)
28C00044Keep people from posting to social media immediatelyM004 - frictionPlatforms can introduce friction to slow down activities, force a small delay between posts, or replies to posts.2019-11-workshopTA15 - Establish Social AssetsD03actionC00044 - Keep people from posting to social media immediately
29C00046Marginalise and discredit extremist groupsM013 - targetingReduce the credibility of extremist groups posting misinformation.2019-11-workshopTA15 - Establish Social AssetsD04actionC00046 - Marginalise and discredit extremist groups
30C00047Honeypot with coordinated inauthenticsM008 - data pollutionFlood disinformation spaces with obviously fake content, to dilute core misinformation narratives in them. 2019-11-workshopTA15 Establish Social AssetsD05actionC00047 - Honeypot with coordinated inauthentics
31C00048Name and Shame InfluencersM003 - daylightThink about the different levels: individual vs state-sponsored account. Includes “call them out” and “name and shame”. Identify social media accounts as sources of propaganda—“calling them out”— might be helpful to prevent the spread of their message to audiences that otherwise would consider them factual. Identify, monitor, and, if necessary, target externally-based nonattributed social media accounts. Impact of and Dealing with Trolls - "Chatham House has observed that trolls also sometimes function as decoys, as a way of “keeping the infantry busy” that “aims to wear down the other side” (Lough et al., 2014). Another type of troll involves “false accounts posing as authoritative information sources on social media”.2019-11-workshop, 2019-11-searchRand2237 and Dalton19TA15 - Establish Social AssetsD07informationC00048 - Name and Shame Influencers
32C00051Counter social engineering trainingM001 - resilienceIncludes anti-elicitation training, phishing prevention education. 2019-11-workshopTA15 - Establish Social AssetsD02educationC00051 - Counter social engineering training
33C00052Infiltrate platformsM013 - targetingDetect and degrade2019-11-workshopTA15 Establish Social AssetsD04actionC00052 - Infiltrate platforms
34C00053Delete old accounts / Remove unused social media accountsM012 - cleaningremove or remove access to (e.g. stop the ability to update) old social media accounts, to reduce the pool of accounts available for takeover, botnets etc. 2019-11-workshop,2019-11-searchI00004TA15 Establish Social AssetsD04actionC00053 - Delete old accounts / Remove unused social media accounts
35C00056Encourage people to leave social mediaM004 - frictionEncourage people to leave spcial media. We don't expect this to work2019-11-workshopTA15 Establish Social AssetsD02actionC00056 - Encourage people to leave social media
36C00058Report crowdfunder as violatorM005 - removalcounters crowdfunding. Includes Expose online funding as fake”. 2019-11-workshopTA15 - Establish Social AssetsD02informationC00058 - Report crowdfunder as violator
37C00059Verification of project before posting fund requestsM011 - verificationthird-party verification of projects posting funding campaigns before those campaigns can be posted. 2019-11-workshopTA15 Establish Social AssetsD02informationC00059 - Verification of project before posting fund requests
38C00060Legal action against for-profit engagement factoriesM013 - targetingTake legal action against for-profit "factories" creating misinformation. 2019-11-workshopTA02 Objective PlanningD03regulationC00060 - Legal action against for-profit engagement factories
39C00062Free open library sources worldwideM010 - countermessagingOpen-source libraries could be created that aid in some way for each technique. Even for Strategic Planning, some open-source frameworks such as DISARM can be created to counter the adversarial efforts.2019-11-workshopTA15 Establish Social AssetsD04informationC00062 - Free open library sources worldwide
40C00065Reduce political targetingM005 - removalIncludes “ban political micro targeting” and “ban political ads”2019-11-workshopTA05 MicrotargetingD03actionC00065 - Reduce political targeting
41C00066Co-opt a hashtag and drown it out (hijack it back)M009 - dilutionFlood a disinformation-related hashtag with other content. 2019-11-workshopTA05 MicrotargetingD03informationC00066 - Co-opt a hashtag and drown it out (hijack it back)
42C00080Create competing narrativeM002 - diversionCreate counternarratives, or narratives that compete in the same spaces as misinformation narratives. Could also be degrade2019-11-workshopTA06 Develop ContentD03narrativeC00080 - Create competing narrative
43C00070Block access to disinformation resourcesM005 - removalResources = accounts, channels etc. Block access to platform. DDOS an attacker. TA02*: DDOS at the critical time, to deny an adversary's time-bound objective. T0008: A quick response to a proto-viral story will affect it's ability to spread and raise questions about their legitimacy. Hashtag: Against the platform, by drowning the hashtag. T0046 - Search Engine Optimization: Sub-optimal website performance affect its search engine rank, which I interpret as "blocking access to a platform".2019-11-workshopTA02 Objective PlanningD02actionC00070 - Block access to disinformation resources
44C00071Block source of pollutionM005 - removalBlock websites, accounts, groups etc connected to misinformation and other information pollution. 2019-11-workshopTA06 Develop ContentD02actionC00071 - Block source of pollution
45C00072Remove non-relevant content from special interest groups - not recommendedM005 - removalCheck special-interest groups (e.g. medical, knitting) for unrelated and misinformation-linked content, and remove it. 2019-11-workshopTA06 Develop ContentD02actionC00072 - Remove non-relevant content from special interest groups - not recommended
46C00073Inoculate populations through media literacy trainingM001 - resilienceUse training to build the resilience of at-risk populations. Educate on how to handle info pollution. Push out targeted education on why it's pollution. Build cultural resistance to false content, e.g. cultural resistance to bullshit. Influence literacy training, to inoculate against “cult” recruiting. Media literacy training: leverage librarians / library for media literacy training. Inoculate at language. Strategic planning included as inoculating population has strategic value. Concepts of media literacy to a mass audience that authorities launch a public information campaign that teaches the program will take time to develop and establish impact, recommends curriculum-based training. Covers detect, deny, and degrade. 2019-11-workshop,2019-11-searchRand2237TA01 Strategic PlanningD02educationC00073 - Inoculate populations through media literacy training
47C00074Identify and delete or rate limit identical contentM012 - cleaningC000002019-11-workshopTA06 Develop ContentD02actionC00074 - Identify and delete or rate limit identical content
48C00075normalise languageM010 - countermessagingnormalise the language around disinformation and misinformation; give people the words for artifact and effect types. 2019-11-workshopTA06 Develop ContentD02informationC00075 - normalise language
49C00076Prohibit images in political discourse channelsM005 - removalMake political discussion channels text-only. 2019-11-workshopTA06 Develop ContentD02actionC00076 - Prohibit images in political discourse channels
50C00077Active defence: run TA15 "develop people” - not recommendedM013 - targetingDevelop networks of communities and influencers around counter-misinformation. Match them to misinformation creators 2019-11-workshopTA15 - Establish Social AssetsD03actionC00077 - Active defence: run TA15 "develop people” - not recommended
51C00078Change Search Algorithms for Disinformation ContentM002 - diversionIncludes “change image search algorithms for hate groups and extremists” and “Change search algorithms for hate and extremist queries to show content sympathetic to opposite side”2019-11-workshopTA06 Develop ContentD03actionC00078 - Change Search Algorithms for Disinformation Content
52C00084Modify disinformation narratives, and rebroadcast themM002 - diversionIncludes “poison pill recasting of message” and “steal their truths”. Many techniques involve promotion which could be manipulated. For example, online fundings or rallies could be advertised, through compromised or fake channels, as being associated with "far-up/down/left/right" actors. "Long Game" narratives could be subjected in a similar way with negative connotations. Can also replay technique T0003. 2019-11-workshopTA06 Develop ContentD03narrativeC00084 - Modify disinformation narratives, and rebroadcast them
53C00081Highlight flooding and noise, and explain motivationsM003 - daylightDiscredit by pointing out the "noise" and informing public that "flooding" is a technique of disinformation campaigns; point out intended objective of "noise"2019-11-workshopTA06 Develop ContentD03informationC00081 - Highlight flooding and noise, and explain motivations
54C00082Ground truthing as automated response to pollutionM010 - countermessagingAlso inoculation.2019-11-workshopTA06 Develop ContentD03informationC00082 - Ground truthing as automated response to pollution
55C00087Make more noise than the disinformationM009 - dilution2019-11-workshopTA06 Develop ContentD04narrativeC00087 - Make more noise than the disinformation
56C00085Mute contentM003 - daylightRate-limit disinformation content. Reduces its effects, whilst not running afoul of censorship concerns. Online archives of content (archives of websites, social media profiles, media, copies of published advertisements; or archives of comments attributed to bad actors, as well as anonymized metadata about users who interacted with them and analysis of the effect) is useful for intelligence analysis and public transparency, but will need similar muting or tagging/ shaming as associated with bad actors.2019-11-workshopTA06 Develop ContentD03actionC00085 - Mute content
57C00086Distract from noise with addictive contentM002 - diversionExample: Interject addictive links or contents into discussions of disinformation materials and measure a "conversion rate" of users who engage with your content and away from the social media channel's "information bubble" around the disinformation item. Use bots to amplify and upvote the addictive content. 2019-11-workshopTA06 Develop ContentD04informationC00086 - Distract from noise with addictive content
58C00112"Prove they are not an op!"M004 - frictionChallenge misinformation creators to prove they're not an information operation. 2019-11-workshopTA08 Pump PrimingD02narrativeC00112 - "Prove they are not an op!"
59C00090Fake engagement systemM002 - diversionCreate honeypots for misinformation creators to engage with, and reduce the resources they have available for misinformation campaigns. 2019-11-workshopTA07 Channel SelectionD05actionC00090 - Fake engagement system
60C00091Honeypot social communityM002 - diversionSet honeypots, e.g. communities, in networks likely to be used for disinformation. 2019-11-workshopTA06 Develop ContentD05actionC00091 - Honeypot social community
61C00092Establish a truth teller reputation score for influencersM006 - scoringIncludes "Establish a truth teller reputation score for influencers” and “Reputation scores for social media users”. Influencers are individuals or accounts with many followers. 2019-11-workshopTA02 Objective PlanningD07informationC00092 - Establish a truth teller reputation score for influencers
62C00093Influencer code of conductM001 - resilienceEstablish tailored code of conduct for individuals with many followers. Can be platform code of conduct; can also be community code.2019-11-workshopTA15 - Establish Social AssetsD07informationC00093 - Influencer code of conduct
63C00094Force full disclosure on corporate sponsor of researchM003 - daylightAccountability move: make sure research is published with its funding sources. 2019-11-workshopTA06 Develop ContentD04informationC00094 - Force full disclosure on corporate sponsor of research
64C00096Strengthen institutions that are always truth tellersM006 - scoringIncrease credibility, visibility, and reach of positive influencers in the information space. 2019-11-workshopTA01 Strategic PlanningD07informationC00096 - Strengthen institutions that are always truth tellers
65C00097Require use of verified identities to contribute to poll or commentM004 - frictionReduce poll flooding by online taking comments or poll entries from verified accounts. 2019-11-workshopTA07 Channel SelectionD02informationC00097 - Require use of verified identities to contribute to poll or comment
66C00098Revocation of allowlisted or "verified" statusM004 - frictionremove blue checkmarks etc from known misinformation accounts. 2019-11-workshopTA07 Channel SelectionD02actionC00098 - Revocation of allowlisted or "verified" status
67C00099Strengthen verification methodsM004 - frictionImprove content veerification methods available to groups, individuals etc. 2019-11-workshopTA07 Channel SelectionD02informationC00099 - Strengthen verification methods
68C00100Hashtag jackingM002 - diversionPost large volumes of unrelated content on known misinformation hashtags 2019-11-workshopTA08 Pump PrimingD03informationC00100 - Hashtag jacking
69C00101Create friction by rate-limiting engagementM004 - frictionCreate participant friction. Includes Make repeat voting hard, and throttle number of forwards. 2019-11-workshopTA07 Channel SelectionD04actionC00101 - Create friction by rate-limiting engagement
70C00103Create a bot that engages / distract trollsM002 - diversionThis is reactive, not active measure (honeypots are active). It's a platform controlled measure.2019-11-workshopTA07 Channel SelectionD05actionC00103 - Create a bot that engages / distract trolls
71C00105Buy more advertising than misinformation creatorsM009 - dilutionShift influence and algorithms by posting more adverts into spaces than misinformation creators. 2019-11-workshopTA07 Channel SelectionD03informationC00105 - Buy more advertising than misinformation creators
72C00106Click-bait centrist contentM002 - diversionCreate emotive centrist content that gets more clicks2019-11-workshopTA06 Develop ContentD03informationC00106 - Click-bait centrist content
73C00107Content moderationM006 - scoring, M005 - removalincludes social media content take-downs, e.g. facebook or Twitter content take-downs2019-11-workshop, 2019-11-searchI00005,I00009,I00056TA06 Develop ContentD02actionC00107 - Content moderation
74C00109Dampen Emotional ReactionM001 - resilienceReduce emotional responses to misinformation through calming messages, etc. 2019-11-workshopTA09 ExposureD03informationC00109 - Dampen Emotional Reaction
75C00111Reduce polarisation by connecting and presenting sympathetic renditions of opposite viewsM001 - resilience2019-11-workshopTA01 Strategic PlanningD04informationC00111 - Reduce polarisation by connecting and presenting sympathetic renditions of opposite views
76C00118Repurpose images with new textM010 - countermessagingAdd countermessage text to iamges used in misinformation incidents. 2019-11-workshopTA08 Pump PrimingD04narrativeC00118 - Repurpose images with new text
77C00113Debunk and defuse a fake expert / credentials.M003 - daylightDebunk fake experts, their credentials, and potentially also their audience quality2019-11-workshopTA08 Pump PrimingD02informationC00113 - Debunk and defuse a fake expert / credentials.
78C00114Don't engage with payloadsM004 - frictionStop passing on misinformation2019-11-workshopTA08 Pump PrimingD02informationC00114 - Don't engage with payloads
79C00115Expose actor and intentionsM003 - daylightDebunk misinformation creators and posters. 2019-11-workshopTA08 Pump PrimingD02informationC00115 - Expose actor and intentions
80C00116Provide proof of involvementM003 - daylightBuild and post information about groups etc's involvement in misinformation incidents. 2019-11-workshopTA08 Pump PrimingD02informationC00116 - Provide proof of involvement
81C00117Downgrade / de-amplify so message is seen by fewer peopleM010 - countermessagingLabel promote counter to disinformation2019-11-workshopTA08 Pump PrimingD04informationC00117 - Downgrade / de-amplify so message is seen by fewer people
82C00125PrebunkingM001 - resilienceProduce material in advance of misinformation incidents, by anticipating the narratives used in them, and debunking them. 2019-11-workshopTA09 ExposureD03narrativeC00125 - Prebunking
83C00119Engage payload and debunk.M010 - countermessagingdebunk misinformation content. Provide link to facts. 2019-11-workshopTA08 Pump PrimingD07informationC00119 - Engage payload and debunk.
84C00120Open dialogue about design of platforms to produce different outcomesM007 - metatechniqueRedesign platforms and algorithms to reduce the effectiveness of disinformation2019-11-workshopTA08 Pump PrimingD07actionC00120 - Open dialogue about design of platforms to produce different outcomes
85C00121Tool transparency and literacy for channels people follow. M001 - resilienceMake algorithms in platforms explainable, and visible to people using those platforms. 2019-11-workshopTA08 Pump PrimingD07informationC00121 - Tool transparency and literacy for channels people follow.
86C00122Content moderationM004 - frictionBeware: content moderation misused becomes censorship. 2019-11-workshopTA09 ExposureD02actionC00122 - Content moderation
87C00123Remove or rate limit botnetsM004 - frictionreduce the visibility of known botnets online. 2019-11-workshopTA09 ExposureD03actionC00123 - Remove or rate limit botnets
88C00124Don't feed the trollsM004 - frictionDon't engage with individuals relaying misinformation. 2019-11-workshopTA09 ExposureD03actionC00124 - Don't feed the trolls
89C00211Use humorous counter-narrativesM010 - countermessaging2019-11-searchI00004TA09 ExposureD03narrativeC00211 - Use humorous counter-narratives
90C00126Social media amber alertM003 - daylightCreate an alert system around disinformation and misinformation artifacts, narratives, and incidents 2019-11-workshopTA09 ExposureD03informationC00126 - Social media amber alert
91C00128Create friction by marking content with ridicule or other "decelerants"M009 - dilutionRepost or comment on misinformation artifacts, using ridicule or other content to reduce the likelihood of reposting. 2019-11-workshopTA09 ExposureD03informationC00128 - Create friction by marking content with ridicule or other "decelerants"
92C00129Use banking to cut off access M014 - reduce resourcesfiscal sanctions; parallel to counter terrorism2019-11-workshopTA09 ExposureD02actionC00129 - Use banking to cut off access
93C00130Mentorship: elders, youth, credit. Learn vicariously.M001 - resilienceTrain local influencers in countering misinformation. 2019-11-workshopTA05 MicrotargetingD07educationC00130 - Mentorship: elders, youth, credit. Learn vicariously.
94C00131Seize and analyse botnet serversM005 - removalTake botnet servers offline by seizing them. 2019-11-workshopTA11 PersistenceD02actionC00131 - Seize and analyse botnet servers
95C00133Deplatform Account*M005 - removalNote: Similar to Deplatform People but less generic. Perhaps both should be left.2019-11-workshopTA15 - Establish Social AssetsD03actionC00133 - Deplatform Account*
96C00135Deplatform message groups and/or message boardsM005 - removalMerged two rows here. 2019-11-workshopTA15 Establish Social AssetsD03actionC00135 - Deplatform message groups and/or message boards
97C00136Microtarget most likely targets then send them countermessagesM010 - countermessagingFind communities likely to be targetted by misinformation campaigns, and send them countermessages or pointers to information sources. 2019-11-workshopTA08 Pump PrimingD03informationC00136 - Microtarget most likely targets then send them countermessages
98C00138Spam domestic actors with lawsuitsM014 - reduce resourcesFile multiple lawsuits against known misinformation creators and posters, to distract them from disinformation creation. 2019-11-workshopTA11 PersistenceD03regulationC00138 - Spam domestic actors with lawsuits
99C00139Weaponise youtube content matricesM004 - frictionGod knows what this is. Keeping temporarily in case we work it out. 2019-11-workshopTA11 PersistenceD03informationC00139 - Weaponise youtube content matrices
100C00140"Bomb" link shorteners with lots of callsM008 - data pollutionApplies to most of the content used by exposure techniques except "T0055 - Use hashtag”. Applies to analytics2019-11-workshopTA12 Measure EffectivenessD03actionC00140 - "Bomb" link shorteners with lots of calls
101C00142Platform adds warning label and decision point when sharing contentM004 - frictionIncludes “this has been disproved: do you want to forward it”. Includes “"Hey this story is old" popup when messaging with old URL” - this assumes that this technique is based on visits to an URL shortener or a captured news site that can publish a message of our choice. Includes “mark clickbait visually”. 2019-11-workshopTA06 Develop ContentD04informationC00142 - Platform adds warning label and decision point when sharing content
102C00143(botnet) DMCA takedown requests to waste group timeM013 - targetingUse copyright infringement claims to remove videos etc. 2019-11-workshopTA11 PersistenceD04regulationC00143 - (botnet) DMCA takedown requests to waste group time
103C00144Buy out troll farm employees / offer them jobsM014 - reduce resourcesDegrade the infrastructure. Could e.g. pay to not act for 30 days. Not recommended2019-11-workshopTA02 Objective PlanningD04actionC00144 - Buy out troll farm employees / offer them jobs
104C00147Make amplification of social media posts expire (e.g. can't like/ retweet after n days)M004 - frictionStop new community activity (likes, comments) on old social media posts. 2019-11-workshopTA09 ExposureD03actionC00147 - Make amplification of social media posts expire (e.g. can't like/ retweet after n days)
105C00148Add random links to network graphsM008 - data pollutionIf creators are using network analysis to determine how to attack networks, then adding random extra links to those networks might throw that analysis out enough to change attack outcomes. Unsure which DISARM techniques.2019-11-workshopTA12 Measure EffectivenessD04actionC00148 - Add random links to network graphs
106C00149Poison the monitoring & evaluation dataM008 - data pollutionIncludes Pollute the AB-testing data feeds: Polluting A/B testing requires knowledge of MOEs and MOPs. A/B testing must be caught early when there is relatively little data available so infiltration of TAs and understanding of how content is migrated from testing to larger audiences is fundamental.2019-11-workshopTA12 Measure EffectivenessD04actionC00149 - Poison the monitoring & evaluation data
107C00153Take pre-emptive action against actors' infrastructureM013 - targetingAlign offensive cyber action with information operations and counter disinformation approaches, where appropriate.2019-11-searchDalton19TA01 Strategic PlanningD03actionC00153 - Take pre-emptive action against actors' infrastructure
108C00154Ask media not to report false informationM005 - removalTrain media to spot and respond to misinformation, and ask them not to post or transmit misinformation they've found. 2019-11-searchI00022TA08 Pump PrimingD02informationC00154 - Ask media not to report false information
109C00155Ban incident actors from funding sitesM005 - removalBan misinformation creators and posters from funding sites2019-11-searchI00002TA15 - Establish Social AssetsD02actionC00155 - Ban incident actors from funding sites
110C00156Better tell your country or organization storyM010 - countermessagingCivil engagement activities conducted on the part of EFP forces. NATO should likewise provide support and training, where needed, to local public affairs and other communication personnel. Local government and military public affairs personnel can play their part in creating and disseminating entertaining and sharable content that supports the EFP mission. 2019-11-searchRand2237TA02 Objective PlanningD03informationC00156 - Better tell your country or organization story
111C00159Have a disinformation response planM007 - metatechniquee.g. Create a campaign plan and toolkit for competition short of armed conflict (this used to be called “the grey zone”). The campaign plan should account for own vulnerabilities and strengths, and not over-rely on any one tool of statecraft or line of effort. It will identify and employ a broad spectrum of national power to deter, compete, and counter (where necessary) other countries approaches, and will include understanding of own capabilities, capabilities of disinformation creators, and international standards of conduct to compete in, shrink the size, and ultimately deter use of competition short of armed conflict.2019-11-searchHicks19TA01 Strategic PlanningD03actionC00159 - Have a disinformation response plan
112C00160find and train influencersM001 - resilienceIdentify key influencers (e.g. use network analysis), then reach out to identified users and offer support, through either training or resources.2019-11-searchRand2237TA15 - Establish Social AssetsD02educationC00160 - find and train influencers
113C00161Coalition Building with stakeholders and Third-Party InducementsM007 - metatechniqueAdvance coalitions across borders and sectors, spanning public and private, as well as foreign and domestic, divides. Improve mechanisms to collaborate, share information, and develop coordinated approaches with the private sector at home and allies and partners abroad.2019-11-searchDalton19TA01 Strategic PlanningD07actionC00161 - Coalition Building with stakeholders and Third-Party Inducements
114C00162Unravel/target the Potemkin villagesM013 - targetingKremlins narrative spin extends through constellations of “civil society” organizations, political parties, churches, and other actors. Moscow leverages think tanks, human rights groups, election observers, Eurasianist integration groups, and orthodox groups. A collection of Russian civil society organizations, such as the Federal Agency for the Commonwealth of Independent States Affairs, Compatriots Living Abroad, and International Humanitarian Cooperation, together receive at least US$100 million per year, in addition to government-organized nongovernmental organizations (NGOs), at least 150 of which are funded by Russian presidential grants totaling US$70 million per year.2019-11-searchRand2237TA15 Establish Social AssetsD03informationC00162 - Unravel/target the Potemkin villages
115C00164compatriot policyM013 - targetingprotect the interests of this population and, more importantly, influence the population to support pro-Russia causes and effectively influence the politics of its neighbors2019-11-searchRand2237TA02 Objective PlanningD03actionC00164 - compatriot policy
116C00165Ensure integrity of official documentsM004 - frictione.g. for leaked legal documents, use court motions to limit future discovery actions2019-11-searchI00015TA06 Develop ContentD02informationC00165 - Ensure integrity of official documents
117C00169develop a creative content hubM010 - countermessaginginternational donors will donate to a basket fund that will pay a committee of local experts who will, in turn, manage and distribute the money to Russian-language producers and broadcasters that pitch various projects.2019-11-searchRand2237TA02 Objective PlanningD03actionC00169 - develop a creative content hub
118C00170elevate information as a critical domain of statecraftM007 - metatechniqueShift from reactive to proactive response, with priority on sharing relevant information with the public and mobilizing private-sector engagement. Recent advances in data-driven technologies have elevated information as a source of power to influence the political and economic environment, to foster economic growth, to enable a decision-making advantage over competitors, and to communicate securely and quickly.2019-11-searchDalton19TA01 Strategic PlanningD03actionC00170 - elevate information as a critical domain of statecraft
119C00172social media source removalM005 - removalRemoving accounts, pages, groups, e.g. facebook page removal2019-11-searchI00035TA15 Establish Social AssetsD02actionC00172 - social media source removal
120C00174Create a healthier news environmentM007 - metatechnique, M002 - diversionFree and fair press: create bipartisan, patriotic commitment to press freedom. Note difference between news and editorialising. Build alternative news sources: create alternative local-language news sources to counter local-language propaganda outlets. Delegitimize the 24 hour news cycle. includes Provide an alternative to disinformation content by expanding and improving local content: Develop content that can displace geopolitically-motivated narratives in the entire media environment, both new and old media alike.2019-11-workshop, 2019-11-searchHicks19, p143 of Corker18, Rand2237TA01 Strategic PlanningD02actionC00174 - Create a healthier news environment
121C00176Improve Coordination amongst stakeholders: public and privateM007 - metatechniqueCoordinated disinformation challenges are increasingly multidisciplinary, there are few organizations within the national security structures that are equipped with the broad-spectrum capability to effectively counter large-scale conflict short of war tactics in real-time. Institutional hurdles currently impede diverse subject matter experts, hailing from outside of the traditional national security and foreign policy disciplines (e.g., physical science, engineering, media, legal, and economics fields), from contributing to the direct development of national security countermeasures to emerging conflict short of war threat vectors. A Cognitive Security Action Group (CSAG), akin to the Counterterrorism Security Group (CSG), could drive interagency alignment across equivalents of DHS, DoS, DoD, Intelligence Community, and other implementing agencies, in areas including strategic narrative, and the nexus of cyber and information operations. 2019-11-searchDalton19TA01 Strategic PlanningD07actionC00176 - Improve Coordination amongst stakeholders: public and private
122C00178Fill information voids with non-disinformation contentM009 - dilution, M008 - data pollution1) Pollute the data voids with wholesome content (Kittens! Babyshark!). 2) fill data voids with relevant information, e.g. increase Russian-language programming in areas subject to Russian disinformation. 2019-11-workshop, 2019-11-searchRand2237TA05 MicrotargetingD04informationC00178 - Fill information voids with non-disinformation content
123C00182Redirection / malware detection/ remediationM005 - removalDetect redirction or malware, then quarantine or delete. 2019-11-searchRand2237TA09 ExposureD02actionC00182 - Redirection / malware detection/ remediation
124C00184Media exposureM003 - daylighthighlight misinformation activities and actors in media2019-11-searchI00010,I00015,I00032,I00044TA08 Pump PrimingD04informationC00184 - Media exposure
125C00188Newsroom/Journalist training to counter influence movesM001 - resilienceIncludes SEO influence. Includes promotion of a “higher standard of journalism”: journalism training “would be helpful, especially for the online community. Includes Strengthen local media: Improve effectiveness of local media outlets. 2019-11-searchRand2237TA08 Pump PrimingD03educationC00188 - Newsroom/Journalist training to counter influence moves
126C00189Ensure that platforms are taking down flagged accountsM003 - daylightUse ongoing analysis/monitoring of "flagged" profiles. Confirm whether platforms are actively removing flagged accounts, and raise pressure via e.g. government organizations to encourage removal2019-11-searchRand2237TA15 - Establish Social AssetsD06actionC00189 - Ensure that platforms are taking down flagged accounts
127C00190open engagement with civil societyM001 - resilienceGovernment open engagement with civil society as an independent check on government action and messaging. Government seeks to coordinate and synchronize narrative themes with allies and partners while calibrating action in cases where elements in these countries may have been co-opted by competitor nations. Includes “fight in the light”: Use leadership in the arts, entertainment, and media to highlight and build on fundamental tenets of democracy.2019-11-searchDalton19, Hicks19TA01 Strategic PlanningD03actionC00190 - open engagement with civil society
128C00195Redirect searches away from disinformation or extremist content M002 - diversionUse Google AdWords to identify instances in which people search Google about particular fake-news stories or propaganda themes. Includes Monetize centrist SEO by subsidizing the difference in greater clicks towards extremist content. 2019-11-workshop, 2019-11-searchRand2237TA07 Channel SelectionD02informationC00195 - Redirect searches away from disinformation or extremist content
129C00197remove suspicious accountsM005 - removalStandard reporting for false profiles (identity issues). Includes detecting hijacked accounts and reallocating them - if possible, back to original owners. 2019-11-search, 2019-11-workshopI00022TA15 - Establish Social AssetsD02actionC00197 - remove suspicious accounts
130C00200Respected figure (influencer) disavows misinfoM010 - countermessagingFIXIT: standardize language used for influencer/ respected figure. 2019-11-searchI00044TA09 ExposureD03informationC00200 - Respected figure (influencer) disavows misinfo
131C00202Set data 'honeytraps'M002 - diversionSet honeytraps in content likely to be accessed for disinformation. 2019-11-searchI00004,I00022TA06 Develop ContentD02actionC00202 - Set data 'honeytraps'
132C00203Stop offering press credentials to propaganda outletsM004 - frictionRemove access to official press events from known misinformation actors. 2019-11-searchI00022TA15 Establish Social AssetsD03actionC00203 - Stop offering press credentials to propaganda outlets
133C00205strong dialogue between the federal government and private sector to encourage better reportingM007 - metatechniqueIncrease civic resilience by partnering with business community to combat gray zone threats and ensuring adequate reporting and enforcement mechanisms. 2019-11-searchHicks19TA01 Strategic PlanningD03actionC00205 - strong dialogue between the federal government and private sector to encourage better reporting
134C00207Run a competing disinformation campaign - not recommendedM013 - targeting2019-11-searchI00042TA02 Objective PlanningD07actionC00207 - Run a competing disinformation campaign - not recommended
135C00212build public resilience by making civil society more vibrantM001 - resilienceIncrease public service experience, and support wider civics and history education.2019-11-searchHicks19TA01 Strategic PlanningD03actionC00212 - build public resilience by making civil society more vibrant
136C00216Use advertiser controls to stem flow of funds to bad actorsM014 - reduce resourcesPrevent ad revenue going to disinformation domains2019-11-workshopTA05 MicrotargetingD02actionC00216 - Use advertiser controls to stem flow of funds to bad actors
137C00219Add metadata to content thats out of the control of disinformation creatorsM003 - daylightSteganography. Adding date, signatures etc to stop issue of photo relabelling etc. grugqTA06 Develop ContentD04informationC00219 - Add metadata to content thats out of the control of disinformation creators
138C00220Develop a monitoring and intelligence planM007 - metatechniqueCreate a plan for misinformation and disinformation response, before it's needed. Include connections / contacts needed, expected counteremessages etc. Counters cleanupTA01 Strategic PlanningD03actionC00220 - Develop a monitoring and intelligence plan
139C00221Run a disinformation red team, and design mitigation factorsM007 - metatechniqueInclude PACE plans - Primary, Alternate, Contingency, EmergencyCounters cleanupTA01 Strategic PlanningD03actionC00221 - Run a disinformation red team, and design mitigation factors
140C00222Tabletop simulationsM007 - metatechniqueSimulate misinformation and disinformation campaigns, and responses to them, before campaigns happen. TA02 Objective PlanningD03educationC00222 - Tabletop simulations
141C00223Strengthen Trust in social media platformsM001 - resilienceImprove trust in the misinformation responses from social media and other platforms. Examples include creating greater transparancy on their actions and algorithms. TA01 Strategic PlanningD03actionC00223 - Strengthen Trust in social media platforms