DISARMframeworks/generated_pages/counters_index.md
Sara-Jayne Terp 22abaf93d8 Copy AMITT repository, clean up and rebrand
Took a copy of the current AMITT github repository - we'll be updating this and merging the SPICE branch back in
Rebranded to DISARM
Moved generated pages to their own folder, to make looking at the repository less confusing
2022-01-29 11:34:46 -05:00

1154 lines
52 KiB
Markdown
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

# DISARM Counters:
<table border="1">
<tr>
<th>disarm_id</th>
<th>name</th>
<th>summary</th>
<th>metatechnique</th>
<th>tactic</th>
<th>responsetype</th>
</tr>
<tr>
<td><a href="counters/C00006.md">C00006</a></td>
<td>Charge for social media</td>
<td>Include a paid-for privacy option, e.g. pay Facebook for an option of them not collecting your personal information. There are examples of this not work, e.g. most people dont use proton mail etc. </td>
<td>M004 - friction</td>
<td>TA01 Strategic Planning</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00008.md">C00008</a></td>
<td>Create shared fact-checking database</td>
<td>Share fact-checking resources - tips, responses, countermessages, across respose groups. Snopes is best-known example of fact-checking sites. </td>
<td>M006 - scoring</td>
<td>TA01 Strategic Planning</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00009.md">C00009</a></td>
<td>Educate high profile influencers on best practices</td>
<td>Find online influencers. Provide training in the mechanisms of disinformation, how to spot campaigns, and/or how to contribute to responses by countermessaging, boosting information sites etc. </td>
<td>M001 - resilience</td>
<td>TA02 Objective Planning</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00010.md">C00010</a></td>
<td>Enhanced privacy regulation for social media</td>
<td>Implement stronger privacy standards, to reduce the ability to microtarget community members. </td>
<td>M004 - friction</td>
<td>TA01 Strategic Planning</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00011.md">C00011</a></td>
<td>Media literacy. Games to identify fake news</td>
<td>Create and use games to show people the mechanics of disinformation, and how to counter them. </td>
<td>M001 - resilience</td>
<td>TA02 Objective Planning</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00012.md">C00012</a></td>
<td>Platform regulation</td>
<td>Empower existing regulators to govern social media. Also covers Destroy. Includes: Include the role of social media in the regulatory framework for media - The European Union created significant new regulations in 2018; the U.S. approach will need to be carefully crafted to protect First Amendment principles, create needed transparency, ensure liability, and impose costs for noncompliance. Includes Create policy that makes social media police disinformation - example: German model: facebook forced to police content by law. Includes: Use fraud legislation to clean up social media</td>
<td>M007 - metatechnique</td>
<td>TA01 Strategic Planning</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00013.md">C00013</a></td>
<td>Rating framework for news</td>
<td>This is "strategic innoculation", raising the standards of what people expect in terms of evidence when consuming news. Example: journalistic ethics, or journalistic licensing body. Include full transcripts, link source, add items. </td>
<td>M006 - scoring</td>
<td>TA01 Strategic Planning</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00014.md">C00014</a></td>
<td>Real-time updates to fact-checking database</td>
<td>Update fact-checking databases and resources in real time. Especially import for time-limited events like natural disasters. Existing examples at Buzzfeed and Fema.</td>
<td>M006 - scoring</td>
<td>TA06 Develop Content</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00016.md">C00016</a></td>
<td>Censorship</td>
<td>Alter and/or block the publication/dissemination of information controlled by disinformation creators. Not recommended. </td>
<td>M005 - removal</td>
<td>TA01 Strategic Planning</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00017.md">C00017</a></td>
<td>Repair broken social connections</td>
<td>For example, use a media campaign to promote in-group to out-group in person communication / activities . Technique could be in terms of forcing a reality-check by talking to people instead of reading about bogeymen. </td>
<td>M010 - countermessaging</td>
<td>TA01 Strategic Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00019.md">C00019</a></td>
<td>Reduce effect of division-enablers</td>
<td>includes Promote constructive communication by shaming division-enablers, and Promote playbooks to call out division-enablers</td>
<td>M003 - daylight</td>
<td>TA01 Strategic Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00021.md">C00021</a></td>
<td>Encourage in-person communication</td>
<td>Encourage offline communication</td>
<td>M001 - resilience</td>
<td>TA01 Strategic Planning</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00022.md">C00022</a></td>
<td>Innoculate. Positive campaign to promote feeling of safety</td>
<td>Used to counter ability based and fear based attacks</td>
<td>M001 - resilience</td>
<td>TA01 Strategic Planning</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00024.md">C00024</a></td>
<td>Promote healthy narratives</td>
<td>Includes promoting constructive narratives i.e. not polarising (e.g. pro-life, pro-choice, pro-USA). Includes promoting identity neutral narratives. </td>
<td>M001 - resilience</td>
<td>TA01 Strategic Planning</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00026.md">C00026</a></td>
<td>Shore up democracy based messages</td>
<td>Messages about e.g. peace, freedom. And make it sexy. Includes Deploy Information and Narrative-Building in Service of Statecraft: Promote a narrative of transparency, truthfulness, liberal values, and democracy. Implement a compelling narrative via effective mechanisms of communication. Continually reassess U.S. messages, mechanisms, and audiences over time. Counteract efforts to manipulate media, undermine free markets, and suppress political freedoms via public diplomacy</td>
<td>M010 - countermessaging</td>
<td>TA01 Strategic Planning</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00027.md">C00027</a></td>
<td>Create culture of civility</td>
<td>This is passive. Includes promoting civility as an identity that people will defend. </td>
<td>M001 - resilience</td>
<td>TA01 Strategic Planning</td>
<td>D07</td>
</tr>
<tr>
<td><a href="counters/C00028.md">C00028</a></td>
<td>Make information provenance available</td>
<td>Blockchain audit log and validation with collaborative decryption to post comments. Use blockchain technology to require collaborative validation before posts or comments are submitted.
This could be used to adjust upvote weight via a trust factor of people and organisations you trust, or other criteria.</td>
<td>M011 - verification</td>
<td>TA02 Objective Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00029.md">C00029</a></td>
<td>Create fake website to issue counter narrative and counter narrative through physical merchandise</td>
<td>Create websites in disinformation voids - spaces where people are looking for known disinformation. </td>
<td>M002 - diversion</td>
<td>TA02 Objective Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00030.md">C00030</a></td>
<td>Develop a compelling counter narrative (truth based)</td>
<td>Example: Reality Team.
https://www.isdglobal.org/wp-content/uploads/2016/06/Counter-narrative-Handbook_1.pdf </td>
<td>M002 - diversion</td>
<td>TA02 Objective Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00031.md">C00031</a></td>
<td>Dilute the core narrative - create multiple permutations, target / amplify</td>
<td>Create competing narratives. Included "Facilitate State Propaganda" as diluting the narrative could have an effect on the pro-state narrative used by volunteers, or lower their involvement.</td>
<td>M009 - dilution</td>
<td>TA02 Objective Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00032.md">C00032</a></td>
<td>Hijack content and link to truth- based info</td>
<td>Link to platform</td>
<td>M002 - diversion</td>
<td>TA06 Develop Content</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00034.md">C00034</a></td>
<td>Create more friction at account creation</td>
<td>Counters fake account</td>
<td>M004 - friction</td>
<td>TA03 Develop People</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00036.md">C00036</a></td>
<td>Infiltrate the in-group to discredit leaders (divide)</td>
<td>All of these would be highly affected by infiltration or false-claims of infiltration.</td>
<td>M013 - targeting</td>
<td>TA03 Develop People</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00040.md">C00040</a></td>
<td>third party verification for people</td>
<td>counters fake experts</td>
<td>M011 - verification</td>
<td>TA03 Develop People</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00042.md">C00042</a></td>
<td>Address truth contained in narratives</td>
<td>Focus on and boost truths in misinformation narratives, removing misinformation from them. </td>
<td>M010 - countermessaging</td>
<td>TA03 Develop People</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00044.md">C00044</a></td>
<td>Keep people from posting to social media immediately</td>
<td>Platforms can introduce friction to slow down activities, force a small delay between posts, or replies to posts.</td>
<td>M004 - friction</td>
<td>TA03 Develop People</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00046.md">C00046</a></td>
<td>Marginalise and discredit extremist groups</td>
<td>Reduce the credibility of extremist groups posting misinformation.</td>
<td>M013 - targeting</td>
<td>TA03 Develop People</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00047.md">C00047</a></td>
<td>Honeypot with coordinated inauthentics</td>
<td>Flood disinformation spaces with obviously fake content, to dilute core misinformation narratives in them. </td>
<td>M008 - data pollution</td>
<td>TA04 Develop Networks</td>
<td>D05</td>
</tr>
<tr>
<td><a href="counters/C00048.md">C00048</a></td>
<td>Name and Shame Influencers</td>
<td>Think about the different levels: individual vs state-sponsored account. Includes “call them out” and “name and shame”. Note that USAID operations were at a different level. Identify social media accounts as sources of propaganda—“calling them out”— might be helpful to prevent the spread of their message to audiences that otherwise would consider them factual. USAID has been restructuring its programs to address predatory Chinese development projects and the information operations that support them. USAIDs new strategy has tailored programs to counter Chinese educational exchange programs and to support free and fair elections, youth empowerment, democratic governance, and free press. USAIDs Russia regional teams have also been compiling a strategy for Russias information operations. One strong point of USAIDs programming is a system of indicators and measurements for a countrys vulnerability to foreign influence and information operations. Identify, monitor, and, if necessary, target externally-based nonattributed social media accounts. Impact of and Dealing with Trolls - "Chatham House has observed that trolls also sometimes function as decoys, as a way of “keeping the infantry busy” that “aims to wear down the other side” (Lough et al., 2014). Another type of troll involves “false accounts posing as authoritative information sources on social media”.</td>
<td>M003 - daylight</td>
<td>TA03 Develop People</td>
<td>D07</td>
</tr>
<tr>
<td><a href="counters/C00051.md">C00051</a></td>
<td>Counter social engineering training</td>
<td>Includes anti-elicitation training, phishing prevention education. </td>
<td>M001 - resilience</td>
<td>TA03 Develop People</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00052.md">C00052</a></td>
<td>Infiltrate platforms</td>
<td>Detect and degrade</td>
<td>M013 - targeting</td>
<td>TA04 Develop Networks</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00053.md">C00053</a></td>
<td>Delete old accounts / Remove unused social media accounts</td>
<td>remove or remove access to (e.g. stop the ability to update) old social media accounts, to reduce the pool of accounts available for takeover, botnets etc. </td>
<td>M012 - cleaning</td>
<td>TA04 Develop Networks</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00056.md">C00056</a></td>
<td>Encourage people to leave social media</td>
<td>Encourage people to leave spcial media. We don't expect this to work</td>
<td>M004 - friction</td>
<td>TA04 Develop Networks</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00058.md">C00058</a></td>
<td>Report crowdfunder as violator</td>
<td>counters crowdfunding. Includes Expose online funding as fake”. </td>
<td>M005 - removal</td>
<td>TA03 Develop People</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00059.md">C00059</a></td>
<td>Verification of project before posting fund requests</td>
<td>third-party verification of projects posting funding campaigns before those campaigns can be posted. </td>
<td>M011 - verification</td>
<td>TA04 Develop Networks</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00060.md">C00060</a></td>
<td>Legal action against for-profit engagement factories</td>
<td>Take legal action against for-profit "factories" creating misinformation. </td>
<td>M013 - targeting</td>
<td>TA02 Objective Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00062.md">C00062</a></td>
<td>Free open library sources worldwide</td>
<td>Open-source libraries could be created that aid in some way for each technique. Even for Strategic Planning, some open-source frameworks such as DISARM can be created to counter the adversarial efforts.</td>
<td>M010 - countermessaging</td>
<td>TA04 Develop Networks</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00065.md">C00065</a></td>
<td>Reduce political targeting</td>
<td>Includes “ban political micro targeting” and “ban political ads”</td>
<td>M005 - removal</td>
<td>TA05 Microtargeting</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00066.md">C00066</a></td>
<td>Co-opt a hashtag and drown it out (hijack it back)</td>
<td>Flood a disinformation-related hashtag with other content. Examples include kPop stans flooding extremist hashtags with pop videos and images. </td>
<td>M009 - dilution</td>
<td>TA05 Microtargeting</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00067.md">C00067</a></td>
<td>Denigrate the recipient/ project (of online funding)</td>
<td>Reduce the credibility of groups behind misinformation-linked funding campaigns. </td>
<td>M013 - targeting</td>
<td>TA03 Develop People</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00070.md">C00070</a></td>
<td>Block access to disinformation resources</td>
<td>Resources = accounts, channels etc. Block access to platform. DDOS an attacker.
TA02*: DDOS at the critical time (ie. midterm-2018 elections DDOS against troll farms) to deny an adversary's time-bound objective.
T0008: A quick response to a proto-viral story will affect it's ability to spread and raise questions about their legitimacy.
Hashtag: Against the platform, by drowning the hashtag.
T0046 - Search Engine Optimization: Sub-optimal website performance affect its search engine rank, which I interpret as "blocking access to a platform".</td>
<td>M005 - removal</td>
<td>TA02 Objective Planning</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00071.md">C00071</a></td>
<td>Block source of pollution</td>
<td>Block websites, accounts, groups etc connected to misinformation and other information pollution. </td>
<td>M005 - removal</td>
<td>TA06 Develop Content</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00072.md">C00072</a></td>
<td>Remove non-relevant content from special interest groups - not recommended</td>
<td>Check special-interest groups (e.g. medical, knitting) for unrelated and misinformation-linked content, and remove it. </td>
<td>M005 - removal</td>
<td>TA06 Develop Content</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00073.md">C00073</a></td>
<td>Inoculate populations through media literacy training</td>
<td>Use training to build the resilience of at-risk populations. Educate on how to handle info pollution. Push out targeted education on why it's pollution. Build cultural resistance to false content, e.g. cultural resistance to bullshit. Influence literacy training, to inoculate against “cult” recruiting. Media literacy training: leverage librarians / library for media literacy training. Inoculate at language. Strategic planning included as inoculating population has strategic value. Example is the "Learn to Discern" Program, funded by the Canadian government, operated in Ukraine from July 2015 to March 2016. The program trained 15,000 Ukrainians in safe, informed media consumption techniques, including avoiding emotional manipulation, verifying sources, identifying hate speech, verifying expert credentials, detecting censorship, and debunking news, photos, and videos.” Example: NGO Baltic Centre for Media Excellence, with some international funding, provides training to journalists in the Baltics and conducts media literacy training in the region. In addition to helping journalists avoid becoming “unwitting multipliers of misleading information,” the organization works with school teachers in the region to help them “decode media and incorporate media research into teaching.” concepts of media literacy to a mass audience that authorities launch a public information campaign that teaches the program will take time to develop and establish impact, recommends curriculum-based training. Covers detect, deny, and degrade. </td>
<td>M001 - resilience</td>
<td>TA01 Strategic Planning</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00074.md">C00074</a></td>
<td>Identify and delete or rate limit identical content</td>
<td>C00000</td>
<td>M012 - cleaning</td>
<td>TA06 Develop Content</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00075.md">C00075</a></td>
<td>normalise language</td>
<td>normalise the language around disinformation and misinformation; give people the words for artifact and effect types. </td>
<td>M010 - countermessaging</td>
<td>TA06 Develop Content</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00076.md">C00076</a></td>
<td>Prohibit images in political discourse channels</td>
<td>Make political discussion channels text-only. </td>
<td>M005 - removal</td>
<td>TA06 Develop Content</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00077.md">C00077</a></td>
<td>Active defence: run TA03 "develop people” - not recommended</td>
<td>Develop networks of communities and influencers around counter-misinformation. Match them to misinformation creators </td>
<td>M013 - targeting</td>
<td>TA03 Develop People</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00078.md">C00078</a></td>
<td>Change Search Algorithms for Disinformation Content</td>
<td>Includes “change image search algorithms for hate groups and extremists” and “Change search algorithms for hate and extremist queries to show content sympathetic to opposite side”</td>
<td>M002 - diversion</td>
<td>TA06 Develop Content</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00080.md">C00080</a></td>
<td>Create competing narrative</td>
<td>Create counternarratives, or narratives that compete in the same spaces as misinformation narratives. Could also be degrade</td>
<td>M002 - diversion</td>
<td>TA06 Develop Content</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00081.md">C00081</a></td>
<td>Highlight flooding and noise, and explain motivations</td>
<td>Discredit by pointing out the "noise" and informing public that "flooding" is a technique of disinformation campaigns; point out intended objective of "noise"</td>
<td>M003 - daylight</td>
<td>TA06 Develop Content</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00082.md">C00082</a></td>
<td>Ground truthing as automated response to pollution</td>
<td>e.g. RealityTeam work that adds clear information to spaces with disinformation in. Also inoculation.</td>
<td>M010 - countermessaging</td>
<td>TA06 Develop Content</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00084.md">C00084</a></td>
<td>Modify disinformation narratives, and rebroadcast them</td>
<td>Includes “poison pill recasting of message” and “steal their truths”. Many techniques involve promotion which could be manipulated. For example, online fundings or rallies could be advertised, through compromised or fake channels, as being associated with "far-up/down/left/right" actors. "Long Game" narratives could be subjected in a similar way with negative connotations. Can also replay technique T0003. </td>
<td>M002 - diversion</td>
<td>TA06 Develop Content</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00085.md">C00085</a></td>
<td>Mute content</td>
<td>Rate-limit disinformation content. Reduces its effects, whilst not running afoul of censorship concerns.
Online archives of content (archives of websites, social media profiles, media, copies of published advertisements; or archives of comments attributed to bad actors, as well as anonymized metadata about users who interacted with them and analysis of the effect) is useful for intelligence analysis and public transparency, but will need similar muting or tagging/ shaming as associated with bad actors.</td>
<td>M003 - daylight</td>
<td>TA06 Develop Content</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00086.md">C00086</a></td>
<td>Distract from noise with addictive content</td>
<td>Example: Interject addictive links or contents into discussions of disinformation materials and measure a "conversion rate" of users who engage with your content and away from the social media channel's "information bubble" around the disinformation item. Use bots to amplify and upvote the addictive content.
Note: This sounds eerlily like many Reddit communities where the most upvoted comments are all jokes, preventing serious discussion from being discovered by those who filter by upvotes.</td>
<td>M002 - diversion</td>
<td>TA06 Develop Content</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00087.md">C00087</a></td>
<td>Make more noise than the disinformation</td>
<td>Examples: kPop stans, #proudboys takeover by LGBT community</td>
<td>M009 - dilution</td>
<td>TA06 Develop Content</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00090.md">C00090</a></td>
<td>Fake engagement system</td>
<td>Create honeypots for misinformation creators to engage with, and reduce the resources they have available for misinformation campaigns. </td>
<td>M002 - diversion</td>
<td>TA07 Channel Selection</td>
<td>D05</td>
</tr>
<tr>
<td><a href="counters/C00091.md">C00091</a></td>
<td>Honeypot social community</td>
<td>Set honeypots, e.g. communities, in networks likely to be used for disinformation. </td>
<td>M002 - diversion</td>
<td>TA06 Develop Content</td>
<td>D05</td>
</tr>
<tr>
<td><a href="counters/C00092.md">C00092</a></td>
<td>Establish a truth teller reputation score for influencers</td>
<td>Includes "Establish a truth teller reputation score for influencers” and “Reputation scores for social media users”. Influencers are individuals or accounts with many followers. </td>
<td>M006 - scoring</td>
<td>TA02 Objective Planning</td>
<td>D07</td>
</tr>
<tr>
<td><a href="counters/C00093.md">C00093</a></td>
<td>Influencer code of conduct</td>
<td>Establish tailored code of conduct for individuals with many followers. Can be platform code of conduct; can also be community code, e.g. the Pro-Truth Pledge. </td>
<td>M001 - resilience</td>
<td>TA03 Develop People</td>
<td>D07</td>
</tr>
<tr>
<td><a href="counters/C00094.md">C00094</a></td>
<td>Force full disclosure on corporate sponsor of research</td>
<td>Accountability move: make sure research is published with its funding sources. </td>
<td>M003 - daylight</td>
<td>TA06 Develop Content</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00096.md">C00096</a></td>
<td>Strengthen institutions that are always truth tellers</td>
<td>Increase credibility, visibility, and reach of positive influencers in the information space. </td>
<td>M006 - scoring</td>
<td>TA01 Strategic Planning</td>
<td>D07</td>
</tr>
<tr>
<td><a href="counters/C00097.md">C00097</a></td>
<td>Require use of verified identities to contribute to poll or comment</td>
<td>Reduce poll flooding by online taking comments or poll entries from verified accounts. </td>
<td>M004 - friction</td>
<td>TA07 Channel Selection</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00098.md">C00098</a></td>
<td>Revocation of allowlisted or "verified" status</td>
<td>remove blue checkmarks etc from known misinformation accounts. </td>
<td>M004 - friction</td>
<td>TA07 Channel Selection</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00099.md">C00099</a></td>
<td>Strengthen verification methods</td>
<td>Improve content veerification methods available to groups, individuals etc. </td>
<td>M004 - friction</td>
<td>TA07 Channel Selection</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00100.md">C00100</a></td>
<td>Hashtag jacking</td>
<td>Post large volumes of unrelated content on known misinformation hashtags </td>
<td>M002 - diversion</td>
<td>TA08 Pump Priming</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00101.md">C00101</a></td>
<td>Create friction by rate-limiting engagement</td>
<td>Create participant friction. Includes Make repeat voting hard, and throttle number of forwards. </td>
<td>M004 - friction</td>
<td>TA07 Channel Selection</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00103.md">C00103</a></td>
<td>Create a bot that engages / distract trolls</td>
<td>This is reactive, not active measure (honeypots are active). It's a platform controlled measure.</td>
<td>M002 - diversion</td>
<td>TA07 Channel Selection</td>
<td>D05</td>
</tr>
<tr>
<td><a href="counters/C00105.md">C00105</a></td>
<td>Buy more advertising than misinformation creators</td>
<td>Shift influence and algorithms by posting more adverts into spaces than misinformation creators. </td>
<td>M009 - dilution</td>
<td>TA07 Channel Selection</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00106.md">C00106</a></td>
<td>Click-bait centrist content</td>
<td>Create emotive centrist content that gets more clicks</td>
<td>M002 - diversion</td>
<td>TA06 Develop Content</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00107.md">C00107</a></td>
<td>Content moderation</td>
<td>includes social media content take-downs, e.g. facebook or Twitter content take-downs</td>
<td>M006 - scoring, M005 - removal</td>
<td>TA06 Develop Content</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00109.md">C00109</a></td>
<td>Dampen Emotional Reaction</td>
<td>Reduce emotional responses to misinformation through calming messages, etc. </td>
<td>M001 - resilience</td>
<td>TA09 Exposure</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00111.md">C00111</a></td>
<td>Reduce polarisation by connecting and presenting sympathetic renditions of opposite views</td>
<td>Example: The Commons Project (BuildUp) work. </td>
<td>M001 - resilience</td>
<td>TA01 Strategic Planning</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00112.md">C00112</a></td>
<td>"Prove they are not an op!"</td>
<td>Challenge misinformation creators to prove they're not an information operation. </td>
<td>M004 - friction</td>
<td>TA08 Pump Priming</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00113.md">C00113</a></td>
<td>Debunk and defuse a fake expert / credentials.</td>
<td>Debunk fake experts, their credentials, and potentially also their audience quality</td>
<td>M003 - daylight</td>
<td>TA08 Pump Priming</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00114.md">C00114</a></td>
<td>Don't engage with payloads</td>
<td>Stop passing on misinformation</td>
<td>M004 - friction</td>
<td>TA08 Pump Priming</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00115.md">C00115</a></td>
<td>Expose actor and intentions</td>
<td>Debunk misinformation creators and posters. </td>
<td>M003 - daylight</td>
<td>TA08 Pump Priming</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00116.md">C00116</a></td>
<td>Provide proof of involvement</td>
<td>Build and post information about groups etc's involvement in misinformation incidents. </td>
<td>M003 - daylight</td>
<td>TA08 Pump Priming</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00117.md">C00117</a></td>
<td>Downgrade / de-amplify so message is seen by fewer people</td>
<td>Label promote counter to disinformation</td>
<td>M010 - countermessaging</td>
<td>TA08 Pump Priming</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00118.md">C00118</a></td>
<td>Repurpose images with new text</td>
<td>Add countermessage text to iamges used in misinformation incidents. </td>
<td>M010 - countermessaging</td>
<td>TA08 Pump Priming</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00119.md">C00119</a></td>
<td>Engage payload and debunk.</td>
<td>debunk misinformation content. Provide link to facts. </td>
<td>M010 - countermessaging</td>
<td>TA08 Pump Priming</td>
<td>D07</td>
</tr>
<tr>
<td><a href="counters/C00120.md">C00120</a></td>
<td>Open dialogue about design of platforms to produce different outcomes</td>
<td>Redesign platforms and algorithms to reduce the effectiveness of disinformation</td>
<td>M007 - metatechnique</td>
<td>TA08 Pump Priming</td>
<td>D07</td>
</tr>
<tr>
<td><a href="counters/C00121.md">C00121</a></td>
<td>Tool transparency and literacy for channels people follow. </td>
<td>Make algorithms in platforms explainable, and visible to people using those platforms. </td>
<td>M001 - resilience</td>
<td>TA08 Pump Priming</td>
<td>D07</td>
</tr>
<tr>
<td><a href="counters/C00122.md">C00122</a></td>
<td>Content moderation</td>
<td>Beware: content moderation misused becomes censorship. </td>
<td>M004 - friction</td>
<td>TA09 Exposure</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00123.md">C00123</a></td>
<td>Remove or rate limit botnets</td>
<td>reduce the visibility of known botnets online. </td>
<td>M004 - friction</td>
<td>TA09 Exposure</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00124.md">C00124</a></td>
<td>Don't feed the trolls</td>
<td>Don't engage with individuals relaying misinformation. </td>
<td>M004 - friction</td>
<td>TA09 Exposure</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00125.md">C00125</a></td>
<td>Prebunking</td>
<td>Produce material in advance of misinformation incidents, by anticipating the narratives used in them, and debunking them. </td>
<td>M001 - resilience</td>
<td>TA09 Exposure</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00126.md">C00126</a></td>
<td>Social media amber alert</td>
<td>Create an alert system around disinformation and misinformation artifacts, narratives, and incidents </td>
<td>M003 - daylight</td>
<td>TA09 Exposure</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00128.md">C00128</a></td>
<td>Create friction by marking content with ridicule or other "decelerants"</td>
<td>Repost or comment on misinformation artifacts, using ridicule or other content to reduce the likelihood of reposting. </td>
<td>M009 - dilution</td>
<td>TA09 Exposure</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00129.md">C00129</a></td>
<td>Use banking to cut off access </td>
<td>fiscal sanctions; parallel to counter terrorism</td>
<td>M014 - reduce resources</td>
<td>TA09 Exposure</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00130.md">C00130</a></td>
<td>Mentorship: elders, youth, credit. Learn vicariously.</td>
<td>Train local influencers in countering misinformation. </td>
<td>M001 - resilience</td>
<td>TA05 Microtargeting</td>
<td>D07</td>
</tr>
<tr>
<td><a href="counters/C00131.md">C00131</a></td>
<td>Seize and analyse botnet servers</td>
<td>Take botnet servers offline by seizing them. </td>
<td>M005 - removal</td>
<td>TA11 Persistence</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00133.md">C00133</a></td>
<td>Deplatform Account*</td>
<td>Note: Similar to Deplatform People but less generic. Perhaps both should be left.</td>
<td>M005 - removal</td>
<td>TA03 Develop People</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00135.md">C00135</a></td>
<td>Deplatform message groups and/or message boards</td>
<td>Merged two rows here. </td>
<td>M005 - removal</td>
<td>TA04 Develop Networks</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00136.md">C00136</a></td>
<td>Microtarget most likely targets then send them countermessages</td>
<td>Find communities likely to be targetted by misinformation campaigns, and send them countermessages or pointers to information sources. </td>
<td>M010 - countermessaging</td>
<td>TA08 Pump Priming</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00138.md">C00138</a></td>
<td>Spam domestic actors with lawsuits</td>
<td>File multiple lawsuits against known misinformation creators and posters, to distract them from disinformation creation. </td>
<td>M014 - reduce resources</td>
<td>TA11 Persistence</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00139.md">C00139</a></td>
<td>Weaponise youtube content matrices</td>
<td>God knows what this is. Keeping temporarily in case we work it out. </td>
<td>M004 - friction</td>
<td>TA11 Persistence</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00140.md">C00140</a></td>
<td>"Bomb" link shorteners with lots of calls</td>
<td>Applies to most of the content used by exposure techniques except "T0055 - Use hashtag”. Applies to analytics</td>
<td>M008 - data pollution</td>
<td>TA12 Measure Effectiveness</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00142.md">C00142</a></td>
<td>Platform adds warning label and decision point when sharing content</td>
<td>Includes “this has been disproved: do you want to forward it”. Includes “"Hey this story is old" popup when messaging with old URL” - this assumes that this technique is based on visits to an URL shortener or a captured news site that can publish a message of our choice. Includes “mark clickbait visually”. </td>
<td>M004 - friction</td>
<td>TA06 Develop Content</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00143.md">C00143</a></td>
<td>(botnet) DMCA takedown requests to waste group time</td>
<td>Use copyright infringement claims to remove videos etc. </td>
<td>M013 - targeting</td>
<td>TA11 Persistence</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00144.md">C00144</a></td>
<td>Buy out troll farm employees / offer them jobs</td>
<td>Degrade the infrastructure. Could e.g. pay to not act for 30 days. Not recommended</td>
<td>M014 - reduce resources</td>
<td>TA02 Objective Planning</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00147.md">C00147</a></td>
<td>Make amplification of social media posts expire (e.g. can't like/ retweet after n days)</td>
<td>Stop new community activity (likes, comments) on old social media posts. </td>
<td>M004 - friction</td>
<td>TA09 Exposure</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00148.md">C00148</a></td>
<td>Add random links to network graphs</td>
<td>If creators are using network analysis to determine how to attack networks, then adding random extra links to those networks might throw that analysis out enough to change attack outcomes. Unsure which DISARM techniques.</td>
<td>M008 - data pollution</td>
<td>TA12 Measure Effectiveness</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00149.md">C00149</a></td>
<td>Poison the monitoring & evaluation data</td>
<td>Includes Pollute the AB-testing data feeds: Polluting A/B testing requires knowledge of MOEs and MOPs. A/B testing must be caught early when there is relatively little data available so infiltration of TAs and understanding of how content is migrated from testing to larger audiences is fundamental.</td>
<td>M008 - data pollution</td>
<td>TA12 Measure Effectiveness</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00153.md">C00153</a></td>
<td>Take pre-emptive action against actors' infrastructure</td>
<td>Align offensive cyber action with information operations and counter disinformation approaches, where appropriate.</td>
<td>M013 - targeting</td>
<td>TA01 Strategic Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00154.md">C00154</a></td>
<td>Ask media not to report false information</td>
<td>Train media to spot and respond to misinformation, and ask them not to post or transmit misinformation they've found. </td>
<td>M005 - removal</td>
<td>TA08 Pump Priming</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00155.md">C00155</a></td>
<td>Ban incident actors from funding sites</td>
<td>Ban misinformation creators and posters from funding sites</td>
<td>M005 - removal</td>
<td>TA03 Develop People</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00156.md">C00156</a></td>
<td>Better tell your country or organization story</td>
<td>Civil engagement activities conducted on the part of EFP forces. In Latvia, for example, U.S. soldiers have reportedly conducted numerous civil engagements with the local populations. In one example, soldiers cut firewood for local Russian-speaking Latvians. Locals were reportedly overheard saying, “A Russian soldier wouldnt do that.” NATO should likewise provide support and training, where needed, to local public affairs and other communication personnel. Local government and military public affairs personnel can play their part in creating and disseminating entertaining and sharable content that supports the EFP mission. </td>
<td>M010 - countermessaging</td>
<td>TA02 Objective Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00159.md">C00159</a></td>
<td>Have a disinformation response plan</td>
<td>e.g. Create a campaign plan and toolkit for competition short of armed conflict (this used to be called “the grey zone”). The campaign plan should account for own vulnerabilities and strengths, and not over-rely on any one tool of statecraft or line of effort. It will identify and employ a broad spectrum of national power to deter, compete, and counter (where necessary) other countries approaches, and will include understanding of own capabilities, capabilities of disinformation creators, and international standards of conduct to compete in, shrink the size, and ultimately deter use of competition short of armed conflict.</td>
<td>M007 - metatechnique</td>
<td>TA01 Strategic Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00160.md">C00160</a></td>
<td>find and train influencers</td>
<td>Identify key influencers (e.g. use network analysis), then reach out to identified users and offer support, through either training or resources.</td>
<td>M001 - resilience</td>
<td>TA03 Develop People</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00161.md">C00161</a></td>
<td>Coalition Building with stakeholders and Third-Party Inducements</td>
<td>Advance coalitions across borders and sectors, spanning public and private, as well as foreign and domestic, divides. Improve mechanisms to collaborate, share information, and develop coordinated approaches with the private sector at home and allies and partners abroad.</td>
<td>M007 - metatechnique</td>
<td>TA01 Strategic Planning</td>
<td>D07</td>
</tr>
<tr>
<td><a href="counters/C00162.md">C00162</a></td>
<td>Unravel/target the Potemkin villages</td>
<td>Kremlins narrative spin extends through constellations of “civil society” organizations, political parties, churches, and other actors. Moscow leverages think tanks, human rights groups, election observers, Eurasianist integration groups, and orthodox groups. A collection of Russian civil society organizations, such as the Federal Agency for the Commonwealth of Independent States Affairs, Compatriots Living Abroad, and International Humanitarian Cooperation, together receive at least US$100 million per year, in addition to government-organized nongovernmental organizations (NGOs), at least 150 of which are funded by Russian presidential grants totaling US$70 million per year.</td>
<td>M013 - targeting</td>
<td>TA04 Develop Networks</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00164.md">C00164</a></td>
<td>compatriot policy</td>
<td>protect the interests of this population and, more importantly, influence the population to support pro-Russia causes and effectively influence the politics of its neighbors</td>
<td>M013 - targeting</td>
<td>TA02 Objective Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00165.md">C00165</a></td>
<td>Ensure integrity of official documents</td>
<td>e.g. for leaked legal documents, use court motions to limit future discovery actions</td>
<td>M004 - friction</td>
<td>TA06 Develop Content</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00169.md">C00169</a></td>
<td>develop a creative content hub</td>
<td>international donors will donate to a basket fund that will pay a committee of local experts who will, in turn, manage and distribute the money to Russian-language producers and broadcasters that pitch various projects.</td>
<td>M010 - countermessaging</td>
<td>TA02 Objective Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00170.md">C00170</a></td>
<td>elevate information as a critical domain of statecraft</td>
<td>Shift from reactive to proactive response, with priority on sharing relevant information with the public and mobilizing private-sector engagement. Recent advances in data-driven technologies have elevated information as a source of power to influence the political and economic environment, to foster economic growth, to enable a decision-making advantage over competitors, and to communicate securely and quickly.</td>
<td>M007 - metatechnique</td>
<td>TA01 Strategic Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00172.md">C00172</a></td>
<td>social media source removal</td>
<td>Removing accounts, pages, groups, e.g. facebook page removal</td>
<td>M005 - removal</td>
<td>TA04 Develop Networks</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00174.md">C00174</a></td>
<td>Create a healthier news environment</td>
<td>Free and fair press: create bipartisan, patriotic commitment to press freedom. Note difference between news and editorialising. Build alternative news sources: create alternative local-language news sources to counter local-language propaganda outlets. Delegitimize the 24 hour news cycle. includes Provide an alternative to disinformation content by expanding and improving local content: Develop content that can displace geopolitically-motivated narratives in the entire media environment, both new and old media alike.</td>
<td>M007 - metatechnique, M002 - diversion</td>
<td>TA01 Strategic Planning</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00176.md">C00176</a></td>
<td>Improve Coordination amongst stakeholders: public and private</td>
<td>Coordinated disinformation challenges are increasingly multidisciplinary, there are few organizations within the national security structures that are equipped with the broad-spectrum capability to effectively counter large-scale conflict short of war tactics in real-time. Institutional hurdles currently impede diverse subject matter experts, hailing from outside of the traditional national security and foreign policy disciplines (e.g., physical science, engineering, media, legal, and economics fields), from contributing to the direct development of national security countermeasures to emerging conflict short of war threat vectors. A Cognitive Security Action Group (CSAG), akin to the Counterterrorism Security Group (CSG), could drive interagency alignment across equivalents of DHS, DoS, DoD, Intelligence Community, and other implementing agencies, in areas including strategic narrative, and the nexus of cyber and information operations. </td>
<td>M007 - metatechnique</td>
<td>TA01 Strategic Planning</td>
<td>D07</td>
</tr>
<tr>
<td><a href="counters/C00178.md">C00178</a></td>
<td>Fill information voids with non-disinformation content</td>
<td>1) Pollute the data voids with wholesome content (Kittens! Babyshark!). 2) fill data voids with relevant information, e.g. increase Russian-language programming in areas subject to Russian disinformation. Examples include using Current Time videos (viewed 40 million times online) to draw viewers away from Russian TV programming in RT and Sputnik. POtential content for this includes conventional entertainment programming (source: The conomist, “Americas Answer to Russian Propaganda TV,” 2017).</td>
<td>M009 - dilution, M008 - data pollution</td>
<td>TA05 Microtargeting</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00182.md">C00182</a></td>
<td>Redirection / malware detection/ remediation</td>
<td>Detect redirction or malware, then quarantine or delete. Example: (2015) Trustwave reported that a Bedep Trojan malware kit had begun infecting machines and forcing them to browse certain sites, artificially inflating traffic to a set of pro-Russia</td>
<td>M005 - removal</td>
<td>TA09 Exposure</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00184.md">C00184</a></td>
<td>Media exposure</td>
<td>highlight misinformation activities and actors in media</td>
<td>M003 - daylight</td>
<td>TA08 Pump Priming</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00188.md">C00188</a></td>
<td>Newsroom/Journalist training to counter influence moves</td>
<td>Includes SEO influence. August 2014, Russian news agency Rossiya Segodnya commissioned a poll in France with poorly worded questions and a statistically insignificant subsample that RT used to back a story titled “15% of French people back ISIS [Islamic State of Iraq and Syria] militants, poll finds.” The story and summary infographic circulated on the internet, initially appearing primarily on French sites. After a week, the generally respectable digital U.S. news outlet Vox ran the story, now titled “One in Six French People Say They Support ISIS.” Although this effect has now worn off or been overwritten, for a time—despite a later story from The Washington Post debunking the claim—typing “ISIS France” into Google resulted in an autosuggestion of “ISIS France support” (Borthwick, 2015). Includes promotion of a “higher standard of journalism”: journalism training “would be helpful, especially for the online community. Includes Strengthen local media: Improve effectiveness of local media outlets. Using eastern Latvia media outlets as an example, one expert noted that the media outlets are “very weak,” are often politically affiliated, or have “little local oligarchs that control them.”</td>
<td>M001 - resilience</td>
<td>TA08 Pump Priming</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00189.md">C00189</a></td>
<td>Ensure that platforms are taking down flagged accounts</td>
<td>Use ongoing analysis/monitoring of "flagged" profiles. Confirm whether platforms are actively removing flagged accounts, and raise pressure via e.g. government organizations to encourage removal</td>
<td>M003 - daylight</td>
<td>TA03 Develop People</td>
<td>D06</td>
</tr>
<tr>
<td><a href="counters/C00190.md">C00190</a></td>
<td>open engagement with civil society</td>
<td>Government open engagement with civil society as an independent check on government action and messaging. Government seeks to coordinate and synchronize narrative themes with allies and partners while calibrating action in cases where elements in these countries may have been co-opted by competitor nations. Includes “fight in the light”: Use leadership in the arts, entertainment, and media to highlight and build on fundamental tenets of democracy.</td>
<td>M001 - resilience</td>
<td>TA01 Strategic Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00195.md">C00195</a></td>
<td>Redirect searches away from disinformation or extremist content </td>
<td>Use Google AdWords to identify instances in which people search Google about particular fake-news stories or Russian propaganda themes. Taking advantage of the technology behind Google AdWords, this method identifies potential ISIS recruits through their Google searches and exposes them to curated YouTube videos debunking ISIS recruiting themes. Apply this method to Russian propaganda. Includes Monetize centrist SEO by subsidizing the difference in greater clicks towards extremist content. </td>
<td>M002 - diversion</td>
<td>TA07 Channel Selection</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00197.md">C00197</a></td>
<td>remove suspicious accounts</td>
<td>Standard reporting for false profiles (identity issues). Includes detecting hijacked accounts and reallocating them - if possible, back to original owners. </td>
<td>M005 - removal</td>
<td>TA03 Develop People</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00200.md">C00200</a></td>
<td>Respected figure (influencer) disavows misinfo</td>
<td>Has been done in e.g. India. FIXIT: standardize language used for influencer/ respected figure. </td>
<td>M010 - countermessaging</td>
<td>TA09 Exposure</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00202.md">C00202</a></td>
<td>Set data 'honeytraps'</td>
<td>Set honeytraps in content likely to be accessed for disinformation. NB Macron election team modified docs to spike a hack and leak. </td>
<td>M002 - diversion</td>
<td>TA06 Develop Content</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00203.md">C00203</a></td>
<td>Stop offering press credentials to propaganda outlets</td>
<td>Remove access to official press events from known misinformation actors. </td>
<td>M004 - friction</td>
<td>TA04 Develop Networks</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00205.md">C00205</a></td>
<td>strong dialogue between the federal government and private sector to encourage better reporting</td>
<td>Increase civic resilience by partnering with business community to combat gray zone threats and ensuring adequate reporting and enforcement mechanisms. </td>
<td>M007 - metatechnique</td>
<td>TA01 Strategic Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00207.md">C00207</a></td>
<td>Run a competing disinformation campaign - not recommended</td>
<td>E.g. Saudi tit-for-tat campaign </td>
<td>M013 - targeting</td>
<td>TA02 Objective Planning</td>
<td>D07</td>
</tr>
<tr>
<td><a href="counters/C00211.md">C00211</a></td>
<td>Use humorous counter-narratives</td>
<td>Examples:
* Baltic Elves.
https://balkaninsight.com/2019/06/07/disinformation-nation-the-slovaks-fighting-in-defence-of-facts/
* Taiwan humor over rumor strategy.https://www.theguardian.com/commentisfree/2021/feb/17/humour-over-rumour-taiwan-fake-news
* Taiwan “humor over rumour"
https://internews.org/story/using-comedy-and-social-media-educate-disinformation </td>
<td>M010 - countermessaging</td>
<td>TA09 Exposure</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00212.md">C00212</a></td>
<td>build public resilience by making civil society more vibrant</td>
<td>Increase public service experience, and support wider civics and history education.</td>
<td>M001 - resilience</td>
<td>TA01 Strategic Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00216.md">C00216</a></td>
<td>Use advertiser controls to stem flow of funds to bad actors</td>
<td>Prevent ad revenue going to disinformation domains</td>
<td>M014 - reduce resources</td>
<td>TA05 Microtargeting</td>
<td>D02</td>
</tr>
<tr>
<td><a href="counters/C00219.md">C00219</a></td>
<td>Add metadata to content thats out of the control of disinformation creators</td>
<td>Steganography. Adding date, signatures etc to stop issue of photo relabelling etc. </td>
<td>M003 - daylight</td>
<td>TA06 Develop Content</td>
<td>D04</td>
</tr>
<tr>
<td><a href="counters/C00220.md">C00220</a></td>
<td>Develop a monitoring and intelligence plan</td>
<td>Create a plan for misinformation and disinformation response, before it's needed. Include connections / contacts needed, expected counteremessages etc. </td>
<td>M007 - metatechnique</td>
<td>TA01 Strategic Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00221.md">C00221</a></td>
<td>Run a disinformation red team, and design mitigation factors</td>
<td>Include PACE plans - Primary, Alternate, Contingency, Emergency</td>
<td>M007 - metatechnique</td>
<td>TA01 Strategic Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00222.md">C00222</a></td>
<td>Tabletop simulations</td>
<td>Simulate misinformation and disinformation campaigns, and responses to them, before campaigns happen. </td>
<td>M007 - metatechnique</td>
<td>TA02 Objective Planning</td>
<td>D03</td>
</tr>
<tr>
<td><a href="counters/C00223.md">C00223</a></td>
<td>Strengthen Trust in social media platforms</td>
<td>Improve trust in the misinformation responses from social media and other platforms. Examples include creating greater transparancy on their actions and algorithms. </td>
<td>M001 - resilience</td>
<td>TA01 Strategic Planning</td>
<td>D03</td>
</tr>
</table>