# DISARM Blue framework: Latest Framework
TA01 Plan Strategy TA02 Plan Objectives TA05 Microtarget TA06 Develop Content TA07 Select Channels and Affordances TA08 Conduct Pump Priming TA09 Deliver Content TA10 Drive Offline Activity TA11 Persist in the Information Environment TA12 Assess Effectiveness TA13 Target Audience Analysis TA14 Develop Narratives TA15 Establish Social Assets TA16 Establish Legitimacy TA17 Maximize Exposure TA18 Drive Online Harms
C00006 Charge for social media C00009 Educate high profile influencers on best practices C00065 Reduce political targeting C00014 Real-time updates to fact-checking database C00090 Fake engagement system C00100 Hashtag jacking C00109 Dampen Emotional Reaction C00131 Seize and analyse botnet servers C00140 "Bomb" link shorteners with lots of calls C00034 Create more friction at account creation
C00008 Create shared fact-checking database C00011 Media literacy. Games to identify fake news C00066 Co-opt a hashtag and drown it out (hijack it back) C00032 Hijack content and link to truth- based info C00097 Require use of verified identities to contribute to poll or comment C00112 "Prove they are not an op!" C00122 Content moderation C00138 Spam domestic actors with lawsuits C00148 Add random links to network graphs C00036 Infiltrate the in-group to discredit leaders (divide)
C00010 Enhanced privacy regulation for social media C00028 Make information provenance available C00130 Mentorship: elders, youth, credit. Learn vicariously. C00071 Block source of pollution C00098 Revocation of allowlisted or "verified" status C00113 Debunk and defuse a fake expert / credentials. C00123 Remove or rate limit botnets C00139 Weaponise youtube content matrices C00149 Poison the monitoring & evaluation data C00040 third party verification for people
C00012 Platform regulation C00029 Create fake website to issue counter narrative and counter narrative through physical merchandise C00178 Fill information voids with non-disinformation content C00072 Remove non-relevant content from special interest groups - not recommended C00099 Strengthen verification methods C00114 Don't engage with payloads C00124 Don't feed the trolls C00143 (botnet) DMCA takedown requests to waste group time C00042 Address truth contained in narratives
C00013 Rating framework for news C00030 Develop a compelling counter narrative (truth based) C00216 Use advertiser controls to stem flow of funds to bad actors C00074 Identify and delete or rate limit identical content C00101 Create friction by rate-limiting engagement C00115 Expose actor and intentions C00125 Prebunking C00044 Keep people from posting to social media immediately
C00016 Censorship C00031 Dilute the core narrative - create multiple permutations, target / amplify C00075 normalise language C00103 Create a bot that engages / distract trolls C00116 Provide proof of involvement C00126 Social media amber alert C00046 Marginalise and discredit extremist groups
C00017 Repair broken social connections C00060 Legal action against for-profit engagement factories C00076 Prohibit images in political discourse channels C00105 Buy more advertising than misinformation creators C00117 Downgrade / de-amplify so message is seen by fewer people C00128 Create friction by marking content with ridicule or other "decelerants" C00047 Honeypot with coordinated inauthentics
C00019 Reduce effect of division-enablers C00070 Block access to disinformation resources C00078 Change Search Algorithms for Disinformation Content C00195 Redirect searches away from disinformation or extremist content C00118 Repurpose images with new text C00129 Use banking to cut off access C00048 Name and Shame Influencers
C00021 Encourage in-person communication C00092 Establish a truth teller reputation score for influencers C00080 Create competing narrative C00119 Engage payload and debunk. C00147 Make amplification of social media posts expire (e.g. can't like/ retweet after n days) C00051 Counter social engineering training
C00022 Innoculate. Positive campaign to promote feeling of safety C00144 Buy out troll farm employees / offer them jobs C00081 Highlight flooding and noise, and explain motivations C00120 Open dialogue about design of platforms to produce different outcomes C00182 Redirection / malware detection/ remediation C00052 Infiltrate platforms
C00024 Promote healthy narratives C00156 Better tell your country or organization story C00082 Ground truthing as automated response to pollution C00121 Tool transparency and literacy for channels people follow. C00200 Respected figure (influencer) disavows misinfo C00053 Delete old accounts / Remove unused social media accounts
C00026 Shore up democracy based messages C00164 compatriot policy C00084 Modify disinformation narratives, and rebroadcast them C00136 Microtarget most likely targets then send them countermessages C00211 Use humorous counter-narratives C00056 Encourage people to leave social media
C00027 Create culture of civility C00169 develop a creative content hub C00085 Mute content C00154 Ask media not to report false information C00058 Report crowdfunder as violator
C00073 Inoculate populations through media literacy training C00207 Run a competing disinformation campaign - not recommended C00086 Distract from noise with addictive content C00184 Media exposure C00059 Verification of project before posting fund requests
C00096 Strengthen institutions that are always truth tellers C00222 Tabletop simulations C00087 Make more noise than the disinformation C00188 Newsroom/Journalist training to counter influence moves C00062 Free open library sources worldwide
C00111 Reduce polarisation by connecting and presenting sympathetic renditions of opposite views C00091 Honeypot social community C00067 Denigrate the recipient/ project (of online funding)
C00153 Take pre-emptive action against actors' infrastructure C00094 Force full disclosure on corporate sponsor of research C00077 Active defence: run TA15 "develop people” - not recommended
C00159 Have a disinformation response plan C00106 Click-bait centrist content C00093 Influencer code of conduct
C00161 Coalition Building with stakeholders and Third-Party Inducements C00107 Content moderation C00133 Deplatform Account*
C00170 elevate information as a critical domain of statecraft C00142 Platform adds warning label and decision point when sharing content C00135 Deplatform message groups and/or message boards
C00174 Create a healthier news environment C00165 Ensure integrity of official documents C00155 Ban incident actors from funding sites
C00176 Improve Coordination amongst stakeholders: public and private C00202 Set data 'honeytraps' C00160 find and train influencers
C00190 open engagement with civil society C00219 Add metadata to content that’s out of the control of disinformation creators C00162 Unravel/target the Potemkin villages
C00205 strong dialogue between the federal government and private sector to encourage better reporting C00172 social media source removal
C00212 build public resilience by making civil society more vibrant C00189 Ensure that platforms are taking down flagged accounts
C00220 Develop a monitoring and intelligence plan C00197 remove suspicious accounts
C00221 Run a disinformation red team, and design mitigation factors C00203 Stop offering press credentials to propaganda outlets
C00223 Strengthen Trust in social media platforms