Include a paid-for privacy option, e.g. pay Facebook for an option of them not collecting your personal information. There are examples of this not working, e.g. most people don’t use proton mail etc.
-
M004 - friction
+
Include a paid-for privacy option, e.g. pay Facebook for an option of them not collecting your personal information. There are examples of this not working, e.g. most people don’t use proton mail etc.
Educate high profile influencers on best practices
-
Find online influencers. Provide training in the mechanisms of disinformation, how to spot campaigns, and/or how to contribute to responses by countermessaging, boosting information sites etc.
-
M001 - resilience
+
Find online influencers. Provide training in the mechanisms of disinformation, how to spot campaigns, and/or how to contribute to responses by countermessaging, boosting information sites etc.
Empower existing regulators to govern social media. Also covers Destroy. Includes: Include the role of social media in the regulatory framework for media. The U.S. approach will need to be carefully crafted to protect First Amendment principles, create needed transparency, ensure liability, and impose costs for noncompliance. Includes Create policy that makes social media police disinformation. Includes: Use fraud legislation to clean up social media
-
M007 - metatechnique
+
Empower existing regulators to govern social media. Also covers Destroy. Includes: Include the role of social media in the regulatory framework for media. The U.S. approach will need to be carefully crafted to protect First Amendment principles, create needed transparency, ensure liability, and impose costs for noncompliance. Includes Create policy that makes social media police disinformation. Includes: Use fraud legislation to clean up social media
This is "strategic innoculation", raising the standards of what people expect in terms of evidence when consuming news. Example: journalistic ethics, or journalistic licensing body. Include full transcripts, link source, add items.
-
M006 - scoring
+
This is "strategic innoculation", raising the standards of what people expect in terms of evidence when consuming news. Example: journalistic ethics, or journalistic licencing body. Include full transcripts, link source, add items.
For example, use a media campaign to promote in-group to out-group in person communication / activities . Technique could be in terms of forcing a reality-check by talking to people instead of reading about bogeymen.
-
M010 - countermessaging
+
For example, use a media campaign to promote in-group to out-group in person communication / activities . Technique could be in terms of forcing a reality-check by talking to people instead of reading about bogeymen.
Messages about e.g. peace, freedom. And make it sexy. Includes Deploy Information and Narrative-Building in Service of Statecraft: Promote a narrative of transparency, truthfulness, liberal values, and democracy. Implement a compelling narrative via effective mechanisms of communication. Continually reassess messages, mechanisms, and audiences over time. Counteract efforts to manipulate media, undermine free markets, and suppress political freedoms via public diplomacy
Blockchain audit log and validation with collaborative decryption to post comments. Use blockchain technology to require collaborative validation before posts or comments are submitted.
-
-This could be used to adjust upvote weight via a trust factor of people and organisations you trust, or other criteria.
-
M011 - verification
+
Blockchain audit log and validation with collaborative decryption to post comments. Use blockchain technology to require collaborative validation before posts or comments are submitted. This could be used to adjust upvote weight via a trust factor of people and organisations you trust, or other criteria.
Create competing narratives. Included "Facilitate State Propaganda" as diluting the narrative could have an effect on the pro-state narrative used by volunteers, or lower their involvement.
-
M009 - dilution
+
M009 - Dilution
TA02 Objective Planning
D03
@@ -175,7 +173,7 @@ This could be used to adjust upvote weight via a trust factor of people and orga
Think about the different levels: individual vs state-sponsored account. Includes “call them out” and “name and shame”. Identify social media accounts as sources of propaganda—“calling them out”— might be helpful to prevent the spread of their message to audiences that otherwise would consider them factual. Identify, monitor, and, if necessary, target externally-based nonattributed social media accounts. Impact of and Dealing with Trolls - "Chatham House has observed that trolls also sometimes function as decoys, as a way of “keeping the infantry busy” that “aims to wear down the other side” (Lough et al., 2014). Another type of troll involves “false accounts posing as authoritative information sources on social media”.
-
M003 - daylight
+
Think about the different levels: individual vs state-sponsored account. Includes “call them out” and “name and shame”. Identify social media accounts as sources of propaganda—“calling them out”— might be helpful to prevent the spread of their message to audiences that otherwise would consider them factual. Identify, monitor, and, if necessary, target externally-based nonattributed social media accounts. Impact of and Dealing with Trolls - "Chatham House has observed that trolls also sometimes function as decoys, as a way of “keeping the infantry busy” that “aims to wear down the other side” (Lough et al., 2014). Another type of troll involves “false accounts posing as authoritative information sources on social media”.
Delete old accounts / Remove unused social media accounts
-
remove or remove access to (e.g. stop the ability to update) old social media accounts, to reduce the pool of accounts available for takeover, botnets etc.
-
M012 - cleaning
+
remove or remove access to (e.g. stop the ability to update) old social media accounts, to reduce the pool of accounts available for takeover, botnets etc.
Open-source libraries could be created that aid in some way for each technique. Even for Strategic Planning, some open-source frameworks such as DISARM can be created to counter the adversarial efforts.
-
M010 - countermessaging
+
M010 - Countermessaging
TA15 Establish Social Assets
D04
@@ -311,63 +309,55 @@ This could be used to adjust upvote weight via a trust factor of people and orga
Resources = accounts, channels etc. Block access to platform. DDOS an attacker.
-
-TA02*: DDOS at the critical time, to deny an adversary's time-bound objective.
-
-T0008: A quick response to a proto-viral story will affect it's ability to spread and raise questions about their legitimacy.
-
-Hashtag: Against the platform, by drowning the hashtag.
-
-T0046 - Search Engine Optimization: Sub-optimal website performance affect its search engine rank, which I interpret as "blocking access to a platform".
-
M005 - removal
+
Resources = accounts, channels etc. Block access to platform. DDOS an attacker. TA02*: DDOS at the critical time, to deny an adversary's time-bound objective. T0008: A quick response to a proto-viral story will affect it's ability to spread and raise questions about their legitimacy. Hashtag: Against the platform, by drowning the hashtag. T0046 - Search Engine Optimisation: Sub-optimal website performance affect its search engine rank, which I interpret as "blocking access to a platform".
Inoculate populations through media literacy training
-
Use training to build the resilience of at-risk populations. Educate on how to handle info pollution. Push out targeted education on why it's pollution. Build cultural resistance to false content, e.g. cultural resistance to bullshit. Influence literacy training, to inoculate against “cult” recruiting. Media literacy training: leverage librarians / library for media literacy training. Inoculate at language. Strategic planning included as inoculating population has strategic value. Concepts of media literacy to a mass audience that authorities launch a public information campaign that teaches the program will take time to develop and establish impact, recommends curriculum-based training. Covers detect, deny, and degrade.
-
M001 - resilience
+
Use training to build the resilience of at-risk populations. Educate on how to handle info pollution. Push out targeted education on why it's pollution. Build cultural resistance to false content, e.g. cultural resistance to bullshit. Influence literacy training, to inoculate against “cult” recruiting. Media literacy training: leverage librarians / library for media literacy training. Inoculate at language. Strategic planning included as inoculating population has strategic value. Concepts of media literacy to a mass audience that authorities launch a public information campaign that teaches the programme will take time to develop and establish impact, recommends curriculum-based training. Covers detect, deny, and degrade.
+
M001 - Resilience
TA01 Strategic Planning
D02
@@ -375,31 +365,31 @@ T0046 - Search Engine Optimization: Sub-optimal website performance affect its s
Change Search Algorithms for Disinformation Content
Includes “change image search algorithms for hate groups and extremists” and “Change search algorithms for hate and extremist queries to show content sympathetic to opposite side”
Highlight flooding and noise, and explain motivations
Discredit by pointing out the "noise" and informing public that "flooding" is a technique of disinformation campaigns; point out intended objective of "noise"
-
M003 - daylight
+
M003 - Daylight
TA06 Develop Content
D03
@@ -431,33 +421,31 @@ T0046 - Search Engine Optimization: Sub-optimal website performance affect its s
Modify disinformation narratives, and rebroadcast them
-
Includes “poison pill recasting of message” and “steal their truths”. Many techniques involve promotion which could be manipulated. For example, online fundings or rallies could be advertised, through compromised or fake channels, as being associated with "far-up/down/left/right" actors. "Long Game" narratives could be subjected in a similar way with negative connotations. Can also replay technique T0003.
-
M002 - diversion
+
Includes “poison pill recasting of message” and “steal their truths”. Many techniques involve promotion which could be manipulated. For example, online fundings or rallies could be advertised, through compromised or fake channels, as being associated with "far-up/down/left/right" actors. "Long Game" narratives could be subjected in a similar way with negative connotations. Can also replay technique T0003.
Rate-limit disinformation content. Reduces its effects, whilst not running afoul of censorship concerns.
-
-Online archives of content (archives of websites, social media profiles, media, copies of published advertisements; or archives of comments attributed to bad actors, as well as anonymized metadata about users who interacted with them and analysis of the effect) is useful for intelligence analysis and public transparency, but will need similar muting or tagging/ shaming as associated with bad actors.
-
M003 - daylight
+
Rate-limit disinformation content. Reduces its effects, whilst not running afoul of censorship concerns. Online archives of content (archives of websites, social media profiles, media, copies of published advertisements; or archives of comments attributed to bad actors, as well as anonymized metadata about users who interacted with them and analysis of the effect) is useful for intelligence analysis and public transparency, but will need similar muting or tagging/ shaming as associated with bad actors.
Example: Interject addictive links or contents into discussions of disinformation materials and measure a "conversion rate" of users who engage with your content and away from the social media channel's "information bubble" around the disinformation item. Use bots to amplify and upvote the addictive content.
-
M002 - diversion
+
Example: Interject addictive links or contents into discussions of disinformation materials and measure a "conversion rate" of users who engage with your content and away from the social media channel's "information bubble" around the disinformation item. Use bots to amplify and upvote the addictive content.
+
M002 - Diversion
TA06 Develop Content
D04
@@ -465,111 +453,111 @@ Online archives of content (archives of websites, social media profiles, media,
Establish a truth teller reputation score for influencers
-
Includes "Establish a truth teller reputation score for influencers” and “Reputation scores for social media users”. Influencers are individuals or accounts with many followers.
-
M006 - scoring
+
Includes "Establish a truth teller reputation score for influencers” and “Reputation scores for social media users”. Influencers are individuals or accounts with many followers.
Platform adds warning label and decision point when sharing content
-
Includes “this has been disproved: do you want to forward it”. Includes “"Hey this story is old" popup when messaging with old URL” - this assumes that this technique is based on visits to an URL shortener or a captured news site that can publish a message of our choice. Includes “mark clickbait visually”.
-
M004 - friction
+
Includes “this has been disproved: do you want to forward it”. Includes “"Hey this story is old" popup when messaging with old URL” - this assumes that this technique is based on visits to an URL shortener or a captured news site that can publish a message of our choice. Includes “mark clickbait visually”.
If creators are using network analysis to determine how to attack networks, then adding random extra links to those networks might throw that analysis out enough to change attack outcomes. Unsure which DISARM techniques.
-
M008 - data pollution
+
M008 - Data Pollution
TA12 Measure Effectiveness
D04
@@ -849,7 +837,7 @@ Online archives of content (archives of websites, social media profiles, media,
Includes Pollute the AB-testing data feeds: Polluting A/B testing requires knowledge of MOEs and MOPs. A/B testing must be caught early when there is relatively little data available so infiltration of TAs and understanding of how content is migrated from testing to larger audiences is fundamental.
-
M008 - data pollution
+
M008 - Data Pollution
TA12 Measure Effectiveness
D04
@@ -857,15 +845,15 @@ Online archives of content (archives of websites, social media profiles, media,
Civil engagement activities conducted on the part of EFP forces. NATO should likewise provide support and training, where needed, to local public affairs and other communication personnel. Local government and military public affairs personnel can play their part in creating and disseminating entertaining and sharable content that supports the EFP mission.
-
M010 - countermessaging
+
Better tell your country or organisation story
+
Civil engagement activities conducted on the part of EFP forces. NATO should likewise provide support and training, where needed, to local public affairs and other communication personnel. Local government and military public affairs personnel can play their part in creating and disseminating entertaining and sharable content that supports the EFP mission.
e.g. Create a campaign plan and toolkit for competition short of armed conflict (this used to be called “the grey zone”). The campaign plan should account for own vulnerabilities and strengths, and not over-rely on any one tool of statecraft or line of effort. It will identify and employ a broad spectrum of national power to deter, compete, and counter (where necessary) other countries’ approaches, and will include understanding of own capabilities, capabilities of disinformation creators, and international standards of conduct to compete in, shrink the size, and ultimately deter use of competition short of armed conflict.
-
M007 - metatechnique
+
e.g. Create a campaign plan and toolkit for competition short of armed conflict (this used to be called “the grey zone”). The campaign plan should account for own vulnerabilities and strengths, and not over-rely on any one tool of statecraft or line of effort. It will identify and employ a broad spectrum of national power to deter, compete, and counter (where necessary) other countries’ approaches, and will include understanding of own capabilities, capabilities of disinformation creators, and international standards of conduct to compete in, shrink the size, and ultimately deter use of competition short of armed conflict.
+
M007 - Metatechnique
TA01 Strategic Planning
D03
@@ -897,7 +885,7 @@ Online archives of content (archives of websites, social media profiles, media,
Coalition Building with stakeholders and Third-Party Inducements
Advance coalitions across borders and sectors, spanning public and private, as well as foreign and domestic, divides. Improve mechanisms to collaborate, share information, and develop coordinated approaches with the private sector at home and allies and partners abroad.
Kremlin’s narrative spin extends through constellations of “civil society” organizations, political parties, churches, and other actors. Moscow leverages think tanks, human rights groups, election observers, Eurasianist integration groups, and orthodox groups. A collection of Russian civil society organizations, such as the Federal Agency for the Commonwealth of Independent States Affairs, Compatriots Living Abroad, and International Humanitarian Cooperation, together receive at least US$100 million per year, in addition to government-organized nongovernmental organizations (NGOs), at least 150 of which are funded by Russian presidential grants totaling US$70 million per year.
-
M013 - targeting
+
Kremlin’s narrative spin extends through constellations of “civil society” organisations, political parties, churches, and other actors. Moscow leverages think tanks, human rights groups, election observers, Eurasianist integration groups, and orthodox groups. A collection of Russian civil society organisations, such as the Federal Agency for the Commonwealth of Independent States Affairs, Compatriots Living Abroad, and International Humanitarian Cooperation, together receive at least US$100 million per year, in addition to government-organized nongovernmental organisations (NGOs), at least 150 of which are funded by Russian presidential grants totaling US$70 million per year.
protect the interests of this population and, more importantly, influence the population to support pro-Russia causes and effectively influence the politics of its neighbors
-
M013 - targeting
+
protect the interests of this population and, more importantly, influence the population to support pro-Russia causes and effectively influence the politics of its neighbours
+
M013 - Targeting
TA02 Objective Planning
D03
@@ -929,7 +917,7 @@ Online archives of content (archives of websites, social media profiles, media,
international donors will donate to a basket fund that will pay a committee of local experts who will, in turn, manage and distribute the money to Russian-language producers and broadcasters that pitch various projects.
elevate information as a critical domain of statecraft
-
Shift from reactive to proactive response, with priority on sharing relevant information with the public and mobilizing private-sector engagement. Recent advances in data-driven technologies have elevated information as a source of power to influence the political and economic environment, to foster economic growth, to enable a decision-making advantage over competitors, and to communicate securely and quickly.
-
M007 - metatechnique
+
Shift from reactive to proactive response, with priority on sharing relevant information with the public and mobilising private-sector engagement. Recent advances in data-driven technologies have elevated information as a source of power to influence the political and economic environment, to foster economic growth, to enable a decision-making advantage over competitors, and to communicate securely and quickly.
+
M007 - Metatechnique
TA01 Strategic Planning
D03
@@ -953,39 +941,39 @@ Online archives of content (archives of websites, social media profiles, media,
Free and fair press: create bipartisan, patriotic commitment to press freedom. Note difference between news and editorialising. Build alternative news sources: create alternative local-language news sources to counter local-language propaganda outlets. Delegitimize the 24 hour news cycle. includes Provide an alternative to disinformation content by expanding and improving local content: Develop content that can displace geopolitically-motivated narratives in the entire media environment, both new and old media alike.
-
M007 - metatechnique, M002 - diversion
+
Free and fair press: create bipartisan, patriotic commitment to press freedom. Note difference between news and editorialising. Build alternative news sources: create alternative local-language news sources to counter local-language propaganda outlets. Delegitimize the 24 hour news cycle. includes Provide an alternative to disinformation content by expanding and improving local content: Develop content that can displace geopolitically-motivated narratives in the entire media environment, both new and old media alike.
Improve Coordination amongst stakeholders: public and private
-
Coordinated disinformation challenges are increasingly multidisciplinary, there are few organizations within the national security structures that are equipped with the broad-spectrum capability to effectively counter large-scale conflict short of war tactics in real-time. Institutional hurdles currently impede diverse subject matter experts, hailing from outside of the traditional national security and foreign policy disciplines (e.g., physical science, engineering, media, legal, and economics fields), from contributing to the direct development of national security countermeasures to emerging conflict short of war threat vectors. A Cognitive Security Action Group (CSAG), akin to the Counterterrorism Security Group (CSG), could drive interagency alignment across equivalents of DHS, DoS, DoD, Intelligence Community, and other implementing agencies, in areas including strategic narrative, and the nexus of cyber and information operations.
-
M007 - metatechnique
+
Coordinated disinformation challenges are increasingly multidisciplinary, there are few organisations within the national security structures that are equipped with the broad-spectrum capability to effectively counter large-scale conflict short of war tactics in real-time. Institutional hurdles currently impede diverse subject matter experts, hailing from outside of the traditional national security and foreign policy disciplines (e.g., physical science, engineering, media, legal, and economics fields), from contributing to the direct development of national security countermeasures to emerging conflict short of war threat vectors. A Cognitive Security Action Group (CSAG), akin to the Counterterrorism Security Group (CSG), could drive interagency alignment across equivalents of DHS, DoS, DoD, Intelligence Community, and other implementing agencies, in areas including strategic narrative, and the nexus of cyber and information operations.
Fill information voids with non-disinformation content
-
1) Pollute the data voids with wholesome content (Kittens! Babyshark!). 2) fill data voids with relevant information, e.g. increase Russian-language programming in areas subject to Russian disinformation.
-
M009 - dilution, M008 - data pollution
+
1) Pollute the data voids with wholesome content (Kittens! Babyshark!). 2) fill data voids with relevant information, e.g. increase Russian-language programming in areas subject to Russian disinformation.
Newsroom/Journalist training to counter influence moves
-
Includes SEO influence. Includes promotion of a “higher standard of journalism”: journalism training “would be helpful, especially for the online community. Includes Strengthen local media: Improve effectiveness of local media outlets.
-
M001 - resilience
+
Includes SEO influence. Includes promotion of a “higher standard of journalism”: journalism training “would be helpful, especially for the online community. Includes Strengthen local media: Improve effectiveness of local media outlets.
Ensure that platforms are taking down flagged accounts
-
Use ongoing analysis/monitoring of "flagged" profiles. Confirm whether platforms are actively removing flagged accounts, and raise pressure via e.g. government organizations to encourage removal
-
M003 - daylight
+
Use ongoing analysis/monitoring of "flagged" profiles. Confirm whether platforms are actively removing flagged accounts, and raise pressure via e.g. government organisations to encourage removal
Government open engagement with civil society as an independent check on government action and messaging. Government seeks to coordinate and synchronize narrative themes with allies and partners while calibrating action in cases where elements in these countries may have been co-opted by competitor nations. Includes “fight in the light”: Use leadership in the arts, entertainment, and media to highlight and build on fundamental tenets of democracy.
-
M001 - resilience
+
Government open engagement with civil society as an independent check on government action and messaging. Government seeks to coordinate and synchronise narrative themes with allies and partners while calibrating action in cases where elements in these countries may have been co-opted by competitor nations. Includes “fight in the light”: Use leadership in the arts, entertainment, and media to highlight and build on fundamental tenets of democracy.
Redirect searches away from disinformation or extremist content
-
Use Google AdWords to identify instances in which people search Google about particular fake-news stories or propaganda themes. Includes Monetize centrist SEO by subsidizing the difference in greater clicks towards extremist content.
-
M002 - diversion
+
Redirect searches away from disinformation or extremist content
+
Use Google AdWords to identify instances in which people search Google about particular fake-news stories or propaganda themes. Includes Monetize centrist SEO by subsidising the difference in greater clicks towards extremist content.
Standard reporting for false profiles (identity issues). Includes detecting hijacked accounts and reallocating them - if possible, back to original owners.
-
M005 - removal
+
Standard reporting for false profiles (identity issues). Includes detecting hijacked accounts and reallocating them - if possible, back to original owners.
strong dialogue between the federal government and private sector to encourage better reporting
-
Increase civic resilience by partnering with business community to combat gray zone threats and ensuring adequate reporting and enforcement mechanisms.
-
M007 - metatechnique
+
Increase civic resilience by partnering with business community to combat grey zone threats and ensuring adequate reporting and enforcement mechanisms.
+
M007 - Metatechnique
TA01 Strategic Planning
D03
@@ -1073,7 +1061,7 @@ Online archives of content (archives of websites, social media profiles, media,
Create a plan for misinformation and disinformation response, before it's needed. Include connections / contacts needed, expected counteremessages etc.
-
M007 - metatechnique
+
Create a plan for misinformation and disinformation response, before it's needed. Include connections / contacts needed, expected counteremessages etc.
+
M007 - Metatechnique
TA01 Strategic Planning
D03
@@ -1121,23 +1109,23 @@ Online archives of content (archives of websites, social media profiles, media,
Improve trust in the misinformation responses from social media and other platforms. Examples include creating greater transparancy on their actions and algorithms.
-
M001 - resilience
+
Improve trust in the misinformation responses from social media and other platforms. Examples include creating greater transparancy on their actions and algorithms.
Note: In each case, depending on the platform there may be a way to identify a fence-sitter. For example, online polls may have a neutral option or a "somewhat this-or-that" option, and may reveal who voted for that to all visitors. This information could be of use to data analysts.
-
-In TA08-11, the engagement level of victims could be identified to detect and respond to increasing engagement.
+
Note: In each case, depending on the platform there may be a way to identify a fence-sitter. For example, online polls may have a neutral option or a "somewhat this-or-that" option, and may reveal who voted for that to all visitors. This information could be of use to data analysts. In TA08-11, the engagement level of victims could be identified to detect and respond to increasing engagement.
TA15 Establish Social Assets
D01
@@ -262,7 +260,7 @@ In TA08-11, the engagement level of victims could be identified to detect and re
Original Comment: Shortcomings: intentional falsehood. Doesn't solve accuracy. Can't be mandatory.
-
-Technique should be in terms of "strategic innoculation", raising the standards of what people expect in terms of evidence when consuming news.
+
News content provenance certification.
+
Original Comment: Shortcomings: intentional falsehood. Doesn't solve accuracy. Can't be mandatory. Technique should be in terms of "strategic innoculation", raising the standards of what people expect in terms of evidence when consuming news.
TA06 Develop Content
D01
@@ -343,7 +339,7 @@ Technique should be in terms of "strategic innoculation", raising the standards
*Deplatform People: This technique needs to be a bit more specific to distinguish it from "account removal" or DDOS and other techniques that get more specific when applied to content.
-
-For example, other ways of deplatforming people include attacking their sources of funds, their allies, their followers, etc.
+
*Deplatform People: This technique needs to be a bit more specific to distinguish it from "account removal" or DDOS and other techniques that get more specific when applied to content. For example, other ways of deplatforming people include attacking their sources of funds, their allies, their followers, etc.
TA10 Go Physical
D01
@@ -490,7 +484,7 @@ For example, other ways of deplatforming people include attacking their sources
I assume this was a transcript error. Otherwise, "Identify Susceptible Influences" as in the various methods of influences that may work against a victim could also be a technique. Nope, wasn't a transcript error: original note says influencers, as in find people of influence that might be targetted.
+
I assume this was a transcript error. Otherwise, "Identify Susceptible Influences" as in the various methods of influences that may work against a victim could also be a technique. Nope, wasn't a transcript error: original note says influencers, as in find people of influence that might be targetted.
TA10 Go Physical
D01
@@ -554,7 +548,7 @@ For example, other ways of deplatforming people include attacking their sources
a developing methodology for identifying statistical differences in how social groups use language and quantifying how common those statistical differences are within a larger population. In essence, it hypothesizes how much affinity might exist for a specific group within a general population, based on the language its members employ
+
a developing methodology for identifying statistical differences in how social groups use language and quantifying how common those statistical differences are within a larger population. In essence, it hypothesises how much affinity might exist for a specific group within a general population, based on the language its members employ
D01
@@ -594,7 +588,7 @@ For example, other ways of deplatforming people include attacking their sources
collect intel/recon on black/covert content creators/manipulators
-
Players at the level of covert attribution, referred to as “black” in the grayscale of deniability, produce content on user-generated media, such as YouTube, but also add fear-mongering commentary to and amplify content produced by others and supply exploitable content to data dump websites. These activities are conducted by a network of trolls, bots, honeypots, and hackers.
+
Players at the level of covert attribution, referred to as “black” in the grayscale of deniability, produce content on user-generated media, such as YouTube, but also add fear-mongering commentary to and amplify content produced by others and supply exploitable content to data dump websites. These activities are conducted by a network of trolls, bots, honeypots, and hackers.
D01
@@ -602,7 +596,7 @@ For example, other ways of deplatforming people include attacking their sources
brand ambassador programs could be used with influencers across a variety of social media channels. It could also target other prominent experts, such as academics, business leaders, and other potentially prominent people. Authorities must ultimately take care in implementing such a program given the risk that contact with U.S. or NATO authorities might damage influencer reputations. Engagements must consequently be made with care, and, if possible, government interlocutors should work through local NGOs.
+
brand ambassador programmes could be used with influencers across a variety of social media channels. It could also target other prominent experts, such as academics, business leaders, and other potentially prominent people. Authorities must ultimately take care in implementing such a programme given the risk that contact with U.S. or NATO authorities might damage influencer reputations. Engagements must consequently be made with care, and, if possible, government interlocutors should work through local NGOs.
D01
@@ -610,7 +604,7 @@ For example, other ways of deplatforming people include attacking their sources
significant amounts of quality open-source information are now available and should be leveraged to build products and analysis prior to problem prioritization in the areas of observation, attribution, and intent. Successfully distinguishing the gray zone campaign signal through the global noise requires action through the entirety of the national security community. Policy, process, and tools must all adapt and evolve to detect, discern, and act upon a new type of signal
+
significant amounts of quality open-source information are now available and should be leveraged to build products and analysis prior to problem prioritisation in the areas of observation, attribution, and intent. Successfully distinguishing the grey zone campaign signal through the global noise requires action through the entirety of the national security community. Policy, process, and tools must all adapt and evolve to detect, discern, and act upon a new type of signal
D01
@@ -618,15 +612,14 @@ For example, other ways of deplatforming people include attacking their sources
Monitor/collect audience engagement data connected to “useful idiots”
-
Target audience connected to "useful idiots rather than the specific profiles because - The active presence of such sources complicates targeting of Russian propaganda, given that it is often difficult to discriminate between authentic views and opinions on the internet and those disseminated by the Russian state.
-
+
Target audience connected to "useful idiots rather than the specific profiles because - The active presence of such sources complicates targeting of Russian propaganda, given that it is often difficult to discriminate between authentic views and opinions on the internet and those disseminated by the Russian state.
Gray zone threats are challenging given that warning requires detection of a weak signal through global noise and across threat vectors and regional boundaries.Three interconnected gray zone elements characterize the nature of the activity:
-Temporality: The nature of gray zone threats truly requires a “big picture view” over long timescales and across regions and functional topics.
-Attribution: requiring an “almost certain” or “nearly certain analytic assessment before acting costs time and analytic effort
-Intent: judgement of adversarial intent to conduct gray zone activity. Indeed, the purpose of countering gray zone threats is to deter adversaries from fulfilling their intent to act. While attribution is one piece of the puzzle, closing the space around intent often means synthesizing multiple relevant indicators and warnings, including the state’s geopolitical ambitions, military ties, trade and investment, level of corruption, and media landscape, among others.
+
Grey zone threats are challenging given that warning requires detection of a weak signal through global noise and across threat vectors and regional boundaries.Three interconnected grey zone elements characterise the nature of the activity: Temporality: The nature of grey zone threats truly requires a “big picture view” over long timescales and across regions and functional topics. Attribution: requiring an “almost certain” or “nearly certain analytic assessment before acting costs time and analytic effort Intent: judgement of adversarial intent to conduct grey zone activity. Indeed, the purpose of countering grey zone threats is to deter adversaries from fulfilling their intent to act. While attribution is one piece of the puzzle, closing the space around intent often means synthesising multiple relevant indicators and warnings, including the state’s geopolitical ambitions, military ties, trade and investment, level of corruption, and media landscape, among others.
@@ -710,15 +700,15 @@ Intent: judgement of adversarial intent to conduct gray zone activity. Indeed, t
United States has not adequately adapted its information indicators and thresholds for warning policymakers to account for gray zone tactics. Competitors have undertaken a marked shift to slow-burn, deceptive, non-military, and indirect challenges to U.S. interests. Relative to traditional security indicators and warnings, these are more numerous and harder to detect and make it difficult for analysts to infer intent.
+
United States has not adequately adapted its information indicators and thresholds for warning policymakers to account for grey zone tactics. Competitors have undertaken a marked shift to slow-burn, deceptive, non-military, and indirect challenges to U.S. interests. Relative to traditional security indicators and warnings, these are more numerous and harder to detect and make it difficult for analysts to infer intent.
Recognize campaigns from weak signals, including rivals’ intent, capability, impact, interactive effects, and impact on U.S. interests... focus on adversarial covert action aspects of campaigning.
+
Revitalise an “active measures working group,”
+
Recognise campaigns from weak signals, including rivals’ intent, capability, impact, interactive effects, and impact on U.S. interests... focus on adversarial covert action aspects of campaigning.
D01
@@ -726,7 +716,7 @@ Intent: judgement of adversarial intent to conduct gray zone activity. Indeed, t
"Gray zone" is second level of content producers and circulators, composed of outlets with uncertain attribution. This category covers conspiracy websites, far-right or far-left websites, news aggregators, and data dump websites
+
"Grey zone" is second level of content producers and circulators, composed of outlets with uncertain attribution. This category covers conspiracy websites, far-right or far-left websites, news aggregators, and data dump websites
TA15 Establish Social Assets
D01
@@ -742,7 +732,7 @@ Intent: judgement of adversarial intent to conduct gray zone activity. Indeed, t
This might include working with relevant technology firms to ensure that contracted analytic support is available. Contracted support is reportedly valuable because technology to monitor social media data is continually evolving, and such firms can provide the expertise to help identify and analyze trends, and they can more effectively stay abreast of the changing systems and develop new models as they are required
+
This might include working with relevant technology firms to ensure that contracted analytic support is available. Contracted support is reportedly valuable because technology to monitor social media data is continually evolving, and such firms can provide the expertise to help identify and analyse trends, and they can more effectively stay abreast of the changing systems and develop new models as they are required
TA01 Strategic Planning
D01
@@ -750,7 +740,7 @@ Intent: judgement of adversarial intent to conduct gray zone activity. Indeed, t
Reduce the resources available to disinformation creators
diff --git a/generated_pages/phases/P04.md b/generated_pages/phases/P04.md
index 20023ce..23ec520 100644
--- a/generated_pages/phases/P04.md
+++ b/generated_pages/phases/P04.md
@@ -1,5 +1,5 @@
# Phase P04: Assess
-* **Summary:** Evaluate effectiveness of action, for use in future plans
+* **Summary:** Evaluate effectiveness of action, for use in future plans
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/phases_index.md b/generated_pages/phases_index.md
index 73f20c9..2b27a95 100644
--- a/generated_pages/phases_index.md
+++ b/generated_pages/phases_index.md
@@ -24,6 +24,6 @@
Completely break or interrupt the flow of information, for a fixed amount of time. (Deny, for a limited time period). Not allowing any efficacy, for a short amount of time.
+
Completely break or interrupt the flow of information, for a fixed amount of time. (Deny, for a limited time period). Not allowing any efficacy, for a short amount of time.
Reduce the effectiveness or efficiency of disinformation creators’ command and control or communications systems, and information collection efforts or means, either indefinitely, or for a limited time period.
+
Reduce the effectiveness or efficiency of disinformation creators’ command and control or communications systems, and information collection efforts or means, either indefinitely, or for a limited time period.
Damage a system or entity so badly that it cannot perform any function or be restored to a usable condition without being entirely rebuilt. Destroy is permanent, e.g. you can rebuild a website, but it’s not the same website.
+
Damage a system or entity so badly that it cannot perform any function or be restored to a usable condition without being entirely rebuilt. Destroy is permanent, e.g. you can rebuild a website, but it’s not the same website.
diff --git a/generated_pages/tactics/TA02.md b/generated_pages/tactics/TA02.md
index afb28ca..9c2db32 100644
--- a/generated_pages/tactics/TA02.md
+++ b/generated_pages/tactics/TA02.md
@@ -1,10 +1,6 @@
# Tactic TA02: Plan Objectives
-* **Summary:** Set clearly defined, measurable, and achievable objectives. Achieving objectives ties execution of tactical tasks to reaching the desired end state. There are four primary considerations:
-- Each desired effect should link directly to one or more objectives
-- The effect should be measurable
-- The objective statement should not specify the way and means of accomplishment
-- The effect should be distinguishable from the objective it supports as a condition for success, not as another objective or task.
+* **Summary:** Set clearly defined, measurable, and achievable objectives. Achieving objectives ties execution of tactical tasks to reaching the desired end state. There are four primary considerations: - Each desired effect should link directly to one or more objectives - The effect should be measurable - The objective statement should not specify the way and means of accomplishment - The effect should be distinguishable from the objective it supports as a condition for success, not as another objective or task.
* **Belongs to phase:** P01
@@ -45,7 +41,7 @@
| [C00030 Develop a compelling counter narrative (truth based)](../../generated_pages/counters/C00030.md) | D03 |
| [C00031 Dilute the core narrative - create multiple permutations, target / amplify](../../generated_pages/counters/C00031.md) | D03 |
| [C00060 Legal action against for-profit engagement factories](../../generated_pages/counters/C00060.md) | D03 |
-| [C00156 Better tell your country or organization story](../../generated_pages/counters/C00156.md) | D03 |
+| [C00156 Better tell your country or organisation story](../../generated_pages/counters/C00156.md) | D03 |
| [C00164 compatriot policy](../../generated_pages/counters/C00164.md) | D03 |
| [C00169 develop a creative content hub](../../generated_pages/counters/C00169.md) | D03 |
| [C00222 Tabletop simulations](../../generated_pages/counters/C00222.md) | D03 |
diff --git a/generated_pages/tactics/TA05.md b/generated_pages/tactics/TA05.md
index 94ec184..080375a 100644
--- a/generated_pages/tactics/TA05.md
+++ b/generated_pages/tactics/TA05.md
@@ -16,7 +16,7 @@
| ---------- |
| [T0016 Create Clickbait](../../generated_pages/techniques/T0016.md) |
| [T0018 Purchase Targeted Advertisements](../../generated_pages/techniques/T0018.md) |
-| [T0101 Create Localized Content](../../generated_pages/techniques/T0101.md) |
+| [T0101 Create Localised Content](../../generated_pages/techniques/T0101.md) |
| [T0102 Leverage Echo Chambers/Filter Bubbles](../../generated_pages/techniques/T0102.md) |
| [T0102.001 Use Existing Echo Chambers/Filter Bubbles](../../generated_pages/techniques/T0102.001.md) |
| [T0102.002 Create Echo Chambers/Filter Bubbles](../../generated_pages/techniques/T0102.002.md) |
diff --git a/generated_pages/tactics/TA06.md b/generated_pages/tactics/TA06.md
index ff0e209..da64537 100644
--- a/generated_pages/tactics/TA06.md
+++ b/generated_pages/tactics/TA06.md
@@ -16,7 +16,7 @@
| Techniques |
| ---------- |
-| [T0015 Create Hashtags and Search Artifacts](../../generated_pages/techniques/T0015.md) |
+| [T0015 Create Hashtags and Search Artefacts](../../generated_pages/techniques/T0015.md) |
| [T0019 Generate Information Pollution](../../generated_pages/techniques/T0019.md) |
| [T0019.001 Create Fake Research](../../generated_pages/techniques/T0019.001.md) |
| [T0019.002 Hijack Hashtags](../../generated_pages/techniques/T0019.002.md) |
@@ -25,8 +25,8 @@
| [T0023.002 Edit Open-Source Content](../../generated_pages/techniques/T0023.002.md) |
| [T0084 Reuse Existing Content](../../generated_pages/techniques/T0084.md) |
| [T0084.001 Use Copypasta](../../generated_pages/techniques/T0084.001.md) |
-| [T0084.002 Plagiarize Content](../../generated_pages/techniques/T0084.002.md) |
-| [T0084.003 Deceptively Labeled or Translated](../../generated_pages/techniques/T0084.003.md) |
+| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) |
+| [T0084.003 Deceptively Labelled or Translated](../../generated_pages/techniques/T0084.003.md) |
| [T0084.004 Appropriate Content](../../generated_pages/techniques/T0084.004.md) |
| [T0085 Develop Text-Based Content](../../generated_pages/techniques/T0085.md) |
| [T0085.001 Develop AI-Generated Text](../../generated_pages/techniques/T0085.001.md) |
diff --git a/generated_pages/tactics/TA07.md b/generated_pages/tactics/TA07.md
index 97086b0..244444a 100644
--- a/generated_pages/tactics/TA07.md
+++ b/generated_pages/tactics/TA07.md
@@ -1,6 +1,6 @@
# Tactic TA07: Select Channels and Affordances
-* **Summary:** Selecting platforms and affordances assesses which online or offline platforms and their associated affordances maximize an influence operation’s ability to reach its target audience. To select the most appropriate platform(s), an operation may assess the technological affordances including platform algorithms, terms of service, permitted content types, or other attributes that determine platform usability and accessibility. Selecting platforms includes both choosing platforms on which the operation will publish its own content and platforms on which the operation will attempt to restrict adversarial content.
+* **Summary:** Selecting platforms and affordances assesses which online or offline platforms and their associated affordances maximise an influence operation’s ability to reach its target audience. To select the most appropriate platform(s), an operation may assess the technological affordances including platform algorithms, terms of service, permitted content types, or other attributes that determine platform usability and accessibility. Selecting platforms includes both choosing platforms on which the operation will publish its own content and platforms on which the operation will attempt to restrict adversarial content.
* **Belongs to phase:** P02
@@ -51,7 +51,7 @@
| [C00097 Require use of verified identities to contribute to poll or comment](../../generated_pages/counters/C00097.md) | D02 |
| [C00098 Revocation of allowlisted or "verified" status](../../generated_pages/counters/C00098.md) | D02 |
| [C00099 Strengthen verification methods](../../generated_pages/counters/C00099.md) | D02 |
-| [C00195 Redirect searches away from disinformation or extremist content ](../../generated_pages/counters/C00195.md) | D02 |
+| [C00195 Redirect searches away from disinformation or extremist content](../../generated_pages/counters/C00195.md) | D02 |
| [C00105 Buy more advertising than misinformation creators](../../generated_pages/counters/C00105.md) | D03 |
| [C00101 Create friction by rate-limiting engagement](../../generated_pages/counters/C00101.md) | D04 |
| [C00090 Fake engagement system](../../generated_pages/counters/C00090.md) | D05 |
diff --git a/generated_pages/tactics/TA08.md b/generated_pages/tactics/TA08.md
index 497d3d3..39c73cc 100644
--- a/generated_pages/tactics/TA08.md
+++ b/generated_pages/tactics/TA08.md
@@ -1,6 +1,6 @@
# Tactic TA08: Conduct Pump Priming
-* **Summary:** Release content on a targetted small scale, prior to general release, including releasing seed. Used for preparation before broader release, and as message honing. Used for preparation before broader release, and as message honing.
+* **Summary:** Release content on a targetted small scale, prior to general release, including releasing seed. Used for preparation before broader release, and as message honing. Used for preparation before broader release, and as message honing.
* **Belongs to phase:** P03
@@ -17,11 +17,11 @@
| Techniques |
| ---------- |
| [T0020 Trial Content](../../generated_pages/techniques/T0020.md) |
-| [T0039 Bait Legitimate Influencers](../../generated_pages/techniques/T0039 .md) |
+| [T0039 Bait Legitimate Influencers](../../generated_pages/techniques/T0039.md) |
| [T0042 Seed Kernel of Truth](../../generated_pages/techniques/T0042.md) |
| [T0044 Seed Distortions](../../generated_pages/techniques/T0044.md) |
| [T0045 Use Fake Experts](../../generated_pages/techniques/T0045.md) |
-| [T0046 Use Search Engine Optimization](../../generated_pages/techniques/T0046.md) |
+| [T0046 Use Search Engine Optimisation](../../generated_pages/techniques/T0046.md) |
| [T0113 Employ Commercial Analytic Firms](../../generated_pages/techniques/T0113.md) |
@@ -42,7 +42,7 @@
| [C00184 Media exposure](../../generated_pages/counters/C00184.md) | D04 |
| [C00119 Engage payload and debunk.](../../generated_pages/counters/C00119.md) | D07 |
| [C00120 Open dialogue about design of platforms to produce different outcomes](../../generated_pages/counters/C00120.md) | D07 |
-| [C00121 Tool transparency and literacy for channels people follow. ](../../generated_pages/counters/C00121.md) | D07 |
+| [C00121 Tool transparency and literacy for channels people follow.](../../generated_pages/counters/C00121.md) | D07 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/tactics/TA09.md b/generated_pages/tactics/TA09.md
index 077a2f3..8cfda70 100644
--- a/generated_pages/tactics/TA09.md
+++ b/generated_pages/tactics/TA09.md
@@ -32,7 +32,7 @@
| Counters | Response types |
| -------- | -------------- |
| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00129 Use banking to cut off access ](../../generated_pages/counters/C00129.md) | D02 |
+| [C00129 Use banking to cut off access](../../generated_pages/counters/C00129.md) | D02 |
| [C00182 Redirection / malware detection/ remediation](../../generated_pages/counters/C00182.md) | D02 |
| [C00109 Dampen Emotional Reaction](../../generated_pages/counters/C00109.md) | D03 |
| [C00123 Remove or rate limit botnets](../../generated_pages/counters/C00123.md) | D03 |
diff --git a/generated_pages/tactics/TA10.md b/generated_pages/tactics/TA10.md
index d30b9c5..a1ebf8a 100644
--- a/generated_pages/tactics/TA10.md
+++ b/generated_pages/tactics/TA10.md
@@ -1,6 +1,6 @@
# Tactic TA10: Drive Offline Activity
-* **Summary:** Move incident/campaign from online to offline. Encouraging users to move from the platform on which they initially viewed operation content and engage in the physical information space or offline world. This may include operation-aligned rallies or protests, radio, newspaper, or billboards. An influence operation may drive to physical forums to diversify its information channels and facilitate spaces where the target audience can engage with both operation content and like-minded individuals offline.
+* **Summary:** Move incident/campaign from online to offline. Encouraging users to move from the platform on which they initially viewed operation content and engage in the physical information space or offline world. This may include operation-aligned rallies or protests, radio, newspaper, or billboards. An influence operation may drive to physical forums to diversify its information channels and facilitate spaces where the target audience can engage with both operation content and like-minded individuals offline.
* **Belongs to phase:** P03
@@ -16,7 +16,7 @@
| ---------- |
| [T0017 Conduct Fundraising](../../generated_pages/techniques/T0017.md) |
| [T0017.001 Conduct Crowdfunding Campaigns](../../generated_pages/techniques/T0017.001.md) |
-| [T0057 Organize Events](../../generated_pages/techniques/T0057.md) |
+| [T0057 Organise Events](../../generated_pages/techniques/T0057.md) |
| [T0057.001 Pay for Physical Action](../../generated_pages/techniques/T0057.001.md) |
| [T0057.002 Conduct Symbolic Action](../../generated_pages/techniques/T0057.002.md) |
| [T0061 Sell Merchandise](../../generated_pages/techniques/T0061.md) |
diff --git a/generated_pages/tactics/TA11.md b/generated_pages/tactics/TA11.md
index 909e9f8..e6aa785 100644
--- a/generated_pages/tactics/TA11.md
+++ b/generated_pages/tactics/TA11.md
@@ -1,6 +1,6 @@
# Tactic TA11: Persist in the Information Environment
-* **Summary:** Persist in the Information Space refers to taking measures that allow an operation to maintain its presence and avoid takedown by an external entity. Techniques in Persist in the Information Space help campaigns operate without detection and appear legitimate to the target audience and platform monitoring services. Influence operations on social media often persist online by varying the type of information assets and platforms used throughout the campaign.
+* **Summary:** Persist in the Information Space refers to taking measures that allow an operation to maintain its presence and avoid takedown by an external entity. Techniques in Persist in the Information Space help campaigns operate without detection and appear legitimate to the target audience and platform monitoring services. Influence operations on social media often persist online by varying the type of information assets and platforms used throughout the campaign.
* **Belongs to phase:** P03
@@ -40,8 +40,8 @@
| [T0129.010 Misattribute Activity](../../generated_pages/techniques/T0129.010.md) |
| [T0130 Conceal Infrastructure](../../generated_pages/techniques/T0130.md) |
| [T0130.001 Conceal Sponsorship](../../generated_pages/techniques/T0130.001.md) |
-| [T0130.002 Utilize Bulletproof Hosting](../../generated_pages/techniques/T0130.002.md) |
-| [T0130.003 Use Shell Organizations](../../generated_pages/techniques/T0130.003.md) |
+| [T0130.002 Utilise Bulletproof Hosting](../../generated_pages/techniques/T0130.002.md) |
+| [T0130.003 Use Shell Organisations](../../generated_pages/techniques/T0130.003.md) |
| [T0130.004 Use Cryptocurrency](../../generated_pages/techniques/T0130.004.md) |
| [T0130.005 Obfuscate Payment](../../generated_pages/techniques/T0130.005.md) |
| [T0131 Exploit TOS/Content Moderation](../../generated_pages/techniques/T0131.md) |
diff --git a/generated_pages/tactics/TA12.md b/generated_pages/tactics/TA12.md
index 2b9c3d8..9983ec3 100644
--- a/generated_pages/tactics/TA12.md
+++ b/generated_pages/tactics/TA12.md
@@ -1,6 +1,6 @@
# Tactic TA12: Assess Effectiveness
-* **Summary:** Assess effectiveness of action, for use in future plans
+* **Summary:** Assess effectiveness of action, for use in future plans
* **Belongs to phase:** P04
@@ -22,7 +22,7 @@
| [T0132.002 Content Focused](../../generated_pages/techniques/T0132.002.md) |
| [T0132.003 View Focused](../../generated_pages/techniques/T0132.003.md) |
| [T0133 Measure Effectiveness](../../generated_pages/techniques/T0133.md) |
-| [T0133.001 Behavior Changes](../../generated_pages/techniques/T0133.001.md) |
+| [T0133.001 Behaviour Changes](../../generated_pages/techniques/T0133.001.md) |
| [T0133.002 Content](../../generated_pages/techniques/T0133.002.md) |
| [T0133.003 Awareness](../../generated_pages/techniques/T0133.003.md) |
| [T0133.004 Knowledge](../../generated_pages/techniques/T0133.004.md) |
diff --git a/generated_pages/tactics/TA13.md b/generated_pages/tactics/TA13.md
index 2c9f7f5..fba78fe 100644
--- a/generated_pages/tactics/TA13.md
+++ b/generated_pages/tactics/TA13.md
@@ -1,6 +1,6 @@
# Tactic TA13: Target Audience Analysis
-* **Summary:** Identifying and analyzing the target audience examines target audience member locations, political affiliations, financial situations, and other attributes that an influence operation may incorporate into its messaging strategy. During this tactic, influence operations may also identify existing similarities and differences between target audience members to unite like groups and divide opposing groups. Identifying and analyzing target audience members allows influence operations to tailor operation strategy and content to their analysis.
+* **Summary:** Identifying and analysing the target audience examines target audience member locations, political affiliations, financial situations, and other attributes that an influence operation may incorporate into its messaging strategy. During this tactic, influence operations may also identify existing similarities and differences between target audience members to unite like groups and divide opposing groups. Identifying and analysing target audience members allows influence operations to tailor operation strategy and content to their analysis.
* **Belongs to phase:** P01
diff --git a/generated_pages/tactics/TA14.md b/generated_pages/tactics/TA14.md
index 5d1af02..a724cab 100644
--- a/generated_pages/tactics/TA14.md
+++ b/generated_pages/tactics/TA14.md
@@ -1,6 +1,6 @@
# Tactic TA14: Develop Narratives
-* **Summary:** The promotion of beneficial master narratives is perhaps the most effective method for achieving long-term strategic narrative dominance. From a ""whole of society"" perspective the promotion of the society's core master narratives should occupy a central strategic role. From a misinformation campaign / cognitive security perpectve the tactics around master narratives center more precisely on the day-to-day promotion and reinforcement of this messaging. In other words, beneficial, high-coverage master narratives are a central strategic goal and their promotion constitutes an ongoing tactical struggle carried out at a whole-of-society level. Tactically, their promotion covers a broad spectrum of activities both on- and offline.
+* **Summary:** The promotion of beneficial master narratives is perhaps the most effective method for achieving long-term strategic narrative dominance. From a ""whole of society"" perspective the promotion of the society's core master narratives should occupy a central strategic role. From a misinformation campaign / cognitive security perpectve the tactics around master narratives centre more precisely on the day-to-day promotion and reinforcement of this messaging. In other words, beneficial, high-coverage master narratives are a central strategic goal and their promotion constitutes an ongoing tactical struggle carried out at a whole-of-society level. Tactically, their promotion covers a broad spectrum of activities both on- and offline.
* **Belongs to phase:** P02
diff --git a/generated_pages/tactics/TA15.md b/generated_pages/tactics/TA15.md
index 2c96a17..3c36e21 100644
--- a/generated_pages/tactics/TA15.md
+++ b/generated_pages/tactics/TA15.md
@@ -1,7 +1,6 @@
# Tactic TA15: Establish Social Assets
-* **Summary:** Establishing information assets generates messaging tools, including social media accounts, operation personnel, and organizations, including directly and indirectly managed assets. For assets under their direct control, the operation can add, change, or remove these assets at will.
-Establishing information assets allows an influence operation to promote messaging directly to the target audience without navigating through external entities. Many online influence operations create or compromise social media accounts as a primary vector of information dissemination.
+* **Summary:** Establishing information assets generates messaging tools, including social media accounts, operation personnel, and organisations, including directly and indirectly managed assets. For assets under their direct control, the operation can add, change, or remove these assets at will. Establishing information assets allows an influence operation to promote messaging directly to the target audience without navigating through external entities. Many online influence operations create or compromise social media accounts as a primary vector of information dissemination.
* **Belongs to phase:** P02
@@ -40,7 +39,7 @@ Establishing information assets allows an influence operation to promote messagi
| [T0091.002 Recruit Partisans](../../generated_pages/techniques/T0091.002.md) |
| [T0091.003 Enlist Troll Accounts](../../generated_pages/techniques/T0091.003.md) |
| [T0092 Build Network](../../generated_pages/techniques/T0092.md) |
-| [T0092.001 Create Organizations](../../generated_pages/techniques/T0092.001.md) |
+| [T0092.001 Create Organisations](../../generated_pages/techniques/T0092.001.md) |
| [T0092.002 Use Follow Trains](../../generated_pages/techniques/T0092.002.md) |
| [T0092.003 Create Community or Sub-Group](../../generated_pages/techniques/T0092.003.md) |
| [T0093 Acquire/Recruit Network](../../generated_pages/techniques/T0093.md) |
@@ -48,11 +47,11 @@ Establishing information assets allows an influence operation to promote messagi
| [T0093.002 Acquire Botnets](../../generated_pages/techniques/T0093.002.md) |
| [T0094 Infiltrate Existing Networks](../../generated_pages/techniques/T0094.md) |
| [T0094.001 Identify Susceptible Targets in Networks](../../generated_pages/techniques/T0094.001.md) |
-| [T0094.002 Utilize Butterfly Attacks](../../generated_pages/techniques/T0094.002.md) |
+| [T0094.002 Utilise Butterfly Attacks](../../generated_pages/techniques/T0094.002.md) |
| [T0095 Develop Owned Media Assets](../../generated_pages/techniques/T0095.md) |
| [T0096 Leverage Content Farms](../../generated_pages/techniques/T0096.md) |
| [T0096.001 Create Content Farms](../../generated_pages/techniques/T0096.001.md) |
-| [T0096.002 Outsource Content Creation to External Organizations](../../generated_pages/techniques/T0096.002.md) |
+| [T0096.002 Outsource Content Creation to External Organisations](../../generated_pages/techniques/T0096.002.md) |
diff --git a/generated_pages/tactics/TA16.md b/generated_pages/tactics/TA16.md
index f5f611e..14c8663 100644
--- a/generated_pages/tactics/TA16.md
+++ b/generated_pages/tactics/TA16.md
@@ -14,7 +14,7 @@
| Techniques |
| ---------- |
| [T0009 Create Fake Experts](../../generated_pages/techniques/T0009.md) |
-| [T0009.001 Utilize Academic/Pseudoscientific Justifications](../../generated_pages/techniques/T0009.001.md) |
+| [T0009.001 Utilise Academic/Pseudoscientific Justifications](../../generated_pages/techniques/T0009.001.md) |
| [T0011 Compromise Legitimate Accounts](../../generated_pages/techniques/T0011.md) |
| [T0097 Create Personas](../../generated_pages/techniques/T0097.md) |
| [T0097.001 Backstop Personas](../../generated_pages/techniques/T0097.001.md) |
diff --git a/generated_pages/tactics/TA17.md b/generated_pages/tactics/TA17.md
index 540bf4b..080336e 100644
--- a/generated_pages/tactics/TA17.md
+++ b/generated_pages/tactics/TA17.md
@@ -1,6 +1,6 @@
-# Tactic TA17: Maximize Exposure
+# Tactic TA17: Maximise Exposure
-* **Summary:** Maximize exposure of the target audience to incident/campaign content via flooding, amplifying, and cross-posting.
+* **Summary:** Maximise exposure of the target audience to incident/campaign content via flooding, amplifying, and cross-posting.
* **Belongs to phase:** P03
@@ -17,7 +17,7 @@
| [T0049.001 Trolls Amplify and Manipulate](../../generated_pages/techniques/T0049.001.md) |
| [T0049.002 Hijack Existing Hashtag](../../generated_pages/techniques/T0049.002.md) |
| [T0049.003 Bots Amplify via Automated Forwarding and Reposting](../../generated_pages/techniques/T0049.003.md) |
-| [T0049.004 Utilize Spamoflauge](../../generated_pages/techniques/T0049.004.md) |
+| [T0049.004 Utilise Spamoflauge](../../generated_pages/techniques/T0049.004.md) |
| [T0049.005 Conduct Swarming](../../generated_pages/techniques/T0049.005.md) |
| [T0049.006 Conduct Keyword Squatting](../../generated_pages/techniques/T0049.006.md) |
| [T0049.007 Inauthentic Sites Amplify News and Narratives](../../generated_pages/techniques/T0049.007.md) |
@@ -27,7 +27,7 @@
| [T0119.002 Post across Platform](../../generated_pages/techniques/T0119.002.md) |
| [T0119.003 Post across Disciplines](../../generated_pages/techniques/T0119.003.md) |
| [T0120 Incentivize Sharing](../../generated_pages/techniques/T0120.md) |
-| [T0120.001 Use Affiliate Marketing Programs](../../generated_pages/techniques/T0120.001.md) |
+| [T0120.001 Use Affiliate Marketing Programmes](../../generated_pages/techniques/T0120.001.md) |
| [T0120.002 Use Contests and Prizes](../../generated_pages/techniques/T0120.002.md) |
| [T0121 Manipulate Platform Algorithm](../../generated_pages/techniques/T0121.md) |
| [T0121.001 Bypass Content Blocking](../../generated_pages/techniques/T0121.001.md) |
diff --git a/generated_pages/tactics/TA18.md b/generated_pages/tactics/TA18.md
index 221de45..20e0223 100644
--- a/generated_pages/tactics/TA18.md
+++ b/generated_pages/tactics/TA18.md
@@ -1,6 +1,6 @@
# Tactic TA18: Drive Online Harms
-* **Summary:** Actions taken by an influence operation to harm their opponents in online spaces through harassment, suppression, releasing private information, and controlling the information space through offensive cyberspace operations.
+* **Summary:** Actions taken by an influence operation to harm their opponents in online spaces through harassment, suppression, releasing private information, and controlling the information space through offensive cyberspace operations.
* **Belongs to phase:** P03
diff --git a/generated_pages/tactics_index.md b/generated_pages/tactics_index.md
index 46d8ebe..3a81bdd 100644
--- a/generated_pages/tactics_index.md
+++ b/generated_pages/tactics_index.md
@@ -16,11 +16,7 @@
Set clearly defined, measurable, and achievable objectives. Achieving objectives ties execution of tactical tasks to reaching the desired end state. There are four primary considerations:
-- Each desired effect should link directly to one or more objectives
-- The effect should be measurable
-- The objective statement should not specify the way and means of accomplishment
-- The effect should be distinguishable from the objective it supports as a condition for success, not as another objective or task.
+
Set clearly defined, measurable, and achievable objectives. Achieving objectives ties execution of tactical tasks to reaching the desired end state. There are four primary considerations: - Each desired effect should link directly to one or more objectives - The effect should be measurable - The objective statement should not specify the way and means of accomplishment - The effect should be distinguishable from the objective it supports as a condition for success, not as another objective or task.
Selecting platforms and affordances assesses which online or offline platforms and their associated affordances maximize an influence operation’s ability to reach its target audience. To select the most appropriate platform(s), an operation may assess the technological affordances including platform algorithms, terms of service, permitted content types, or other attributes that determine platform usability and accessibility. Selecting platforms includes both choosing platforms on which the operation will publish its own content and platforms on which the operation will attempt to restrict adversarial content.
+
Selecting platforms and affordances assesses which online or offline platforms and their associated affordances maximise an influence operation’s ability to reach its target audience. To select the most appropriate platform(s), an operation may assess the technological affordances including platform algorithms, terms of service, permitted content types, or other attributes that determine platform usability and accessibility. Selecting platforms includes both choosing platforms on which the operation will publish its own content and platforms on which the operation will attempt to restrict adversarial content.
Release content on a targetted small scale, prior to general release, including releasing seed. Used for preparation before broader release, and as message honing. Used for preparation before broader release, and as message honing.
+
Release content on a targetted small scale, prior to general release, including releasing seed. Used for preparation before broader release, and as message honing. Used for preparation before broader release, and as message honing.
Move incident/campaign from online to offline. Encouraging users to move from the platform on which they initially viewed operation content and engage in the physical information space or offline world. This may include operation-aligned rallies or protests, radio, newspaper, or billboards. An influence operation may drive to physical forums to diversify its information channels and facilitate spaces where the target audience can engage with both operation content and like-minded individuals offline.
+
Move incident/campaign from online to offline. Encouraging users to move from the platform on which they initially viewed operation content and engage in the physical information space or offline world. This may include operation-aligned rallies or protests, radio, newspaper, or billboards. An influence operation may drive to physical forums to diversify its information channels and facilitate spaces where the target audience can engage with both operation content and like-minded individuals offline.
Persist in the Information Space refers to taking measures that allow an operation to maintain its presence and avoid takedown by an external entity. Techniques in Persist in the Information Space help campaigns operate without detection and appear legitimate to the target audience and platform monitoring services. Influence operations on social media often persist online by varying the type of information assets and platforms used throughout the campaign.
+
Persist in the Information Space refers to taking measures that allow an operation to maintain its presence and avoid takedown by an external entity. Techniques in Persist in the Information Space help campaigns operate without detection and appear legitimate to the target audience and platform monitoring services. Influence operations on social media often persist online by varying the type of information assets and platforms used throughout the campaign.
Identifying and analyzing the target audience examines target audience member locations, political affiliations, financial situations, and other attributes that an influence operation may incorporate into its messaging strategy. During this tactic, influence operations may also identify existing similarities and differences between target audience members to unite like groups and divide opposing groups. Identifying and analyzing target audience members allows influence operations to tailor operation strategy and content to their analysis.
+
Identifying and analysing the target audience examines target audience member locations, political affiliations, financial situations, and other attributes that an influence operation may incorporate into its messaging strategy. During this tactic, influence operations may also identify existing similarities and differences between target audience members to unite like groups and divide opposing groups. Identifying and analysing target audience members allows influence operations to tailor operation strategy and content to their analysis.
The promotion of beneficial master narratives is perhaps the most effective method for achieving long-term strategic narrative dominance. From a ""whole of society"" perspective the promotion of the society's core master narratives should occupy a central strategic role. From a misinformation campaign / cognitive security perpectve the tactics around master narratives center more precisely on the day-to-day promotion and reinforcement of this messaging. In other words, beneficial, high-coverage master narratives are a central strategic goal and their promotion constitutes an ongoing tactical struggle carried out at a whole-of-society level. Tactically, their promotion covers a broad spectrum of activities both on- and offline.
+
The promotion of beneficial master narratives is perhaps the most effective method for achieving long-term strategic narrative dominance. From a ""whole of society"" perspective the promotion of the society's core master narratives should occupy a central strategic role. From a misinformation campaign / cognitive security perpectve the tactics around master narratives centre more precisely on the day-to-day promotion and reinforcement of this messaging. In other words, beneficial, high-coverage master narratives are a central strategic goal and their promotion constitutes an ongoing tactical struggle carried out at a whole-of-society level. Tactically, their promotion covers a broad spectrum of activities both on- and offline.
Establishing information assets generates messaging tools, including social media accounts, operation personnel, and organizations, including directly and indirectly managed assets. For assets under their direct control, the operation can add, change, or remove these assets at will.
-Establishing information assets allows an influence operation to promote messaging directly to the target audience without navigating through external entities. Many online influence operations create or compromise social media accounts as a primary vector of information dissemination.
+
Establishing information assets generates messaging tools, including social media accounts, operation personnel, and organisations, including directly and indirectly managed assets. For assets under their direct control, the operation can add, change, or remove these assets at will. Establishing information assets allows an influence operation to promote messaging directly to the target audience without navigating through external entities. Many online influence operations create or compromise social media accounts as a primary vector of information dissemination.
P02
@@ -98,14 +93,14 @@ Establishing information assets allows an influence operation to promote messagi
Actions taken by an influence operation to harm their opponents in online spaces through harassment, suppression, releasing private information, and controlling the information space through offensive cyberspace operations.
+
Actions taken by an influence operation to harm their opponents in online spaces through harassment, suppression, releasing private information, and controlling the information space through offensive cyberspace operations.
P03
diff --git a/generated_pages/tasks/TK0001.md b/generated_pages/tasks/TK0001.md
index 38b0f93..8bd9e4d 100644
--- a/generated_pages/tasks/TK0001.md
+++ b/generated_pages/tasks/TK0001.md
@@ -1,6 +1,6 @@
# Task TK0001: Goal setting
-* **Summary:** Set the goals for this incident.
+* **Summary:** Set the goals for this incident.
* **Belongs to tactic stage:** TA01
diff --git a/generated_pages/tasks/TK0002.md b/generated_pages/tasks/TK0002.md
index 52e9e12..895bd33 100644
--- a/generated_pages/tasks/TK0002.md
+++ b/generated_pages/tasks/TK0002.md
@@ -1,6 +1,6 @@
# Task TK0002: Population research / audience analysis (centre of gravity)
-* **Summary:** Research intended audience. Includes audience segmentation, hot-button issues etc.
+* **Summary:** Research intended audience. Includes audience segmentation, hot-button issues etc.
* **Belongs to tactic stage:** TA01
diff --git a/generated_pages/tasks_index.md b/generated_pages/tasks_index.md
index 5965080..c6de61c 100644
--- a/generated_pages/tasks_index.md
+++ b/generated_pages/tasks_index.md
@@ -10,13 +10,13 @@
Population research / audience analysis (centre of gravity)
-
Research intended audience. Includes audience segmentation, hot-button issues etc.
+
Research intended audience. Includes audience segmentation, hot-button issues etc.
TA01
diff --git a/generated_pages/techniques/T0002.md b/generated_pages/techniques/T0002.md
index 482733e..ba0220c 100644
--- a/generated_pages/techniques/T0002.md
+++ b/generated_pages/techniques/T0002.md
@@ -1,6 +1,6 @@
# Technique T0002: Facilitate State Propaganda
-* **Summary**: Organize citizens around pro-state messaging. Coordinate paid or volunteer groups to push state propaganda.
+* **Summary**: Organise citizens around pro-state messaging. Coordinate paid or volunteer groups to push state propaganda.
* **Belongs to tactic stage**: TA02
@@ -14,31 +14,11 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00008 Create shared fact-checking database](../../generated_pages/counters/C00008.md) | D04 |
-| [C00011 Media literacy. Games to identify fake news](../../generated_pages/counters/C00011.md) | D02 |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00013 Rating framework for news](../../generated_pages/counters/C00013.md) | D02 |
-| [C00014 Real-time updates to fact-checking database](../../generated_pages/counters/C00014.md) | D04 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
-| [C00017 Repair broken social connections](../../generated_pages/counters/C00017.md) | D03 |
-| [C00019 Reduce effect of division-enablers](../../generated_pages/counters/C00019.md) | D03 |
-| [C00021 Encourage in-person communication](../../generated_pages/counters/C00021.md) | D04 |
-| [C00022 Innoculate. Positive campaign to promote feeling of safety](../../generated_pages/counters/C00022.md) | D04 |
-| [C00024 Promote healthy narratives](../../generated_pages/counters/C00024.md) | D04 |
-| [C00026 Shore up democracy based messages](../../generated_pages/counters/C00026.md) | D04 |
-| [C00027 Create culture of civility](../../generated_pages/counters/C00027.md) | D07 |
| [C00029 Create fake website to issue counter narrative and counter narrative through physical merchandise](../../generated_pages/counters/C00029.md) | D03 |
| [C00030 Develop a compelling counter narrative (truth based)](../../generated_pages/counters/C00030.md) | D03 |
| [C00031 Dilute the core narrative - create multiple permutations, target / amplify](../../generated_pages/counters/C00031.md) | D03 |
| [C00082 Ground truthing as automated response to pollution](../../generated_pages/counters/C00082.md) | D03 |
| [C00084 Modify disinformation narratives, and rebroadcast them](../../generated_pages/counters/C00084.md) | D03 |
-| [C00125 Prebunking](../../generated_pages/counters/C00125.md) | D03 |
-| [C00126 Social media amber alert](../../generated_pages/counters/C00126.md) | D03 |
-| [C00156 Better tell your country or organization story](../../generated_pages/counters/C00156.md) | D03 |
-| [C00161 Coalition Building with stakeholders and Third-Party Inducements](../../generated_pages/counters/C00161.md) | D07 |
-| [C00162 Unravel/target the Potemkin villages](../../generated_pages/counters/C00162.md) | D03 |
-| [C00164 compatriot policy](../../generated_pages/counters/C00164.md) | D03 |
-| [C00169 develop a creative content hub](../../generated_pages/counters/C00169.md) | D03 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0003.md b/generated_pages/techniques/T0003.md
index 5d6034a..5b576d2 100644
--- a/generated_pages/techniques/T0003.md
+++ b/generated_pages/techniques/T0003.md
@@ -1,6 +1,6 @@
# Technique T0003: Leverage Existing Narratives
-* **Summary**: Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices.
+* **Summary**: Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices.
* **Belongs to tactic stage**: TA14
@@ -12,37 +12,8 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00008 Create shared fact-checking database](../../generated_pages/counters/C00008.md) | D04 |
-| [C00011 Media literacy. Games to identify fake news](../../generated_pages/counters/C00011.md) | D02 |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00013 Rating framework for news](../../generated_pages/counters/C00013.md) | D02 |
-| [C00014 Real-time updates to fact-checking database](../../generated_pages/counters/C00014.md) | D04 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
-| [C00017 Repair broken social connections](../../generated_pages/counters/C00017.md) | D03 |
-| [C00019 Reduce effect of division-enablers](../../generated_pages/counters/C00019.md) | D03 |
-| [C00021 Encourage in-person communication](../../generated_pages/counters/C00021.md) | D04 |
-| [C00022 Innoculate. Positive campaign to promote feeling of safety](../../generated_pages/counters/C00022.md) | D04 |
-| [C00024 Promote healthy narratives](../../generated_pages/counters/C00024.md) | D04 |
-| [C00027 Create culture of civility](../../generated_pages/counters/C00027.md) | D07 |
-| [C00029 Create fake website to issue counter narrative and counter narrative through physical merchandise](../../generated_pages/counters/C00029.md) | D03 |
-| [C00030 Develop a compelling counter narrative (truth based)](../../generated_pages/counters/C00030.md) | D03 |
-| [C00031 Dilute the core narrative - create multiple permutations, target / amplify](../../generated_pages/counters/C00031.md) | D03 |
| [C00080 Create competing narrative](../../generated_pages/counters/C00080.md) | D03 |
| [C00081 Highlight flooding and noise, and explain motivations](../../generated_pages/counters/C00081.md) | D03 |
-| [C00082 Ground truthing as automated response to pollution](../../generated_pages/counters/C00082.md) | D03 |
-| [C00084 Modify disinformation narratives, and rebroadcast them](../../generated_pages/counters/C00084.md) | D03 |
-| [C00125 Prebunking](../../generated_pages/counters/C00125.md) | D03 |
-| [C00126 Social media amber alert](../../generated_pages/counters/C00126.md) | D03 |
-| [C00156 Better tell your country or organization story](../../generated_pages/counters/C00156.md) | D03 |
-| [C00161 Coalition Building with stakeholders and Third-Party Inducements](../../generated_pages/counters/C00161.md) | D07 |
-| [C00162 Unravel/target the Potemkin villages](../../generated_pages/counters/C00162.md) | D03 |
-| [C00164 compatriot policy](../../generated_pages/counters/C00164.md) | D03 |
-| [C00169 develop a creative content hub](../../generated_pages/counters/C00169.md) | D03 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
-| [C00188 Newsroom/Journalist training to counter influence moves](../../generated_pages/counters/C00188.md) | D03 |
-| [C00190 open engagement with civil society](../../generated_pages/counters/C00190.md) | D03 |
-| [C00205 strong dialogue between the federal government and private sector to encourage better reporting](../../generated_pages/counters/C00205.md) | D03 |
-| [C00212 build public resilience by making civil society more vibrant](../../generated_pages/counters/C00212.md) | D03 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0004.md b/generated_pages/techniques/T0004.md
index bf64de0..02be11e 100644
--- a/generated_pages/techniques/T0004.md
+++ b/generated_pages/techniques/T0004.md
@@ -1,6 +1,6 @@
# Technique T0004: Develop Competing Narratives
-* **Summary**: Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centered on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the "firehose of misinformation" approach.
+* **Summary**: Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centred on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the "firehose of misinformation" approach.
* **Belongs to tactic stage**: TA14
@@ -13,17 +13,6 @@
| Counters | Response types |
| -------- | -------------- |
| [C00042 Address truth contained in narratives](../../generated_pages/counters/C00042.md) | D04 |
-| [C00080 Create competing narrative](../../generated_pages/counters/C00080.md) | D03 |
-| [C00081 Highlight flooding and noise, and explain motivations](../../generated_pages/counters/C00081.md) | D03 |
-| [C00082 Ground truthing as automated response to pollution](../../generated_pages/counters/C00082.md) | D03 |
-| [C00084 Modify disinformation narratives, and rebroadcast them](../../generated_pages/counters/C00084.md) | D03 |
-| [C00125 Prebunking](../../generated_pages/counters/C00125.md) | D03 |
-| [C00126 Social media amber alert](../../generated_pages/counters/C00126.md) | D03 |
-| [C00156 Better tell your country or organization story](../../generated_pages/counters/C00156.md) | D03 |
-| [C00161 Coalition Building with stakeholders and Third-Party Inducements](../../generated_pages/counters/C00161.md) | D07 |
-| [C00162 Unravel/target the Potemkin villages](../../generated_pages/counters/C00162.md) | D03 |
-| [C00164 compatriot policy](../../generated_pages/counters/C00164.md) | D03 |
-| [C00169 develop a creative content hub](../../generated_pages/counters/C00169.md) | D03 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0007.md b/generated_pages/techniques/T0007.md
index 5760ddb..ee72108 100644
--- a/generated_pages/techniques/T0007.md
+++ b/generated_pages/techniques/T0007.md
@@ -21,23 +21,7 @@
| Counters | Response types |
| -------- | -------------- |
| [C00006 Charge for social media](../../generated_pages/counters/C00006.md) | D02 |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
-| [C00034 Create more friction at account creation](../../generated_pages/counters/C00034.md) | D04 |
-| [C00036 Infiltrate the in-group to discredit leaders (divide)](../../generated_pages/counters/C00036.md) | D02 |
| [C00040 third party verification for people](../../generated_pages/counters/C00040.md) | D02 |
-| [C00085 Mute content](../../generated_pages/counters/C00085.md) | D03 |
-| [C00097 Require use of verified identities to contribute to poll or comment](../../generated_pages/counters/C00097.md) | D02 |
-| [C00099 Strengthen verification methods](../../generated_pages/counters/C00099.md) | D02 |
-| [C00107 Content moderation](../../generated_pages/counters/C00107.md) | D02 |
-| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00133 Deplatform Account*](../../generated_pages/counters/C00133.md) | D03 |
-| [C00135 Deplatform message groups and/or message boards](../../generated_pages/counters/C00135.md) | D03 |
-| [C00153 Take pre-emptive action against actors' infrastructure](../../generated_pages/counters/C00153.md) | D03 |
-| [C00172 social media source removal](../../generated_pages/counters/C00172.md) | D02 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
-| [C00189 Ensure that platforms are taking down flagged accounts](../../generated_pages/counters/C00189.md) | D06 |
-| [C00197 remove suspicious accounts](../../generated_pages/counters/C00197.md) | D02 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0009.001.md b/generated_pages/techniques/T0009.001.md
index 9fbc45e..2058c3f 100644
--- a/generated_pages/techniques/T0009.001.md
+++ b/generated_pages/techniques/T0009.001.md
@@ -1,6 +1,6 @@
-# Technique T0009.001: Utilize Academic/Pseudoscientific Justifications
+# Technique T0009.001: Utilise Academic/Pseudoscientific Justifications
-* **Summary**: Utilize Academic/Pseudoscientific Justifications
+* **Summary**: Utilise Academic/Pseudoscientific Justifications
* **Belongs to tactic stage**: TA16
diff --git a/generated_pages/techniques/T0009.md b/generated_pages/techniques/T0009.md
index f25485f..e87f35f 100644
--- a/generated_pages/techniques/T0009.md
+++ b/generated_pages/techniques/T0009.md
@@ -1,6 +1,6 @@
# Technique T0009: Create Fake Experts
-* **Summary**: Stories planted or promoted in computational propaganda operations often make use of experts fabricated from whole cloth, sometimes specifically for the story itself.
+* **Summary**: Stories planted or promoted in computational propaganda operations often make use of experts fabricated from whole cloth, sometimes specifically for the story itself.
* **Belongs to tactic stage**: TA16
@@ -13,19 +13,9 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00008 Create shared fact-checking database](../../generated_pages/counters/C00008.md) | D04 |
-| [C00011 Media literacy. Games to identify fake news](../../generated_pages/counters/C00011.md) | D02 |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00014 Real-time updates to fact-checking database](../../generated_pages/counters/C00014.md) | D04 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
| [C00034 Create more friction at account creation](../../generated_pages/counters/C00034.md) | D04 |
-| [C00040 third party verification for people](../../generated_pages/counters/C00040.md) | D02 |
-| [C00097 Require use of verified identities to contribute to poll or comment](../../generated_pages/counters/C00097.md) | D02 |
-| [C00099 Strengthen verification methods](../../generated_pages/counters/C00099.md) | D02 |
| [C00107 Content moderation](../../generated_pages/counters/C00107.md) | D02 |
-| [C00133 Deplatform Account*](../../generated_pages/counters/C00133.md) | D03 |
| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
-| [C00189 Ensure that platforms are taking down flagged accounts](../../generated_pages/counters/C00189.md) | D06 |
| [C00197 remove suspicious accounts](../../generated_pages/counters/C00197.md) | D02 |
diff --git a/generated_pages/techniques/T0010.md b/generated_pages/techniques/T0010.md
index 11ffdc3..9dd47de 100644
--- a/generated_pages/techniques/T0010.md
+++ b/generated_pages/techniques/T0010.md
@@ -24,32 +24,16 @@
| Counters | Response types |
| -------- | -------------- |
| [C00009 Educate high profile influencers on best practices](../../generated_pages/counters/C00009.md) | D02 |
-| [C00036 Infiltrate the in-group to discredit leaders (divide)](../../generated_pages/counters/C00036.md) | D02 |
| [C00046 Marginalise and discredit extremist groups](../../generated_pages/counters/C00046.md) | D04 |
| [C00048 Name and Shame Influencers](../../generated_pages/counters/C00048.md) | D07 |
| [C00051 Counter social engineering training](../../generated_pages/counters/C00051.md) | D02 |
-| [C00072 Remove non-relevant content from special interest groups - not recommended](../../generated_pages/counters/C00072.md) | D02 |
-| [C00073 Inoculate populations through media literacy training](../../generated_pages/counters/C00073.md) | D02 |
-| [C00092 Establish a truth teller reputation score for influencers](../../generated_pages/counters/C00092.md) | D07 |
-| [C00093 Influencer code of conduct](../../generated_pages/counters/C00093.md) | D07 |
| [C00111 Reduce polarisation by connecting and presenting sympathetic renditions of opposite views](../../generated_pages/counters/C00111.md) | D04 |
-| [C00125 Prebunking](../../generated_pages/counters/C00125.md) | D03 |
-| [C00126 Social media amber alert](../../generated_pages/counters/C00126.md) | D03 |
| [C00130 Mentorship: elders, youth, credit. Learn vicariously.](../../generated_pages/counters/C00130.md) | D07 |
-| [C00136 Microtarget most likely targets then send them countermessages](../../generated_pages/counters/C00136.md) | D03 |
-| [C00156 Better tell your country or organization story](../../generated_pages/counters/C00156.md) | D03 |
-| [C00160 find and train influencers](../../generated_pages/counters/C00160.md) | D02 |
| [C00162 Unravel/target the Potemkin villages](../../generated_pages/counters/C00162.md) | D03 |
| [C00169 develop a creative content hub](../../generated_pages/counters/C00169.md) | D03 |
-| [C00174 Create a healthier news environment](../../generated_pages/counters/C00174.md) | D02 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
-| [C00184 Media exposure](../../generated_pages/counters/C00184.md) | D04 |
-| [C00188 Newsroom/Journalist training to counter influence moves](../../generated_pages/counters/C00188.md) | D03 |
-| [C00190 open engagement with civil society](../../generated_pages/counters/C00190.md) | D03 |
-| [C00195 Redirect searches away from disinformation or extremist content ](../../generated_pages/counters/C00195.md) | D02 |
+| [C00195 Redirect searches away from disinformation or extremist content](../../generated_pages/counters/C00195.md) | D02 |
| [C00200 Respected figure (influencer) disavows misinfo](../../generated_pages/counters/C00200.md) | D03 |
| [C00203 Stop offering press credentials to propaganda outlets](../../generated_pages/counters/C00203.md) | D03 |
-| [C00212 build public resilience by making civil society more vibrant](../../generated_pages/counters/C00212.md) | D03 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0011.md b/generated_pages/techniques/T0011.md
index 3e1a69b..d33f5d4 100644
--- a/generated_pages/techniques/T0011.md
+++ b/generated_pages/techniques/T0011.md
@@ -14,12 +14,7 @@
| Counters | Response types |
| -------- | -------------- |
| [C00053 Delete old accounts / Remove unused social media accounts](../../generated_pages/counters/C00053.md) | D04 |
-| [C00098 Revocation of allowlisted or "verified" status](../../generated_pages/counters/C00098.md) | D02 |
-| [C00133 Deplatform Account*](../../generated_pages/counters/C00133.md) | D03 |
-| [C00153 Take pre-emptive action against actors' infrastructure](../../generated_pages/counters/C00153.md) | D03 |
| [C00182 Redirection / malware detection/ remediation](../../generated_pages/counters/C00182.md) | D02 |
-| [C00189 Ensure that platforms are taking down flagged accounts](../../generated_pages/counters/C00189.md) | D06 |
-| [C00197 remove suspicious accounts](../../generated_pages/counters/C00197.md) | D02 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0013.md b/generated_pages/techniques/T0013.md
index b18986a..b2b8272 100644
--- a/generated_pages/techniques/T0013.md
+++ b/generated_pages/techniques/T0013.md
@@ -1,6 +1,6 @@
# Technique T0013: Create Inauthentic Websites
-* **Summary**: Create media assets to support inauthentic organizations (e.g. think tank), people (e.g. experts) and/or serve as sites to distribute malware/launch phishing operations.
+* **Summary**: Create media assets to support inauthentic organisations (e.g. think tank), people (e.g. experts) and/or serve as sites to distribute malware/launch phishing operations.
* **Belongs to tactic stage**: TA15
@@ -12,15 +12,6 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00008 Create shared fact-checking database](../../generated_pages/counters/C00008.md) | D04 |
-| [C00011 Media literacy. Games to identify fake news](../../generated_pages/counters/C00011.md) | D02 |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00014 Real-time updates to fact-checking database](../../generated_pages/counters/C00014.md) | D04 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
-| [C00107 Content moderation](../../generated_pages/counters/C00107.md) | D02 |
-| [C00153 Take pre-emptive action against actors' infrastructure](../../generated_pages/counters/C00153.md) | D03 |
-| [C00172 social media source removal](../../generated_pages/counters/C00172.md) | D02 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0014.001.md b/generated_pages/techniques/T0014.001.md
index 573cff4..a2414e7 100644
--- a/generated_pages/techniques/T0014.001.md
+++ b/generated_pages/techniques/T0014.001.md
@@ -1,6 +1,6 @@
# Technique T0014.001: Raise Funds from Malign Actors
-* **Summary**: Raising funds from malign actors may include contributions from foreign agents, cutouts or proxies, shell companies, dark money groups, etc.
+* **Summary**: Raising funds from malign actors may include contributions from foreign agents, cutouts or proxies, shell companies, dark money groups, etc.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0014.002.md b/generated_pages/techniques/T0014.002.md
index 3dc51e5..3e9b6c4 100644
--- a/generated_pages/techniques/T0014.002.md
+++ b/generated_pages/techniques/T0014.002.md
@@ -1,6 +1,6 @@
# Technique T0014.002: Raise Funds from Ignorant Agents
-* **Summary**: Raising funds from ignorant agents may include scams, donations intended for one stated purpose but then used for another, etc.
+* **Summary**: Raising funds from ignorant agents may include scams, donations intended for one stated purpose but then used for another, etc.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0014.md b/generated_pages/techniques/T0014.md
index 6e524af..9329053 100644
--- a/generated_pages/techniques/T0014.md
+++ b/generated_pages/techniques/T0014.md
@@ -1,6 +1,6 @@
# Technique T0014: Prepare Fundraising Campaigns
-* **Summary**: Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities.
+* **Summary**: Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities.
* **Belongs to tactic stage**: TA15
@@ -12,19 +12,8 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00008 Create shared fact-checking database](../../generated_pages/counters/C00008.md) | D04 |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00014 Real-time updates to fact-checking database](../../generated_pages/counters/C00014.md) | D04 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
| [C00059 Verification of project before posting fund requests](../../generated_pages/counters/C00059.md) | D02 |
-| [C00070 Block access to disinformation resources](../../generated_pages/counters/C00070.md) | D02 |
-| [C00107 Content moderation](../../generated_pages/counters/C00107.md) | D02 |
-| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00129 Use banking to cut off access ](../../generated_pages/counters/C00129.md) | D02 |
-| [C00153 Take pre-emptive action against actors' infrastructure](../../generated_pages/counters/C00153.md) | D03 |
| [C00155 Ban incident actors from funding sites](../../generated_pages/counters/C00155.md) | D02 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
-| [C00189 Ensure that platforms are taking down flagged accounts](../../generated_pages/counters/C00189.md) | D06 |
| [C00216 Use advertiser controls to stem flow of funds to bad actors](../../generated_pages/counters/C00216.md) | D02 |
diff --git a/generated_pages/techniques/T0015.md b/generated_pages/techniques/T0015.md
index 973ccaa..aceddf4 100644
--- a/generated_pages/techniques/T0015.md
+++ b/generated_pages/techniques/T0015.md
@@ -1,6 +1,6 @@
-# Technique T0015: Create Hashtags and Search Artifacts
+# Technique T0015: Create Hashtags and Search Artefacts
-* **Summary**: Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicize the story more widely through trending lists and search behavior. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites).
+* **Summary**: Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicise the story more widely through trending lists and search behaviour. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites).
* **Belongs to tactic stage**: TA06
@@ -13,14 +13,7 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00006 Charge for social media](../../generated_pages/counters/C00006.md) | D02 |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
| [C00066 Co-opt a hashtag and drown it out (hijack it back)](../../generated_pages/counters/C00066.md) | D03 |
-| [C00070 Block access to disinformation resources](../../generated_pages/counters/C00070.md) | D02 |
-| [C00107 Content moderation](../../generated_pages/counters/C00107.md) | D02 |
-| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0016.md b/generated_pages/techniques/T0016.md
index ff73576..efef607 100644
--- a/generated_pages/techniques/T0016.md
+++ b/generated_pages/techniques/T0016.md
@@ -1,6 +1,6 @@
# Technique T0016: Create Clickbait
-* **Summary**: Create attention grabbing headlines (outrage, doubt, humor) required to drive traffic & engagement. This is a key asset.
+* **Summary**: Create attention grabbing headlines (outrage, doubt, humour) required to drive traffic & engagement. This is a key asset.
* **Belongs to tactic stage**: TA05
@@ -13,19 +13,11 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
| [C00073 Inoculate populations through media literacy training](../../generated_pages/counters/C00073.md) | D02 |
| [C00076 Prohibit images in political discourse channels](../../generated_pages/counters/C00076.md) | D02 |
| [C00105 Buy more advertising than misinformation creators](../../generated_pages/counters/C00105.md) | D03 |
| [C00106 Click-bait centrist content](../../generated_pages/counters/C00106.md) | D03 |
-| [C00107 Content moderation](../../generated_pages/counters/C00107.md) | D02 |
-| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00142 Platform adds warning label and decision point when sharing content](../../generated_pages/counters/C00142.md) | D04 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
| [C00178 Fill information voids with non-disinformation content](../../generated_pages/counters/C00178.md) | D04 |
-| [C00195 Redirect searches away from disinformation or extremist content ](../../generated_pages/counters/C00195.md) | D02 |
-| [C00216 Use advertiser controls to stem flow of funds to bad actors](../../generated_pages/counters/C00216.md) | D02 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0017.md b/generated_pages/techniques/T0017.md
index 46d65b8..f145cc9 100644
--- a/generated_pages/techniques/T0017.md
+++ b/generated_pages/techniques/T0017.md
@@ -1,6 +1,6 @@
# Technique T0017: Conduct Fundraising
-* **Summary**: Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services166 on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns to promote operation messaging while raising money to support its activities.
+* **Summary**: Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services166 on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns to promote operation messaging while raising money to support its activities.
* **Belongs to tactic stage**: TA10
@@ -13,21 +13,8 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
| [C00058 Report crowdfunder as violator](../../generated_pages/counters/C00058.md) | D02 |
| [C00067 Denigrate the recipient/ project (of online funding)](../../generated_pages/counters/C00067.md) | D03 |
-| [C00070 Block access to disinformation resources](../../generated_pages/counters/C00070.md) | D02 |
-| [C00085 Mute content](../../generated_pages/counters/C00085.md) | D03 |
-| [C00093 Influencer code of conduct](../../generated_pages/counters/C00093.md) | D07 |
-| [C00107 Content moderation](../../generated_pages/counters/C00107.md) | D02 |
-| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00129 Use banking to cut off access ](../../generated_pages/counters/C00129.md) | D02 |
-| [C00153 Take pre-emptive action against actors' infrastructure](../../generated_pages/counters/C00153.md) | D03 |
-| [C00155 Ban incident actors from funding sites](../../generated_pages/counters/C00155.md) | D02 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
-| [C00178 Fill information voids with non-disinformation content](../../generated_pages/counters/C00178.md) | D04 |
-| [C00216 Use advertiser controls to stem flow of funds to bad actors](../../generated_pages/counters/C00216.md) | D02 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0018.md b/generated_pages/techniques/T0018.md
index 1a532f9..a280ffa 100644
--- a/generated_pages/techniques/T0018.md
+++ b/generated_pages/techniques/T0018.md
@@ -15,21 +15,7 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00006 Charge for social media](../../generated_pages/counters/C00006.md) | D02 |
-| [C00010 Enhanced privacy regulation for social media](../../generated_pages/counters/C00010.md) | D02 |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
| [C00065 Reduce political targeting](../../generated_pages/counters/C00065.md) | D03 |
-| [C00076 Prohibit images in political discourse channels](../../generated_pages/counters/C00076.md) | D02 |
-| [C00105 Buy more advertising than misinformation creators](../../generated_pages/counters/C00105.md) | D03 |
-| [C00107 Content moderation](../../generated_pages/counters/C00107.md) | D02 |
-| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00129 Use banking to cut off access ](../../generated_pages/counters/C00129.md) | D02 |
-| [C00153 Take pre-emptive action against actors' infrastructure](../../generated_pages/counters/C00153.md) | D03 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
-| [C00178 Fill information voids with non-disinformation content](../../generated_pages/counters/C00178.md) | D04 |
-| [C00195 Redirect searches away from disinformation or extremist content ](../../generated_pages/counters/C00195.md) | D02 |
-| [C00216 Use advertiser controls to stem flow of funds to bad actors](../../generated_pages/counters/C00216.md) | D02 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0019.md b/generated_pages/techniques/T0019.md
index c2d3bf6..ab68f37 100644
--- a/generated_pages/techniques/T0019.md
+++ b/generated_pages/techniques/T0019.md
@@ -24,10 +24,8 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00042 Address truth contained in narratives](../../generated_pages/counters/C00042.md) | D04 |
| [C00071 Block source of pollution](../../generated_pages/counters/C00071.md) | D02 |
| [C00072 Remove non-relevant content from special interest groups - not recommended](../../generated_pages/counters/C00072.md) | D02 |
-| [C00073 Inoculate populations through media literacy training](../../generated_pages/counters/C00073.md) | D02 |
| [C00074 Identify and delete or rate limit identical content](../../generated_pages/counters/C00074.md) | D02 |
diff --git a/generated_pages/techniques/T0020.md b/generated_pages/techniques/T0020.md
index 0391f3a..895f401 100644
--- a/generated_pages/techniques/T0020.md
+++ b/generated_pages/techniques/T0020.md
@@ -17,10 +17,6 @@
| Counters | Response types |
| -------- | -------------- |
| [C00090 Fake engagement system](../../generated_pages/counters/C00090.md) | D05 |
-| [C00136 Microtarget most likely targets then send them countermessages](../../generated_pages/counters/C00136.md) | D03 |
-| [C00149 Poison the monitoring & evaluation data](../../generated_pages/counters/C00149.md) | D04 |
-| [C00178 Fill information voids with non-disinformation content](../../generated_pages/counters/C00178.md) | D04 |
-| [C00211 Use humorous counter-narratives](../../generated_pages/counters/C00211.md) | D03 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0022.001.md b/generated_pages/techniques/T0022.001.md
index f9f3744..97c3c35 100644
--- a/generated_pages/techniques/T0022.001.md
+++ b/generated_pages/techniques/T0022.001.md
@@ -1,6 +1,6 @@
# Technique T0022.001: Amplify Existing Conspiracy Theory Narratives
-* **Summary**: An influence operation may amplify an existing conspiracy theory narrative that aligns with its incident or campaign goals. By amplifying existing conspiracy theory narratives, operators can leverage the power of the existing communities that support and propagate those theories without needing to expend resources creating new narratives or building momentum and buy in around new narratives.
+* **Summary**: An influence operation may amplify an existing conspiracy theory narrative that aligns with its incident or campaign goals. By amplifying existing conspiracy theory narratives, operators can leverage the power of the existing communities that support and propagate those theories without needing to expend resources creating new narratives or building momentum and buy in around new narratives.
* **Belongs to tactic stage**: TA14
diff --git a/generated_pages/techniques/T0022.002.md b/generated_pages/techniques/T0022.002.md
index b1a355d..f6d43c6 100644
--- a/generated_pages/techniques/T0022.002.md
+++ b/generated_pages/techniques/T0022.002.md
@@ -1,6 +1,6 @@
# Technique T0022.002: Develop Original Conspiracy Theory Narratives
-* **Summary**: While this requires more resources than amplifying existing conspiracy theory narratives, an influence operation may develop original conspiracy theory narratives in order to achieve greater control and alignment over the narrative and their campaign goals. Prominent examples include the USSR's Operation INFEKTION disinformation campaign run by the KGB in the 1980s to plant the idea that the United States had invented HIV/AIDS as part of a biological weapons research project at Fort Detrick, Maryland. More recently, Fort Detrick featured prominently in a new conspiracy theory narratives around the origins of the COVID-19 outbreak and pandemic.
+* **Summary**: While this requires more resources than amplifying existing conspiracy theory narratives, an influence operation may develop original conspiracy theory narratives in order to achieve greater control and alignment over the narrative and their campaign goals. Prominent examples include the USSR's Operation INFEKTION disinformation campaign run by the KGB in the 1980s to plant the idea that the United States had invented HIV/AIDS as part of a biological weapons research project at Fort Detrick, Maryland. More recently, Fort Detrick featured prominently in a new conspiracy theory narratives around the origins of the COVID-19 outbreak and pandemic.
* **Belongs to tactic stage**: TA14
diff --git a/generated_pages/techniques/T0022.md b/generated_pages/techniques/T0022.md
index 2e53182..c604c98 100644
--- a/generated_pages/techniques/T0022.md
+++ b/generated_pages/techniques/T0022.md
@@ -1,6 +1,6 @@
# Technique T0022: Leverage Conspiracy Theory Narratives
-* **Summary**: "Conspiracy narratives" appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalized or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the "firehose of falsehoods" model.
+* **Summary**: "Conspiracy narratives" appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalised or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the "firehose of falsehoods" model.
* **Belongs to tactic stage**: TA14
@@ -13,48 +13,11 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
-| [C00017 Repair broken social connections](../../generated_pages/counters/C00017.md) | D03 |
-| [C00019 Reduce effect of division-enablers](../../generated_pages/counters/C00019.md) | D03 |
-| [C00021 Encourage in-person communication](../../generated_pages/counters/C00021.md) | D04 |
-| [C00022 Innoculate. Positive campaign to promote feeling of safety](../../generated_pages/counters/C00022.md) | D04 |
-| [C00024 Promote healthy narratives](../../generated_pages/counters/C00024.md) | D04 |
-| [C00027 Create culture of civility](../../generated_pages/counters/C00027.md) | D07 |
-| [C00029 Create fake website to issue counter narrative and counter narrative through physical merchandise](../../generated_pages/counters/C00029.md) | D03 |
-| [C00030 Develop a compelling counter narrative (truth based)](../../generated_pages/counters/C00030.md) | D03 |
-| [C00042 Address truth contained in narratives](../../generated_pages/counters/C00042.md) | D04 |
-| [C00046 Marginalise and discredit extremist groups](../../generated_pages/counters/C00046.md) | D04 |
-| [C00072 Remove non-relevant content from special interest groups - not recommended](../../generated_pages/counters/C00072.md) | D02 |
-| [C00073 Inoculate populations through media literacy training](../../generated_pages/counters/C00073.md) | D02 |
-| [C00074 Identify and delete or rate limit identical content](../../generated_pages/counters/C00074.md) | D02 |
-| [C00080 Create competing narrative](../../generated_pages/counters/C00080.md) | D03 |
-| [C00081 Highlight flooding and noise, and explain motivations](../../generated_pages/counters/C00081.md) | D03 |
-| [C00082 Ground truthing as automated response to pollution](../../generated_pages/counters/C00082.md) | D03 |
-| [C00084 Modify disinformation narratives, and rebroadcast them](../../generated_pages/counters/C00084.md) | D03 |
-| [C00085 Mute content](../../generated_pages/counters/C00085.md) | D03 |
| [C00096 Strengthen institutions that are always truth tellers](../../generated_pages/counters/C00096.md) | D07 |
-| [C00107 Content moderation](../../generated_pages/counters/C00107.md) | D02 |
-| [C00117 Downgrade / de-amplify so message is seen by fewer people](../../generated_pages/counters/C00117.md) | D04 |
| [C00119 Engage payload and debunk.](../../generated_pages/counters/C00119.md) | D07 |
-| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00125 Prebunking](../../generated_pages/counters/C00125.md) | D03 |
-| [C00126 Social media amber alert](../../generated_pages/counters/C00126.md) | D03 |
-| [C00156 Better tell your country or organization story](../../generated_pages/counters/C00156.md) | D03 |
+| [C00156 Better tell your country or organisation story](../../generated_pages/counters/C00156.md) | D03 |
| [C00161 Coalition Building with stakeholders and Third-Party Inducements](../../generated_pages/counters/C00161.md) | D07 |
-| [C00162 Unravel/target the Potemkin villages](../../generated_pages/counters/C00162.md) | D03 |
| [C00164 compatriot policy](../../generated_pages/counters/C00164.md) | D03 |
-| [C00169 develop a creative content hub](../../generated_pages/counters/C00169.md) | D03 |
-| [C00174 Create a healthier news environment](../../generated_pages/counters/C00174.md) | D02 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
-| [C00188 Newsroom/Journalist training to counter influence moves](../../generated_pages/counters/C00188.md) | D03 |
-| [C00190 open engagement with civil society](../../generated_pages/counters/C00190.md) | D03 |
-| [C00195 Redirect searches away from disinformation or extremist content ](../../generated_pages/counters/C00195.md) | D02 |
-| [C00200 Respected figure (influencer) disavows misinfo](../../generated_pages/counters/C00200.md) | D03 |
-| [C00203 Stop offering press credentials to propaganda outlets](../../generated_pages/counters/C00203.md) | D03 |
-| [C00205 strong dialogue between the federal government and private sector to encourage better reporting](../../generated_pages/counters/C00205.md) | D03 |
-| [C00211 Use humorous counter-narratives](../../generated_pages/counters/C00211.md) | D03 |
-| [C00212 build public resilience by making civil society more vibrant](../../generated_pages/counters/C00212.md) | D03 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0023.001.md b/generated_pages/techniques/T0023.001.md
index 745c0af..b4f7b31 100644
--- a/generated_pages/techniques/T0023.001.md
+++ b/generated_pages/techniques/T0023.001.md
@@ -1,6 +1,6 @@
# Technique T0023.001: Reframe Context
-* **Summary**: Reframing context refers to removing an event from its surrounding context to distort its intended meaning. Rather than deny that an event occurred, reframing context frames an event in a manner that may lead the target audience to draw a different conclusion about its intentions.
+* **Summary**: Reframing context refers to removing an event from its surrounding context to distort its intended meaning. Rather than deny that an event occurred, reframing context frames an event in a manner that may lead the target audience to draw a different conclusion about its intentions.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0023.002.md b/generated_pages/techniques/T0023.002.md
index e804563..86fbba7 100644
--- a/generated_pages/techniques/T0023.002.md
+++ b/generated_pages/techniques/T0023.002.md
@@ -1,6 +1,6 @@
# Technique T0023.002: Edit Open-Source Content
-* **Summary**: An influence operation may edit open-source content, such as collaborative blogs or encyclopedias, to promote its narratives on outlets with existing credibility and audiences. Editing open-source content may allow an operation to post content on platforms without dedicating resources to the creation and maintenance of its own assets.
+* **Summary**: An influence operation may edit open-source content, such as collaborative blogs or encyclopaedias, to promote its narratives on outlets with existing credibility and audiences. Editing open-source content may allow an operation to post content on platforms without dedicating resources to the creation and maintenance of its own assets.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0023.md b/generated_pages/techniques/T0023.md
index 54c7028..04d4124 100644
--- a/generated_pages/techniques/T0023.md
+++ b/generated_pages/techniques/T0023.md
@@ -14,19 +14,6 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00017 Repair broken social connections](../../generated_pages/counters/C00017.md) | D03 |
-| [C00019 Reduce effect of division-enablers](../../generated_pages/counters/C00019.md) | D03 |
-| [C00021 Encourage in-person communication](../../generated_pages/counters/C00021.md) | D04 |
-| [C00022 Innoculate. Positive campaign to promote feeling of safety](../../generated_pages/counters/C00022.md) | D04 |
-| [C00024 Promote healthy narratives](../../generated_pages/counters/C00024.md) | D04 |
-| [C00027 Create culture of civility](../../generated_pages/counters/C00027.md) | D07 |
-| [C00046 Marginalise and discredit extremist groups](../../generated_pages/counters/C00046.md) | D04 |
-| [C00072 Remove non-relevant content from special interest groups - not recommended](../../generated_pages/counters/C00072.md) | D02 |
-| [C00073 Inoculate populations through media literacy training](../../generated_pages/counters/C00073.md) | D02 |
-| [C00081 Highlight flooding and noise, and explain motivations](../../generated_pages/counters/C00081.md) | D03 |
-| [C00082 Ground truthing as automated response to pollution](../../generated_pages/counters/C00082.md) | D03 |
-| [C00092 Establish a truth teller reputation score for influencers](../../generated_pages/counters/C00092.md) | D07 |
-| [C00096 Strengthen institutions that are always truth tellers](../../generated_pages/counters/C00096.md) | D07 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0039.md b/generated_pages/techniques/T0039.md
index b6c8aa2..c1c3b11 100644
--- a/generated_pages/techniques/T0039.md
+++ b/generated_pages/techniques/T0039.md
@@ -1,17 +1,24 @@
-# Technique T0039 : Bait Legitimate Influencers
+# Technique T0039: Bait Legitimate Influencers
-* **Summary**: Credibility in a social media environment is often a function of the size of a user's network. "Influencers" are so-called because of their reach, typically understood as: 1) the size of their network (i.e. the number of followers, perhaps weighted by their own influence); and 2) The rate at which their comments are re-circulated (these two metrics are related). Add traditional media players at all levels of credibility and professionalism to this, and the number of potential influencial carriers available for unwitting amplification becomes substantial. By targeting high-influence people and organizations in all types of media with narratives and content engineered to appeal their emotional or ideological drivers, influence campaigns are able to add perceived credibility to their messaging via saturation and adoption by trusted agents such as celebrities, journalists and local leaders.
+* **Summary**: Credibility in a social media environment is often a function of the size of a user's network. "Influencers" are so-called because of their reach, typically understood as: 1) the size of their network (i.e. the number of followers, perhaps weighted by their own influence); and 2) The rate at which their comments are re-circulated (these two metrics are related). Add traditional media players at all levels of credibility and professionalism to this, and the number of potential influencial carriers available for unwitting amplification becomes substantial. By targeting high-influence people and organisations in all types of media with narratives and content engineered to appeal their emotional or ideological drivers, influence campaigns are able to add perceived credibility to their messaging via saturation and adoption by trusted agents such as celebrities, journalists and local leaders.
* **Belongs to tactic stage**: TA08
| Incident | Descriptions given for this incident |
| -------- | -------------------- |
+| [I00006 Columbian Chemicals](../../generated_pages/incidents/I00006.md) | bait journalists/media/politicians |
+| [I00010 ParklandTeens](../../generated_pages/incidents/I00010.md) | journalist/media baiting |
+| [I00015 ConcordDiscovery](../../generated_pages/incidents/I00015.md) | journalist/media baiting |
| Counters | Response types |
| -------- | -------------- |
+| [C00087 Make more noise than the disinformation](../../generated_pages/counters/C00087.md) | D04 |
+| [C00114 Don't engage with payloads](../../generated_pages/counters/C00114.md) | D02 |
+| [C00154 Ask media not to report false information](../../generated_pages/counters/C00154.md) | D02 |
+| [C00160 find and train influencers](../../generated_pages/counters/C00160.md) | D02 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0040.md b/generated_pages/techniques/T0040.md
index cad11af..df5588c 100644
--- a/generated_pages/techniques/T0040.md
+++ b/generated_pages/techniques/T0040.md
@@ -14,7 +14,6 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00073 Inoculate populations through media literacy training](../../generated_pages/counters/C00073.md) | D02 |
| [C00112 "Prove they are not an op!"](../../generated_pages/counters/C00112.md) | D02 |
diff --git a/generated_pages/techniques/T0042.md b/generated_pages/techniques/T0042.md
index 3d34fa0..113dd14 100644
--- a/generated_pages/techniques/T0042.md
+++ b/generated_pages/techniques/T0042.md
@@ -12,9 +12,6 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00009 Educate high profile influencers on best practices](../../generated_pages/counters/C00009.md) | D02 |
-| [C00042 Address truth contained in narratives](../../generated_pages/counters/C00042.md) | D04 |
-| [C00112 "Prove they are not an op!"](../../generated_pages/counters/C00112.md) | D02 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0043.md b/generated_pages/techniques/T0043.md
index 3857e3d..f2e1553 100644
--- a/generated_pages/techniques/T0043.md
+++ b/generated_pages/techniques/T0043.md
@@ -13,13 +13,7 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00006 Charge for social media](../../generated_pages/counters/C00006.md) | D02 |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
-| [C00074 Identify and delete or rate limit identical content](../../generated_pages/counters/C00074.md) | D02 |
-| [C00121 Tool transparency and literacy for channels people follow. ](../../generated_pages/counters/C00121.md) | D07 |
-| [C00135 Deplatform message groups and/or message boards](../../generated_pages/counters/C00135.md) | D03 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
+| [C00121 Tool transparency and literacy for channels people follow.](../../generated_pages/counters/C00121.md) | D07 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0044.md b/generated_pages/techniques/T0044.md
index c9a889e..149918c 100644
--- a/generated_pages/techniques/T0044.md
+++ b/generated_pages/techniques/T0044.md
@@ -1,6 +1,6 @@
# Technique T0044: Seed Distortions
-* **Summary**: Try a wide variety of messages in the early hours surrounding an incident or event, to give a misleading account or impression.
+* **Summary**: Try a wide variety of messages in the early hours surrounding an incident or event, to give a misleading account or impression.
* **Belongs to tactic stage**: TA08
@@ -13,34 +13,8 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00008 Create shared fact-checking database](../../generated_pages/counters/C00008.md) | D04 |
-| [C00009 Educate high profile influencers on best practices](../../generated_pages/counters/C00009.md) | D02 |
-| [C00011 Media literacy. Games to identify fake news](../../generated_pages/counters/C00011.md) | D02 |
-| [C00014 Real-time updates to fact-checking database](../../generated_pages/counters/C00014.md) | D04 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
-| [C00019 Reduce effect of division-enablers](../../generated_pages/counters/C00019.md) | D03 |
-| [C00021 Encourage in-person communication](../../generated_pages/counters/C00021.md) | D04 |
-| [C00022 Innoculate. Positive campaign to promote feeling of safety](../../generated_pages/counters/C00022.md) | D04 |
-| [C00024 Promote healthy narratives](../../generated_pages/counters/C00024.md) | D04 |
-| [C00027 Create culture of civility](../../generated_pages/counters/C00027.md) | D07 |
-| [C00042 Address truth contained in narratives](../../generated_pages/counters/C00042.md) | D04 |
-| [C00046 Marginalise and discredit extremist groups](../../generated_pages/counters/C00046.md) | D04 |
-| [C00072 Remove non-relevant content from special interest groups - not recommended](../../generated_pages/counters/C00072.md) | D02 |
-| [C00073 Inoculate populations through media literacy training](../../generated_pages/counters/C00073.md) | D02 |
-| [C00076 Prohibit images in political discourse channels](../../generated_pages/counters/C00076.md) | D02 |
-| [C00078 Change Search Algorithms for Disinformation Content](../../generated_pages/counters/C00078.md) | D03 |
-| [C00081 Highlight flooding and noise, and explain motivations](../../generated_pages/counters/C00081.md) | D03 |
-| [C00082 Ground truthing as automated response to pollution](../../generated_pages/counters/C00082.md) | D03 |
-| [C00085 Mute content](../../generated_pages/counters/C00085.md) | D03 |
| [C00086 Distract from noise with addictive content](../../generated_pages/counters/C00086.md) | D04 |
-| [C00087 Make more noise than the disinformation](../../generated_pages/counters/C00087.md) | D04 |
-| [C00092 Establish a truth teller reputation score for influencers](../../generated_pages/counters/C00092.md) | D07 |
-| [C00117 Downgrade / de-amplify so message is seen by fewer people](../../generated_pages/counters/C00117.md) | D04 |
| [C00118 Repurpose images with new text](../../generated_pages/counters/C00118.md) | D04 |
-| [C00119 Engage payload and debunk.](../../generated_pages/counters/C00119.md) | D07 |
-| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00174 Create a healthier news environment](../../generated_pages/counters/C00174.md) | D02 |
-| [C00184 Media exposure](../../generated_pages/counters/C00184.md) | D04 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0045.md b/generated_pages/techniques/T0045.md
index bb58687..a6ee98e 100644
--- a/generated_pages/techniques/T0045.md
+++ b/generated_pages/techniques/T0045.md
@@ -13,38 +13,8 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00008 Create shared fact-checking database](../../generated_pages/counters/C00008.md) | D04 |
-| [C00011 Media literacy. Games to identify fake news](../../generated_pages/counters/C00011.md) | D02 |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00014 Real-time updates to fact-checking database](../../generated_pages/counters/C00014.md) | D04 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
-| [C00036 Infiltrate the in-group to discredit leaders (divide)](../../generated_pages/counters/C00036.md) | D02 |
-| [C00046 Marginalise and discredit extremist groups](../../generated_pages/counters/C00046.md) | D04 |
-| [C00048 Name and Shame Influencers](../../generated_pages/counters/C00048.md) | D07 |
-| [C00073 Inoculate populations through media literacy training](../../generated_pages/counters/C00073.md) | D02 |
-| [C00085 Mute content](../../generated_pages/counters/C00085.md) | D03 |
-| [C00092 Establish a truth teller reputation score for influencers](../../generated_pages/counters/C00092.md) | D07 |
-| [C00097 Require use of verified identities to contribute to poll or comment](../../generated_pages/counters/C00097.md) | D02 |
-| [C00099 Strengthen verification methods](../../generated_pages/counters/C00099.md) | D02 |
-| [C00107 Content moderation](../../generated_pages/counters/C00107.md) | D02 |
| [C00113 Debunk and defuse a fake expert / credentials.](../../generated_pages/counters/C00113.md) | D02 |
-| [C00117 Downgrade / de-amplify so message is seen by fewer people](../../generated_pages/counters/C00117.md) | D04 |
-| [C00119 Engage payload and debunk.](../../generated_pages/counters/C00119.md) | D07 |
-| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00125 Prebunking](../../generated_pages/counters/C00125.md) | D03 |
-| [C00126 Social media amber alert](../../generated_pages/counters/C00126.md) | D03 |
-| [C00133 Deplatform Account*](../../generated_pages/counters/C00133.md) | D03 |
-| [C00133 Deplatform Account*](../../generated_pages/counters/C00133.md) | D03 |
-| [C00154 Ask media not to report false information](../../generated_pages/counters/C00154.md) | D02 |
-| [C00174 Create a healthier news environment](../../generated_pages/counters/C00174.md) | D02 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
| [C00184 Media exposure](../../generated_pages/counters/C00184.md) | D04 |
-| [C00184 Media exposure](../../generated_pages/counters/C00184.md) | D04 |
-| [C00188 Newsroom/Journalist training to counter influence moves](../../generated_pages/counters/C00188.md) | D03 |
-| [C00195 Redirect searches away from disinformation or extremist content ](../../generated_pages/counters/C00195.md) | D02 |
-| [C00200 Respected figure (influencer) disavows misinfo](../../generated_pages/counters/C00200.md) | D03 |
-| [C00203 Stop offering press credentials to propaganda outlets](../../generated_pages/counters/C00203.md) | D03 |
-| [C00211 Use humorous counter-narratives](../../generated_pages/counters/C00211.md) | D03 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0046.md b/generated_pages/techniques/T0046.md
index 9bc4d05..ab36822 100644
--- a/generated_pages/techniques/T0046.md
+++ b/generated_pages/techniques/T0046.md
@@ -1,6 +1,6 @@
-# Technique T0046: Use Search Engine Optimization
+# Technique T0046: Use Search Engine Optimisation
-* **Summary**: Manipulate content engagement metrics (ie: Reddit & Twitter) to influence/impact news search results (e.g. Google), also elevates RT & Sputnik headline into Google news alert emails. aka "Black-hat SEO"
+* **Summary**: Manipulate content engagement metrics (ie: Reddit & Twitter) to influence/impact news search results (e.g. Google), also elevates RT & Sputnik headline into Google news alert emails. aka "Black-hat SEO"
* **Belongs to tactic stage**: TA08
@@ -24,13 +24,7 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00070 Block access to disinformation resources](../../generated_pages/counters/C00070.md) | D02 |
-| [C00078 Change Search Algorithms for Disinformation Content](../../generated_pages/counters/C00078.md) | D03 |
| [C00117 Downgrade / de-amplify so message is seen by fewer people](../../generated_pages/counters/C00117.md) | D04 |
-| [C00149 Poison the monitoring & evaluation data](../../generated_pages/counters/C00149.md) | D04 |
-| [C00188 Newsroom/Journalist training to counter influence moves](../../generated_pages/counters/C00188.md) | D03 |
-| [C00195 Redirect searches away from disinformation or extremist content ](../../generated_pages/counters/C00195.md) | D02 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0047.md b/generated_pages/techniques/T0047.md
index b035d19..f87aeef 100644
--- a/generated_pages/techniques/T0047.md
+++ b/generated_pages/techniques/T0047.md
@@ -13,9 +13,6 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00060 Legal action against for-profit engagement factories](../../generated_pages/counters/C00060.md) | D03 |
-| [C00093 Influencer code of conduct](../../generated_pages/counters/C00093.md) | D07 |
| [C00120 Open dialogue about design of platforms to produce different outcomes](../../generated_pages/counters/C00120.md) | D07 |
diff --git a/generated_pages/techniques/T0048.001.md b/generated_pages/techniques/T0048.001.md
index 2068424..60486ab 100644
--- a/generated_pages/techniques/T0048.001.md
+++ b/generated_pages/techniques/T0048.001.md
@@ -1,6 +1,6 @@
# Technique T0048.001: Boycott/"Cancel" Opponents
-* **Summary**: Cancel culture refers to the phenomenon in which individuals collectively refrain from supporting an individual, organization, business, or other entity, usually following a real or falsified controversy. An influence operation may exploit cancel culture by emphasizing an adversary’s problematic or disputed behavior and presenting its own content as an alternative.
+* **Summary**: Cancel culture refers to the phenomenon in which individuals collectively refrain from supporting an individual, organisation, business, or other entity, usually following a real or falsified controversy. An influence operation may exploit cancel culture by emphasising an adversary’s problematic or disputed behaviour and presenting its own content as an alternative.
* **Belongs to tactic stage**: TA18
diff --git a/generated_pages/techniques/T0048.003.md b/generated_pages/techniques/T0048.003.md
index b1046b9..fc50358 100644
--- a/generated_pages/techniques/T0048.003.md
+++ b/generated_pages/techniques/T0048.003.md
@@ -1,6 +1,6 @@
# Technique T0048.003: Threaten to Dox
-* **Summary**: Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.
+* **Summary**: Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.
* **Belongs to tactic stage**: TA18
diff --git a/generated_pages/techniques/T0048.004.md b/generated_pages/techniques/T0048.004.md
index 6cc6b23..731fcf8 100644
--- a/generated_pages/techniques/T0048.004.md
+++ b/generated_pages/techniques/T0048.004.md
@@ -1,6 +1,6 @@
# Technique T0048.004: Dox
-* **Summary**: Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.
+* **Summary**: Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.
* **Belongs to tactic stage**: TA18
diff --git a/generated_pages/techniques/T0048.md b/generated_pages/techniques/T0048.md
index a374048..c4dcd34 100644
--- a/generated_pages/techniques/T0048.md
+++ b/generated_pages/techniques/T0048.md
@@ -1,6 +1,6 @@
# Technique T0048: Harass
-* **Summary**: Threatening or harassing believers of opposing narratives refers to the use of intimidation techniques, including cyberbullying and doxing, to discourage opponents from voicing their dissent. An influence operation may threaten or harass believers of the opposing narratives to deter individuals from posting or proliferating conflicting content.
+* **Summary**: Threatening or harassing believers of opposing narratives refers to the use of intimidation techniques, including cyberbullying and doxing, to discourage opponents from voicing their dissent. An influence operation may threaten or harass believers of the opposing narratives to deter individuals from posting or proliferating conflicting content.
* **Belongs to tactic stage**: TA18
@@ -13,18 +13,6 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00009 Educate high profile influencers on best practices](../../generated_pages/counters/C00009.md) | D02 |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00027 Create culture of civility](../../generated_pages/counters/C00027.md) | D07 |
-| [C00046 Marginalise and discredit extremist groups](../../generated_pages/counters/C00046.md) | D04 |
-| [C00048 Name and Shame Influencers](../../generated_pages/counters/C00048.md) | D07 |
-| [C00073 Inoculate populations through media literacy training](../../generated_pages/counters/C00073.md) | D02 |
-| [C00087 Make more noise than the disinformation](../../generated_pages/counters/C00087.md) | D04 |
-| [C00093 Influencer code of conduct](../../generated_pages/counters/C00093.md) | D07 |
-| [C00114 Don't engage with payloads](../../generated_pages/counters/C00114.md) | D02 |
-| [C00115 Expose actor and intentions](../../generated_pages/counters/C00115.md) | D02 |
-| [C00154 Ask media not to report false information](../../generated_pages/counters/C00154.md) | D02 |
-| [C00160 find and train influencers](../../generated_pages/counters/C00160.md) | D02 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0049.003.md b/generated_pages/techniques/T0049.003.md
index 4bfec7c..944d37d 100644
--- a/generated_pages/techniques/T0049.003.md
+++ b/generated_pages/techniques/T0049.003.md
@@ -1,7 +1,6 @@
# Technique T0049.003: Bots Amplify via Automated Forwarding and Reposting
-* **Summary**: Automated forwarding and reposting refer to the proliferation of operation content using automated means, such as artificial intelligence or social media bots. An influence operation may use automated activity to increase content exposure without dedicating the resources, including personnel and time, traditionally required to forward and repost content.
-Use bots to amplify narratives above algorithm thresholds. Bots are automated/programmed profiles designed to amplify content (ie: automatically retweet or like) and give appearance it's more "popular" than it is. They can operate as a network, to function in a coordinated/orchestrated manner. In some cases (more so now) they are an inexpensive/disposable assets used for minimal deployment as bot detection tools improve and platforms are more responsive.
+* **Summary**: Automated forwarding and reposting refer to the proliferation of operation content using automated means, such as artificial intelligence or social media bots. An influence operation may use automated activity to increase content exposure without dedicating the resources, including personnel and time, traditionally required to forward and repost content. Use bots to amplify narratives above algorithm thresholds. Bots are automated/programmed profiles designed to amplify content (ie: automatically retweet or like) and give appearance it's more "popular" than it is. They can operate as a network, to function in a coordinated/orchestrated manner. In some cases (more so now) they are an inexpensive/disposable assets used for minimal deployment as bot detection tools improve and platforms are more responsive.
* **Belongs to tactic stage**: TA17
diff --git a/generated_pages/techniques/T0049.004.md b/generated_pages/techniques/T0049.004.md
index a574d11..fac68ff 100644
--- a/generated_pages/techniques/T0049.004.md
+++ b/generated_pages/techniques/T0049.004.md
@@ -1,6 +1,6 @@
-# Technique T0049.004: Utilize Spamoflauge
+# Technique T0049.004: Utilise Spamoflauge
-* **Summary**: Spamoflauge refers to the practice of disguising spam messages as legitimate. Spam refers to the use of electronic messaging systems to send out unrequested or unwanted messages in bulk. Simple methods of spamoflauge include replacing letters with numbers to fool keyword-based email spam filters, for example, "you've w0n our jackp0t!". Spamoflauge may extend to more complex techniques such as modifying the grammar or word choice of the language, casting messages as images which spam detectors cannot automatically read, or encapsulating messages in password protected attachments, such as .pdf or .zip files. Influence operations may use spamoflauge to avoid spam filtering systems and increase the likelihood of the target audience receiving operation messaging.
+* **Summary**: Spamoflauge refers to the practice of disguising spam messages as legitimate. Spam refers to the use of electronic messaging systems to send out unrequested or unwanted messages in bulk. Simple methods of spamoflauge include replacing letters with numbers to fool keyword-based email spam filters, for example, "you've w0n our jackp0t!". Spamoflauge may extend to more complex techniques such as modifying the grammar or word choice of the language, casting messages as images which spam detectors cannot automatically read, or encapsulating messages in password protected attachments, such as .pdf or .zip files. Influence operations may use spamoflauge to avoid spam filtering systems and increase the likelihood of the target audience receiving operation messaging.
* **Belongs to tactic stage**: TA17
diff --git a/generated_pages/techniques/T0049.005.md b/generated_pages/techniques/T0049.005.md
index e3a547b..699d738 100644
--- a/generated_pages/techniques/T0049.005.md
+++ b/generated_pages/techniques/T0049.005.md
@@ -1,6 +1,6 @@
# Technique T0049.005: Conduct Swarming
-* **Summary**: Swarming refers to the coordinated use of accounts to overwhelm the information space with operation content. Unlike information flooding, swarming centers exclusively around a specific event or actor rather than a general narrative. Swarming relies on “horizontal communication” between information assets rather than a top-down, vertical command-and-control approach.
+* **Summary**: Swarming refers to the coordinated use of accounts to overwhelm the information space with operation content. Unlike information flooding, swarming centres exclusively around a specific event or actor rather than a general narrative. Swarming relies on “horizontal communication” between information assets rather than a top-down, vertical command-and-control approach.
* **Belongs to tactic stage**: TA17
diff --git a/generated_pages/techniques/T0049.006.md b/generated_pages/techniques/T0049.006.md
index 3a46735..068f583 100644
--- a/generated_pages/techniques/T0049.006.md
+++ b/generated_pages/techniques/T0049.006.md
@@ -1,6 +1,6 @@
# Technique T0049.006: Conduct Keyword Squatting
-* **Summary**: Keyword squatting refers to the creation of online content, such as websites, articles, or social media accounts, around a specific search engine-optimized term to overwhelm the search results of that term. An influence may keyword squat to increase content exposure to target audience members who query the exploited term in a search engine and manipulate the narrative around the term.
+* **Summary**: Keyword squatting refers to the creation of online content, such as websites, articles, or social media accounts, around a specific search engine-optimized term to overwhelm the search results of that term. An influence may keyword squat to increase content exposure to target audience members who query the exploited term in a search engine and manipulate the narrative around the term.
* **Belongs to tactic stage**: TA17
diff --git a/generated_pages/techniques/T0049.md b/generated_pages/techniques/T0049.md
index 82c2548..d5f0a90 100644
--- a/generated_pages/techniques/T0049.md
+++ b/generated_pages/techniques/T0049.md
@@ -14,22 +14,7 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
-| [C00044 Keep people from posting to social media immediately](../../generated_pages/counters/C00044.md) | D03 |
-| [C00072 Remove non-relevant content from special interest groups - not recommended](../../generated_pages/counters/C00072.md) | D02 |
-| [C00074 Identify and delete or rate limit identical content](../../generated_pages/counters/C00074.md) | D02 |
-| [C00085 Mute content](../../generated_pages/counters/C00085.md) | D03 |
-| [C00086 Distract from noise with addictive content](../../generated_pages/counters/C00086.md) | D04 |
-| [C00087 Make more noise than the disinformation](../../generated_pages/counters/C00087.md) | D04 |
-| [C00091 Honeypot social community](../../generated_pages/counters/C00091.md) | D05 |
-| [C00101 Create friction by rate-limiting engagement](../../generated_pages/counters/C00101.md) | D04 |
-| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00123 Remove or rate limit botnets](../../generated_pages/counters/C00123.md) | D03 |
-| [C00128 Create friction by marking content with ridicule or other "decelerants"](../../generated_pages/counters/C00128.md) | D03 |
| [C00131 Seize and analyse botnet servers](../../generated_pages/counters/C00131.md) | D02 |
-| [C00153 Take pre-emptive action against actors' infrastructure](../../generated_pages/counters/C00153.md) | D03 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0057.001.md b/generated_pages/techniques/T0057.001.md
index d88af6a..fdc79a5 100644
--- a/generated_pages/techniques/T0057.001.md
+++ b/generated_pages/techniques/T0057.001.md
@@ -1,6 +1,6 @@
# Technique T0057.001: Pay for Physical Action
-* **Summary**: Paying for physical action occurs when an influence operation pays individuals to act in the physical realm. An influence operation may pay for physical action to create specific situations and frame them in a way that supports operation narratives, for example, paying a group of people to burn a car to later post an image of the burning car and frame it as an act of protest.
+* **Summary**: Paying for physical action occurs when an influence operation pays individuals to act in the physical realm. An influence operation may pay for physical action to create specific situations and frame them in a way that supports operation narratives, for example, paying a group of people to burn a car to later post an image of the burning car and frame it as an act of protest.
* **Belongs to tactic stage**: TA10
diff --git a/generated_pages/techniques/T0057.002.md b/generated_pages/techniques/T0057.002.md
index c8108bb..4a2f0ab 100644
--- a/generated_pages/techniques/T0057.002.md
+++ b/generated_pages/techniques/T0057.002.md
@@ -1,6 +1,6 @@
# Technique T0057.002: Conduct Symbolic Action
-* **Summary**: Symbolic action refers to activities specifically intended to advance an operation’s narrative by signaling something to the audience, for example, a military parade supporting a state’s narrative of military superiority. An influence operation may use symbolic action to create falsified evidence supporting operation narratives in the physical information space.
+* **Summary**: Symbolic action refers to activities specifically intended to advance an operation’s narrative by signalling something to the audience, for example, a military parade supporting a state’s narrative of military superiority. An influence operation may use symbolic action to create falsified evidence supporting operation narratives in the physical information space.
* **Belongs to tactic stage**: TA10
diff --git a/generated_pages/techniques/T0057.md b/generated_pages/techniques/T0057.md
index b3dd95a..908eb81 100644
--- a/generated_pages/techniques/T0057.md
+++ b/generated_pages/techniques/T0057.md
@@ -1,4 +1,4 @@
-# Technique T0057: Organize Events
+# Technique T0057: Organise Events
* **Summary**: Coordinate and promote real-world events across media platforms, e.g. rallies, protests, gatherings in support of incident narratives.
@@ -16,24 +16,7 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00016 Censorship](../../generated_pages/counters/C00016.md) | D02 |
-| [C00036 Infiltrate the in-group to discredit leaders (divide)](../../generated_pages/counters/C00036.md) | D02 |
-| [C00046 Marginalise and discredit extremist groups](../../generated_pages/counters/C00046.md) | D04 |
-| [C00048 Name and Shame Influencers](../../generated_pages/counters/C00048.md) | D07 |
-| [C00070 Block access to disinformation resources](../../generated_pages/counters/C00070.md) | D02 |
-| [C00074 Identify and delete or rate limit identical content](../../generated_pages/counters/C00074.md) | D02 |
-| [C00085 Mute content](../../generated_pages/counters/C00085.md) | D03 |
-| [C00107 Content moderation](../../generated_pages/counters/C00107.md) | D02 |
-| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00126 Social media amber alert](../../generated_pages/counters/C00126.md) | D03 |
-| [C00128 Create friction by marking content with ridicule or other "decelerants"](../../generated_pages/counters/C00128.md) | D03 |
-| [C00129 Use banking to cut off access ](../../generated_pages/counters/C00129.md) | D02 |
-| [C00149 Poison the monitoring & evaluation data](../../generated_pages/counters/C00149.md) | D04 |
-| [C00153 Take pre-emptive action against actors' infrastructure](../../generated_pages/counters/C00153.md) | D03 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
-| [C00211 Use humorous counter-narratives](../../generated_pages/counters/C00211.md) | D03 |
-| [C00216 Use advertiser controls to stem flow of funds to bad actors](../../generated_pages/counters/C00216.md) | D02 |
+| [C00129 Use banking to cut off access](../../generated_pages/counters/C00129.md) | D02 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0059.md b/generated_pages/techniques/T0059.md
index b8cd911..5c2140b 100644
--- a/generated_pages/techniques/T0059.md
+++ b/generated_pages/techniques/T0059.md
@@ -12,8 +12,6 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00042 Address truth contained in narratives](../../generated_pages/counters/C00042.md) | D04 |
-| [C00084 Modify disinformation narratives, and rebroadcast them](../../generated_pages/counters/C00084.md) | D03 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0060.md b/generated_pages/techniques/T0060.md
index 076414b..b51991b 100644
--- a/generated_pages/techniques/T0060.md
+++ b/generated_pages/techniques/T0060.md
@@ -12,17 +12,6 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00046 Marginalise and discredit extremist groups](../../generated_pages/counters/C00046.md) | D04 |
-| [C00048 Name and Shame Influencers](../../generated_pages/counters/C00048.md) | D07 |
-| [C00073 Inoculate populations through media literacy training](../../generated_pages/counters/C00073.md) | D02 |
-| [C00074 Identify and delete or rate limit identical content](../../generated_pages/counters/C00074.md) | D02 |
-| [C00078 Change Search Algorithms for Disinformation Content](../../generated_pages/counters/C00078.md) | D03 |
-| [C00085 Mute content](../../generated_pages/counters/C00085.md) | D03 |
-| [C00117 Downgrade / de-amplify so message is seen by fewer people](../../generated_pages/counters/C00117.md) | D04 |
-| [C00119 Engage payload and debunk.](../../generated_pages/counters/C00119.md) | D07 |
-| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00123 Remove or rate limit botnets](../../generated_pages/counters/C00123.md) | D03 |
-| [C00131 Seize and analyse botnet servers](../../generated_pages/counters/C00131.md) | D02 |
| [C00138 Spam domestic actors with lawsuits](../../generated_pages/counters/C00138.md) | D03 |
| [C00143 (botnet) DMCA takedown requests to waste group time](../../generated_pages/counters/C00143.md) | D04 |
| [C00147 Make amplification of social media posts expire (e.g. can't like/ retweet after n days)](../../generated_pages/counters/C00147.md) | D03 |
diff --git a/generated_pages/techniques/T0061.md b/generated_pages/techniques/T0061.md
index d804f01..b859aab 100644
--- a/generated_pages/techniques/T0061.md
+++ b/generated_pages/techniques/T0061.md
@@ -12,20 +12,6 @@
| Counters | Response types |
| -------- | -------------- |
-| [C00012 Platform regulation](../../generated_pages/counters/C00012.md) | D02 |
-| [C00046 Marginalise and discredit extremist groups](../../generated_pages/counters/C00046.md) | D04 |
-| [C00048 Name and Shame Influencers](../../generated_pages/counters/C00048.md) | D07 |
-| [C00058 Report crowdfunder as violator](../../generated_pages/counters/C00058.md) | D02 |
-| [C00067 Denigrate the recipient/ project (of online funding)](../../generated_pages/counters/C00067.md) | D03 |
-| [C00074 Identify and delete or rate limit identical content](../../generated_pages/counters/C00074.md) | D02 |
-| [C00085 Mute content](../../generated_pages/counters/C00085.md) | D03 |
-| [C00107 Content moderation](../../generated_pages/counters/C00107.md) | D02 |
-| [C00122 Content moderation](../../generated_pages/counters/C00122.md) | D02 |
-| [C00128 Create friction by marking content with ridicule or other "decelerants"](../../generated_pages/counters/C00128.md) | D03 |
-| [C00129 Use banking to cut off access ](../../generated_pages/counters/C00129.md) | D02 |
-| [C00153 Take pre-emptive action against actors' infrastructure](../../generated_pages/counters/C00153.md) | D03 |
-| [C00176 Improve Coordination amongst stakeholders: public and private](../../generated_pages/counters/C00176.md) | D07 |
-| [C00216 Use advertiser controls to stem flow of funds to bad actors](../../generated_pages/counters/C00216.md) | D02 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
\ No newline at end of file
diff --git a/generated_pages/techniques/T0068.md b/generated_pages/techniques/T0068.md
index 3afdeaf..235f391 100644
--- a/generated_pages/techniques/T0068.md
+++ b/generated_pages/techniques/T0068.md
@@ -1,6 +1,6 @@
# Technique T0068: Respond to Breaking News Event or Active Crisis
-* **Summary**: Media attention on a story or event is heightened during a breaking news event, where unclear facts and incomplete information increase speculation, rumors, and conspiracy theories, which are all vulnerable to manipulation.
+* **Summary**: Media attention on a story or event is heightened during a breaking news event, where unclear facts and incomplete information increase speculation, rumours, and conspiracy theories, which are all vulnerable to manipulation.
* **Belongs to tactic stage**: TA14
diff --git a/generated_pages/techniques/T0072.001.md b/generated_pages/techniques/T0072.001.md
index 366e630..7f5732f 100644
--- a/generated_pages/techniques/T0072.001.md
+++ b/generated_pages/techniques/T0072.001.md
@@ -1,6 +1,6 @@
# Technique T0072.001: Geographic Segmentation
-* **Summary**: An influence operation may target populations in a specific geographic location, such as a region, state, or city. An influence operation may use geographic segmentation to Create Localized Content (see: Establish Legitimacy).
+* **Summary**: An influence operation may target populations in a specific geographic location, such as a region, state, or city. An influence operation may use geographic segmentation to Create Localised Content (see: Establish Legitimacy).
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0072.002.md b/generated_pages/techniques/T0072.002.md
index 5cf2870..8163ce7 100644
--- a/generated_pages/techniques/T0072.002.md
+++ b/generated_pages/techniques/T0072.002.md
@@ -1,6 +1,6 @@
# Technique T0072.002: Demographic Segmentation
-* **Summary**: An influence operation may target populations based on demographic segmentation, including age, gender, and income. Demographic segmentation may be useful for influence operations aiming to change state policies that affect a specific population sector. For example, an influence operation attempting to influence Medicare funding in the United States would likely target U.S. voters over 65 years of age.
+* **Summary**: An influence operation may target populations based on demographic segmentation, including age, gender, and income. Demographic segmentation may be useful for influence operations aiming to change state policies that affect a specific population sector. For example, an influence operation attempting to influence Medicare funding in the United States would likely target U.S. voters over 65 years of age.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0072.003.md b/generated_pages/techniques/T0072.003.md
index c5e5cdb..ab8aa20 100644
--- a/generated_pages/techniques/T0072.003.md
+++ b/generated_pages/techniques/T0072.003.md
@@ -1,6 +1,6 @@
# Technique T0072.003: Economic Segmentation
-* **Summary**: An influence operation may target populations based on their income bracket, wealth, or other financial or economic division.
+* **Summary**: An influence operation may target populations based on their income bracket, wealth, or other financial or economic division.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0072.004.md b/generated_pages/techniques/T0072.004.md
index 9351bc2..8f8c01b 100644
--- a/generated_pages/techniques/T0072.004.md
+++ b/generated_pages/techniques/T0072.004.md
@@ -1,6 +1,6 @@
# Technique T0072.004: Psychographic Segmentation
-* **Summary**: An influence operation may target populations based on psychographic segmentation, which uses audience values and decision-making processes. An operation may individually gather psychographic data with its own surveys or collection tools or externally purchase data from social media companies or online surveys, such as personality quizzes.
+* **Summary**: An influence operation may target populations based on psychographic segmentation, which uses audience values and decision-making processes. An operation may individually gather psychographic data with its own surveys or collection tools or externally purchase data from social media companies or online surveys, such as personality quizzes.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0072.005.md b/generated_pages/techniques/T0072.005.md
index 03e6fbe..f5a4fcf 100644
--- a/generated_pages/techniques/T0072.005.md
+++ b/generated_pages/techniques/T0072.005.md
@@ -1,6 +1,6 @@
# Technique T0072.005: Political Segmentation
-* **Summary**: An influence operation may target populations based on their political affiliations, especially when aiming to manipulate voting or change policy.
+* **Summary**: An influence operation may target populations based on their political affiliations, especially when aiming to manipulate voting or change policy.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0072.md b/generated_pages/techniques/T0072.md
index 3d8760b..97a8b5b 100644
--- a/generated_pages/techniques/T0072.md
+++ b/generated_pages/techniques/T0072.md
@@ -1,6 +1,6 @@
# Technique T0072: Segment Audiences
-* **Summary**: Create audience segmentations by features of interest to the influence campaign, including political affiliation, geographic location, income, demographics, and psychographics.
+* **Summary**: Create audience segmentations by features of interest to the influence campaign, including political affiliation, geographic location, income, demographics, and psychographics.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0073.md b/generated_pages/techniques/T0073.md
index 19bc2fd..a97d932 100644
--- a/generated_pages/techniques/T0073.md
+++ b/generated_pages/techniques/T0073.md
@@ -1,6 +1,6 @@
# Technique T0073: Determine Target Audiences
-* **Summary**: Determining the target audiences (segments of the population) who will receive campaign narratives and artifacts intended to achieve the strategic ends.
+* **Summary**: Determining the target audiences (segments of the population) who will receive campaign narratives and artefacts intended to achieve the strategic ends.
* **Belongs to tactic stage**: TA01
diff --git a/generated_pages/techniques/T0074.md b/generated_pages/techniques/T0074.md
index 65649f9..6790db4 100644
--- a/generated_pages/techniques/T0074.md
+++ b/generated_pages/techniques/T0074.md
@@ -1,6 +1,6 @@
# Technique T0074: Determine Strategic Ends
-* **Summary**: Determining the campaigns goals or objectives. Examples include achieving achieving geopolitical advantage like undermining trust in an adversary, gaining domestic political advantage, achieving financial gain, or attaining a policy change,
+* **Summary**: Determining the campaigns goals or objectives. Examples include achieving achieving geopolitical advantage like undermining trust in an adversary, gaining domestic political advantage, achieving financial gain, or attaining a policy change,
* **Belongs to tactic stage**: TA01
diff --git a/generated_pages/techniques/T0075.md b/generated_pages/techniques/T0075.md
index ff55257..9b4f458 100644
--- a/generated_pages/techniques/T0075.md
+++ b/generated_pages/techniques/T0075.md
@@ -1,6 +1,6 @@
# Technique T0075: Dismiss
-* **Summary**: Push back against criticism by dismissing your critics. This might be arguing that the critics use a different standard for you than with other actors or themselves; or arguing that their criticism is biased.
+* **Summary**: Push back against criticism by dismissing your critics. This might be arguing that the critics use a different standard for you than with other actors or themselves; or arguing that their criticism is biassed.
* **Belongs to tactic stage**: TA02
diff --git a/generated_pages/techniques/T0076.md b/generated_pages/techniques/T0076.md
index b391342..38ed0d6 100644
--- a/generated_pages/techniques/T0076.md
+++ b/generated_pages/techniques/T0076.md
@@ -1,6 +1,6 @@
# Technique T0076: Distort
-* **Summary**: Twist the narrative. Take information, or artifacts like images, and change the framing around them.
+* **Summary**: Twist the narrative. Take information, or artefacts like images, and change the framing around them.
* **Belongs to tactic stage**: TA02
diff --git a/generated_pages/techniques/T0080.001.md b/generated_pages/techniques/T0080.001.md
index b43c5a5..b0987ee 100644
--- a/generated_pages/techniques/T0080.001.md
+++ b/generated_pages/techniques/T0080.001.md
@@ -1,6 +1,6 @@
# Technique T0080.001: Monitor Social Media Analytics
-* **Summary**: An influence operation may use social media analytics to determine which factors will increase the operation content’s exposure to its target audience on social media platforms, including views, interactions, and sentiment relating to topics and content types. The social media platform itself or a third-party tool may collect the metrics.
+* **Summary**: An influence operation may use social media analytics to determine which factors will increase the operation content’s exposure to its target audience on social media platforms, including views, interactions, and sentiment relating to topics and content types. The social media platform itself or a third-party tool may collect the metrics.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0080.002.md b/generated_pages/techniques/T0080.002.md
index 7120a99..713c2f6 100644
--- a/generated_pages/techniques/T0080.002.md
+++ b/generated_pages/techniques/T0080.002.md
@@ -1,6 +1,6 @@
# Technique T0080.002: Evaluate Media Surveys
-* **Summary**: An influence operation may evaluate its own or third-party media surveys to determine what type of content appeals to its target audience. Media surveys may provide insight into an audience’s political views, social class, general interests, or other indicators used to tailor operation messaging to its target audience.
+* **Summary**: An influence operation may evaluate its own or third-party media surveys to determine what type of content appeals to its target audience. Media surveys may provide insight into an audience’s political views, social class, general interests, or other indicators used to tailor operation messaging to its target audience.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0080.003.md b/generated_pages/techniques/T0080.003.md
index 9923af4..6afed42 100644
--- a/generated_pages/techniques/T0080.003.md
+++ b/generated_pages/techniques/T0080.003.md
@@ -1,6 +1,6 @@
# Technique T0080.003: Identify Trending Topics/Hashtags
-* **Summary**: An influence operation may identify trending hashtags on social media platforms for later use in boosting operation content. A hashtag40 refers to a word or phrase preceded by the hash symbol (#) on social media used to identify messages and posts relating to a specific topic. All public posts that use the same hashtag are aggregated onto a centralized page dedicated to the word or phrase and sorted either chronologically or by popularity.
+* **Summary**: An influence operation may identify trending hashtags on social media platforms for later use in boosting operation content. A hashtag40 refers to a word or phrase preceded by the hash symbol (#) on social media used to identify messages and posts relating to a specific topic. All public posts that use the same hashtag are aggregated onto a centralised page dedicated to the word or phrase and sorted either chronologically or by popularity.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0080.004.md b/generated_pages/techniques/T0080.004.md
index cd86e6d..b77b6e5 100644
--- a/generated_pages/techniques/T0080.004.md
+++ b/generated_pages/techniques/T0080.004.md
@@ -1,6 +1,6 @@
# Technique T0080.004: Conduct Web Traffic Analysis
-* **Summary**: An influence operation may conduct web traffic analysis to determine which search engines, keywords, websites, and advertisements gain the most traction with its target audience.
+* **Summary**: An influence operation may conduct web traffic analysis to determine which search engines, keywords, websites, and advertisements gain the most traction with its target audience.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0080.005.md b/generated_pages/techniques/T0080.005.md
index 1ce7359..637792b 100644
--- a/generated_pages/techniques/T0080.005.md
+++ b/generated_pages/techniques/T0080.005.md
@@ -1,6 +1,6 @@
# Technique T0080.005: Assess Degree/Type of Media Access
-* **Summary**: An influence operation may survey a target audience’s Internet availability and degree of media freedom to determine which target audience members will have access to operation content and on which platforms. An operation may face more difficulty targeting an information environment with heavy restrictions and media control than an environment with independent media, freedom of speech and of the press, and individual liberties.
+* **Summary**: An influence operation may survey a target audience’s Internet availability and degree of media freedom to determine which target audience members will have access to operation content and on which platforms. An operation may face more difficulty targeting an information environment with heavy restrictions and media control than an environment with independent media, freedom of speech and of the press, and individual liberties.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0080.md b/generated_pages/techniques/T0080.md
index 6e40cde..b11fe98 100644
--- a/generated_pages/techniques/T0080.md
+++ b/generated_pages/techniques/T0080.md
@@ -1,7 +1,6 @@
# Technique T0080: Map Target Audience Information Environment
-* **Summary**: Mapping the target audience information environment analyzes the information space itself, including social media analytics, web traffic, and media surveys. Mapping the information environment may help the influence operation determine the most realistic and popular information channels to reach its target audience.
-Mapping the target audience information environment aids influence operations in determining the most vulnerable areas of the information space to target with messaging.
+* **Summary**: Mapping the target audience information environment analyses the information space itself, including social media analytics, web traffic, and media surveys. Mapping the information environment may help the influence operation determine the most realistic and popular information channels to reach its target audience. Mapping the target audience information environment aids influence operations in determining the most vulnerable areas of the information space to target with messaging.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0081.001.md b/generated_pages/techniques/T0081.001.md
index a9af349..53ccc20 100644
--- a/generated_pages/techniques/T0081.001.md
+++ b/generated_pages/techniques/T0081.001.md
@@ -1,6 +1,6 @@
# Technique T0081.001: Find Echo Chambers
-* **Summary**: Find or plan to create areas (social media groups, search term groups, hashtag groups etc) where individuals only engage with people they agree with.
+* **Summary**: Find or plan to create areas (social media groups, search term groups, hashtag groups etc) where individuals only engage with people they agree with.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0081.002.md b/generated_pages/techniques/T0081.002.md
index 5247e57..e9d2c8b 100644
--- a/generated_pages/techniques/T0081.002.md
+++ b/generated_pages/techniques/T0081.002.md
@@ -1,7 +1,6 @@
# Technique T0081.002: Identify Data Voids
-* **Summary**: A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation.
-A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalizing on most search engines’ preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term.
+* **Summary**: A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation. A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalising on most search engines’ preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0081.003.md b/generated_pages/techniques/T0081.003.md
index 3a8168d..e81d50d 100644
--- a/generated_pages/techniques/T0081.003.md
+++ b/generated_pages/techniques/T0081.003.md
@@ -1,6 +1,6 @@
# Technique T0081.003: Identify Existing Prejudices
-* **Summary**: An influence operation may exploit existing racial, religious, demographic, or social prejudices to further polarize its target audience from the rest of the public.
+* **Summary**: An influence operation may exploit existing racial, religious, demographic, or social prejudices to further polarise its target audience from the rest of the public.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0081.004.md b/generated_pages/techniques/T0081.004.md
index 627da38..a76001d 100644
--- a/generated_pages/techniques/T0081.004.md
+++ b/generated_pages/techniques/T0081.004.md
@@ -1,6 +1,6 @@
# Technique T0081.004: Identify Existing Fissures
-* **Summary**: An influence operation may identify existing fissures to pit target populations against one another or facilitate a “divide-and-conquer" approach to tailor operation narratives along the divides.
+* **Summary**: An influence operation may identify existing fissures to pit target populations against one another or facilitate a “divide-and-conquer" approach to tailor operation narratives along the divides.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0081.005.md b/generated_pages/techniques/T0081.005.md
index ca663d8..17a206b 100644
--- a/generated_pages/techniques/T0081.005.md
+++ b/generated_pages/techniques/T0081.005.md
@@ -1,6 +1,6 @@
# Technique T0081.005: Identify Existing Conspiracy Narratives/Suspicions
-* **Summary**: An influence operation may assess preexisting conspiracy theories or suspicions in a population to identify existing narratives that support operational objectives.
+* **Summary**: An influence operation may assess preexisting conspiracy theories or suspicions in a population to identify existing narratives that support operational objectives.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0081.006.md b/generated_pages/techniques/T0081.006.md
index a51463c..d6c394a 100644
--- a/generated_pages/techniques/T0081.006.md
+++ b/generated_pages/techniques/T0081.006.md
@@ -1,6 +1,6 @@
# Technique T0081.006: Identify Wedge Issues
-* **Summary**: A wedge issue is a divisive political issue, usually concerning a social phenomenon, that divides individuals along a defined line. An influence operation may exploit wedge issues by intentionally polarizing the public along the wedge issue line and encouraging opposition between factions.
+* **Summary**: A wedge issue is a divisive political issue, usually concerning a social phenomenon, that divides individuals along a defined line. An influence operation may exploit wedge issues by intentionally polarising the public along the wedge issue line and encouraging opposition between factions.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0081.007.md b/generated_pages/techniques/T0081.007.md
index 9f6cb96..2f8f484 100644
--- a/generated_pages/techniques/T0081.007.md
+++ b/generated_pages/techniques/T0081.007.md
@@ -1,6 +1,6 @@
# Technique T0081.007: Identify Target Audience Adversaries
-* **Summary**: An influence operation may identify or create a real or imaginary adversary to center operation narratives against. A real adversary may include certain politicians or political parties while imaginary adversaries may include falsified “deep state”62 actors that, according to conspiracies, run the state behind public view.
+* **Summary**: An influence operation may identify or create a real or imaginary adversary to centre operation narratives against. A real adversary may include certain politicians or political parties while imaginary adversaries may include falsified “deep state”62 actors that, according to conspiracies, run the state behind public view.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0081.008.md b/generated_pages/techniques/T0081.008.md
index 03a651c..b641bb8 100644
--- a/generated_pages/techniques/T0081.008.md
+++ b/generated_pages/techniques/T0081.008.md
@@ -1,6 +1,6 @@
# Technique T0081.008: Identify Media System Vulnerabilities
-* **Summary**: An influence operation may exploit existing weaknesses in a target’s media system. These weaknesses may include existing biases among media agencies, vulnerability to false news agencies on social media, or existing distrust of traditional media sources. An existing distrust among the public in the media system’s credibility holds high potential for exploitation by an influence operation when establishing alternative news agencies to spread operation content.
+* **Summary**: An influence operation may exploit existing weaknesses in a target’s media system. These weaknesses may include existing biases among media agencies, vulnerability to false news agencies on social media, or existing distrust of traditional media sources. An existing distrust among the public in the media system’s credibility holds high potential for exploitation by an influence operation when establishing alternative news agencies to spread operation content.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0081.md b/generated_pages/techniques/T0081.md
index dc6b997..4a1680c 100644
--- a/generated_pages/techniques/T0081.md
+++ b/generated_pages/techniques/T0081.md
@@ -1,7 +1,6 @@
# Technique T0081: Identify Social and Technical Vulnerabilities
-* **Summary**: Identifying social and technical vulnerabilities determines weaknesses within the target audience information environment for later exploitation. Vulnerabilities include decisive political issues, weak cybersecurity infrastructure, search engine data voids, and other technical and non technical weaknesses in the target information environment.
-Identifying social and technical vulnerabilities facilitates the later exploitation of the identified weaknesses to advance operation objectives.
+* **Summary**: Identifying social and technical vulnerabilities determines weaknesses within the target audience information environment for later exploitation. Vulnerabilities include decisive political issues, weak cybersecurity infrastructure, search engine data voids, and other technical and non technical weaknesses in the target information environment. Identifying social and technical vulnerabilities facilitates the later exploitation of the identified weaknesses to advance operation objectives.
* **Belongs to tactic stage**: TA13
diff --git a/generated_pages/techniques/T0082.md b/generated_pages/techniques/T0082.md
index 4ff7c6d..6f65637 100644
--- a/generated_pages/techniques/T0082.md
+++ b/generated_pages/techniques/T0082.md
@@ -1,6 +1,6 @@
# Technique T0082: Develop New Narratives
-* **Summary**: Actors may develop new narratives to further strategic or tactical goals, especially when existing narratives adequately align with the campaign goals. New narratives provide more control in terms of crafting the message to achieve specific goals. However, new narratives may require more effort to disseminate than adapting or adopting existing narratives.
+* **Summary**: Actors may develop new narratives to further strategic or tactical goals, especially when existing narratives adequately align with the campaign goals. New narratives provide more control in terms of crafting the message to achieve specific goals. However, new narratives may require more effort to disseminate than adapting or adopting existing narratives.
* **Belongs to tactic stage**: TA14
diff --git a/generated_pages/techniques/T0083.md b/generated_pages/techniques/T0083.md
index 0ff7fea..7bd17c8 100644
--- a/generated_pages/techniques/T0083.md
+++ b/generated_pages/techniques/T0083.md
@@ -1,6 +1,6 @@
# Technique T0083: Integrate Target Audience Vulnerabilities into Narrative
-* **Summary**: An influence operation may seek to exploit the preexisting weaknesses, fears, and enemies of the target audience for integration into the operation’s narratives and overall strategy. Integrating existing vulnerabilities into the operational approach conserves resources by exploiting already weak areas of the target information environment instead of forcing the operation to create new vulnerabilities in the environment.
+* **Summary**: An influence operation may seek to exploit the preexisting weaknesses, fears, and enemies of the target audience for integration into the operation’s narratives and overall strategy. Integrating existing vulnerabilities into the operational approach conserves resources by exploiting already weak areas of the target information environment instead of forcing the operation to create new vulnerabilities in the environment.
* **Belongs to tactic stage**: TA14
diff --git a/generated_pages/techniques/T0084.001.md b/generated_pages/techniques/T0084.001.md
index bbd7ce2..cf09645 100644
--- a/generated_pages/techniques/T0084.001.md
+++ b/generated_pages/techniques/T0084.001.md
@@ -1,6 +1,6 @@
# Technique T0084.001: Use Copypasta
-* **Summary**: Copypasta refers to a piece of text that has been copied and pasted multiple times across various online platforms. A copypasta’s final form may differ from its original source text as users add, delete, or otherwise edit the content as they repost the text.
+* **Summary**: Copypasta refers to a piece of text that has been copied and pasted multiple times across various online platforms. A copypasta’s final form may differ from its original source text as users add, delete, or otherwise edit the content as they repost the text.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0084.002.md b/generated_pages/techniques/T0084.002.md
index 485b2e0..973b504 100644
--- a/generated_pages/techniques/T0084.002.md
+++ b/generated_pages/techniques/T0084.002.md
@@ -1,6 +1,6 @@
-# Technique T0084.002: Plagiarize Content
+# Technique T0084.002: Plagiarise Content
-* **Summary**: An influence operation may take content from other sources without proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources.
+* **Summary**: An influence operation may take content from other sources without proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0084.003.md b/generated_pages/techniques/T0084.003.md
index c1dedbe..967e960 100644
--- a/generated_pages/techniques/T0084.003.md
+++ b/generated_pages/techniques/T0084.003.md
@@ -1,6 +1,6 @@
-# Technique T0084.003: Deceptively Labeled or Translated
+# Technique T0084.003: Deceptively Labelled or Translated
-* **Summary**: An influence operation may take authentic content from other sources and add deceptive labels or deceptively translate the content into other langauges.
+* **Summary**: An influence operation may take authentic content from other sources and add deceptive labels or deceptively translate the content into other langauges.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0084.004.md b/generated_pages/techniques/T0084.004.md
index d0f4be7..d144b9c 100644
--- a/generated_pages/techniques/T0084.004.md
+++ b/generated_pages/techniques/T0084.004.md
@@ -1,6 +1,6 @@
# Technique T0084.004: Appropriate Content
-* **Summary**: An influence operation may take content from other sources with proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources. Examples include the appropriation of content from one inauthentic news site to another inauthentic news site or network in ways that align with the originators licensing or terms of service.
+* **Summary**: An influence operation may take content from other sources with proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources. Examples include the appropriation of content from one inauthentic news site to another inauthentic news site or network in ways that align with the originators licencing or terms of service.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0084.md b/generated_pages/techniques/T0084.md
index b3ef09e..71248e5 100644
--- a/generated_pages/techniques/T0084.md
+++ b/generated_pages/techniques/T0084.md
@@ -1,6 +1,6 @@
# Technique T0084: Reuse Existing Content
-* **Summary**: When an operation recycles content from its own previous operations or plagiarizes from external operations. An operation may launder information to conserve resources that would have otherwise been utilized to develop new content.
+* **Summary**: When an operation recycles content from its own previous operations or plagiarises from external operations. An operation may launder information to conserve resources that would have otherwise been utilised to develop new content.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0085.001.md b/generated_pages/techniques/T0085.001.md
index ffdd029..8bb0573 100644
--- a/generated_pages/techniques/T0085.001.md
+++ b/generated_pages/techniques/T0085.001.md
@@ -1,6 +1,6 @@
# Technique T0085.001: Develop AI-Generated Text
-* **Summary**: AI-generated texts refers to synthetic text composed by computers using text-generating AI technology. Autonomous generation refers to content created by a bot without human input, also known as bot-created content generation. Autonomous generation represents the next step in automation after language generation and may lead to automated journalism. An influence operation may use read fakes or autonomous generation to quickly develop and distribute content to the target audience.
+* **Summary**: AI-generated texts refers to synthetic text composed by computers using text-generating AI technology. Autonomous generation refers to content created by a bot without human input, also known as bot-created content generation. Autonomous generation represents the next step in automation after language generation and may lead to automated journalism. An influence operation may use read fakes or autonomous generation to quickly develop and distribute content to the target audience.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0085.003.md b/generated_pages/techniques/T0085.003.md
index 77e5737..bf05712 100644
--- a/generated_pages/techniques/T0085.003.md
+++ b/generated_pages/techniques/T0085.003.md
@@ -1,6 +1,6 @@
# Technique T0085.003: Develop Inauthentic News Articles
-* **Summary**: An influence operation may develop false or misleading news articles aligned to their campaign goals or narratives.
+* **Summary**: An influence operation may develop false or misleading news articles aligned to their campaign goals or narratives.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0085.md b/generated_pages/techniques/T0085.md
index e09acc3..422db58 100644
--- a/generated_pages/techniques/T0085.md
+++ b/generated_pages/techniques/T0085.md
@@ -1,6 +1,6 @@
# Technique T0085: Develop Text-Based Content
-* **Summary**: Creating and editing false or misleading text-based artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign.
+* **Summary**: Creating and editing false or misleading text-based artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0086.003.md b/generated_pages/techniques/T0086.003.md
index b04ac9d..96b088d 100644
--- a/generated_pages/techniques/T0086.003.md
+++ b/generated_pages/techniques/T0086.003.md
@@ -1,6 +1,6 @@
# Technique T0086.003: Deceptively Edit Images (Cheap Fakes)
-* **Summary**: Cheap fakes utilize less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.
+* **Summary**: Cheap fakes utilise less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0086.md b/generated_pages/techniques/T0086.md
index 0664c9e..df5dd64 100644
--- a/generated_pages/techniques/T0086.md
+++ b/generated_pages/techniques/T0086.md
@@ -1,6 +1,6 @@
# Technique T0086: Develop Image-Based Content
-* **Summary**: Creating and editing false or misleading visual artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include photographing staged real-life situations, repurposing existing digital images, or using image creation and editing technologies.
+* **Summary**: Creating and editing false or misleading visual artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include photographing staged real-life situations, repurposing existing digital images, or using image creation and editing technologies.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0087.002.md b/generated_pages/techniques/T0087.002.md
index ee4fa04..68e47a3 100644
--- a/generated_pages/techniques/T0087.002.md
+++ b/generated_pages/techniques/T0087.002.md
@@ -1,6 +1,6 @@
# Technique T0087.002: Deceptively Edit Video (Cheap Fakes)
-* **Summary**: Cheap fakes utilize less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.
+* **Summary**: Cheap fakes utilise less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0087.md b/generated_pages/techniques/T0087.md
index 341b912..e9cac90 100644
--- a/generated_pages/techniques/T0087.md
+++ b/generated_pages/techniques/T0087.md
@@ -1,6 +1,6 @@
# Technique T0087: Develop Video-Based Content
-* **Summary**: Creating and editing false or misleading video artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include staging videos of purportedly real situations, repurposing existing video artifacts, or using AI-generated video creation and editing technologies (including deepfakes).
+* **Summary**: Creating and editing false or misleading video artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include staging videos of purportedly real situations, repurposing existing video artefacts, or using AI-generated video creation and editing technologies (including deepfakes).
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0088.002.md b/generated_pages/techniques/T0088.002.md
index c3f43e8..af88bf0 100644
--- a/generated_pages/techniques/T0088.002.md
+++ b/generated_pages/techniques/T0088.002.md
@@ -1,6 +1,6 @@
# Technique T0088.002: Deceptively Edit Audio (Cheap Fakes)
-* **Summary**: Cheap fakes utilize less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.
+* **Summary**: Cheap fakes utilise less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0088.md b/generated_pages/techniques/T0088.md
index c28ad11..4f6e9c8 100644
--- a/generated_pages/techniques/T0088.md
+++ b/generated_pages/techniques/T0088.md
@@ -1,6 +1,6 @@
# Technique T0088: Develop Audio-Based Content
-* **Summary**: Creating and editing false or misleading audio artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include creating completely new audio content, repurposing existing audio artifacts (including cheap fakes), or using AI-generated audio creation and editing technologies (including deepfakes).
+* **Summary**: Creating and editing false or misleading audio artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include creating completely new audio content, repurposing existing audio artefacts (including cheap fakes), or using AI-generated audio creation and editing technologies (including deepfakes).
* **Belongs to tactic stage**: TA06
diff --git a/generated_pages/techniques/T0090.001.md b/generated_pages/techniques/T0090.001.md
index bfe8dcf..f51f764 100644
--- a/generated_pages/techniques/T0090.001.md
+++ b/generated_pages/techniques/T0090.001.md
@@ -1,6 +1,6 @@
# Technique T0090.001: Create Anonymous Accounts
-* **Summary**: Anonymous accounts or anonymous users refer to users that access network resources without providing a username or password. An influence operation may use anonymous accounts to spread content without direct attribution to the operation.
+* **Summary**: Anonymous accounts or anonymous users refer to users that access network resources without providing a username or password. An influence operation may use anonymous accounts to spread content without direct attribution to the operation.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0090.002.md b/generated_pages/techniques/T0090.002.md
index 52698a5..f0990a1 100644
--- a/generated_pages/techniques/T0090.002.md
+++ b/generated_pages/techniques/T0090.002.md
@@ -1,6 +1,6 @@
# Technique T0090.002: Create Cyborg Accounts
-* **Summary**: Cyborg accounts refer to partly manned, partly automated social media accounts. Cyborg accounts primarily act as bots, but a human operator periodically takes control of the account to engage with real social media users by responding to comments and posting original content. Influence operations may use cyborg accounts to reduce the amount of direct human input required to maintain a regular account but increase the apparent legitimacy of the cyborg account by occasionally breaking its bot-like behavior with human interaction.
+* **Summary**: Cyborg accounts refer to partly manned, partly automated social media accounts. Cyborg accounts primarily act as bots, but a human operator periodically takes control of the account to engage with real social media users by responding to comments and posting original content. Influence operations may use cyborg accounts to reduce the amount of direct human input required to maintain a regular account but increase the apparent legitimacy of the cyborg account by occasionally breaking its bot-like behaviour with human interaction.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0090.003.md b/generated_pages/techniques/T0090.003.md
index dacce4e..7581efb 100644
--- a/generated_pages/techniques/T0090.003.md
+++ b/generated_pages/techniques/T0090.003.md
@@ -1,7 +1,6 @@
# Technique T0090.003: Create Bot Accounts
-* **Summary**: Bots refer to autonomous internet users that interact with systems or other users while imitating traditional human behavior. Bots use a variety of tools to stay active without direct human operation, including artificial intelligence and big data analytics. For example, an individual may program a Twitter bot to retweet a tweet every time it contains a certain keyword or hashtag. An influence operation may use bots to increase its exposure and artificially promote its content across the internet without dedicating additional time or human resources.
-Amplifier bots promote operation content through reposts, shares, and likes to increase the content’s online popularity. Hacker bots are traditionally covert bots running on computer scripts that rarely engage with users and work primarily as agents of larger cyberattacks, such as a Distributed Denial of Service attacks. Spammer bots are programmed to post content on social media or in comment sections, usually as a supplementary tool. Impersonator bots102 pose as real people by mimicking human behavior, complicating their detection.
+* **Summary**: Bots refer to autonomous internet users that interact with systems or other users while imitating traditional human behaviour. Bots use a variety of tools to stay active without direct human operation, including artificial intelligence and big data analytics. For example, an individual may programme a Twitter bot to retweet a tweet every time it contains a certain keyword or hashtag. An influence operation may use bots to increase its exposure and artificially promote its content across the internet without dedicating additional time or human resources. Amplifier bots promote operation content through reposts, shares, and likes to increase the content’s online popularity. Hacker bots are traditionally covert bots running on computer scripts that rarely engage with users and work primarily as agents of larger cyberattacks, such as a Distributed Denial of Service attacks. Spammer bots are programmed to post content on social media or in comment sections, usually as a supplementary tool. Impersonator bots102 pose as real people by mimicking human behaviour, complicating their detection.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0090.004.md b/generated_pages/techniques/T0090.004.md
index 86ce4f2..3a2b097 100644
--- a/generated_pages/techniques/T0090.004.md
+++ b/generated_pages/techniques/T0090.004.md
@@ -1,6 +1,6 @@
# Technique T0090.004: Create Sockpuppet Accounts
-* **Summary**: Sockpuppet accounts refer to falsified accounts that either promote the influence operation’s own material or attack critics of the material online. Individuals who control sockpuppet accounts also man at least one other user account.67 Sockpuppet accounts help legitimize operation narratives by providing an appearance of external support for the material and discrediting opponents of the operation.
+* **Summary**: Sockpuppet accounts refer to falsified accounts that either promote the influence operation’s own material or attack critics of the material online. Individuals who control sockpuppet accounts also man at least one other user account.67 Sockpuppet accounts help legitimise operation narratives by providing an appearance of external support for the material and discrediting opponents of the operation.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0091.003.md b/generated_pages/techniques/T0091.003.md
index 19fdffd..3933ed8 100644
--- a/generated_pages/techniques/T0091.003.md
+++ b/generated_pages/techniques/T0091.003.md
@@ -1,7 +1,6 @@
# Technique T0091.003: Enlist Troll Accounts
-* **Summary**: An influence operation may hire trolls, or human operators of fake accounts that aim to provoke others by posting and amplifying content about controversial issues. Trolls can serve to discredit an influence operation’s opposition or bring attention to the operation’s cause through debate.
-Classic trolls refer to regular people who troll for personal reasons, such as attention-seeking or boredom. Classic trolls may advance operation narratives by coincidence but are not directly affiliated with any larger operation. Conversely, hybrid trolls act on behalf of another institution, such as a state or financial organization, and post content with a specific ideological goal. Hybrid trolls may be highly advanced and institutionalized or less organized and work for a single individual.
+* **Summary**: An influence operation may hire trolls, or human operators of fake accounts that aim to provoke others by posting and amplifying content about controversial issues. Trolls can serve to discredit an influence operation’s opposition or bring attention to the operation’s cause through debate. Classic trolls refer to regular people who troll for personal reasons, such as attention-seeking or boredom. Classic trolls may advance operation narratives by coincidence but are not directly affiliated with any larger operation. Conversely, hybrid trolls act on behalf of another institution, such as a state or financial organisation, and post content with a specific ideological goal. Hybrid trolls may be highly advanced and institutionalised or less organised and work for a single individual.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0092.001.md b/generated_pages/techniques/T0092.001.md
index e1414ce..d96441d 100644
--- a/generated_pages/techniques/T0092.001.md
+++ b/generated_pages/techniques/T0092.001.md
@@ -1,6 +1,6 @@
-# Technique T0092.001: Create Organizations
+# Technique T0092.001: Create Organisations
-* **Summary**: Influence operations may establish organizations with legitimate or falsified hierarchies, staff, and content to structure operation assets, provide a sense of legitimacy to the operation, or provide institutional backing to operation activities.
+* **Summary**: Influence operations may establish organisations with legitimate or falsified hierarchies, staff, and content to structure operation assets, provide a sense of legitimacy to the operation, or provide institutional backing to operation activities.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0092.002.md b/generated_pages/techniques/T0092.002.md
index 7c68835..171aad5 100644
--- a/generated_pages/techniques/T0092.002.md
+++ b/generated_pages/techniques/T0092.002.md
@@ -1,6 +1,6 @@
# Technique T0092.002: Use Follow Trains
-* **Summary**: A follow train is a group of people who follow each other on a social media platform, often as a way for an individual or campaign to grow its social media following. Follow trains may be a violation of platform Terms of Service. They are also known as follow-for-follow groups.
+* **Summary**: A follow train is a group of people who follow each other on a social media platform, often as a way for an individual or campaign to grow its social media following. Follow trains may be a violation of platform Terms of Service. They are also known as follow-for-follow groups.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0092.003.md b/generated_pages/techniques/T0092.003.md
index 2f1db8c..9ef4f2e 100644
--- a/generated_pages/techniques/T0092.003.md
+++ b/generated_pages/techniques/T0092.003.md
@@ -1,6 +1,6 @@
# Technique T0092.003: Create Community or Sub-Group
-* **Summary**: When there is not an existing community or sub-group that meets a campaign's goals, an influence operation may seek to create a community or sub-group.
+* **Summary**: When there is not an existing community or sub-group that meets a campaign's goals, an influence operation may seek to create a community or sub-group.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0092.md b/generated_pages/techniques/T0092.md
index f21ade5..8023991 100644
--- a/generated_pages/techniques/T0092.md
+++ b/generated_pages/techniques/T0092.md
@@ -1,6 +1,6 @@
# Technique T0092: Build Network
-* **Summary**: Operators build their own network, creating links between accounts -- whether authentic or inauthentic -- in order amplify and promote narratives and artifacts, and encourage further growth of ther network, as well as the ongoing sharing and engagement with operational content.
+* **Summary**: Operators build their own network, creating links between accounts -- whether authentic or inauthentic -- in order amplify and promote narratives and artefacts, and encourage further growth of ther network, as well as the ongoing sharing and engagement with operational content.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0093.001.md b/generated_pages/techniques/T0093.001.md
index 1888591..6d4edbe 100644
--- a/generated_pages/techniques/T0093.001.md
+++ b/generated_pages/techniques/T0093.001.md
@@ -1,8 +1,6 @@
# Technique T0093.001: Fund Proxies
-* **Summary**: An influence operation may fund proxies, or external entities that work for the operation. An operation may recruit/train users with existing sympathies towards the operation’s narratives and/or goals as proxies. Funding proxies serves various purposes including:
-- Diversifying operation locations to complicate attribution
-- Reducing the workload for direct operation assets
+* **Summary**: An influence operation may fund proxies, or external entities that work for the operation. An operation may recruit/train users with existing sympathies towards the operation’s narratives and/or goals as proxies. Funding proxies serves various purposes including: - Diversifying operation locations to complicate attribution - Reducing the workload for direct operation assets
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0093.002.md b/generated_pages/techniques/T0093.002.md
index 21dc025..f8062a7 100644
--- a/generated_pages/techniques/T0093.002.md
+++ b/generated_pages/techniques/T0093.002.md
@@ -1,6 +1,6 @@
# Technique T0093.002: Acquire Botnets
-* **Summary**: A botnet is a group of bots that can function in coordination with each other.
+* **Summary**: A botnet is a group of bots that can function in coordination with each other.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0093.md b/generated_pages/techniques/T0093.md
index 44ef131..68cd97a 100644
--- a/generated_pages/techniques/T0093.md
+++ b/generated_pages/techniques/T0093.md
@@ -1,6 +1,6 @@
# Technique T0093: Acquire/Recruit Network
-* **Summary**: Operators acquire an existing network by paying, recruiting, or exerting control over the leaders of the existing network.
+* **Summary**: Operators acquire an existing network by paying, recruiting, or exerting control over the leaders of the existing network.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0094.002.md b/generated_pages/techniques/T0094.002.md
index ab4bd45..7890671 100644
--- a/generated_pages/techniques/T0094.002.md
+++ b/generated_pages/techniques/T0094.002.md
@@ -1,6 +1,6 @@
-# Technique T0094.002: Utilize Butterfly Attacks
+# Technique T0094.002: Utilise Butterfly Attacks
-* **Summary**: Butterfly attacks occur when operators pretend to be members of a certain social group, usually a group that struggles for representation. An influence operation may mimic a group to insert controversial statements into the discourse, encourage the spread of operation content, or promote harassment among group members. Unlike astroturfing, butterfly attacks aim to infiltrate and discredit existing grassroots movements, organizations, and media campaigns.
+* **Summary**: Butterfly attacks occur when operators pretend to be members of a certain social group, usually a group that struggles for representation. An influence operation may mimic a group to insert controversial statements into the discourse, encourage the spread of operation content, or promote harassment among group members. Unlike astroturfing, butterfly attacks aim to infiltrate and discredit existing grassroots movements, organisations, and media campaigns.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0095.md b/generated_pages/techniques/T0095.md
index e5a3fd1..e49bd28 100644
--- a/generated_pages/techniques/T0095.md
+++ b/generated_pages/techniques/T0095.md
@@ -1,6 +1,6 @@
# Technique T0095: Develop Owned Media Assets
-* **Summary**: An owned media asset refers to an agency or organization through which an influence operation may create, develop, and host content and narratives. Owned media assets include websites, blogs, social media pages, forums, and other platforms that facilitate the creation and organization of content.
+* **Summary**: An owned media asset refers to an agency or organisation through which an influence operation may create, develop, and host content and narratives. Owned media assets include websites, blogs, social media pages, forums, and other platforms that facilitate the creation and organisation of content.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0096.001.md b/generated_pages/techniques/T0096.001.md
index 1156cfa..64e75bd 100644
--- a/generated_pages/techniques/T0096.001.md
+++ b/generated_pages/techniques/T0096.001.md
@@ -1,6 +1,6 @@
# Technique T0096.001: Create Content Farms
-* **Summary**: An influence operation may create an organization for creating and amplifying campaign artifacts at scale.
+* **Summary**: An influence operation may create an organisation for creating and amplifying campaign artefacts at scale.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0096.002.md b/generated_pages/techniques/T0096.002.md
index c6cafc8..49e2a7d 100644
--- a/generated_pages/techniques/T0096.002.md
+++ b/generated_pages/techniques/T0096.002.md
@@ -1,6 +1,6 @@
-# Technique T0096.002: Outsource Content Creation to External Organizations
+# Technique T0096.002: Outsource Content Creation to External Organisations
-* **Summary**: An influence operation may outsource content creation to external companies to avoid attribution, increase the rate of content creation, or improve content quality, i.e., by employing an organization that can create content in the target audience’s native language. Employed organizations may include marketing companies for tailored advertisements or external content farms for high volumes of targeted media.
+* **Summary**: An influence operation may outsource content creation to external companies to avoid attribution, increase the rate of content creation, or improve content quality, i.e., by employing an organisation that can create content in the target audience’s native language. Employed organisations may include marketing companies for tailored advertisements or external content farms for high volumes of targeted media.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0096.md b/generated_pages/techniques/T0096.md
index d644027..1ea5ef5 100644
--- a/generated_pages/techniques/T0096.md
+++ b/generated_pages/techniques/T0096.md
@@ -1,6 +1,6 @@
# Technique T0096: Leverage Content Farms
-* **Summary**: Using the services of large-scale content providers for creating and amplifying campaign artifacts at scale.
+* **Summary**: Using the services of large-scale content providers for creating and amplifying campaign artefacts at scale.
* **Belongs to tactic stage**: TA15
diff --git a/generated_pages/techniques/T0097.md b/generated_pages/techniques/T0097.md
index beb22bd..c13c0fb 100644
--- a/generated_pages/techniques/T0097.md
+++ b/generated_pages/techniques/T0097.md
@@ -1,6 +1,6 @@
# Technique T0097: Create Personas
-* **Summary**: Creating fake people, often with accounts across multiple platforms. These personas can be as simple as a name, can contain slightly more background like location, profile pictures, backstory, or can be effectively backstopped with indicators like fake identity documents.
+* **Summary**: Creating fake people, often with accounts across multiple platforms. These personas can be as simple as a name, can contain slightly more background like location, profile pictures, backstory, or can be effectively backstopped with indicators like fake identity documents.
* **Belongs to tactic stage**: TA16
diff --git a/generated_pages/techniques/T0099.001.md b/generated_pages/techniques/T0099.001.md
index 36ed72b..8d1e688 100644
--- a/generated_pages/techniques/T0099.001.md
+++ b/generated_pages/techniques/T0099.001.md
@@ -1,6 +1,6 @@
# Technique T0099.001: Astroturfing
-* **Summary**: Astroturfing occurs when an influence operation disguises itself as grassroots movement or organization that supports operation narratives. Unlike butterfly attacks, astroturfing aims to increase the appearance of popular support for the operation cause and does not infiltrate existing groups to discredit their objectives.
+* **Summary**: Astroturfing occurs when an influence operation disguises itself as grassroots movement or organisation that supports operation narratives. Unlike butterfly attacks, astroturfing aims to increase the appearance of popular support for the operation cause and does not infiltrate existing groups to discredit their objectives.
* **Belongs to tactic stage**: TA16
diff --git a/generated_pages/techniques/T0099.002.md b/generated_pages/techniques/T0099.002.md
index a529eaf..2de758c 100644
--- a/generated_pages/techniques/T0099.002.md
+++ b/generated_pages/techniques/T0099.002.md
@@ -1,6 +1,6 @@
# Technique T0099.002: Spoof/Parody Account/Site
-* **Summary**: An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognizable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organizations, or state entities.
+* **Summary**: An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognisable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organisations, or state entities.
* **Belongs to tactic stage**: TA16
diff --git a/generated_pages/techniques/T0099.md b/generated_pages/techniques/T0099.md
index e57a621..ae8d9e0 100644
--- a/generated_pages/techniques/T0099.md
+++ b/generated_pages/techniques/T0099.md
@@ -1,7 +1,6 @@
# Technique T0099: Prepare Assets Impersonating Legitimate Entities
-* **Summary**: An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognizable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organizations, or state entities.
-An influence operation may use a wide variety of cyber techniques to impersonate a legitimate entity’s website or social media account. Typosquatting87 is the international registration of a domain name with purposeful variations of the impersonated domain name through intentional typos, top-level domain (TLD) manipulation, or punycode. Typosquatting facilitates the creation of falsified websites by creating similar domain names in the URL box, leaving it to the user to confirm that the URL is correct.
+* **Summary**: An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognisable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organisations, or state entities. An influence operation may use a wide variety of cyber techniques to impersonate a legitimate entity’s website or social media account. Typosquatting87 is the international registration of a domain name with purposeful variations of the impersonated domain name through intentional typos, top-level domain (TLD) manipulation, or punycode. Typosquatting facilitates the creation of falsified websites by creating similar domain names in the URL box, leaving it to the user to confirm that the URL is correct.
* **Belongs to tactic stage**: TA16
diff --git a/generated_pages/techniques/T0100.md b/generated_pages/techniques/T0100.md
index b88849d..8f29d42 100644
--- a/generated_pages/techniques/T0100.md
+++ b/generated_pages/techniques/T0100.md
@@ -1,9 +1,6 @@
# Technique T0100: Co-Opt Trusted Sources
-* **Summary**: An influence operation may co-opt trusted sources by infiltrating or repurposing a source to reach a target audience through existing, previously reliable networks. Co-opted trusted sources may include:
-- National or local new outlets
-- Research or academic publications
-- Online blogs or websites
+* **Summary**: An influence operation may co-opt trusted sources by infiltrating or repurposing a source to reach a target audience through existing, previously reliable networks. Co-opted trusted sources may include: - National or local new outlets - Research or academic publications - Online blogs or websites
* **Belongs to tactic stage**: TA16
diff --git a/generated_pages/techniques/T0101.md b/generated_pages/techniques/T0101.md
index dd8506d..676d595 100644
--- a/generated_pages/techniques/T0101.md
+++ b/generated_pages/techniques/T0101.md
@@ -1,6 +1,6 @@
-# Technique T0101: Create Localized Content
+# Technique T0101: Create Localised Content
-* **Summary**: Localized content refers to content that appeals to a specific community of individuals, often in defined geographic areas. An operation may create localized content using local language and dialects to resonate with its target audience and blend in with other local news and social media. Localized content may help an operation increase legitimacy, avoid detection, and complicate external attribution.
+* **Summary**: Localised content refers to content that appeals to a specific community of individuals, often in defined geographic areas. An operation may create localised content using local language and dialects to resonate with its target audience and blend in with other local news and social media. Localised content may help an operation increase legitimacy, avoid detection, and complicate external attribution.
* **Belongs to tactic stage**: TA05
diff --git a/generated_pages/techniques/T0102.003.md b/generated_pages/techniques/T0102.003.md
index 0b059f7..7ab30d8 100644
--- a/generated_pages/techniques/T0102.003.md
+++ b/generated_pages/techniques/T0102.003.md
@@ -1,7 +1,6 @@
# Technique T0102.003: Exploit Data Voids
-* **Summary**: A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation.
-A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalizing on most search engines’ preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term.
+* **Summary**: A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation. A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalising on most search engines’ preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term.
* **Belongs to tactic stage**: TA05
diff --git a/generated_pages/techniques/T0102.md b/generated_pages/techniques/T0102.md
index 9d94b89..e4cbf74 100644
--- a/generated_pages/techniques/T0102.md
+++ b/generated_pages/techniques/T0102.md
@@ -1,6 +1,6 @@
# Technique T0102: Leverage Echo Chambers/Filter Bubbles
-* **Summary**: An echo chamber refers to an internet subgroup, often along ideological lines, where individuals only engage with “others with which they are already in agreement.” A filter bubble refers to an algorithm's placement of an individual in content that they agree with or regularly engage with, possibly entrapping the user into a bubble of their own making. An operation may create these isolated areas of the internet by match existing groups, or aggregating individuals into a single target audience based on shared interests, politics, values, demographics, and other characteristics. Echo chambers and filter bubbles help to reinforce similar biases and content to the same target audience members.
+* **Summary**: An echo chamber refers to an internet subgroup, often along ideological lines, where individuals only engage with “others with which they are already in agreement.” A filter bubble refers to an algorithm's placement of an individual in content that they agree with or regularly engage with, possibly entrapping the user into a bubble of their own making. An operation may create these isolated areas of the internet by match existing groups, or aggregating individuals into a single target audience based on shared interests, politics, values, demographics, and other characteristics. Echo chambers and filter bubbles help to reinforce similar biases and content to the same target audience members.
* **Belongs to tactic stage**: TA05
diff --git a/generated_pages/techniques/T0104.003.md b/generated_pages/techniques/T0104.003.md
index 2c305b1..b9a68db 100644
--- a/generated_pages/techniques/T0104.003.md
+++ b/generated_pages/techniques/T0104.003.md
@@ -1,6 +1,6 @@
# Technique T0104.003: Private/Closed Social Networks
-* **Summary**: An audio livestream refers to an online audio broadcast capability that allows for real-time communication to closed or open networks. Examples include Twitter Spaces,
+* **Summary**: An audio livestream refers to an online audio broadcast capability that allows for real-time communication to closed or open networks. Examples include Twitter Spaces,
* **Belongs to tactic stage**: TA07
diff --git a/generated_pages/techniques/T0108.md b/generated_pages/techniques/T0108.md
index 6d1fc28..679b708 100644
--- a/generated_pages/techniques/T0108.md
+++ b/generated_pages/techniques/T0108.md
@@ -1,6 +1,6 @@
# Technique T0108: Blogging and Publishing Networks
-* **Summary**: Examples include WordPress, Blogger, Weebly, Tumblr, Medium, etc.
+* **Summary**: Examples include WordPress, Blogger, Weebly, Tumblr, Medium, etc.
* **Belongs to tactic stage**: TA07
diff --git a/generated_pages/techniques/T0109.md b/generated_pages/techniques/T0109.md
index 8984ed7..745e0ac 100644
--- a/generated_pages/techniques/T0109.md
+++ b/generated_pages/techniques/T0109.md
@@ -1,6 +1,6 @@
# Technique T0109: Consumer Review Networks
-* **Summary**: Platforms for finding, reviewing, and sharing information about brands, products, services, restaurants, travel destinations, etc. Examples include Yelp, TripAdvisor, etc.
+* **Summary**: Platforms for finding, reviewing, and sharing information about brands, products, services, restaurants, travel destinations, etc. Examples include Yelp, TripAdvisor, etc.
* **Belongs to tactic stage**: TA07
diff --git a/generated_pages/techniques/T0110.md b/generated_pages/techniques/T0110.md
index c963cdb..5e1030a 100644
--- a/generated_pages/techniques/T0110.md
+++ b/generated_pages/techniques/T0110.md
@@ -1,6 +1,6 @@
# Technique T0110: Formal Diplomatic Channels
-* **Summary**: Leveraging formal, traditional, diplomatic channels to communicate with foreign governments (written documents, meetings, summits, diplomatic visits, etc). This type of diplomacy is conducted by diplomats of one nation with diplomats and other officials of another nation or international organization.
+* **Summary**: Leveraging formal, traditional, diplomatic channels to communicate with foreign governments (written documents, meetings, summits, diplomatic visits, etc). This type of diplomacy is conducted by diplomats of one nation with diplomats and other officials of another nation or international organisation.
* **Belongs to tactic stage**: TA07
diff --git a/generated_pages/techniques/T0113.md b/generated_pages/techniques/T0113.md
index ff0418c..2bc3d64 100644
--- a/generated_pages/techniques/T0113.md
+++ b/generated_pages/techniques/T0113.md
@@ -1,6 +1,6 @@
# Technique T0113: Employ Commercial Analytic Firms
-* **Summary**: Commercial analytic firms collect data on target audience activities and evaluate the data to detect trends, such as content receiving high click-rates. An influence operation may employ commercial analytic firms to facilitate external collection on its target audience, complicating attribution efforts and better tailoring the content to audience preferences.
+* **Summary**: Commercial analytic firms collect data on target audience activities and evaluate the data to detect trends, such as content receiving high click-rates. An influence operation may employ commercial analytic firms to facilitate external collection on its target audience, complicating attribution efforts and better tailoring the content to audience preferences.
* **Belongs to tactic stage**: TA08
diff --git a/generated_pages/techniques/T0115.003.md b/generated_pages/techniques/T0115.003.md
index 4c36c05..ecc9819 100644
--- a/generated_pages/techniques/T0115.003.md
+++ b/generated_pages/techniques/T0115.003.md
@@ -1,6 +1,6 @@
# Technique T0115.003: One-Way Direct Posting
-* **Summary**: Direct posting refers to a method of posting content via a one-way messaging service, where the recipient cannot directly respond to the poster’s messaging. An influence operation may post directly to promote operation narratives to the target audience without allowing opportunities for fact-checking or disagreement, creating a false sense of support for the narrative.
+* **Summary**: Direct posting refers to a method of posting content via a one-way messaging service, where the recipient cannot directly respond to the poster’s messaging. An influence operation may post directly to promote operation narratives to the target audience without allowing opportunities for fact-checking or disagreement, creating a false sense of support for the narrative.
* **Belongs to tactic stage**: TA09
diff --git a/generated_pages/techniques/T0115.md b/generated_pages/techniques/T0115.md
index df570c3..83f772c 100644
--- a/generated_pages/techniques/T0115.md
+++ b/generated_pages/techniques/T0115.md
@@ -1,6 +1,6 @@
# Technique T0115: Post Content
-* **Summary**: Delivering content by posting via owned media (assets that the operator controls).
+* **Summary**: Delivering content by posting via owned media (assets that the operator controls).
* **Belongs to tactic stage**: TA09
diff --git a/generated_pages/techniques/T0116.md b/generated_pages/techniques/T0116.md
index c17af30..a2166fb 100644
--- a/generated_pages/techniques/T0116.md
+++ b/generated_pages/techniques/T0116.md
@@ -1,6 +1,6 @@
# Technique T0116: Comment or Reply on Content
-* **Summary**: Delivering content by replying or commenting via owned media (assets that the operator controls).
+* **Summary**: Delivering content by replying or commenting via owned media (assets that the operator controls).
* **Belongs to tactic stage**: TA09
diff --git a/generated_pages/techniques/T0118.md b/generated_pages/techniques/T0118.md
index f93295a..4a50d29 100644
--- a/generated_pages/techniques/T0118.md
+++ b/generated_pages/techniques/T0118.md
@@ -1,6 +1,6 @@
# Technique T0118: Amplify Existing Narrative
-* **Summary**: An influence operation may amplify existing narratives that align with its narratives to support operation objectives.
+* **Summary**: An influence operation may amplify existing narratives that align with its narratives to support operation objectives.
* **Belongs to tactic stage**: TA17
diff --git a/generated_pages/techniques/T0119.001.md b/generated_pages/techniques/T0119.001.md
index 0911d8c..5ab34a2 100644
--- a/generated_pages/techniques/T0119.001.md
+++ b/generated_pages/techniques/T0119.001.md
@@ -1,6 +1,6 @@
# Technique T0119.001: Post across Groups
-* **Summary**: An influence operation may post content across groups to spread narratives and content to new communities within the target audiences or to new target audiences.
+* **Summary**: An influence operation may post content across groups to spread narratives and content to new communities within the target audiences or to new target audiences.
* **Belongs to tactic stage**: TA17
diff --git a/generated_pages/techniques/T0119.002.md b/generated_pages/techniques/T0119.002.md
index 5a0129e..d7849e8 100644
--- a/generated_pages/techniques/T0119.002.md
+++ b/generated_pages/techniques/T0119.002.md
@@ -1,6 +1,6 @@
# Technique T0119.002: Post across Platform
-* **Summary**: An influence operation may post content across platforms to spread narratives and content to new communities within the target audiences or to new target audiences. Posting across platforms can also remove opposition and context, helping the narrative spread with less opposition on the cross-posted platform.
+* **Summary**: An influence operation may post content across platforms to spread narratives and content to new communities within the target audiences or to new target audiences. Posting across platforms can also remove opposition and context, helping the narrative spread with less opposition on the cross-posted platform.
* **Belongs to tactic stage**: TA17
diff --git a/generated_pages/techniques/T0119.md b/generated_pages/techniques/T0119.md
index cf5df93..0c3a8ca 100644
--- a/generated_pages/techniques/T0119.md
+++ b/generated_pages/techniques/T0119.md
@@ -1,6 +1,6 @@
# Technique T0119: Cross-Posting
-* **Summary**: Cross-posting refers to posting the same message to multiple internet discussions, social media platforms or accounts, or news groups at one time. An influence operation may post content online in multiple communities and platforms to increase the chances of content exposure to the target audience.
+* **Summary**: Cross-posting refers to posting the same message to multiple internet discussions, social media platforms or accounts, or news groups at one time. An influence operation may post content online in multiple communities and platforms to increase the chances of content exposure to the target audience.
* **Belongs to tactic stage**: TA17
diff --git a/generated_pages/techniques/T0120.001.md b/generated_pages/techniques/T0120.001.md
index b78500d..b67424e 100644
--- a/generated_pages/techniques/T0120.001.md
+++ b/generated_pages/techniques/T0120.001.md
@@ -1,6 +1,6 @@
-# Technique T0120.001: Use Affiliate Marketing Programs
+# Technique T0120.001: Use Affiliate Marketing Programmes
-* **Summary**: Use Affiliate Marketing Programs
+* **Summary**: Use Affiliate Marketing Programmes
* **Belongs to tactic stage**: TA17
diff --git a/generated_pages/techniques/T0120.md b/generated_pages/techniques/T0120.md
index 075b3af..1b22404 100644
--- a/generated_pages/techniques/T0120.md
+++ b/generated_pages/techniques/T0120.md
@@ -1,6 +1,6 @@
# Technique T0120: Incentivize Sharing
-* **Summary**: Incentivizing content sharing refers to actions that encourage users to share content themselves, reducing the need for the operation itself to post and promote its own content.
+* **Summary**: Incentivizing content sharing refers to actions that encourage users to share content themselves, reducing the need for the operation itself to post and promote its own content.
* **Belongs to tactic stage**: TA17
diff --git a/generated_pages/techniques/T0121.001.md b/generated_pages/techniques/T0121.001.md
index eedb1a5..2c47c1a 100644
--- a/generated_pages/techniques/T0121.001.md
+++ b/generated_pages/techniques/T0121.001.md
@@ -1,12 +1,6 @@
# Technique T0121.001: Bypass Content Blocking
-* **Summary**: Bypassing content blocking refers to actions taken to circumvent network security measures that prevent users from accessing certain servers, resources, or other online spheres. An influence operation may bypass content blocking to proliferate its content on restricted areas of the internet. Common strategies for bypassing content blocking include:
-- Altering IP addresses to avoid IP filtering
-- Using a Virtual Private Network (VPN) to avoid IP filtering
-- Using a Content Delivery Network (CDN) to avoid IP filtering
-- Enabling encryption to bypass packet inspection blocking
-- Manipulating text to avoid filtering by keywords
-- Posting content on multiple platforms to avoid platform-specific removals - Using local facilities or modified DNS servers to avoid DNS filtering
+* **Summary**: Bypassing content blocking refers to actions taken to circumvent network security measures that prevent users from accessing certain servers, resources, or other online spheres. An influence operation may bypass content blocking to proliferate its content on restricted areas of the internet. Common strategies for bypassing content blocking include: - Altering IP addresses to avoid IP filtering - Using a Virtual Private Network (VPN) to avoid IP filtering - Using a Content Delivery Network (CDN) to avoid IP filtering - Enabling encryption to bypass packet inspection blocking - Manipulating text to avoid filtering by keywords - Posting content on multiple platforms to avoid platform-specific removals - Using local facilities or modified DNS servers to avoid DNS filtering
* **Belongs to tactic stage**: TA17
diff --git a/generated_pages/techniques/T0121.md b/generated_pages/techniques/T0121.md
index a0196db..dad314f 100644
--- a/generated_pages/techniques/T0121.md
+++ b/generated_pages/techniques/T0121.md
@@ -1,6 +1,6 @@
# Technique T0121: Manipulate Platform Algorithm
-* **Summary**: Manipulating a platform algorithm refers to conducting activity on a platform in a way that intentionally targets its underlying algorithm. After analyzing a platform’s algorithm (see: Select Platforms), an influence operation may use a platform in a way that increases its content exposure, avoids content removal, or otherwise benefits the operation’s strategy. For example, an influence operation may use bots to amplify its posts so that the platform’s algorithm recognizes engagement with operation content and further promotes the content on user timelines.
+* **Summary**: Manipulating a platform algorithm refers to conducting activity on a platform in a way that intentionally targets its underlying algorithm. After analysing a platform’s algorithm (see: Select Platforms), an influence operation may use a platform in a way that increases its content exposure, avoids content removal, or otherwise benefits the operation’s strategy. For example, an influence operation may use bots to amplify its posts so that the platform’s algorithm recognises engagement with operation content and further promotes the content on user timelines.
* **Belongs to tactic stage**: TA17
diff --git a/generated_pages/techniques/T0122.md b/generated_pages/techniques/T0122.md
index d62f51f..936e8d1 100644
--- a/generated_pages/techniques/T0122.md
+++ b/generated_pages/techniques/T0122.md
@@ -1,6 +1,6 @@
# Technique T0122: Direct Users to Alternative Platforms
-* **Summary**: Direct users to alternative platforms refers to encouraging users to move from the platform on which they initially viewed operation content and engage with content on alternate information channels, including separate social media channels and inauthentic websites. An operation may drive users to alternative platforms to diversify its information channels and ensure the target audience knows where to access operation content if the initial platform suspends, flags, or otherwise removes original operation assets and content.
+* **Summary**: Direct users to alternative platforms refers to encouraging users to move from the platform on which they initially viewed operation content and engage with content on alternate information channels, including separate social media channels and inauthentic websites. An operation may drive users to alternative platforms to diversify its information channels and ensure the target audience knows where to access operation content if the initial platform suspends, flags, or otherwise removes original operation assets and content.
* **Belongs to tactic stage**: TA17
diff --git a/generated_pages/techniques/T0123.001.md b/generated_pages/techniques/T0123.001.md
index bedac35..6910d7f 100644
--- a/generated_pages/techniques/T0123.001.md
+++ b/generated_pages/techniques/T0123.001.md
@@ -1,6 +1,6 @@
# Technique T0123.001: Delete Opposing Content
-* **Summary**: Deleting opposing content refers to the removal of content that conflicts with operational narratives from selected platforms. An influence operation may delete opposing content to censor contradictory information from the target audience, allowing operation narratives to take priority in the information space.
+* **Summary**: Deleting opposing content refers to the removal of content that conflicts with operational narratives from selected platforms. An influence operation may delete opposing content to censor contradictory information from the target audience, allowing operation narratives to take priority in the information space.
* **Belongs to tactic stage**: TA18
diff --git a/generated_pages/techniques/T0123.002.md b/generated_pages/techniques/T0123.002.md
index 2900bbe..604a93d 100644
--- a/generated_pages/techniques/T0123.002.md
+++ b/generated_pages/techniques/T0123.002.md
@@ -1,6 +1,6 @@
# Technique T0123.002: Block Content
-* **Summary**: Content blocking refers to actions taken to restrict internet access or render certain areas of the internet inaccessible. An influence operation may restrict content based on both network and content attributes.
+* **Summary**: Content blocking refers to actions taken to restrict internet access or render certain areas of the internet inaccessible. An influence operation may restrict content based on both network and content attributes.
* **Belongs to tactic stage**: TA18
diff --git a/generated_pages/techniques/T0123.003.md b/generated_pages/techniques/T0123.003.md
index 1431f29..a8b0104 100644
--- a/generated_pages/techniques/T0123.003.md
+++ b/generated_pages/techniques/T0123.003.md
@@ -1,6 +1,6 @@
# Technique T0123.003: Destroy Information Generation Capabilities
-* **Summary**: Destroying information generation capabilities refers to actions taken to limit, degrade, or otherwise incapacitate an actor’s ability to generate conflicting information. An influence operation may destroy an actor’s information generation capabilities by physically dismantling the information infrastructure, disconnecting resources needed for information generation, or redirecting information generation personnel. An operation may destroy an adversary’s information generation capabilities to limit conflicting content exposure to the target audience and crowd the information space with its own narratives.
+* **Summary**: Destroying information generation capabilities refers to actions taken to limit, degrade, or otherwise incapacitate an actor’s ability to generate conflicting information. An influence operation may destroy an actor’s information generation capabilities by physically dismantling the information infrastructure, disconnecting resources needed for information generation, or redirecting information generation personnel. An operation may destroy an adversary’s information generation capabilities to limit conflicting content exposure to the target audience and crowd the information space with its own narratives.
* **Belongs to tactic stage**: TA18
diff --git a/generated_pages/techniques/T0123.004.md b/generated_pages/techniques/T0123.004.md
index e535930..5c6177d 100644
--- a/generated_pages/techniques/T0123.004.md
+++ b/generated_pages/techniques/T0123.004.md
@@ -1,6 +1,6 @@
# Technique T0123.004: Conduct Server Redirect
-* **Summary**: A server redirect, also known as a URL redirect, occurs when a server automatically forwards a user from one URL to another using server-side scripting languages. An influence operation may conduct a server redirect to divert target audience members from one website to another without their knowledge. The redirected website may pose as a legitimate source, host malware, or otherwise aid operation objectives.
+* **Summary**: A server redirect, also known as a URL redirect, occurs when a server automatically forwards a user from one URL to another using server-side scripting languages. An influence operation may conduct a server redirect to divert target audience members from one website to another without their knowledge. The redirected website may pose as a legitimate source, host malware, or otherwise aid operation objectives.
* **Belongs to tactic stage**: TA18
diff --git a/generated_pages/techniques/T0123.md b/generated_pages/techniques/T0123.md
index 4f9c1bd..9b64334 100644
--- a/generated_pages/techniques/T0123.md
+++ b/generated_pages/techniques/T0123.md
@@ -1,6 +1,6 @@
# Technique T0123: Control Information Environment through Offensive Cyberspace Operations
-* **Summary**: Controlling the information environment through offensive cyberspace operations uses cyber tools and techniques to alter the trajectory of content in the information space to either prioritize operation messaging or block opposition messaging.
+* **Summary**: Controlling the information environment through offensive cyberspace operations uses cyber tools and techniques to alter the trajectory of content in the information space to either prioritise operation messaging or block opposition messaging.
* **Belongs to tactic stage**: TA18
diff --git a/generated_pages/techniques/T0124.001.md b/generated_pages/techniques/T0124.001.md
index 5ad8e72..a50a8ed 100644
--- a/generated_pages/techniques/T0124.001.md
+++ b/generated_pages/techniques/T0124.001.md
@@ -1,6 +1,6 @@
# Technique T0124.001: Report Non-Violative Opposing Content
-* **Summary**: Reporting opposing content refers to notifying and providing an instance of a violation of a platform’s guidelines and policies for conduct on the platform. In addition to simply reporting the content, an operation may leverage copyright regulations to trick social media and web platforms into removing opposing content by manipulating the content to appear in violation of copyright laws. Reporting opposing content facilitates the suppression of contradictory information and allows operation narratives to take priority in the information space.
+* **Summary**: Reporting opposing content refers to notifying and providing an instance of a violation of a platform’s guidelines and policies for conduct on the platform. In addition to simply reporting the content, an operation may leverage copyright regulations to trick social media and web platforms into removing opposing content by manipulating the content to appear in violation of copyright laws. Reporting opposing content facilitates the suppression of contradictory information and allows operation narratives to take priority in the information space.
* **Belongs to tactic stage**: TA18
diff --git a/generated_pages/techniques/T0124.002.md b/generated_pages/techniques/T0124.002.md
index f815929..f4119e8 100644
--- a/generated_pages/techniques/T0124.002.md
+++ b/generated_pages/techniques/T0124.002.md
@@ -1,6 +1,6 @@
# Technique T0124.002: Goad People into Harmful Action (Stop Hitting Yourself)
-* **Summary**: Goad people into actions that violate terms of service or will lead to having their content or accounts taken down.
+* **Summary**: Goad people into actions that violate terms of service or will lead to having their content or accounts taken down.
* **Belongs to tactic stage**: TA18
diff --git a/generated_pages/techniques/T0124.md b/generated_pages/techniques/T0124.md
index 559708e..9af1137 100644
--- a/generated_pages/techniques/T0124.md
+++ b/generated_pages/techniques/T0124.md
@@ -1,6 +1,6 @@
# Technique T0124: Suppress Opposition
-* **Summary**: Operators can suppress the opposition by exploiting platform content moderation tools and processes like reporting non-violative content to platforms for takedown and goading opposition actors into taking actions that result in platform action or target audience disapproval.
+* **Summary**: Operators can suppress the opposition by exploiting platform content moderation tools and processes like reporting non-violative content to platforms for takedown and goading opposition actors into taking actions that result in platform action or target audience disapproval.
* **Belongs to tactic stage**: TA18
diff --git a/generated_pages/techniques/T0127.001.md b/generated_pages/techniques/T0127.001.md
index a4f2177..b8a6edf 100644
--- a/generated_pages/techniques/T0127.001.md
+++ b/generated_pages/techniques/T0127.001.md
@@ -1,6 +1,6 @@
# Technique T0127.001: Conduct Physical Violence
-* **Summary**: An influence operation may directly Conduct Physical Violence to achieve campaign goals.
+* **Summary**: An influence operation may directly Conduct Physical Violence to achieve campaign goals.
* **Belongs to tactic stage**: TA10
diff --git a/generated_pages/techniques/T0127.002.md b/generated_pages/techniques/T0127.002.md
index a143db6..c6dc8fa 100644
--- a/generated_pages/techniques/T0127.002.md
+++ b/generated_pages/techniques/T0127.002.md
@@ -1,6 +1,6 @@
# Technique T0127.002: Encourage Physical Violence
-* **Summary**: An influence operation may Encourage others to engage in Physical Violence to achieve campaign goals.
+* **Summary**: An influence operation may Encourage others to engage in Physical Violence to achieve campaign goals.
* **Belongs to tactic stage**: TA10
diff --git a/generated_pages/techniques/T0127.md b/generated_pages/techniques/T0127.md
index 035d055..79008f2 100644
--- a/generated_pages/techniques/T0127.md
+++ b/generated_pages/techniques/T0127.md
@@ -1,6 +1,6 @@
# Technique T0127: Physical Violence
-* **Summary**: Physical violence refers to the use of force to injure, abuse, damage, or destroy. An influence operation may conduct or encourage physical violence to discourage opponents from promoting conflicting content or draw attention to operation narratives using shock value.
+* **Summary**: Physical violence refers to the use of force to injure, abuse, damage, or destroy. An influence operation may conduct or encourage physical violence to discourage opponents from promoting conflicting content or draw attention to operation narratives using shock value.
* **Belongs to tactic stage**: TA10
diff --git a/generated_pages/techniques/T0128.001.md b/generated_pages/techniques/T0128.001.md
index a728815..0fd2d34 100644
--- a/generated_pages/techniques/T0128.001.md
+++ b/generated_pages/techniques/T0128.001.md
@@ -1,6 +1,6 @@
# Technique T0128.001: Use Pseudonyms
-* **Summary**: An operation may use pseudonyms, or fake names, to mask the identity of operation accounts, publish anonymous content, or otherwise use falsified personas to conceal identity of the operation. An operation may coordinate pseudonyms across multiple platforms, for example, by writing an article under a pseudonym and then posting a link to the article on social media on an account with the same falsified name.
+* **Summary**: An operation may use pseudonyms, or fake names, to mask the identity of operation accounts, publish anonymous content, or otherwise use falsified personas to conceal identity of the operation. An operation may coordinate pseudonyms across multiple platforms, for example, by writing an article under a pseudonym and then posting a link to the article on social media on an account with the same falsified name.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0128.002.md b/generated_pages/techniques/T0128.002.md
index 10a456c..11602f0 100644
--- a/generated_pages/techniques/T0128.002.md
+++ b/generated_pages/techniques/T0128.002.md
@@ -1,6 +1,6 @@
# Technique T0128.002: Conceal Network Identity
-* **Summary**: Concealing network identity aims to hide the existence an influence operation’s network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organization.
+* **Summary**: Concealing network identity aims to hide the existence an influence operation’s network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organisation.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0128.003.md b/generated_pages/techniques/T0128.003.md
index 5d387c7..a075c64 100644
--- a/generated_pages/techniques/T0128.003.md
+++ b/generated_pages/techniques/T0128.003.md
@@ -1,6 +1,6 @@
# Technique T0128.003: Distance Reputable Individuals from Operation
-* **Summary**: Distancing reputable individuals from the operation occurs when enlisted individuals, such as celebrities or subject matter experts, actively disengage themselves from operation activities and messaging. Individuals may distance themselves from the operation by deleting old posts or statements, unfollowing operation information assets, or otherwise detaching themselves from the operation’s timeline. An influence operation may want reputable individuals to distance themselves from the operation to reduce operation exposure, particularly if the operation aims to remove all evidence.
+* **Summary**: Distancing reputable individuals from the operation occurs when enlisted individuals, such as celebrities or subject matter experts, actively disengage themselves from operation activities and messaging. Individuals may distance themselves from the operation by deleting old posts or statements, unfollowing operation information assets, or otherwise detaching themselves from the operation’s timeline. An influence operation may want reputable individuals to distance themselves from the operation to reduce operation exposure, particularly if the operation aims to remove all evidence.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0128.004.md b/generated_pages/techniques/T0128.004.md
index 47e97db..68cac93 100644
--- a/generated_pages/techniques/T0128.004.md
+++ b/generated_pages/techniques/T0128.004.md
@@ -1,6 +1,6 @@
# Technique T0128.004: Launder Accounts
-* **Summary**: Account laundering occurs when an influence operation acquires control of previously legitimate online accounts from third parties through sale or exchange and often in contravention of terms of use. Influence operations use laundered accounts to reach target audience members from an existing information channel and complicate attribution.
+* **Summary**: Account laundering occurs when an influence operation acquires control of previously legitimate online accounts from third parties through sale or exchange and often in contravention of terms of use. Influence operations use laundered accounts to reach target audience members from an existing information channel and complicate attribution.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0128.005.md b/generated_pages/techniques/T0128.005.md
index 7e30b10..442b712 100644
--- a/generated_pages/techniques/T0128.005.md
+++ b/generated_pages/techniques/T0128.005.md
@@ -1,6 +1,6 @@
# Technique T0128.005: Change Names of Accounts
-* **Summary**: Changing names of accounts occurs when an operation changes the name of an existing social media account. An operation may change the names of its accounts throughout an operation to avoid detection or alter the names of newly acquired or repurposed accounts to fit operational narratives.
+* **Summary**: Changing names of accounts occurs when an operation changes the name of an existing social media account. An operation may change the names of its accounts throughout an operation to avoid detection or alter the names of newly acquired or repurposed accounts to fit operational narratives.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0129.001.md b/generated_pages/techniques/T0129.001.md
index 83c268f..1075049 100644
--- a/generated_pages/techniques/T0129.001.md
+++ b/generated_pages/techniques/T0129.001.md
@@ -1,6 +1,6 @@
# Technique T0129.001: Conceal Network Identity
-* **Summary**: Concealing network identity aims to hide the existence an influence operation’s network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organization.
+* **Summary**: Concealing network identity aims to hide the existence an influence operation’s network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organisation.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0129.002.md b/generated_pages/techniques/T0129.002.md
index ad01b55..937b7c0 100644
--- a/generated_pages/techniques/T0129.002.md
+++ b/generated_pages/techniques/T0129.002.md
@@ -1,6 +1,6 @@
# Technique T0129.002: Generate Content Unrelated to Narrative
-* **Summary**: An influence operation may mix its own operation content with legitimate news or external unrelated content to disguise operational objectives, narratives, or existence. For example, an operation may generate "lifestyle" or "cuisine" content alongside regular operation content.
+* **Summary**: An influence operation may mix its own operation content with legitimate news or external unrelated content to disguise operational objectives, narratives, or existence. For example, an operation may generate "lifestyle" or "cuisine" content alongside regular operation content.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0129.003.md b/generated_pages/techniques/T0129.003.md
index 285fea5..1eee125 100644
--- a/generated_pages/techniques/T0129.003.md
+++ b/generated_pages/techniques/T0129.003.md
@@ -1,6 +1,6 @@
# Technique T0129.003: Break Association with Content
-* **Summary**: Breaking association with content occurs when an influence operation actively separates itself from its own content. An influence operation may break association with content by unfollowing, unliking, or unsharing its content, removing attribution from its content, or otherwise taking actions that distance the operation from its messaging. An influence operation may break association with its content to complicate attribution or regain credibility for a new operation.
+* **Summary**: Breaking association with content occurs when an influence operation actively separates itself from its own content. An influence operation may break association with content by unfollowing, unliking, or unsharing its content, removing attribution from its content, or otherwise taking actions that distance the operation from its messaging. An influence operation may break association with its content to complicate attribution or regain credibility for a new operation.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0129.004.md b/generated_pages/techniques/T0129.004.md
index 674518d..31f49c7 100644
--- a/generated_pages/techniques/T0129.004.md
+++ b/generated_pages/techniques/T0129.004.md
@@ -1,6 +1,6 @@
# Technique T0129.004: Delete URLs
-* **Summary**: URL deletion occurs when an influence operation completely removes its website registration, rendering the URL inaccessible. An influence operation may delete its URLs to complicate attribution or remove online documentation that the operation ever occurred.
+* **Summary**: URL deletion occurs when an influence operation completely removes its website registration, rendering the URL inaccessible. An influence operation may delete its URLs to complicate attribution or remove online documentation that the operation ever occurred.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0129.007.md b/generated_pages/techniques/T0129.007.md
index 82e6c6d..03b9708 100644
--- a/generated_pages/techniques/T0129.007.md
+++ b/generated_pages/techniques/T0129.007.md
@@ -1,6 +1,6 @@
# Technique T0129.007: Delete Accounts/Account Activity
-* **Summary**: Deleting accounts and account activity occurs when an influence operation removes its online social media assets, including social media accounts, posts, likes, comments, and other online artifacts. An influence operation may delete its accounts and account activity to complicate attribution or remove online documentation that the operation ever occurred.
+* **Summary**: Deleting accounts and account activity occurs when an influence operation removes its online social media assets, including social media accounts, posts, likes, comments, and other online artefacts. An influence operation may delete its accounts and account activity to complicate attribution or remove online documentation that the operation ever occurred.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0129.008.md b/generated_pages/techniques/T0129.008.md
index b76c268..db6f75c 100644
--- a/generated_pages/techniques/T0129.008.md
+++ b/generated_pages/techniques/T0129.008.md
@@ -1,6 +1,6 @@
# Technique T0129.008: Redirect URLs
-* **Summary**: An influence operation may redirect its falsified or typosquatted URLs to legitimate websites to increase the operation's appearance of legitimacy, complicate attribution, and avoid detection.
+* **Summary**: An influence operation may redirect its falsified or typosquatted URLs to legitimate websites to increase the operation's appearance of legitimacy, complicate attribution, and avoid detection.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0129.009.md b/generated_pages/techniques/T0129.009.md
index 6e663f2..8ba019f 100644
--- a/generated_pages/techniques/T0129.009.md
+++ b/generated_pages/techniques/T0129.009.md
@@ -1,6 +1,6 @@
# Technique T0129.009: Remove Post Origins
-* **Summary**: Removing post origins refers to the elimination of evidence that indicates the initial source of operation content, often to complicate attribution. An influence operation may remove post origins by deleting watermarks, renaming files, or removing embedded links in its content.
+* **Summary**: Removing post origins refers to the elimination of evidence that indicates the initial source of operation content, often to complicate attribution. An influence operation may remove post origins by deleting watermarks, renaming files, or removing embedded links in its content.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0129.010.md b/generated_pages/techniques/T0129.010.md
index b87b46e..0f2c7a5 100644
--- a/generated_pages/techniques/T0129.010.md
+++ b/generated_pages/techniques/T0129.010.md
@@ -1,6 +1,6 @@
# Technique T0129.010: Misattribute Activity
-* **Summary**: Misattributed activity refers to incorrectly attributed operation activity. For example, a state sponsored influence operation may conduct operation activity in a way that mimics another state so that external entities misattribute activity to the incorrect state. An operation may misattribute their activities to complicate attribution, avoid detection, or frame an adversary for negative behavior.
+* **Summary**: Misattributed activity refers to incorrectly attributed operation activity. For example, a state sponsored influence operation may conduct operation activity in a way that mimics another state so that external entities misattribute activity to the incorrect state. An operation may misattribute their activities to complicate attribution, avoid detection, or frame an adversary for negative behaviour.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0130.001.md b/generated_pages/techniques/T0130.001.md
index 1143d01..196b4d9 100644
--- a/generated_pages/techniques/T0130.001.md
+++ b/generated_pages/techniques/T0130.001.md
@@ -1,7 +1,6 @@
# Technique T0130.001: Conceal Sponsorship
-* **Summary**: Concealing sponsorship aims to mislead or obscure the identity of the hidden sponsor behind an operation rather than entity publicly running the operation. Operations that conceal sponsorship may maintain visible falsified groups, news outlets, non-profits, or other organizations, but seek to mislead or obscure the identity sponsoring, funding, or otherwise supporting these entities.
-Influence operations may use a variety of techniques to mask the location of their social media accounts to complicate attribution and conceal evidence of foreign interference. Operation accounts may set their location to a false place, often the location of the operation’s target audience, and post in the region’s language
+* **Summary**: Concealing sponsorship aims to mislead or obscure the identity of the hidden sponsor behind an operation rather than entity publicly running the operation. Operations that conceal sponsorship may maintain visible falsified groups, news outlets, non-profits, or other organisations, but seek to mislead or obscure the identity sponsoring, funding, or otherwise supporting these entities. Influence operations may use a variety of techniques to mask the location of their social media accounts to complicate attribution and conceal evidence of foreign interference. Operation accounts may set their location to a false place, often the location of the operation’s target audience, and post in the region’s language
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0130.002.md b/generated_pages/techniques/T0130.002.md
index ce8afba..7604d65 100644
--- a/generated_pages/techniques/T0130.002.md
+++ b/generated_pages/techniques/T0130.002.md
@@ -1,6 +1,6 @@
-# Technique T0130.002: Utilize Bulletproof Hosting
+# Technique T0130.002: Utilise Bulletproof Hosting
-* **Summary**: Hosting refers to services through which storage and computing resources are provided to an individual or organization for the accommodation and maintenance of one or more websites and related services. Services may include web hosting, file sharing, and email distribution. Bulletproof hosting refers to services provided by an entity, such as a domain hosting or web hosting firm, that allows its customer considerable leniency in use of the service. An influence operation may utilize bulletproof hosting to maintain continuity of service for suspicious, illegal, or disruptive operation activities that stricter hosting services would limit, report, or suspend.
+* **Summary**: Hosting refers to services through which storage and computing resources are provided to an individual or organisation for the accommodation and maintenance of one or more websites and related services. Services may include web hosting, file sharing, and email distribution. Bulletproof hosting refers to services provided by an entity, such as a domain hosting or web hosting firm, that allows its customer considerable leniency in use of the service. An influence operation may utilise bulletproof hosting to maintain continuity of service for suspicious, illegal, or disruptive operation activities that stricter hosting services would limit, report, or suspend.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0130.003.md b/generated_pages/techniques/T0130.003.md
index b923382..5a73608 100644
--- a/generated_pages/techniques/T0130.003.md
+++ b/generated_pages/techniques/T0130.003.md
@@ -1,6 +1,6 @@
-# Technique T0130.003: Use Shell Organizations
+# Technique T0130.003: Use Shell Organisations
-* **Summary**: Use Shell Organizations to conceal sponsorship.
+* **Summary**: Use Shell Organisations to conceal sponsorship.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0130.004.md b/generated_pages/techniques/T0130.004.md
index 560d999..47fb566 100644
--- a/generated_pages/techniques/T0130.004.md
+++ b/generated_pages/techniques/T0130.004.md
@@ -1,6 +1,6 @@
# Technique T0130.004: Use Cryptocurrency
-* **Summary**: Use Cryptocurrency to conceal sponsorship. Examples include Bitcoin, Monero, and Etherium.
+* **Summary**: Use Cryptocurrency to conceal sponsorship. Examples include Bitcoin, Monero, and Etherium.
* **Belongs to tactic stage**: TA11
diff --git a/generated_pages/techniques/T0133.001.md b/generated_pages/techniques/T0133.001.md
index 72018d2..45b9818 100644
--- a/generated_pages/techniques/T0133.001.md
+++ b/generated_pages/techniques/T0133.001.md
@@ -1,6 +1,6 @@
-# Technique T0133.001: Behavior Changes
+# Technique T0133.001: Behaviour Changes
-* **Summary**: Monitor and evaluate behaviour changes from misinformation incidents.
+* **Summary**: Monitor and evaluate behaviour changes from misinformation incidents.
* **Belongs to tactic stage**: TA12
diff --git a/generated_pages/techniques/T0133.002.md b/generated_pages/techniques/T0133.002.md
index c080cd2..867d9aa 100644
--- a/generated_pages/techniques/T0133.002.md
+++ b/generated_pages/techniques/T0133.002.md
@@ -1,6 +1,6 @@
# Technique T0133.002: Content
-* **Summary**: Measure current system state with respect to the effectiveness of campaign content.
+* **Summary**: Measure current system state with respect to the effectiveness of campaign content.
* **Belongs to tactic stage**: TA12
diff --git a/generated_pages/techniques/T0133.003.md b/generated_pages/techniques/T0133.003.md
index 0276888..2bc01f3 100644
--- a/generated_pages/techniques/T0133.003.md
+++ b/generated_pages/techniques/T0133.003.md
@@ -1,6 +1,6 @@
# Technique T0133.003: Awareness
-* **Summary**: Measure current system state with respect to the effectiveness of influencing awareness.
+* **Summary**: Measure current system state with respect to the effectiveness of influencing awareness.
* **Belongs to tactic stage**: TA12
diff --git a/generated_pages/techniques/T0133.004.md b/generated_pages/techniques/T0133.004.md
index a01e431..e045986 100644
--- a/generated_pages/techniques/T0133.004.md
+++ b/generated_pages/techniques/T0133.004.md
@@ -1,6 +1,6 @@
# Technique T0133.004: Knowledge
-* **Summary**: Measure current system state with respect to the effectiveness of influencing knowledge.
+* **Summary**: Measure current system state with respect to the effectiveness of influencing knowledge.
* **Belongs to tactic stage**: TA12
diff --git a/generated_pages/techniques/T0133.005.md b/generated_pages/techniques/T0133.005.md
index 12ed204..87bd8c6 100644
--- a/generated_pages/techniques/T0133.005.md
+++ b/generated_pages/techniques/T0133.005.md
@@ -1,6 +1,6 @@
# Technique T0133.005: Action/Attitude
-* **Summary**: Measure current system state with respect to the effectiveness of influencing action/attitude.
+* **Summary**: Measure current system state with respect to the effectiveness of influencing action/attitude.
* **Belongs to tactic stage**: TA12
diff --git a/generated_pages/techniques/T0134.001.md b/generated_pages/techniques/T0134.001.md
index 6d116af..189ced3 100644
--- a/generated_pages/techniques/T0134.001.md
+++ b/generated_pages/techniques/T0134.001.md
@@ -1,6 +1,6 @@
# Technique T0134.001: Message Reach
-* **Summary**: Monitor and evaluate message reach in misinformation incidents.
+* **Summary**: Monitor and evaluate message reach in misinformation incidents.
* **Belongs to tactic stage**: TA12
diff --git a/generated_pages/techniques_index.md b/generated_pages/techniques_index.md
index b06a859..331bf13 100644
--- a/generated_pages/techniques_index.md
+++ b/generated_pages/techniques_index.md
@@ -10,19 +10,19 @@
Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices.
+
Use or adapt existing narrative themes, where narratives are the baseline stories of a target audience. Narratives form the bedrock of our worldviews. New information is understood through a process firmly grounded in this bedrock. If new information is not consitent with the prevailing narratives of an audience, it will be ignored. Effective campaigns will frame their misinformation in the context of these narratives. Highly effective campaigns will make extensive use of audience-appropriate archetypes and meta-narratives throughout their content creation and amplifiction practices.
Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centered on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the "firehose of misinformation" approach.
+
Advance competing narratives connected to same issue ie: on one hand deny incident while at same time expresses dismiss. Suppressing or discouraging narratives already spreading requires an alternative. The most simple set of narrative techniques in response would be the construction and promotion of contradictory alternatives centred on denial, deflection, dismissal, counter-charges, excessive standards of proof, bias in prohibition or enforcement, and so on. These competing narratives allow loyalists cover, but are less compelling to opponents and fence-sitters than campaigns built around existing narratives or highly explanatory master narratives. Competing narratives, as such, are especially useful in the "firehose of misinformation" approach.
Stories planted or promoted in computational propaganda operations often make use of experts fabricated from whole cloth, sometimes specifically for the story itself.
+
Stories planted or promoted in computational propaganda operations often make use of experts fabricated from whole cloth, sometimes specifically for the story itself.
Create media assets to support inauthentic organizations (e.g. think tank), people (e.g. experts) and/or serve as sites to distribute malware/launch phishing operations.
+
Create media assets to support inauthentic organisations (e.g. think tank), people (e.g. experts) and/or serve as sites to distribute malware/launch phishing operations.
Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities.
+
Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities.
Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicize the story more widely through trending lists and search behavior. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites).
+
Create Hashtags and Search Artefacts
+
Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicise the story more widely through trending lists and search behaviour. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites).
Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services166 on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns to promote operation messaging while raising money to support its activities.
+
Fundraising campaigns refer to an influence operation’s systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services166 on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns to promote operation messaging while raising money to support its activities.
"Conspiracy narratives" appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalized or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the "firehose of falsehoods" model.
+
"Conspiracy narratives" appeal to the human desire for explanatory order, by invoking the participation of poweful (often sinister) actors in pursuit of their own political goals. These narratives are especially appealing when an audience is low-information, marginalised or otherwise inclined to reject the prevailing explanation. Conspiracy narratives are an important component of the "firehose of falsehoods" model.
An influence operation may amplify an existing conspiracy theory narrative that aligns with its incident or campaign goals. By amplifying existing conspiracy theory narratives, operators can leverage the power of the existing communities that support and propagate those theories without needing to expend resources creating new narratives or building momentum and buy in around new narratives.
+
An influence operation may amplify an existing conspiracy theory narrative that aligns with its incident or campaign goals. By amplifying existing conspiracy theory narratives, operators can leverage the power of the existing communities that support and propagate those theories without needing to expend resources creating new narratives or building momentum and buy in around new narratives.
While this requires more resources than amplifying existing conspiracy theory narratives, an influence operation may develop original conspiracy theory narratives in order to achieve greater control and alignment over the narrative and their campaign goals. Prominent examples include the USSR's Operation INFEKTION disinformation campaign run by the KGB in the 1980s to plant the idea that the United States had invented HIV/AIDS as part of a biological weapons research project at Fort Detrick, Maryland. More recently, Fort Detrick featured prominently in a new conspiracy theory narratives around the origins of the COVID-19 outbreak and pandemic.
+
While this requires more resources than amplifying existing conspiracy theory narratives, an influence operation may develop original conspiracy theory narratives in order to achieve greater control and alignment over the narrative and their campaign goals. Prominent examples include the USSR's Operation INFEKTION disinformation campaign run by the KGB in the 1980s to plant the idea that the United States had invented HIV/AIDS as part of a biological weapons research project at Fort Detrick, Maryland. More recently, Fort Detrick featured prominently in a new conspiracy theory narratives around the origins of the COVID-19 outbreak and pandemic.
Reframing context refers to removing an event from its surrounding context to distort its intended meaning. Rather than deny that an event occurred, reframing context frames an event in a manner that may lead the target audience to draw a different conclusion about its intentions.
+
Reframing context refers to removing an event from its surrounding context to distort its intended meaning. Rather than deny that an event occurred, reframing context frames an event in a manner that may lead the target audience to draw a different conclusion about its intentions.
An influence operation may edit open-source content, such as collaborative blogs or encyclopedias, to promote its narratives on outlets with existing credibility and audiences. Editing open-source content may allow an operation to post content on platforms without dedicating resources to the creation and maintenance of its own assets.
+
An influence operation may edit open-source content, such as collaborative blogs or encyclopaedias, to promote its narratives on outlets with existing credibility and audiences. Editing open-source content may allow an operation to post content on platforms without dedicating resources to the creation and maintenance of its own assets.
Credibility in a social media environment is often a function of the size of a user's network. "Influencers" are so-called because of their reach, typically understood as: 1) the size of their network (i.e. the number of followers, perhaps weighted by their own influence); and 2) The rate at which their comments are re-circulated (these two metrics are related). Add traditional media players at all levels of credibility and professionalism to this, and the number of potential influencial carriers available for unwitting amplification becomes substantial. By targeting high-influence people and organizations in all types of media with narratives and content engineered to appeal their emotional or ideological drivers, influence campaigns are able to add perceived credibility to their messaging via saturation and adoption by trusted agents such as celebrities, journalists and local leaders.
+
Credibility in a social media environment is often a function of the size of a user's network. "Influencers" are so-called because of their reach, typically understood as: 1) the size of their network (i.e. the number of followers, perhaps weighted by their own influence); and 2) The rate at which their comments are re-circulated (these two metrics are related). Add traditional media players at all levels of credibility and professionalism to this, and the number of potential influencial carriers available for unwitting amplification becomes substantial. By targeting high-influence people and organisations in all types of media with narratives and content engineered to appeal their emotional or ideological drivers, influence campaigns are able to add perceived credibility to their messaging via saturation and adoption by trusted agents such as celebrities, journalists and local leaders.
Threatening or harassing believers of opposing narratives refers to the use of intimidation techniques, including cyberbullying and doxing, to discourage opponents from voicing their dissent. An influence operation may threaten or harass believers of the opposing narratives to deter individuals from posting or proliferating conflicting content.
+
Threatening or harassing believers of opposing narratives refers to the use of intimidation techniques, including cyberbullying and doxing, to discourage opponents from voicing their dissent. An influence operation may threaten or harass believers of the opposing narratives to deter individuals from posting or proliferating conflicting content.
Cancel culture refers to the phenomenon in which individuals collectively refrain from supporting an individual, organization, business, or other entity, usually following a real or falsified controversy. An influence operation may exploit cancel culture by emphasizing an adversary’s problematic or disputed behavior and presenting its own content as an alternative.
+
Cancel culture refers to the phenomenon in which individuals collectively refrain from supporting an individual, organisation, business, or other entity, usually following a real or falsified controversy. An influence operation may exploit cancel culture by emphasising an adversary’s problematic or disputed behaviour and presenting its own content as an alternative.
Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.
+
Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.
Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.
+
Doxing refers to online harassment in which individuals publicly release private information about another individual, including names, addresses, employment information, pictures, family members, and other sensitive information. An influence operation may dox its opposition to encourage individuals aligned with operation narratives to harass the doxed individuals themselves or otherwise discourage the doxed individuals from posting or proliferating conflicting content.
Bots Amplify via Automated Forwarding and Reposting
-
Automated forwarding and reposting refer to the proliferation of operation content using automated means, such as artificial intelligence or social media bots. An influence operation may use automated activity to increase content exposure without dedicating the resources, including personnel and time, traditionally required to forward and repost content.
-Use bots to amplify narratives above algorithm thresholds. Bots are automated/programmed profiles designed to amplify content (ie: automatically retweet or like) and give appearance it's more "popular" than it is. They can operate as a network, to function in a coordinated/orchestrated manner. In some cases (more so now) they are an inexpensive/disposable assets used for minimal deployment as bot detection tools improve and platforms are more responsive.
+
Automated forwarding and reposting refer to the proliferation of operation content using automated means, such as artificial intelligence or social media bots. An influence operation may use automated activity to increase content exposure without dedicating the resources, including personnel and time, traditionally required to forward and repost content. Use bots to amplify narratives above algorithm thresholds. Bots are automated/programmed profiles designed to amplify content (ie: automatically retweet or like) and give appearance it's more "popular" than it is. They can operate as a network, to function in a coordinated/orchestrated manner. In some cases (more so now) they are an inexpensive/disposable assets used for minimal deployment as bot detection tools improve and platforms are more responsive.
Spamoflauge refers to the practice of disguising spam messages as legitimate. Spam refers to the use of electronic messaging systems to send out unrequested or unwanted messages in bulk. Simple methods of spamoflauge include replacing letters with numbers to fool keyword-based email spam filters, for example, "you've w0n our jackp0t!". Spamoflauge may extend to more complex techniques such as modifying the grammar or word choice of the language, casting messages as images which spam detectors cannot automatically read, or encapsulating messages in password protected attachments, such as .pdf or .zip files. Influence operations may use spamoflauge to avoid spam filtering systems and increase the likelihood of the target audience receiving operation messaging.
+
Utilise Spamoflauge
+
Spamoflauge refers to the practice of disguising spam messages as legitimate. Spam refers to the use of electronic messaging systems to send out unrequested or unwanted messages in bulk. Simple methods of spamoflauge include replacing letters with numbers to fool keyword-based email spam filters, for example, "you've w0n our jackp0t!". Spamoflauge may extend to more complex techniques such as modifying the grammar or word choice of the language, casting messages as images which spam detectors cannot automatically read, or encapsulating messages in password protected attachments, such as .pdf or .zip files. Influence operations may use spamoflauge to avoid spam filtering systems and increase the likelihood of the target audience receiving operation messaging.
Swarming refers to the coordinated use of accounts to overwhelm the information space with operation content. Unlike information flooding, swarming centers exclusively around a specific event or actor rather than a general narrative. Swarming relies on “horizontal communication” between information assets rather than a top-down, vertical command-and-control approach.
+
Swarming refers to the coordinated use of accounts to overwhelm the information space with operation content. Unlike information flooding, swarming centres exclusively around a specific event or actor rather than a general narrative. Swarming relies on “horizontal communication” between information assets rather than a top-down, vertical command-and-control approach.
Keyword squatting refers to the creation of online content, such as websites, articles, or social media accounts, around a specific search engine-optimized term to overwhelm the search results of that term. An influence may keyword squat to increase content exposure to target audience members who query the exploited term in a search engine and manipulate the narrative around the term.
+
Keyword squatting refers to the creation of online content, such as websites, articles, or social media accounts, around a specific search engine-optimized term to overwhelm the search results of that term. An influence may keyword squat to increase content exposure to target audience members who query the exploited term in a search engine and manipulate the narrative around the term.
TA17
@@ -316,20 +315,20 @@ Use bots to amplify narratives above algorithm thresholds. Bots are automated/pr
Paying for physical action occurs when an influence operation pays individuals to act in the physical realm. An influence operation may pay for physical action to create specific situations and frame them in a way that supports operation narratives, for example, paying a group of people to burn a car to later post an image of the burning car and frame it as an act of protest.
+
Paying for physical action occurs when an influence operation pays individuals to act in the physical realm. An influence operation may pay for physical action to create specific situations and frame them in a way that supports operation narratives, for example, paying a group of people to burn a car to later post an image of the burning car and frame it as an act of protest.
Symbolic action refers to activities specifically intended to advance an operation’s narrative by signaling something to the audience, for example, a military parade supporting a state’s narrative of military superiority. An influence operation may use symbolic action to create falsified evidence supporting operation narratives in the physical information space.
+
Symbolic action refers to activities specifically intended to advance an operation’s narrative by signalling something to the audience, for example, a military parade supporting a state’s narrative of military superiority. An influence operation may use symbolic action to create falsified evidence supporting operation narratives in the physical information space.
TA10
@@ -365,61 +364,61 @@ Use bots to amplify narratives above algorithm thresholds. Bots are automated/pr
Media attention on a story or event is heightened during a breaking news event, where unclear facts and incomplete information increase speculation, rumors, and conspiracy theories, which are all vulnerable to manipulation.
+
Media attention on a story or event is heightened during a breaking news event, where unclear facts and incomplete information increase speculation, rumours, and conspiracy theories, which are all vulnerable to manipulation.
Create audience segmentations by features of interest to the influence campaign, including political affiliation, geographic location, income, demographics, and psychographics.
+
Create audience segmentations by features of interest to the influence campaign, including political affiliation, geographic location, income, demographics, and psychographics.
An influence operation may target populations in a specific geographic location, such as a region, state, or city. An influence operation may use geographic segmentation to Create Localized Content (see: Establish Legitimacy).
+
An influence operation may target populations in a specific geographic location, such as a region, state, or city. An influence operation may use geographic segmentation to Create Localised Content (see: Establish Legitimacy).
An influence operation may target populations based on demographic segmentation, including age, gender, and income. Demographic segmentation may be useful for influence operations aiming to change state policies that affect a specific population sector. For example, an influence operation attempting to influence Medicare funding in the United States would likely target U.S. voters over 65 years of age.
+
An influence operation may target populations based on demographic segmentation, including age, gender, and income. Demographic segmentation may be useful for influence operations aiming to change state policies that affect a specific population sector. For example, an influence operation attempting to influence Medicare funding in the United States would likely target U.S. voters over 65 years of age.
An influence operation may target populations based on psychographic segmentation, which uses audience values and decision-making processes. An operation may individually gather psychographic data with its own surveys or collection tools or externally purchase data from social media companies or online surveys, such as personality quizzes.
+
An influence operation may target populations based on psychographic segmentation, which uses audience values and decision-making processes. An operation may individually gather psychographic data with its own surveys or collection tools or externally purchase data from social media companies or online surveys, such as personality quizzes.
Determining the target audiences (segments of the population) who will receive campaign narratives and artifacts intended to achieve the strategic ends.
+
Determining the target audiences (segments of the population) who will receive campaign narratives and artefacts intended to achieve the strategic ends.
Determining the campaigns goals or objectives. Examples include achieving achieving geopolitical advantage like undermining trust in an adversary, gaining domestic political advantage, achieving financial gain, or attaining a policy change,
+
Determining the campaigns goals or objectives. Examples include achieving achieving geopolitical advantage like undermining trust in an adversary, gaining domestic political advantage, achieving financial gain, or attaining a policy change,
Push back against criticism by dismissing your critics. This might be arguing that the critics use a different standard for you than with other actors or themselves; or arguing that their criticism is biased.
+
Push back against criticism by dismissing your critics. This might be arguing that the critics use a different standard for you than with other actors or themselves; or arguing that their criticism is biassed.
TA02
@@ -431,7 +430,7 @@ Use bots to amplify narratives above algorithm thresholds. Bots are automated/pr
Mapping the target audience information environment analyzes the information space itself, including social media analytics, web traffic, and media surveys. Mapping the information environment may help the influence operation determine the most realistic and popular information channels to reach its target audience.
-Mapping the target audience information environment aids influence operations in determining the most vulnerable areas of the information space to target with messaging.
+
Mapping the target audience information environment analyses the information space itself, including social media analytics, web traffic, and media surveys. Mapping the information environment may help the influence operation determine the most realistic and popular information channels to reach its target audience. Mapping the target audience information environment aids influence operations in determining the most vulnerable areas of the information space to target with messaging.
An influence operation may use social media analytics to determine which factors will increase the operation content’s exposure to its target audience on social media platforms, including views, interactions, and sentiment relating to topics and content types. The social media platform itself or a third-party tool may collect the metrics.
+
An influence operation may use social media analytics to determine which factors will increase the operation content’s exposure to its target audience on social media platforms, including views, interactions, and sentiment relating to topics and content types. The social media platform itself or a third-party tool may collect the metrics.
An influence operation may evaluate its own or third-party media surveys to determine what type of content appeals to its target audience. Media surveys may provide insight into an audience’s political views, social class, general interests, or other indicators used to tailor operation messaging to its target audience.
+
An influence operation may evaluate its own or third-party media surveys to determine what type of content appeals to its target audience. Media surveys may provide insight into an audience’s political views, social class, general interests, or other indicators used to tailor operation messaging to its target audience.
An influence operation may identify trending hashtags on social media platforms for later use in boosting operation content. A hashtag40 refers to a word or phrase preceded by the hash symbol (#) on social media used to identify messages and posts relating to a specific topic. All public posts that use the same hashtag are aggregated onto a centralized page dedicated to the word or phrase and sorted either chronologically or by popularity.
+
An influence operation may identify trending hashtags on social media platforms for later use in boosting operation content. A hashtag40 refers to a word or phrase preceded by the hash symbol (#) on social media used to identify messages and posts relating to a specific topic. All public posts that use the same hashtag are aggregated onto a centralised page dedicated to the word or phrase and sorted either chronologically or by popularity.
An influence operation may conduct web traffic analysis to determine which search engines, keywords, websites, and advertisements gain the most traction with its target audience.
+
An influence operation may conduct web traffic analysis to determine which search engines, keywords, websites, and advertisements gain the most traction with its target audience.
An influence operation may survey a target audience’s Internet availability and degree of media freedom to determine which target audience members will have access to operation content and on which platforms. An operation may face more difficulty targeting an information environment with heavy restrictions and media control than an environment with independent media, freedom of speech and of the press, and individual liberties.
+
An influence operation may survey a target audience’s Internet availability and degree of media freedom to determine which target audience members will have access to operation content and on which platforms. An operation may face more difficulty targeting an information environment with heavy restrictions and media control than an environment with independent media, freedom of speech and of the press, and individual liberties.
Identifying social and technical vulnerabilities determines weaknesses within the target audience information environment for later exploitation. Vulnerabilities include decisive political issues, weak cybersecurity infrastructure, search engine data voids, and other technical and non technical weaknesses in the target information environment.
-Identifying social and technical vulnerabilities facilitates the later exploitation of the identified weaknesses to advance operation objectives.
+
Identifying social and technical vulnerabilities determines weaknesses within the target audience information environment for later exploitation. Vulnerabilities include decisive political issues, weak cybersecurity infrastructure, search engine data voids, and other technical and non technical weaknesses in the target information environment. Identifying social and technical vulnerabilities facilitates the later exploitation of the identified weaknesses to advance operation objectives.
A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation.
-A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalizing on most search engines’ preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term.
+
A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation. A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalising on most search engines’ preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term.
An influence operation may exploit existing racial, religious, demographic, or social prejudices to further polarize its target audience from the rest of the public.
+
An influence operation may exploit existing racial, religious, demographic, or social prejudices to further polarise its target audience from the rest of the public.
An influence operation may identify existing fissures to pit target populations against one another or facilitate a “divide-and-conquer" approach to tailor operation narratives along the divides.
+
An influence operation may identify existing fissures to pit target populations against one another or facilitate a “divide-and-conquer" approach to tailor operation narratives along the divides.
An influence operation may assess preexisting conspiracy theories or suspicions in a population to identify existing narratives that support operational objectives.
+
An influence operation may assess preexisting conspiracy theories or suspicions in a population to identify existing narratives that support operational objectives.
A wedge issue is a divisive political issue, usually concerning a social phenomenon, that divides individuals along a defined line. An influence operation may exploit wedge issues by intentionally polarizing the public along the wedge issue line and encouraging opposition between factions.
+
A wedge issue is a divisive political issue, usually concerning a social phenomenon, that divides individuals along a defined line. An influence operation may exploit wedge issues by intentionally polarising the public along the wedge issue line and encouraging opposition between factions.
An influence operation may identify or create a real or imaginary adversary to center operation narratives against. A real adversary may include certain politicians or political parties while imaginary adversaries may include falsified “deep state”62 actors that, according to conspiracies, run the state behind public view.
+
An influence operation may identify or create a real or imaginary adversary to centre operation narratives against. A real adversary may include certain politicians or political parties while imaginary adversaries may include falsified “deep state”62 actors that, according to conspiracies, run the state behind public view.
An influence operation may exploit existing weaknesses in a target’s media system. These weaknesses may include existing biases among media agencies, vulnerability to false news agencies on social media, or existing distrust of traditional media sources. An existing distrust among the public in the media system’s credibility holds high potential for exploitation by an influence operation when establishing alternative news agencies to spread operation content.
+
An influence operation may exploit existing weaknesses in a target’s media system. These weaknesses may include existing biases among media agencies, vulnerability to false news agencies on social media, or existing distrust of traditional media sources. An existing distrust among the public in the media system’s credibility holds high potential for exploitation by an influence operation when establishing alternative news agencies to spread operation content.
Actors may develop new narratives to further strategic or tactical goals, especially when existing narratives adequately align with the campaign goals. New narratives provide more control in terms of crafting the message to achieve specific goals. However, new narratives may require more effort to disseminate than adapting or adopting existing narratives.
+
Actors may develop new narratives to further strategic or tactical goals, especially when existing narratives adequately align with the campaign goals. New narratives provide more control in terms of crafting the message to achieve specific goals. However, new narratives may require more effort to disseminate than adapting or adopting existing narratives.
Integrate Target Audience Vulnerabilities into Narrative
-
An influence operation may seek to exploit the preexisting weaknesses, fears, and enemies of the target audience for integration into the operation’s narratives and overall strategy. Integrating existing vulnerabilities into the operational approach conserves resources by exploiting already weak areas of the target information environment instead of forcing the operation to create new vulnerabilities in the environment.
+
An influence operation may seek to exploit the preexisting weaknesses, fears, and enemies of the target audience for integration into the operation’s narratives and overall strategy. Integrating existing vulnerabilities into the operational approach conserves resources by exploiting already weak areas of the target information environment instead of forcing the operation to create new vulnerabilities in the environment.
When an operation recycles content from its own previous operations or plagiarizes from external operations. An operation may launder information to conserve resources that would have otherwise been utilized to develop new content.
+
When an operation recycles content from its own previous operations or plagiarises from external operations. An operation may launder information to conserve resources that would have otherwise been utilised to develop new content.
Copypasta refers to a piece of text that has been copied and pasted multiple times across various online platforms. A copypasta’s final form may differ from its original source text as users add, delete, or otherwise edit the content as they repost the text.
+
Copypasta refers to a piece of text that has been copied and pasted multiple times across various online platforms. A copypasta’s final form may differ from its original source text as users add, delete, or otherwise edit the content as they repost the text.
An influence operation may take content from other sources without proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources.
+
Plagiarise Content
+
An influence operation may take content from other sources without proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources.
An influence operation may take authentic content from other sources and add deceptive labels or deceptively translate the content into other langauges.
+
Deceptively Labelled or Translated
+
An influence operation may take authentic content from other sources and add deceptive labels or deceptively translate the content into other langauges.
An influence operation may take content from other sources with proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources. Examples include the appropriation of content from one inauthentic news site to another inauthentic news site or network in ways that align with the originators licensing or terms of service.
+
An influence operation may take content from other sources with proper attribution. This content may be either misinformation content shared by others without malicious intent but now leveraged by the campaign as disinformation or disinformation content from other sources. Examples include the appropriation of content from one inauthentic news site to another inauthentic news site or network in ways that align with the originators licencing or terms of service.
Creating and editing false or misleading text-based artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign.
+
Creating and editing false or misleading text-based artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign.
AI-generated texts refers to synthetic text composed by computers using text-generating AI technology. Autonomous generation refers to content created by a bot without human input, also known as bot-created content generation. Autonomous generation represents the next step in automation after language generation and may lead to automated journalism. An influence operation may use read fakes or autonomous generation to quickly develop and distribute content to the target audience.
+
AI-generated texts refers to synthetic text composed by computers using text-generating AI technology. Autonomous generation refers to content created by a bot without human input, also known as bot-created content generation. Autonomous generation represents the next step in automation after language generation and may lead to automated journalism. An influence operation may use read fakes or autonomous generation to quickly develop and distribute content to the target audience.
TA06
@@ -608,13 +604,13 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
Creating and editing false or misleading visual artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include photographing staged real-life situations, repurposing existing digital images, or using image creation and editing technologies.
+
Creating and editing false or misleading visual artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include photographing staged real-life situations, repurposing existing digital images, or using image creation and editing technologies.
TA06
@@ -632,7 +628,7 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
Cheap fakes utilize less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.
+
Cheap fakes utilise less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.
TA06
@@ -644,7 +640,7 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
Creating and editing false or misleading video artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include staging videos of purportedly real situations, repurposing existing video artifacts, or using AI-generated video creation and editing technologies (including deepfakes).
+
Creating and editing false or misleading video artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include staging videos of purportedly real situations, repurposing existing video artefacts, or using AI-generated video creation and editing technologies (including deepfakes).
TA06
@@ -656,13 +652,13 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
Cheap fakes utilize less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.
+
Cheap fakes utilise less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.
Creating and editing false or misleading audio artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include creating completely new audio content, repurposing existing audio artifacts (including cheap fakes), or using AI-generated audio creation and editing technologies (including deepfakes).
+
Creating and editing false or misleading audio artefacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include creating completely new audio content, repurposing existing audio artefacts (including cheap fakes), or using AI-generated audio creation and editing technologies (including deepfakes).
TA06
@@ -674,7 +670,7 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
Cheap fakes utilize less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.
+
Cheap fakes utilise less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.
TA06
@@ -710,26 +706,25 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
Anonymous accounts or anonymous users refer to users that access network resources without providing a username or password. An influence operation may use anonymous accounts to spread content without direct attribution to the operation.
+
Anonymous accounts or anonymous users refer to users that access network resources without providing a username or password. An influence operation may use anonymous accounts to spread content without direct attribution to the operation.
Cyborg accounts refer to partly manned, partly automated social media accounts. Cyborg accounts primarily act as bots, but a human operator periodically takes control of the account to engage with real social media users by responding to comments and posting original content. Influence operations may use cyborg accounts to reduce the amount of direct human input required to maintain a regular account but increase the apparent legitimacy of the cyborg account by occasionally breaking its bot-like behavior with human interaction.
+
Cyborg accounts refer to partly manned, partly automated social media accounts. Cyborg accounts primarily act as bots, but a human operator periodically takes control of the account to engage with real social media users by responding to comments and posting original content. Influence operations may use cyborg accounts to reduce the amount of direct human input required to maintain a regular account but increase the apparent legitimacy of the cyborg account by occasionally breaking its bot-like behaviour with human interaction.
Bots refer to autonomous internet users that interact with systems or other users while imitating traditional human behavior. Bots use a variety of tools to stay active without direct human operation, including artificial intelligence and big data analytics. For example, an individual may program a Twitter bot to retweet a tweet every time it contains a certain keyword or hashtag. An influence operation may use bots to increase its exposure and artificially promote its content across the internet without dedicating additional time or human resources.
-Amplifier bots promote operation content through reposts, shares, and likes to increase the content’s online popularity. Hacker bots are traditionally covert bots running on computer scripts that rarely engage with users and work primarily as agents of larger cyberattacks, such as a Distributed Denial of Service attacks. Spammer bots are programmed to post content on social media or in comment sections, usually as a supplementary tool. Impersonator bots102 pose as real people by mimicking human behavior, complicating their detection.
+
Bots refer to autonomous internet users that interact with systems or other users while imitating traditional human behaviour. Bots use a variety of tools to stay active without direct human operation, including artificial intelligence and big data analytics. For example, an individual may programme a Twitter bot to retweet a tweet every time it contains a certain keyword or hashtag. An influence operation may use bots to increase its exposure and artificially promote its content across the internet without dedicating additional time or human resources. Amplifier bots promote operation content through reposts, shares, and likes to increase the content’s online popularity. Hacker bots are traditionally covert bots running on computer scripts that rarely engage with users and work primarily as agents of larger cyberattacks, such as a Distributed Denial of Service attacks. Spammer bots are programmed to post content on social media or in comment sections, usually as a supplementary tool. Impersonator bots102 pose as real people by mimicking human behaviour, complicating their detection.
Sockpuppet accounts refer to falsified accounts that either promote the influence operation’s own material or attack critics of the material online. Individuals who control sockpuppet accounts also man at least one other user account.67 Sockpuppet accounts help legitimize operation narratives by providing an appearance of external support for the material and discrediting opponents of the operation.
+
Sockpuppet accounts refer to falsified accounts that either promote the influence operation’s own material or attack critics of the material online. Individuals who control sockpuppet accounts also man at least one other user account.67 Sockpuppet accounts help legitimise operation narratives by providing an appearance of external support for the material and discrediting opponents of the operation.
TA15
@@ -753,52 +748,49 @@ Amplifier bots promote operation content through reposts, shares, and likes to i
An influence operation may hire trolls, or human operators of fake accounts that aim to provoke others by posting and amplifying content about controversial issues. Trolls can serve to discredit an influence operation’s opposition or bring attention to the operation’s cause through debate.
-Classic trolls refer to regular people who troll for personal reasons, such as attention-seeking or boredom. Classic trolls may advance operation narratives by coincidence but are not directly affiliated with any larger operation. Conversely, hybrid trolls act on behalf of another institution, such as a state or financial organization, and post content with a specific ideological goal. Hybrid trolls may be highly advanced and institutionalized or less organized and work for a single individual.
+
An influence operation may hire trolls, or human operators of fake accounts that aim to provoke others by posting and amplifying content about controversial issues. Trolls can serve to discredit an influence operation’s opposition or bring attention to the operation’s cause through debate. Classic trolls refer to regular people who troll for personal reasons, such as attention-seeking or boredom. Classic trolls may advance operation narratives by coincidence but are not directly affiliated with any larger operation. Conversely, hybrid trolls act on behalf of another institution, such as a state or financial organisation, and post content with a specific ideological goal. Hybrid trolls may be highly advanced and institutionalised or less organised and work for a single individual.
Operators build their own network, creating links between accounts -- whether authentic or inauthentic -- in order amplify and promote narratives and artifacts, and encourage further growth of ther network, as well as the ongoing sharing and engagement with operational content.
+
Operators build their own network, creating links between accounts -- whether authentic or inauthentic -- in order amplify and promote narratives and artefacts, and encourage further growth of ther network, as well as the ongoing sharing and engagement with operational content.
Influence operations may establish organizations with legitimate or falsified hierarchies, staff, and content to structure operation assets, provide a sense of legitimacy to the operation, or provide institutional backing to operation activities.
+
Create Organisations
+
Influence operations may establish organisations with legitimate or falsified hierarchies, staff, and content to structure operation assets, provide a sense of legitimacy to the operation, or provide institutional backing to operation activities.
A follow train is a group of people who follow each other on a social media platform, often as a way for an individual or campaign to grow its social media following. Follow trains may be a violation of platform Terms of Service. They are also known as follow-for-follow groups.
+
A follow train is a group of people who follow each other on a social media platform, often as a way for an individual or campaign to grow its social media following. Follow trains may be a violation of platform Terms of Service. They are also known as follow-for-follow groups.
When there is not an existing community or sub-group that meets a campaign's goals, an influence operation may seek to create a community or sub-group.
+
When there is not an existing community or sub-group that meets a campaign's goals, an influence operation may seek to create a community or sub-group.
An influence operation may fund proxies, or external entities that work for the operation. An operation may recruit/train users with existing sympathies towards the operation’s narratives and/or goals as proxies. Funding proxies serves various purposes including:
-- Diversifying operation locations to complicate attribution
-- Reducing the workload for direct operation assets
+
An influence operation may fund proxies, or external entities that work for the operation. An operation may recruit/train users with existing sympathies towards the operation’s narratives and/or goals as proxies. Funding proxies serves various purposes including: - Diversifying operation locations to complicate attribution - Reducing the workload for direct operation assets
Butterfly attacks occur when operators pretend to be members of a certain social group, usually a group that struggles for representation. An influence operation may mimic a group to insert controversial statements into the discourse, encourage the spread of operation content, or promote harassment among group members. Unlike astroturfing, butterfly attacks aim to infiltrate and discredit existing grassroots movements, organizations, and media campaigns.
+
Utilise Butterfly Attacks
+
Butterfly attacks occur when operators pretend to be members of a certain social group, usually a group that struggles for representation. An influence operation may mimic a group to insert controversial statements into the discourse, encourage the spread of operation content, or promote harassment among group members. Unlike astroturfing, butterfly attacks aim to infiltrate and discredit existing grassroots movements, organisations, and media campaigns.
An owned media asset refers to an agency or organization through which an influence operation may create, develop, and host content and narratives. Owned media assets include websites, blogs, social media pages, forums, and other platforms that facilitate the creation and organization of content.
+
An owned media asset refers to an agency or organisation through which an influence operation may create, develop, and host content and narratives. Owned media assets include websites, blogs, social media pages, forums, and other platforms that facilitate the creation and organisation of content.
Outsource Content Creation to External Organizations
-
An influence operation may outsource content creation to external companies to avoid attribution, increase the rate of content creation, or improve content quality, i.e., by employing an organization that can create content in the target audience’s native language. Employed organizations may include marketing companies for tailored advertisements or external content farms for high volumes of targeted media.
+
Outsource Content Creation to External Organisations
+
An influence operation may outsource content creation to external companies to avoid attribution, increase the rate of content creation, or improve content quality, i.e., by employing an organisation that can create content in the target audience’s native language. Employed organisations may include marketing companies for tailored advertisements or external content farms for high volumes of targeted media.
Creating fake people, often with accounts across multiple platforms. These personas can be as simple as a name, can contain slightly more background like location, profile pictures, backstory, or can be effectively backstopped with indicators like fake identity documents.
+
Creating fake people, often with accounts across multiple platforms. These personas can be as simple as a name, can contain slightly more background like location, profile pictures, backstory, or can be effectively backstopped with indicators like fake identity documents.
TA16
@@ -876,29 +868,25 @@ Classic trolls refer to regular people who troll for personal reasons, such as a
An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognizable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organizations, or state entities.
-An influence operation may use a wide variety of cyber techniques to impersonate a legitimate entity’s website or social media account. Typosquatting87 is the international registration of a domain name with purposeful variations of the impersonated domain name through intentional typos, top-level domain (TLD) manipulation, or punycode. Typosquatting facilitates the creation of falsified websites by creating similar domain names in the URL box, leaving it to the user to confirm that the URL is correct.
+
An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognisable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organisations, or state entities. An influence operation may use a wide variety of cyber techniques to impersonate a legitimate entity’s website or social media account. Typosquatting87 is the international registration of a domain name with purposeful variations of the impersonated domain name through intentional typos, top-level domain (TLD) manipulation, or punycode. Typosquatting facilitates the creation of falsified websites by creating similar domain names in the URL box, leaving it to the user to confirm that the URL is correct.
Astroturfing occurs when an influence operation disguises itself as grassroots movement or organization that supports operation narratives. Unlike butterfly attacks, astroturfing aims to increase the appearance of popular support for the operation cause and does not infiltrate existing groups to discredit their objectives.
+
Astroturfing occurs when an influence operation disguises itself as grassroots movement or organisation that supports operation narratives. Unlike butterfly attacks, astroturfing aims to increase the appearance of popular support for the operation cause and does not infiltrate existing groups to discredit their objectives.
An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognizable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organizations, or state entities.
+
An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognisable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organisations, or state entities.
An influence operation may co-opt trusted sources by infiltrating or repurposing a source to reach a target audience through existing, previously reliable networks. Co-opted trusted sources may include:
-- National or local new outlets
-- Research or academic publications
-- Online blogs or websites
+
An influence operation may co-opt trusted sources by infiltrating or repurposing a source to reach a target audience through existing, previously reliable networks. Co-opted trusted sources may include: - National or local new outlets - Research or academic publications - Online blogs or websites
TA16
@@ -921,14 +909,14 @@ An influence operation may use a wide variety of cyber techniques to impersonate
Localized content refers to content that appeals to a specific community of individuals, often in defined geographic areas. An operation may create localized content using local language and dialects to resonate with its target audience and blend in with other local news and social media. Localized content may help an operation increase legitimacy, avoid detection, and complicate external attribution.
+
Create Localised Content
+
Localised content refers to content that appeals to a specific community of individuals, often in defined geographic areas. An operation may create localised content using local language and dialects to resonate with its target audience and blend in with other local news and social media. Localised content may help an operation increase legitimacy, avoid detection, and complicate external attribution.
An echo chamber refers to an internet subgroup, often along ideological lines, where individuals only engage with “others with which they are already in agreement.” A filter bubble refers to an algorithm's placement of an individual in content that they agree with or regularly engage with, possibly entrapping the user into a bubble of their own making. An operation may create these isolated areas of the internet by match existing groups, or aggregating individuals into a single target audience based on shared interests, politics, values, demographics, and other characteristics. Echo chambers and filter bubbles help to reinforce similar biases and content to the same target audience members.
+
An echo chamber refers to an internet subgroup, often along ideological lines, where individuals only engage with “others with which they are already in agreement.” A filter bubble refers to an algorithm's placement of an individual in content that they agree with or regularly engage with, possibly entrapping the user into a bubble of their own making. An operation may create these isolated areas of the internet by match existing groups, or aggregating individuals into a single target audience based on shared interests, politics, values, demographics, and other characteristics. Echo chambers and filter bubbles help to reinforce similar biases and content to the same target audience members.
TA05
@@ -946,8 +934,7 @@ An influence operation may use a wide variety of cyber techniques to impersonate
A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation.
-A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalizing on most search engines’ preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term.
+
A data void refers to a word or phrase that results in little, manipulative, or low-quality search engine data. Data voids are hard to detect and relatively harmless until exploited by an entity aiming to quickly proliferate false or misleading information during a phenomenon that causes a high number of individuals to query the term or phrase. In the Plan phase, an influence operation may identify data voids for later exploitation in the operation. A 2019 report by Michael Golebiewski identifies five types of data voids. (1) “Breaking news” data voids occur when a keyword gains popularity during a short period of time, allowing an influence operation to publish false content before legitimate news outlets have an opportunity to publish relevant information. (2) An influence operation may create a “strategic new terms” data void by creating their own terms and publishing information online before promoting their keyword to the target audience. (3) An influence operation may publish content on “outdated terms” that have decreased in popularity, capitalising on most search engines’ preferences for recency. (4) “Fragmented concepts” data voids separate connections between similar ideas, isolating segment queries to distinct search engine results. (5) An influence operation may use “problematic queries” that previously resulted in disturbing or inappropriate content to promote messaging until mainstream media recontextualizes the term.
TA05
@@ -989,7 +976,7 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
An audio livestream refers to an online audio broadcast capability that allows for real-time communication to closed or open networks. Examples include Twitter Spaces,
+
An audio livestream refers to an online audio broadcast capability that allows for real-time communication to closed or open networks. Examples include Twitter Spaces,
TA07
@@ -1055,19 +1042,19 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
Platforms for finding, reviewing, and sharing information about brands, products, services, restaurants, travel destinations, etc. Examples include Yelp, TripAdvisor, etc.
+
Platforms for finding, reviewing, and sharing information about brands, products, services, restaurants, travel destinations, etc. Examples include Yelp, TripAdvisor, etc.
Leveraging formal, traditional, diplomatic channels to communicate with foreign governments (written documents, meetings, summits, diplomatic visits, etc). This type of diplomacy is conducted by diplomats of one nation with diplomats and other officials of another nation or international organization.
+
Leveraging formal, traditional, diplomatic channels to communicate with foreign governments (written documents, meetings, summits, diplomatic visits, etc). This type of diplomacy is conducted by diplomats of one nation with diplomats and other officials of another nation or international organisation.
TA07
@@ -1103,7 +1090,7 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
Commercial analytic firms collect data on target audience activities and evaluate the data to detect trends, such as content receiving high click-rates. An influence operation may employ commercial analytic firms to facilitate external collection on its target audience, complicating attribution efforts and better tailoring the content to audience preferences.
+
Commercial analytic firms collect data on target audience activities and evaluate the data to detect trends, such as content receiving high click-rates. An influence operation may employ commercial analytic firms to facilitate external collection on its target audience, complicating attribution efforts and better tailoring the content to audience preferences.
TA08
@@ -1127,7 +1114,7 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
Direct posting refers to a method of posting content via a one-way messaging service, where the recipient cannot directly respond to the poster’s messaging. An influence operation may post directly to promote operation narratives to the target audience without allowing opportunities for fact-checking or disagreement, creating a false sense of support for the narrative.
+
Direct posting refers to a method of posting content via a one-way messaging service, where the recipient cannot directly respond to the poster’s messaging. An influence operation may post directly to promote operation narratives to the target audience without allowing opportunities for fact-checking or disagreement, creating a false sense of support for the narrative.
Cross-posting refers to posting the same message to multiple internet discussions, social media platforms or accounts, or news groups at one time. An influence operation may post content online in multiple communities and platforms to increase the chances of content exposure to the target audience.
+
Cross-posting refers to posting the same message to multiple internet discussions, social media platforms or accounts, or news groups at one time. An influence operation may post content online in multiple communities and platforms to increase the chances of content exposure to the target audience.
An influence operation may post content across groups to spread narratives and content to new communities within the target audiences or to new target audiences.
+
An influence operation may post content across groups to spread narratives and content to new communities within the target audiences or to new target audiences.
An influence operation may post content across platforms to spread narratives and content to new communities within the target audiences or to new target audiences. Posting across platforms can also remove opposition and context, helping the narrative spread with less opposition on the cross-posted platform.
+
An influence operation may post content across platforms to spread narratives and content to new communities within the target audiences or to new target audiences. Posting across platforms can also remove opposition and context, helping the narrative spread with less opposition on the cross-posted platform.
TA17
@@ -1199,13 +1186,13 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
Incentivizing content sharing refers to actions that encourage users to share content themselves, reducing the need for the operation itself to post and promote its own content.
+
Incentivizing content sharing refers to actions that encourage users to share content themselves, reducing the need for the operation itself to post and promote its own content.
Manipulating a platform algorithm refers to conducting activity on a platform in a way that intentionally targets its underlying algorithm. After analyzing a platform’s algorithm (see: Select Platforms), an influence operation may use a platform in a way that increases its content exposure, avoids content removal, or otherwise benefits the operation’s strategy. For example, an influence operation may use bots to amplify its posts so that the platform’s algorithm recognizes engagement with operation content and further promotes the content on user timelines.
+
Manipulating a platform algorithm refers to conducting activity on a platform in a way that intentionally targets its underlying algorithm. After analysing a platform’s algorithm (see: Select Platforms), an influence operation may use a platform in a way that increases its content exposure, avoids content removal, or otherwise benefits the operation’s strategy. For example, an influence operation may use bots to amplify its posts so that the platform’s algorithm recognises engagement with operation content and further promotes the content on user timelines.
Bypassing content blocking refers to actions taken to circumvent network security measures that prevent users from accessing certain servers, resources, or other online spheres. An influence operation may bypass content blocking to proliferate its content on restricted areas of the internet. Common strategies for bypassing content blocking include:
-- Altering IP addresses to avoid IP filtering
-- Using a Virtual Private Network (VPN) to avoid IP filtering
-- Using a Content Delivery Network (CDN) to avoid IP filtering
-- Enabling encryption to bypass packet inspection blocking
-- Manipulating text to avoid filtering by keywords
-- Posting content on multiple platforms to avoid platform-specific removals - Using local facilities or modified DNS servers to avoid DNS filtering
+
Bypassing content blocking refers to actions taken to circumvent network security measures that prevent users from accessing certain servers, resources, or other online spheres. An influence operation may bypass content blocking to proliferate its content on restricted areas of the internet. Common strategies for bypassing content blocking include: - Altering IP addresses to avoid IP filtering - Using a Virtual Private Network (VPN) to avoid IP filtering - Using a Content Delivery Network (CDN) to avoid IP filtering - Enabling encryption to bypass packet inspection blocking - Manipulating text to avoid filtering by keywords - Posting content on multiple platforms to avoid platform-specific removals - Using local facilities or modified DNS servers to avoid DNS filtering
Direct users to alternative platforms refers to encouraging users to move from the platform on which they initially viewed operation content and engage with content on alternate information channels, including separate social media channels and inauthentic websites. An operation may drive users to alternative platforms to diversify its information channels and ensure the target audience knows where to access operation content if the initial platform suspends, flags, or otherwise removes original operation assets and content.
+
Direct users to alternative platforms refers to encouraging users to move from the platform on which they initially viewed operation content and engage with content on alternate information channels, including separate social media channels and inauthentic websites. An operation may drive users to alternative platforms to diversify its information channels and ensure the target audience knows where to access operation content if the initial platform suspends, flags, or otherwise removes original operation assets and content.
Control Information Environment through Offensive Cyberspace Operations
-
Controlling the information environment through offensive cyberspace operations uses cyber tools and techniques to alter the trajectory of content in the information space to either prioritize operation messaging or block opposition messaging.
+
Controlling the information environment through offensive cyberspace operations uses cyber tools and techniques to alter the trajectory of content in the information space to either prioritise operation messaging or block opposition messaging.
Deleting opposing content refers to the removal of content that conflicts with operational narratives from selected platforms. An influence operation may delete opposing content to censor contradictory information from the target audience, allowing operation narratives to take priority in the information space.
+
Deleting opposing content refers to the removal of content that conflicts with operational narratives from selected platforms. An influence operation may delete opposing content to censor contradictory information from the target audience, allowing operation narratives to take priority in the information space.
Content blocking refers to actions taken to restrict internet access or render certain areas of the internet inaccessible. An influence operation may restrict content based on both network and content attributes.
+
Content blocking refers to actions taken to restrict internet access or render certain areas of the internet inaccessible. An influence operation may restrict content based on both network and content attributes.
Destroying information generation capabilities refers to actions taken to limit, degrade, or otherwise incapacitate an actor’s ability to generate conflicting information. An influence operation may destroy an actor’s information generation capabilities by physically dismantling the information infrastructure, disconnecting resources needed for information generation, or redirecting information generation personnel. An operation may destroy an adversary’s information generation capabilities to limit conflicting content exposure to the target audience and crowd the information space with its own narratives.
+
Destroying information generation capabilities refers to actions taken to limit, degrade, or otherwise incapacitate an actor’s ability to generate conflicting information. An influence operation may destroy an actor’s information generation capabilities by physically dismantling the information infrastructure, disconnecting resources needed for information generation, or redirecting information generation personnel. An operation may destroy an adversary’s information generation capabilities to limit conflicting content exposure to the target audience and crowd the information space with its own narratives.
A server redirect, also known as a URL redirect, occurs when a server automatically forwards a user from one URL to another using server-side scripting languages. An influence operation may conduct a server redirect to divert target audience members from one website to another without their knowledge. The redirected website may pose as a legitimate source, host malware, or otherwise aid operation objectives.
+
A server redirect, also known as a URL redirect, occurs when a server automatically forwards a user from one URL to another using server-side scripting languages. An influence operation may conduct a server redirect to divert target audience members from one website to another without their knowledge. The redirected website may pose as a legitimate source, host malware, or otherwise aid operation objectives.
Operators can suppress the opposition by exploiting platform content moderation tools and processes like reporting non-violative content to platforms for takedown and goading opposition actors into taking actions that result in platform action or target audience disapproval.
+
Operators can suppress the opposition by exploiting platform content moderation tools and processes like reporting non-violative content to platforms for takedown and goading opposition actors into taking actions that result in platform action or target audience disapproval.
Reporting opposing content refers to notifying and providing an instance of a violation of a platform’s guidelines and policies for conduct on the platform. In addition to simply reporting the content, an operation may leverage copyright regulations to trick social media and web platforms into removing opposing content by manipulating the content to appear in violation of copyright laws. Reporting opposing content facilitates the suppression of contradictory information and allows operation narratives to take priority in the information space.
+
Reporting opposing content refers to notifying and providing an instance of a violation of a platform’s guidelines and policies for conduct on the platform. In addition to simply reporting the content, an operation may leverage copyright regulations to trick social media and web platforms into removing opposing content by manipulating the content to appear in violation of copyright laws. Reporting opposing content facilitates the suppression of contradictory information and allows operation narratives to take priority in the information space.
Physical violence refers to the use of force to injure, abuse, damage, or destroy. An influence operation may conduct or encourage physical violence to discourage opponents from promoting conflicting content or draw attention to operation narratives using shock value.
+
Physical violence refers to the use of force to injure, abuse, damage, or destroy. An influence operation may conduct or encourage physical violence to discourage opponents from promoting conflicting content or draw attention to operation narratives using shock value.
An operation may use pseudonyms, or fake names, to mask the identity of operation accounts, publish anonymous content, or otherwise use falsified personas to conceal identity of the operation. An operation may coordinate pseudonyms across multiple platforms, for example, by writing an article under a pseudonym and then posting a link to the article on social media on an account with the same falsified name.
+
An operation may use pseudonyms, or fake names, to mask the identity of operation accounts, publish anonymous content, or otherwise use falsified personas to conceal identity of the operation. An operation may coordinate pseudonyms across multiple platforms, for example, by writing an article under a pseudonym and then posting a link to the article on social media on an account with the same falsified name.
Concealing network identity aims to hide the existence an influence operation’s network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organization.
+
Concealing network identity aims to hide the existence an influence operation’s network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organisation.
Distancing reputable individuals from the operation occurs when enlisted individuals, such as celebrities or subject matter experts, actively disengage themselves from operation activities and messaging. Individuals may distance themselves from the operation by deleting old posts or statements, unfollowing operation information assets, or otherwise detaching themselves from the operation’s timeline. An influence operation may want reputable individuals to distance themselves from the operation to reduce operation exposure, particularly if the operation aims to remove all evidence.
+
Distancing reputable individuals from the operation occurs when enlisted individuals, such as celebrities or subject matter experts, actively disengage themselves from operation activities and messaging. Individuals may distance themselves from the operation by deleting old posts or statements, unfollowing operation information assets, or otherwise detaching themselves from the operation’s timeline. An influence operation may want reputable individuals to distance themselves from the operation to reduce operation exposure, particularly if the operation aims to remove all evidence.
Account laundering occurs when an influence operation acquires control of previously legitimate online accounts from third parties through sale or exchange and often in contravention of terms of use. Influence operations use laundered accounts to reach target audience members from an existing information channel and complicate attribution.
+
Account laundering occurs when an influence operation acquires control of previously legitimate online accounts from third parties through sale or exchange and often in contravention of terms of use. Influence operations use laundered accounts to reach target audience members from an existing information channel and complicate attribution.
Changing names of accounts occurs when an operation changes the name of an existing social media account. An operation may change the names of its accounts throughout an operation to avoid detection or alter the names of newly acquired or repurposed accounts to fit operational narratives.
+
Changing names of accounts occurs when an operation changes the name of an existing social media account. An operation may change the names of its accounts throughout an operation to avoid detection or alter the names of newly acquired or repurposed accounts to fit operational narratives.
TA11
@@ -1379,25 +1360,25 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
Concealing network identity aims to hide the existence an influence operation’s network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organization.
+
Concealing network identity aims to hide the existence an influence operation’s network completely. Unlike concealing sponsorship, concealing network identity denies the existence of any sort of organisation.
An influence operation may mix its own operation content with legitimate news or external unrelated content to disguise operational objectives, narratives, or existence. For example, an operation may generate "lifestyle" or "cuisine" content alongside regular operation content.
+
An influence operation may mix its own operation content with legitimate news or external unrelated content to disguise operational objectives, narratives, or existence. For example, an operation may generate "lifestyle" or "cuisine" content alongside regular operation content.
Breaking association with content occurs when an influence operation actively separates itself from its own content. An influence operation may break association with content by unfollowing, unliking, or unsharing its content, removing attribution from its content, or otherwise taking actions that distance the operation from its messaging. An influence operation may break association with its content to complicate attribution or regain credibility for a new operation.
+
Breaking association with content occurs when an influence operation actively separates itself from its own content. An influence operation may break association with content by unfollowing, unliking, or unsharing its content, removing attribution from its content, or otherwise taking actions that distance the operation from its messaging. An influence operation may break association with its content to complicate attribution or regain credibility for a new operation.
URL deletion occurs when an influence operation completely removes its website registration, rendering the URL inaccessible. An influence operation may delete its URLs to complicate attribution or remove online documentation that the operation ever occurred.
+
URL deletion occurs when an influence operation completely removes its website registration, rendering the URL inaccessible. An influence operation may delete its URLs to complicate attribution or remove online documentation that the operation ever occurred.
TA11
@@ -1415,25 +1396,25 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
Deleting accounts and account activity occurs when an influence operation removes its online social media assets, including social media accounts, posts, likes, comments, and other online artifacts. An influence operation may delete its accounts and account activity to complicate attribution or remove online documentation that the operation ever occurred.
+
Deleting accounts and account activity occurs when an influence operation removes its online social media assets, including social media accounts, posts, likes, comments, and other online artefacts. An influence operation may delete its accounts and account activity to complicate attribution or remove online documentation that the operation ever occurred.
An influence operation may redirect its falsified or typosquatted URLs to legitimate websites to increase the operation's appearance of legitimacy, complicate attribution, and avoid detection.
+
An influence operation may redirect its falsified or typosquatted URLs to legitimate websites to increase the operation's appearance of legitimacy, complicate attribution, and avoid detection.
Removing post origins refers to the elimination of evidence that indicates the initial source of operation content, often to complicate attribution. An influence operation may remove post origins by deleting watermarks, renaming files, or removing embedded links in its content.
+
Removing post origins refers to the elimination of evidence that indicates the initial source of operation content, often to complicate attribution. An influence operation may remove post origins by deleting watermarks, renaming files, or removing embedded links in its content.
Misattributed activity refers to incorrectly attributed operation activity. For example, a state sponsored influence operation may conduct operation activity in a way that mimics another state so that external entities misattribute activity to the incorrect state. An operation may misattribute their activities to complicate attribution, avoid detection, or frame an adversary for negative behavior.
+
Misattributed activity refers to incorrectly attributed operation activity. For example, a state sponsored influence operation may conduct operation activity in a way that mimics another state so that external entities misattribute activity to the incorrect state. An operation may misattribute their activities to complicate attribution, avoid detection, or frame an adversary for negative behaviour.
TA11
@@ -1445,26 +1426,25 @@ A 2019 report by Michael Golebiewski identifies five types of data voids. (1)
Concealing sponsorship aims to mislead or obscure the identity of the hidden sponsor behind an operation rather than entity publicly running the operation. Operations that conceal sponsorship may maintain visible falsified groups, news outlets, non-profits, or other organizations, but seek to mislead or obscure the identity sponsoring, funding, or otherwise supporting these entities.
-Influence operations may use a variety of techniques to mask the location of their social media accounts to complicate attribution and conceal evidence of foreign interference. Operation accounts may set their location to a false place, often the location of the operation’s target audience, and post in the region’s language
+
Concealing sponsorship aims to mislead or obscure the identity of the hidden sponsor behind an operation rather than entity publicly running the operation. Operations that conceal sponsorship may maintain visible falsified groups, news outlets, non-profits, or other organisations, but seek to mislead or obscure the identity sponsoring, funding, or otherwise supporting these entities. Influence operations may use a variety of techniques to mask the location of their social media accounts to complicate attribution and conceal evidence of foreign interference. Operation accounts may set their location to a false place, often the location of the operation’s target audience, and post in the region’s language
Hosting refers to services through which storage and computing resources are provided to an individual or organization for the accommodation and maintenance of one or more websites and related services. Services may include web hosting, file sharing, and email distribution. Bulletproof hosting refers to services provided by an entity, such as a domain hosting or web hosting firm, that allows its customer considerable leniency in use of the service. An influence operation may utilize bulletproof hosting to maintain continuity of service for suspicious, illegal, or disruptive operation activities that stricter hosting services would limit, report, or suspend.
+
Utilise Bulletproof Hosting
+
Hosting refers to services through which storage and computing resources are provided to an individual or organisation for the accommodation and maintenance of one or more websites and related services. Services may include web hosting, file sharing, and email distribution. Bulletproof hosting refers to services provided by an entity, such as a domain hosting or web hosting firm, that allows its customer considerable leniency in use of the service. An influence operation may utilise bulletproof hosting to maintain continuity of service for suspicious, illegal, or disruptive operation activities that stricter hosting services would limit, report, or suspend.