Amended technique names to MLA Title Case in following sheets: techniques, detections and countermeasures

This commit is contained in:
Stephen Campbell 2023-05-25 15:57:53 -04:00
parent c4275fe3f8
commit c39577572d
184 changed files with 585 additions and 605 deletions

View file

@ -1,4 +1,4 @@
# Technique T0009: Create fake experts
# Technique T0009: Create Fake Experts
* **Summary**: Stories planted or promoted in computational propaganda operations often make use of experts fabricated from whole cloth, sometimes specifically for the story itself.

View file

@ -1,4 +1,4 @@
# Technique T0010: Cultivate ignorant agents
# Technique T0010: Cultivate Ignorant Agents
* **Summary**: Cultivate propagandists for a cause, the goals of which are not fully comprehended, and who are used cynically by the leaders of the cause. Independent actors use social media and specialised web sites to strategically reinforce and spread messages compatible with their own. Their networks are infiltrated and used by state media disinformation organisations to amplify the states own disinformation strategies against target populations. Many are traffickers in conspiracy theories or hoaxes, unified by a suspicion of Western governments and mainstream media. Their narratives, which appeal to leftists hostile to globalism and military intervention and nationalists against immigration, are frequently infiltrated and shaped by state-controlled trolls and altered news items from agencies such as RT and Sputnik. Also know as "useful idiots" or "unwitting agents".

View file

@ -1,4 +1,4 @@
# Technique T0011: Compromise legitimate accounts
# Technique T0011: Compromise Legitimate Accounts
* **Summary**: Hack or take over legimate accounts to distribute misinformation or damaging content.

View file

@ -1,4 +1,4 @@
# Technique T0013: Create inauthentic websites
# Technique T0013: Create Inauthentic Websites
* **Summary**: Create media assets to support inauthentic organizations (e.g. think tank), people (e.g. experts) and/or serve as sites to distribute malware/launch phishing operations.

View file

@ -1,4 +1,4 @@
# Technique T0014.001: Raise funds from malign actors
# Technique T0014.001: Raise Funds from Malign Actors
* **Summary**: Raising funds from malign actors may include contributions from foreign agents, cutouts or proxies, shell companies, dark money groups, etc.

View file

@ -1,4 +1,4 @@
# Technique T0014.002: Raise funds from ignorant agents
# Technique T0014.002: Raise Funds from Ignorant Agents
* **Summary**: Raising funds from ignorant agents may include scams, donations intended for one stated purpose but then used for another, etc.

View file

@ -1,4 +1,4 @@
# Technique T0014: Prepare fundraising campaigns
# Technique T0014: Prepare Fundraising Campaigns
* **Summary**: Fundraising campaigns refer to an influence operations systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities.

View file

@ -1,4 +1,4 @@
# Technique T0015: Create hashtags and search artifacts
# Technique T0015: Create Hashtags and Search Artifacts
* **Summary**: Create one or more hashtags and/or hashtag groups. Many incident-based campaigns will create hashtags to promote their fabricated event. Creating a hashtag for an incident can have two important effects: 1. Create a perception of reality around an event. Certainly only "real" events would be discussed in a hashtag. After all, the event has a name!, and 2. Publicize the story more widely through trending lists and search behavior. Asset needed to direct/control/manage "conversation" connected to launching new incident/campaign with new hashtag for applicable social media sites).

View file

@ -1,4 +1,4 @@
# Technique T0017: Conduct fundraising
# Technique T0017: Conduct Fundraising
* **Summary**: Fundraising campaigns refer to an influence operations systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services166 on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns to promote operation messaging while raising money to support its activities.

View file

@ -1,4 +1,4 @@
# Technique T0019.001: Create fake research
# Technique T0019.001: Create Fake Research
* **Summary**: Create fake academic research. Example: fake social science research is often aimed at hot-button social issues such as gender, race and sexuality. Fake science research can target Climate Science debate or pseudoscience like anti-vaxx

View file

@ -1,4 +1,4 @@
# Technique T0019: Generate information pollution
# Technique T0019: Generate Information Pollution
* **Summary**: Flood social channels; drive traffic/engagement to all assets; create aura/sense/perception of pervasiveness/consensus (for or against or both simultaneously) of an issue or topic. "Nothing is true, but everything is possible." Akin to astroturfing campaign.

View file

@ -1,4 +1,4 @@
# Technique T0020: Trial content
# Technique T0020: Trial Content
* **Summary**: Iteratively test incident performance (messages, content etc), e.g. A/B test headline/content enagagement metrics; website and/or funding campaign conversion rates

View file

@ -1,4 +1,4 @@
# Technique T0023: Distort facts
# Technique T0023: Distort Facts
* **Summary**: Change, twist, or exaggerate existing facts to construct a narrative that differs from reality. Examples: images and ideas can be distorted by being placed in an improper content

View file

@ -1,4 +1,4 @@
# Technique T0029: Online polls
# Technique T0029: Online Polls
* **Summary**: Create fake online polls, or manipulate existing online polls. Data gathering tactic to target those who engage, and potentially their networks of friends/followers as well

View file

@ -1,37 +1,17 @@
# Technique T0039: Prepare fundraising campaigns
# Technique T0039 : Bait Legitimate Influencers
* **Summary**: Fundraising campaigns refer to an influence operations systematic effort to seek financial support for a charity, cause, or other enterprise using online activities that further promote operation information pathways while raising a profit. Many influence operations have engaged in crowdfunding services on platforms including Tipee, Patreon, and GoFundMe. An operation may use its previously prepared fundraising campaigns (see: Develop Information Pathways) to promote operation messaging while raising money to support its activities.
* **Summary**: Credibility in a social media environment is often a function of the size of a user's network. "Influencers" are so-called because of their reach, typically understood as: 1) the size of their network (i.e. the number of followers, perhaps weighted by their own influence); and 2) The rate at which their comments are re-circulated (these two metrics are related). Add traditional media players at all levels of credibility and professionalism to this, and the number of potential influencial carriers available for unwitting amplification becomes substantial. By targeting high-influence people and organizations in all types of media with narratives and content engineered to appeal their emotional or ideological drivers, influence campaigns are able to add perceived credibility to their messaging via saturation and adoption by trusted agents such as celebrities, journalists and local leaders.
* **Belongs to tactic stage**: TA15
* **Belongs to tactic stage**: TA08
| Incident | Descriptions given for this incident |
| -------- | -------------------- |
| [I00006 Columbian Chemicals](../generated_pages/incidents/I00006.md) | bait journalists/media/politicians |
| [I00010 ParklandTeens](../generated_pages/incidents/I00010.md) | journalist/media baiting |
| [I00015 ConcordDiscovery](../generated_pages/incidents/I00015.md) | journalist/media baiting |
| Counters | Response types |
| -------- | -------------- |
| [C00009 Educate high profile influencers on best practices](../generated_pages/counters/C00009.md) | D02 |
| [C00027 Create culture of civility](../generated_pages/counters/C00027.md) | D07 |
| [C00046 Marginalise and discredit extremist groups](../generated_pages/counters/C00046.md) | D04 |
| [C00072 Remove non-relevant content from special interest groups - not recommended](../generated_pages/counters/C00072.md) | D02 |
| [C00073 Inoculate populations through media literacy training](../generated_pages/counters/C00073.md) | D02 |
| [C00076 Prohibit images in political discourse channels](../generated_pages/counters/C00076.md) | D02 |
| [C00087 Make more noise than the disinformation](../generated_pages/counters/C00087.md) | D04 |
| [C00092 Establish a truth teller reputation score for influencers](../generated_pages/counters/C00092.md) | D07 |
| [C00093 Influencer code of conduct](../generated_pages/counters/C00093.md) | D07 |
| [C00114 Don't engage with payloads](../generated_pages/counters/C00114.md) | D02 |
| [C00154 Ask media not to report false information](../generated_pages/counters/C00154.md) | D02 |
| [C00160 find and train influencers](../generated_pages/counters/C00160.md) | D02 |
| [C00162 Unravel/target the Potemkin villages](../generated_pages/counters/C00162.md) | D03 |
| [C00169 develop a creative content hub](../generated_pages/counters/C00169.md) | D03 |
| [C00184 Media exposure](../generated_pages/counters/C00184.md) | D04 |
| [C00188 Newsroom/Journalist training to counter influence moves](../generated_pages/counters/C00188.md) | D03 |
| [C00203 Stop offering press credentials to propaganda outlets](../generated_pages/counters/C00203.md) | D03 |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW

View file

@ -1,4 +1,4 @@
# Technique T0040: Demand insurmountable proof
# Technique T0040: Demand Insurmountable Proof
* **Summary**: Campaigns often leverage tactical and informational asymmetries on the threat surface, as seen in the Distort and Deny strategies, and the "firehose of misinformation". Specifically, conspiracy theorists can be repeatedly wrong, but advocates of the truth need to be perfect. By constantly escalating demands for proof, propagandists can effectively leverage this asymmetry while also priming its future use, often with an even greater asymmetric advantage. The conspiracist is offered freer rein for a broader range of "questions" while the truth teller is burdened with higher and higher standards of proof.

View file

@ -1,4 +1,4 @@
# Technique T0042: Seed Kernel of truth
# Technique T0042: Seed Kernel of Truth
* **Summary**: Wrap lies or altered context/facts around truths. Influence campaigns pursue a variety of objectives with respect to target audiences, prominent among them: 1. undermine a narrative commonly referenced in the target audience; or 2. promote a narrative less common in the target audience, but preferred by the attacker. In both cases, the attacker is presented with a heavy lift. They must change the relative importance of various narratives in the interpretation of events, despite contrary tendencies. When messaging makes use of factual reporting to promote these adjustments in the narrative space, they are less likely to be dismissed out of hand; when messaging can juxtapose a (factual) truth about current affairs with the (abstract) truth explicated in these narratives, propagandists can undermine or promote them selectively. Context matters.

View file

@ -1,4 +1,4 @@
# Technique T0043: Chat apps
# Technique T0043: Chat Apps
* **Summary**: Direct messaging via chat app is an increasing method of delivery. These messages are often automated and new delivery and storage methods make them anonymous, viral, and ephemeral. This is a difficult space to monitor, but also a difficult space to build acclaim or notoriety.

View file

@ -1,4 +1,4 @@
# Technique T0044: Seed distortions
# Technique T0044: Seed Distortions
* **Summary**: Try a wide variety of messages in the early hours surrounding an incident or event, to give a misleading account or impression.

View file

@ -1,4 +1,4 @@
# Technique T0045: Use fake experts
# Technique T0045: Use Fake Experts
* **Summary**: Use the fake experts that were set up during Establish Legitimacy. Pseudo-experts are disposable assets that often appear once and then disappear. Give "credility" to misinformation. Take advantage of credential bias

View file

@ -1,4 +1,4 @@
# Technique T0047: Censor social media as a political force
# Technique T0047: Censor Social Media as a Political Force
* **Summary**: Use political influence or the power of state to stop critical social media comments. Government requested/driven content take downs (see Google Transperancy reports).

View file

@ -1,4 +1,4 @@
# Technique T0049.001: Trolls amplify and manipulate
# Technique T0049.001: Trolls Amplify and Manipulate
* **Summary**: Use trolls to amplify narratives and/or manipulate narratives. Fake profiles/sockpuppets operating to support individuals/narratives from the entire political spectrum (left/right binary). Operating with increased emphasis on promoting local content and promoting real Twitter users generating their own, often divisive political content, as it's easier to amplify existing content than create new/original content. Trolls operate where ever there's a socially divisive issue (issues that can/are be politicized).

View file

@ -1,4 +1,4 @@
# Technique T0049.002: Hijack existing hashtag
# Technique T0049.002: Hijack Existing Hashtag
* **Summary**: Take over an existing hashtag to drive exposure.

View file

@ -1,4 +1,4 @@
# Technique T0059: Play the long game
# Technique T0059: Play the Long Game
* **Summary**: Play the long game refers to two phenomena: 1. To plan messaging and allow it to grow organically without conducting your own amplification. This is methodical and slow and requires years for the message to take hold 2. To develop a series of seemingly disconnected messaging narratives that eventually combine into a new narrative.

View file

@ -1,4 +1,4 @@
# Technique T0085: Develop Text-based Content
# Technique T0085: Develop Text-Based Content
* **Summary**: Creating and editing false or misleading text-based artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign.

View file

@ -1,4 +1,4 @@
# Technique T0086.003: Deceptively Edit Images (Cheap fakes)
# Technique T0086.003: Deceptively Edit Images (Cheap Fakes)
* **Summary**: Cheap fakes utilize less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.

View file

@ -1,4 +1,4 @@
# Technique T0086: Develop Image-based Content
# Technique T0086: Develop Image-Based Content
* **Summary**: Creating and editing false or misleading visual artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include photographing staged real-life situations, repurposing existing digital images, or using image creation and editing technologies.

View file

@ -1,4 +1,4 @@
# Technique T0087.002: Deceptively Edit Video (Cheap fakes)
# Technique T0087.002: Deceptively Edit Video (Cheap Fakes)
* **Summary**: Cheap fakes utilize less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.

View file

@ -1,4 +1,4 @@
# Technique T0087: Develop Video-based Content
# Technique T0087: Develop Video-Based Content
* **Summary**: Creating and editing false or misleading video artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include staging videos of purportedly real situations, repurposing existing video artifacts, or using AI-generated video creation and editing technologies (including deepfakes).

View file

@ -1,4 +1,4 @@
# Technique T0088.002: Deceptively Edit Audio (Cheap fakes)
# Technique T0088.002: Deceptively Edit Audio (Cheap Fakes)
* **Summary**: Cheap fakes utilize less sophisticated measures of altering an image, video, or audio for example, slowing, speeding, or cutting footage to create a false context surrounding an image or event.

View file

@ -1,4 +1,4 @@
# Technique T0088: Develop Audio-based Content
# Technique T0088: Develop Audio-Based Content
* **Summary**: Creating and editing false or misleading audio artifacts, often aligned with one or more specific narratives, for use in a disinformation campaign. This may include creating completely new audio content, repurposing existing audio artifacts (including cheap fakes), or using AI-generated audio creation and editing technologies (including deepfakes).

View file

@ -1,4 +1,4 @@
# Technique T0091: Recruit malign actors
# Technique T0091: Recruit Malign Actors
* **Summary**: Operators recruit bad actors paying recruiting, or exerting control over individuals includes trolls, partisans, and contractors.

View file

@ -1,4 +1,4 @@
# Technique T0092.003: Create Community or Sub-group
# Technique T0092.003: Create Community or Sub-Group
* **Summary**: When there is not an existing community or sub-group that meets a campaign's goals, an influence operation may seek to create a community or sub-group.

View file

@ -1,4 +1,4 @@
# Technique T0094.001: Identify susceptible targets in networks
# Technique T0094.001: Identify Susceptible Targets in Networks
* **Summary**: When seeking to infiltrate an existing network, an influence operation may identify individuals and groups that might be susceptible to being co-opted or influenced.

View file

@ -1,4 +1,4 @@
# Technique T0097.001: Backstop personas
# Technique T0097.001: Backstop Personas
* **Summary**: Create other assets/dossier/cover/fake relationships and/or connections or documents, sites, bylines, attributions, to establish/augment/inflate crediblity/believability

View file

@ -1,4 +1,4 @@
# Technique T0097: Create personas
# Technique T0097: Create Personas
* **Summary**: Creating fake people, often with accounts across multiple platforms. These personas can be as simple as a name, can contain slightly more background like location, profile pictures, backstory, or can be effectively backstopped with indicators like fake identity documents.

View file

@ -1,4 +1,4 @@
# Technique T0099.002: Spoof/parody account/site
# Technique T0099.002: Spoof/Parody Account/Site
* **Summary**: An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognizable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organizations, or state entities.

View file

@ -1,4 +1,4 @@
# Technique T0100.003: Co-opt Influencers
# Technique T0100.003: Co-Opt Influencers
* **Summary**: Co-opt Influencers

View file

@ -1,4 +1,4 @@
# Technique T0100: Co-opt Trusted Sources
# Technique T0100: Co-Opt Trusted Sources
* **Summary**: An influence operation may co-opt trusted sources by infiltrating or repurposing a source to reach a target audience through existing, previously reliable networks. Co-opted trusted sources may include:
- National or local new outlets

View file

@ -1,4 +1,4 @@
# Technique T0102.001: Use existing Echo Chambers/Filter Bubbles
# Technique T0102.001: Use Existing Echo Chambers/Filter Bubbles
* **Summary**: Use existing Echo Chambers/Filter Bubbles

View file

@ -1,4 +1,4 @@
# Technique T0104.005: Use hashtags
# Technique T0104.005: Use Hashtags
* **Summary**: Use a dedicated, existing hashtag for the campaign/incident.

View file

@ -1,4 +1,4 @@
# Technique T0104.006: Create dedicated hashtag
# Technique T0104.006: Create Dedicated Hashtag
* **Summary**: Create a campaign/incident specific hashtag.

View file

@ -1,4 +1,4 @@
# Technique T0105.003: Audio sharing
# Technique T0105.003: Audio Sharing
* **Summary**: Examples include podcasting apps, Soundcloud, etc.

View file

@ -1,4 +1,4 @@
# Technique T0114.001: Social media
# Technique T0114.001: Social Media
* **Summary**: Social Media

View file

@ -1,4 +1,4 @@
# Technique T0116.001: Post inauthentic social media comment
# Technique T0116.001: Post Inauthentic Social Media Comment
* **Summary**: Use government-paid social media commenters, astroturfers, chat bots (programmed to reply to specific key words/hashtags) influence online conversations, product reviews, web-site comment forums.

View file

@ -1,4 +1,4 @@
# Technique T0119.001: Post Across Groups
# Technique T0119.001: Post across Groups
* **Summary**: An influence operation may post content across groups to spread narratives and content to new communities within the target audiences or to new target audiences.

View file

@ -1,4 +1,4 @@
# Technique T0119.002: Post Across Platform
# Technique T0119.002: Post across Platform
* **Summary**: An influence operation may post content across platforms to spread narratives and content to new communities within the target audiences or to new target audiences. Posting across platforms can also remove opposition and context, helping the narrative spread with less opposition on the cross-posted platform.

View file

@ -1,4 +1,4 @@
# Technique T0119.003: Post Across Disciplines
# Technique T0119.003: Post across Disciplines
* **Summary**: Post Across Disciplines

View file

@ -1,4 +1,4 @@
# Technique T0126.001: Call to action to attend
# Technique T0126.001: Call to Action to Attend
* **Summary**: Call to action to attend an event

View file

@ -1,4 +1,4 @@
# Technique T0126.002: Facilitate logistics or support for attendance
# Technique T0126.002: Facilitate Logistics or Support for Attendance
* **Summary**: Facilitate logistics or support for travel, food, housing, etc.

View file

@ -1,4 +1,4 @@
# Technique T0129.005: Coordinate on encrypted/closed networks
# Technique T0129.005: Coordinate on Encrypted/Closed Networks
* **Summary**: Coordinate on encrypted/ closed networks

View file

@ -1,4 +1,4 @@
# Technique T0129.006: Deny involvement
# Technique T0129.006: Deny Involvement
* **Summary**: Without "smoking gun" proof (and even with proof), incident creator can or will deny involvement. This technique also leverages the attacker advantages outlined in "Demand insurmountable proof", specifically the asymmetric disadvantage for truth-tellers in a "firehose of misinformation" environment.

View file

@ -1,4 +1,4 @@
# Technique T0131.001: Legacy web content
# Technique T0131.001: Legacy Web Content
* **Summary**: Make incident content visible for a long time, e.g. by exploiting platform terms of service, or placing it where it's hard to remove or unlikely to be removed.

View file

@ -1,4 +1,4 @@
# Technique T0133.001: Behavior changes
# Technique T0133.001: Behavior Changes
* **Summary**: Monitor and evaluate behaviour changes from misinformation incidents.

View file

@ -1,4 +1,4 @@
# Technique T0133.005: Action/attitude
# Technique T0133.005: Action/Attitude
* **Summary**: Measure current system state with respect to the effectiveness of influencing action/attitude.

View file

@ -1,4 +1,4 @@
# Technique T0134.001: Message reach
# Technique T0134.001: Message Reach
* **Summary**: Monitor and evaluate message reach in misinformation incidents.

View file

@ -1,4 +1,4 @@
# Technique T0134.002: Social media engagement
# Technique T0134.002: Social Media Engagement
* **Summary**: Monitor and evaluate social media engagement in misinformation incidents.