mirror of
https://github.com/DISARMFoundation/DISARMframeworks.git
synced 2024-10-01 01:45:36 -04:00
Some formatting changes to incidents and removing some techniques
This commit is contained in:
parent
d83d55c722
commit
9c2735869d
Binary file not shown.
@ -21,14 +21,14 @@
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | IT00000335 <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0084.002 Plagiarise Content](../../generated_pages/techniques/T0084.002.md) | IT00000335 <i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.101 Local Persona](../../generated_pages/techniques/T0097.101.md) | IT00000252 <i>“In addition to directly posting material on social media, we observed some personas in the network [of inauthentic accounts attributed to Iran] leverage legitimate print and online media outlets in the U.S. and Israel to promote Iranian interests via the submission of letters, guest columns, and blog posts that were then published. We also identified personas that we suspect were fabricated for the sole purpose of submitting such letters, but that do not appear to maintain accounts on social media. The personas claimed to be based in varying locations depending on the news outlets they were targeting for submission; for example, a persona that listed their location as Seattle, WA in a letter submitted to the Seattle Times subsequently claimed to be located in Baytown, TX in a letter submitted to The Baytown Sun. Other accounts in the network then posted links to some of these letters on social media.”</i><br><br> In this example actors fabricated individuals who lived in areas which were being targeted for influence through the use of letters to local papers (T0097.101: Local Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0097.102 Journalist Persona](../../generated_pages/techniques/T0097.102.md) | IT00000257 <i>“Accounts in the network [of inauthentic accounts attributed to Iran], under the guise of journalist personas, also solicited various individuals over Twitter for interviews and chats, including real journalists and politicians. The personas appear to have successfully conducted remote video and audio interviews with U.S. and UK-based individuals, including a prominent activist, a radio talk show host, and a former U.S. Government official, and subsequently posted the interviews on social media, showing only the individual being interviewed and not the interviewer. The interviewees expressed views that Iran would likely find favorable, discussing topics such as the February 2019 Warsaw summit, an attack on a military parade in the Iranian city of Ahvaz, and the killing of Jamal Khashoggi.<br><br> “The provenance of these interviews appear to have been misrepresented on at least one occasion, with one persona appearing to have falsely claimed to be operating on behalf of a mainstream news outlet; a remote video interview with a US-based activist about the Jamal Khashoggi killing was posted by an account adopting the persona of a journalist from the outlet Newsday, with the Newsday logo also appearing in the video. We did not identify any Newsday interview with the activist in question on this topic. In another instance, a persona posing as a journalist directed tweets containing audio of an interview conducted with a former U.S. Government official at real media personalities, calling on them to post about the interview.”</i><br><br> In this example actors fabricated journalists (T0097.102: Journalist Persona, T0143.002: Fabricated Persona) who worked at existing news outlets (T0097.202: News Outlet Persona, T0143.003: Impersonated Persona) in order to conduct interviews with targeted individuals. |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | IT00000334 <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.110 Party Official Persona](../../generated_pages/techniques/T0097.110.md) | IT00000334 <i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0097.202 News Outlet Persona](../../generated_pages/techniques/T0097.202.md) | IT00000255 <i>“Accounts in the network [of inauthentic accounts attributed to Iran], under the guise of journalist personas, also solicited various individuals over Twitter for interviews and chats, including real journalists and politicians. The personas appear to have successfully conducted remote video and audio interviews with U.S. and UK-based individuals, including a prominent activist, a radio talk show host, and a former U.S. Government official, and subsequently posted the interviews on social media, showing only the individual being interviewed and not the interviewer. The interviewees expressed views that Iran would likely find favorable, discussing topics such as the February 2019 Warsaw summit, an attack on a military parade in the Iranian city of Ahvaz, and the killing of Jamal Khashoggi.<br><br> “The provenance of these interviews appear to have been misrepresented on at least one occasion, with one persona appearing to have falsely claimed to be operating on behalf of a mainstream news outlet; a remote video interview with a US-based activist about the Jamal Khashoggi killing was posted by an account adopting the persona of a journalist from the outlet Newsday, with the Newsday logo also appearing in the video. We did not identify any Newsday interview with the activist in question on this topic. In another instance, a persona posing as a journalist directed tweets containing audio of an interview conducted with a former U.S. Government official at real media personalities, calling on them to post about the interview.”</i><br><br> In this example actors fabricated journalists (T0097.102: Journalist Persona, T0143.002: Fabricated Persona) who worked at existing news outlets (T0097.202: News Outlet Persona, T0143.003: Impersonated Persona) in order to conduct interviews with targeted individuals. |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000253 <i>“In addition to directly posting material on social media, we observed some personas in the network [of inauthentic accounts attributed to Iran] leverage legitimate print and online media outlets in the U.S. and Israel to promote Iranian interests via the submission of letters, guest columns, and blog posts that were then published. We also identified personas that we suspect were fabricated for the sole purpose of submitting such letters, but that do not appear to maintain accounts on social media. The personas claimed to be based in varying locations depending on the news outlets they were targeting for submission; for example, a persona that listed their location as Seattle, WA in a letter submitted to the Seattle Times subsequently claimed to be located in Baytown, TX in a letter submitted to The Baytown Sun. Other accounts in the network then posted links to some of these letters on social media.”</i><br><br> In this example actors fabricated individuals who lived in areas which were being targeted for influence through the use of letters to local papers (T0097.101: Local Persona, T0143.002: Fabricated Persona). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000333 <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | IT00000332 <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000333 <i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [T0145.001 Copy Account Imagery](../../generated_pages/techniques/T0145.001.md) | IT00000332 <i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -21,10 +21,10 @@
|
||||
|
||||
| Technique | Description given for this incident |
|
||||
| --------- | ------------------------- |
|
||||
| [T0097.102 Journalist Persona](../../generated_pages/techniques/T0097.102.md) | IT00000281 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [T0097.103 Activist Persona](../../generated_pages/techniques/T0097.103.md) | IT00000282 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000283 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000284 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [T0097.102 Journalist Persona](../../generated_pages/techniques/T0097.102.md) | IT00000281 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler (noun) (linking verb) (noun/verb/adjective),” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [T0097.103 Activist Persona](../../generated_pages/techniques/T0097.103.md) | IT00000282 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler (noun) (linking verb) (noun/verb/adjective),” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [T0143.002 Fabricated Persona](../../generated_pages/techniques/T0143.002.md) | IT00000283 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler (noun) (linking verb) (noun/verb/adjective),” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [T0144.002 Persona Template](../../generated_pages/techniques/T0144.002.md) | IT00000284 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler (noun) (linking verb) (noun/verb/adjective),” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [T0145.007 Stock Image Account Imagery](../../generated_pages/techniques/T0145.007.md) | IT00000280 <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br>“Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</I><br><br>This behaviour matches T0145.007: Stock Image Account Imagery because the account was identified as using a stock image as its profile picture. |
|
||||
|
||||
|
||||
|
@ -1,17 +0,0 @@
|
||||
# Technique T0009.001: Utilise Academic/Pseudoscientific Justifications
|
||||
|
||||
* **Summary**: Utilise Academic/Pseudoscientific Justifications
|
||||
|
||||
* **Belongs to tactic stage**: TA16
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -7,7 +7,7 @@
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00076 Network of Social Media Accounts Impersonates U.S. Political Candidates, Leverages U.S. and Israeli Media in Support of Iranian Interests](../../generated_pages/incidents/I00076.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [I00076 Network of Social Media Accounts Impersonates U.S. Political Candidates, Leverages U.S. and Israeli Media in Support of Iranian Interests](../../generated_pages/incidents/I00076.md) | <i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [I00079 Three thousand fake tanks](../../generated_pages/incidents/I00079.md) | <i>“The sixth [website to repost a confirmed false narrative investigated in this report] is an apparent think tank, the Center for Global Strategic Monitoring. This website describes itself, in English apparently written by a non-native speaker, as a “nonprofit and nonpartisan research and analysis institution dedicated to providing insights of the think tank community publications”. It does, indeed, publish think-tank reports on issues such as Turkey and US-China relations; however, the reports are the work of other think tanks, often unattributed (the two mentioned in this sentence were actually produced by the Brookings Institution, although the website makes no mention of the fact). It also fails to provide an address, or any other contact details other than an email, and its (long) list of experts includes entries apparently copied and pasted from other institutions. Thus, the “think tank” website which shared the fake story appears to be a fake itself.”</i> In this example a website which amplified a false narrative presented itself as a think tank (T0097.204: Think Tank Persona).<br><br> This is an entirely fabricated persona (T0143.002: Fabricated Persona); it republished content from other think tanks without attribution (T0084.002: Plagiarise Content) and fabricated experts (T0097.108: Expert Persona, T0143.002: Fabricated Persona) to make it more believable that they were a real think tank. |
|
||||
|
||||
|
||||
|
@ -1,23 +0,0 @@
|
||||
# Technique T0097.001: Produce Evidence for Persona
|
||||
|
||||
* **Summary**: People may produce evidence which supports the persona they are deploying (T0097) (aka “backstopping” the persona).
|
||||
|
||||
This Technique covers situations where evidence is developed or produced as part of an influence operation to increase the perceived legitimacy of a persona used during IO, including creating accounts for the same persona on multiple platforms.
|
||||
|
||||
The use of personas (T0097), and providing evidence to improve people’s perception of one’s persona (T0097.001), are not necessarily malicious or inauthentic. However, sometimes people use personas to increase the perceived legitimacy of narratives for malicious purposes.
|
||||
|
||||
This Technique was previously called Backstop Personas.
|
||||
|
||||
* **Belongs to tactic stage**: TA16
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -8,7 +8,7 @@
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00076 Network of Social Media Accounts Impersonates U.S. Political Candidates, Leverages U.S. and Israeli Media in Support of Iranian Interests](../../generated_pages/incidents/I00076.md) | <i>“Accounts in the network [of inauthentic accounts attributed to Iran], under the guise of journalist personas, also solicited various individuals over Twitter for interviews and chats, including real journalists and politicians. The personas appear to have successfully conducted remote video and audio interviews with U.S. and UK-based individuals, including a prominent activist, a radio talk show host, and a former U.S. Government official, and subsequently posted the interviews on social media, showing only the individual being interviewed and not the interviewer. The interviewees expressed views that Iran would likely find favorable, discussing topics such as the February 2019 Warsaw summit, an attack on a military parade in the Iranian city of Ahvaz, and the killing of Jamal Khashoggi.<br><br> “The provenance of these interviews appear to have been misrepresented on at least one occasion, with one persona appearing to have falsely claimed to be operating on behalf of a mainstream news outlet; a remote video interview with a US-based activist about the Jamal Khashoggi killing was posted by an account adopting the persona of a journalist from the outlet Newsday, with the Newsday logo also appearing in the video. We did not identify any Newsday interview with the activist in question on this topic. In another instance, a persona posing as a journalist directed tweets containing audio of an interview conducted with a former U.S. Government official at real media personalities, calling on them to post about the interview.”</i><br><br> In this example actors fabricated journalists (T0097.102: Journalist Persona, T0143.002: Fabricated Persona) who worked at existing news outlets (T0097.202: News Outlet Persona, T0143.003: Impersonated Persona) in order to conduct interviews with targeted individuals. |
|
||||
| [I00080 Hundreds Of Propaganda Accounts Targeting Iran And Qatar Have Been Removed From Facebook](../../generated_pages/incidents/I00080.md) | <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [I00080 Hundreds Of Propaganda Accounts Targeting Iran And Qatar Have Been Removed From Facebook](../../generated_pages/incidents/I00080.md) | <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler (noun) (linking verb) (noun/verb/adjective),” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [I00082 Meta’s November 2021 Adversarial Threat Report ](../../generated_pages/incidents/I00082.md) | <I>“[Meta] removed 41 Facebook accounts, five Groups, and four Instagram accounts for violating our policy against coordinated inauthentic behavior. This activity originated in Belarus and primarily targeted audiences in the Middle East and Europe.<br><br> “The core of this activity began in October 2021, with some accounts created as recently as mid-November. The people behind it used newly-created fake accounts — many of which were detected and disabled by our automated systems soon after creation — to pose as journalists and activists from the European Union, particularly Poland and Lithuania. Some of the accounts used profile photos likely generated using artificial intelligence techniques like generative adversarial networks (GAN). These fictitious personas posted criticism of Poland in English, Polish, and Kurdish, including pictures and videos about Polish border guards allegedly violating migrants’ rights, and compared Poland’s treatment of migrants against other countries’. They also posted to Groups focused on the welfare of migrants in Europe. A few accounts posted in Russian about relations between Belarus and the Baltic States.”</i><br><br> This example shows how accounts identified as participating in coordinated inauthentic behaviour were presenting themselves as journalists and activists while spreading operation narratives (T0097.102: Journalist Persona, T0097.103: Activist Persona).<br><br> Additionally, analysts at Meta identified accounts which were participating in coordinated inauthentic behaviour that had likely used AI-Generated images as their profile pictures (T0145.002: AI-Generated Account Imagery). |
|
||||
|
||||
|
||||
|
@ -9,7 +9,7 @@
|
||||
| -------- | -------------------- |
|
||||
| [I00069 Uncharmed: Untangling Iran's APT42 Operations](../../generated_pages/incidents/I00069.md) | <I>“In March 2023, [Iranian state-sponsored cyber espionage actor] APT42 sent a spear-phishing email with a fake Google Meet invitation, allegedly sent on behalf of Mona Louri, a likely fake persona leveraged by APT42, claiming to be a human rights activist and researcher. Upon entry, the user was presented with a fake Google Meet page and asked to enter their credentials, which were subsequently sent to the attackers.”</i><br><br>In this example APT42, an Iranian state-sponsored cyber espionage actor, created an account which presented as a human rights activist (T0097.103: Activist Persona) and researcher (T0097.107: Researcher Persona). The analysts assert that it was likely the persona was fabricated (T0143.002: Fabricated Persona) |
|
||||
| [I00077 Fronts & Friends: An Investigation into Two Twitter Networks Linked to Russian Actors](../../generated_pages/incidents/I00077.md) | <i>“The Syria portion of the network [of inauthentic accounts attributed to Russia] included additional sockpuppet accounts. One of these claimed to be a gay rights defender in Syria. Several said they were Syrian journalists. Another account, @SophiaHammer3, said she was born in Syria but currently lives in London. “I’m fond of history and politics. I struggle for justice.” Twitter users had previously observed that Sophia was likely a sockpuppet.”</i><br><br> This behaviour matches T0097.103: Activist Persona because the account presents itself as defending a political cause - in this case gay rights.<br><br> Twitter’s technical indicators allowed their analysts to assert that these accounts were “reliably tied to Russian state actors”, meaning the presented personas were entirely fabricated (T0143.002: Fabricated Persona); these accounts are not legitimate gay rights defenders or journalists, they’re assets controlled by Russia publishing narratives beneficial to their agenda. |
|
||||
| [I00080 Hundreds Of Propaganda Accounts Targeting Iran And Qatar Have Been Removed From Facebook](../../generated_pages/incidents/I00080.md) | <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [I00080 Hundreds Of Propaganda Accounts Targeting Iran And Qatar Have Been Removed From Facebook](../../generated_pages/incidents/I00080.md) | <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler (noun) (linking verb) (noun/verb/adjective),” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [I00082 Meta’s November 2021 Adversarial Threat Report ](../../generated_pages/incidents/I00082.md) | <i>“[Meta] removed a network of accounts in Vietnam for violating our Inauthentic Behavior policy against mass reporting. They coordinated the targeting of activists and other people who publicly criticized the Vietnamese government and used false reports of various violations in an attempt to have these users removed from our platform. The people behind this activity relied primarily on authentic and duplicate accounts to submit hundreds — in some cases, thousands — of complaints against their targets through our abuse reporting flows.<br><br>“Many operators also maintained fake accounts — some of which were detected and disabled by our automated systems — to pose as their targets so they could then report the legitimate accounts as fake. They would frequently change the gender and name of their fake accounts to resemble the target individual. Among the most common claims in this misleading reporting activity were complaints of impersonation, and to a much lesser extent inauthenticity. The network also advertised abusive services in their bios and constantly evolved their tactics in an attempt to evade detection.“</i><br><br>In this example actors repurposed their accounts to impersonate targeted activists (T0097.103: Activist Persona, T0143.003: Impersonated Persona) in order to falsely report the activists’ legitimate accounts as impersonations (T0124.001: Report Non-Violative Opposing Content). |
|
||||
|
||||
|
||||
|
@ -8,7 +8,7 @@
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00075 How Russia Meddles Abroad for Profit: Cash, Trolls and a Cult Leader](../../generated_pages/incidents/I00075.md) | <I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona) |
|
||||
| [I00076 Network of Social Media Accounts Impersonates U.S. Political Candidates, Leverages U.S. and Israeli Media in Support of Iranian Interests](../../generated_pages/incidents/I00076.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [I00076 Network of Social Media Accounts Impersonates U.S. Political Candidates, Leverages U.S. and Israeli Media in Support of Iranian Interests](../../generated_pages/incidents/I00076.md) | <i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
|
||||
|
||||
|
||||
|
@ -1,17 +0,0 @@
|
||||
# Technique T0099.002: Spoof/Parody Account/Site
|
||||
|
||||
* **Summary**: An influence operation may prepare assets impersonating legitimate entities to further conceal its network identity and add a layer of legitimacy to its operation content. Users will more likely believe and less likely fact-check news from recognisable sources rather than unknown sites. Legitimate entities may include authentic news outlets, public figures, organisations, or state entities.
|
||||
|
||||
* **Belongs to tactic stage**: TA16
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -1,19 +0,0 @@
|
||||
# Technique T0099.003: Impersonate Existing Organisation
|
||||
|
||||
* **Summary**: A situation where a threat actor styles their online assets or content to mimic an existing organisation.
|
||||
|
||||
This can be done to take advantage of peoples’ trust in the organisation to increase narrative believability, to smear the organisation, or to make the organisation less trustworthy.
|
||||
|
||||
* **Belongs to tactic stage**: TA16
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -1,19 +0,0 @@
|
||||
# Technique T0099.004: Impersonate Existing Media Outlet
|
||||
|
||||
* **Summary**: A situation where a threat actor styles their online assets or content to mimic an existing media outlet.
|
||||
|
||||
This can be done to take advantage of peoples’ trust in the outlet to increase narrative believability, to smear the outlet, or to make the outlet less trustworthy.
|
||||
|
||||
* **Belongs to tactic stage**: TA16
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -1,17 +0,0 @@
|
||||
# Technique T0099.005: Impersonate Existing Official
|
||||
|
||||
* **Summary**: A situation where a threat actor styles their online assets or content to impersonate an official (including government officials, organisation officials, etc).
|
||||
|
||||
* **Belongs to tactic stage**: TA16
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -1,17 +0,0 @@
|
||||
# Technique T0099.006: Impersonate Existing Influencer
|
||||
|
||||
* **Summary**: A situation where a threat actor styles their online assets or content to impersonate an influencer or celebrity, typically to exploit users’ existing faith in the impersonated target.
|
||||
|
||||
* **Belongs to tactic stage**: TA16
|
||||
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
|
||||
|
||||
|
||||
| Counters | Response types |
|
||||
| -------- | -------------- |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
@ -13,7 +13,7 @@
|
||||
| [I00077 Fronts & Friends: An Investigation into Two Twitter Networks Linked to Russian Actors](../../generated_pages/incidents/I00077.md) | <i>“Two accounts [in the second network of accounts taken down by Twitter] appear to have been operated by Oriental Review and the Strategic Culture Foundation, respectively. Oriental Review bills itself as an “open source site for free thinking”, though it trades in outlandish conspiracy theories and posts content bylined by fake people. Stanford Internet Observatory researchers and investigative journalists have previously noted the presence of content bylined by fake “reporter” personas tied to the GRU-linked front Inside Syria Media Center, posted on Oriental Review.”</i><br><br> In an effort to make the Oriental Review’s stories appear more credible, the threat actors created fake journalists and pretended they wrote the articles on their website (aka “bylined” them).<br><br> In DISARM terms, they fabricated journalists (T0143.002: Fabricated Persona, T0097.003: Journalist Persona), and then used these fabricated journalists to increase perceived legitimacy (T0097.202: News Outlet Persona, T0143.002: Fabricated Persona). |
|
||||
| [I00078 Meta’s September 2020 Removal of Coordinated Inauthentic Behavior](../../generated_pages/incidents/I00078.md) | <i>“[Meta has] removed one Page, five Facebook accounts, one Group and three Instagram accounts for foreign or government interference which is coordinated inauthentic behavior on behalf of a foreign or government entity. This small network originated in Russia and focused primarily on Turkey and Europe, and also on the United States.<br><br> “This operation relied on fake accounts — some of which had been already detected and removed by our automated systems — to manage their Page and their Group, and to drive people to their site purporting to be an independent think-tank based primarily in Turkey. These accounts posed as locals based in Turkey, Canada and the US. They also recruited people to write for their website. This network had almost no following on our platforms when we removed it.”</i><br><br> Meta identified that a network of accounts originating in Russia were driving people off platform to a site which presented itself as a think-tank (T0097.204: Think Tank Persona). Meta did not make an attribution about the authenticity of this off-site think tank, so neither T0143.001: Authentic Persona or T0143.002: Fabricated Persona are used here.<br><br> Meta had access to technical data for accounts on its platform, and asserted that they were fabricated individuals posing as locals who recruited targets to write content for their website (T0097.101: Local Persona, T0097.106: Recruiter Persona, T0143.002: Fabricated Persona). |
|
||||
| [I00079 Three thousand fake tanks](../../generated_pages/incidents/I00079.md) | <i>“The sixth [website to repost a confirmed false narrative investigated in this report] is an apparent think tank, the Center for Global Strategic Monitoring. This website describes itself, in English apparently written by a non-native speaker, as a “nonprofit and nonpartisan research and analysis institution dedicated to providing insights of the think tank community publications”. It does, indeed, publish think-tank reports on issues such as Turkey and US-China relations; however, the reports are the work of other think tanks, often unattributed (the two mentioned in this sentence were actually produced by the Brookings Institution, although the website makes no mention of the fact). It also fails to provide an address, or any other contact details other than an email, and its (long) list of experts includes entries apparently copied and pasted from other institutions. Thus, the “think tank” website which shared the fake story appears to be a fake itself.”</i> In this example a website which amplified a false narrative presented itself as a think tank (T0097.204: Think Tank Persona).<br><br> This is an entirely fabricated persona (T0143.002: Fabricated Persona); it republished content from other think tanks without attribution (T0084.002: Plagiarise Content) and fabricated experts (T0097.108: Expert Persona, T0143.002: Fabricated Persona) to make it more believable that they were a real think tank. |
|
||||
| [I00080 Hundreds Of Propaganda Accounts Targeting Iran And Qatar Have Been Removed From Facebook](../../generated_pages/incidents/I00080.md) | <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [I00080 Hundreds Of Propaganda Accounts Targeting Iran And Qatar Have Been Removed From Facebook](../../generated_pages/incidents/I00080.md) | <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler (noun) (linking verb) (noun/verb/adjective),” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [I00081 Belarus KGB created fake accounts to criticize Poland during border crisis, Facebook parent company says](../../generated_pages/incidents/I00081.md) | <i>“Meta said it also removed 31 Facebook accounts, four groups, two events and four Instagram accounts that it believes originated in Poland and targeted Belarus and Iraq. Those allegedly fake accounts posed as Middle Eastern migrants posting about the border crisis. Meta did not link the accounts to a specific group.<br><br> ““These fake personas claimed to be sharing their own negative experiences of trying to get from Belarus to Poland and posted about migrants’ difficult lives in Europe,” Meta said. “They also posted about Poland’s strict anti-migrant policies and anti-migrant neo-Nazi activity in Poland. They also shared links to news articles criticizing the Belarusian government’s handling of the border crisis and off-platform videos alleging migrant abuse in Europe.””</i><br><br> In this example accounts falsely presented themselves as having local insight into the border crisis narrative (T0097.101: Local Persona, T0143.002: Fabricated Persona). |
|
||||
| [I00086 #WeAreNotSafe – Exposing How a Post-October 7th Disinformation Network Operates on Israeli Social Media](../../generated_pages/incidents/I00086.md) | In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
| [I00089 Hackers Use Fake Facebook Profiles of Attractive Women to Spread Viruses, Steal Passwords](../../generated_pages/incidents/I00089.md) | <I>“On Facebook, Rita, Alona and Christina appeared to be just like the millions of other U.S citizens sharing their lives with the world. They discussed family outings, shared emojis and commented on each other's photographs.<br><br> “In reality, the three accounts were part of a highly-targeted cybercrime operation, used to spread malware that was able to steal passwords and spy on victims.<br><br> “Hackers with links to Lebanon likely ran the covert scheme using a strain of malware dubbed "Tempting Cedar Spyware," according to researchers from Prague-based anti-virus company Avast, which detailed its findings in a report released on Wednesday.<br><br> “In a honey trap tactic as old as time, the culprits' targets were mostly male, and lured by fake attractive women. <br><br> “In the attack, hackers would send flirtatious messages using Facebook to the chosen victims, encouraging them to download a second , booby-trapped, chat application known as Kik Messenger to have "more secure" conversations. Upon analysis, Avast experts found that "many fell for the trap.””</i><br><br> In this example threat actors took on the persona of a romantic suitor on Facebook, directing their targets to another platform (T0097:109 Romantic Suitor Persona, T0145.006: Attractive Person Account Imagery, T0143.002: Fabricated Persona). |
|
||||
|
@ -13,7 +13,7 @@
|
||||
| [I00070 Eli Lilly Clarifies It’s Not Offering Free Insulin After Tweet From Fake Verified Account—As Chaos Unfolds On Twitter](../../generated_pages/incidents/I00070.md) | <i>“Twitter Blue launched [November 2022], giving any users who pay $8 a month the ability to be verified on the site, a feature previously only available to public figures, government officials and journalists as a way to show they are who they claim to be.<br><br> “[A day after the launch], an account with the handle @EliLillyandCo labeled itself with the name “Eli Lilly and Company,” and by using the same logo as the company in its profile picture and with the verification checkmark, was indistinguishable from the real company (the picture has since been removed and the account has labeled itself as a parody profile).<br><br> The parody account tweeted “we are excited to announce insulin is free now.””</i><br><br> In this example an account impersonated the pharmaceutical company Eli Lilly (T0097.205: Business Persona, T0143.003: Impersonated Persona) by copying its name, profile picture (T0145.001: Copy Account Imagery), and paying for verification. |
|
||||
| [I00071 Russia-aligned hacktivists stir up anti-Ukrainian sentiments in Poland](../../generated_pages/incidents/I00071.md) | <i>“The August 17 [2022] Telegram post [which contained a falsified letter from the Ukrainian Minister of Foreign Affairs asking Poland to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII] also contained screenshots of Facebook posts that appeared on two Facebook accounts belonging to Polish nationals Piotr Górka, an expert in the history of the Polish Air Force, and Dariusz Walusiak, a Polish historian and documentary maker. The Górka post suggested that he fully supported the Polish government’s decision to change Belwederska Street to Stepan Bandera Street.<br><br> “In a statement to the DFRLab, Górka said his account was accessed without his consent. “This is not my post loaded to my Facebook page,” he explained. “My site was hacked, some days ago.” At the time of publishing, Piotr Górka’s post and his Facebook account were no longer accessible.<br><br> “The post on Górka’s Facebook page was shared by Dariusz Walusiak’s Facebook account; the account also reposted it on the Facebook walls of more than twenty other Facebook users, including Adam Kalita, currently working at Krakow branch of the Institute of National Remembrance; Jan Kasprzyk, head of the Office for War Veterans and Victims of Oppression; and Alicja Kondraciuk, a Polish public figure living in Krakow.<br><br> “Walusiak’s Facebook account is also no longer accessible. Given his work on Polish history and identity, it seems highly unlikely he would support the Bandera measure; the DFRLab has also reached out to him for comment.<br><br> “The fact that Joker DPR’s Telegram post included screenshots of their Facebook posts raises the strong possibility that both Facebook accounts were compromised, and that hackers planted false statements on their pages that would seem out of character for them in order to gain further attention to the forged documents.”</I><br><br> In this example, threat actors used compromised accounts (T0141.001: Acquire Compromised Account) of Polish historians who have enough relevant knowledge to plausibly weigh in on the forged letter’s narrative (T0143.003: Impersonated Persona, T0097.101: Local Persona, T0097.108: Expert Persona). <br><br> This matches T0097.108: Expert Persona because the impersonation exploited Górka and Walusiak’s existing personas as experts in Polish history. |
|
||||
| [I00075 How Russia Meddles Abroad for Profit: Cash, Trolls and a Cult Leader](../../generated_pages/incidents/I00075.md) | <I>“In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused.<br><br> “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager.<br><br> When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.”</i><br><br> In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona) |
|
||||
| [I00076 Network of Social Media Accounts Impersonates U.S. Political Candidates, Leverages U.S. and Israeli Media in Support of Iranian Interests](../../generated_pages/incidents/I00076.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [I00076 Network of Social Media Accounts Impersonates U.S. Political Candidates, Leverages U.S. and Israeli Media in Support of Iranian Interests](../../generated_pages/incidents/I00076.md) | <i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [I00082 Meta’s November 2021 Adversarial Threat Report ](../../generated_pages/incidents/I00082.md) | <i>“[Meta] removed a network of accounts in Vietnam for violating our Inauthentic Behavior policy against mass reporting. They coordinated the targeting of activists and other people who publicly criticized the Vietnamese government and used false reports of various violations in an attempt to have these users removed from our platform. The people behind this activity relied primarily on authentic and duplicate accounts to submit hundreds — in some cases, thousands — of complaints against their targets through our abuse reporting flows.<br><br>“Many operators also maintained fake accounts — some of which were detected and disabled by our automated systems — to pose as their targets so they could then report the legitimate accounts as fake. They would frequently change the gender and name of their fake accounts to resemble the target individual. Among the most common claims in this misleading reporting activity were complaints of impersonation, and to a much lesser extent inauthenticity. The network also advertised abusive services in their bios and constantly evolved their tactics in an attempt to evade detection.“</i><br><br>In this example actors repurposed their accounts to impersonate targeted activists (T0097.103: Activist Persona, T0143.003: Impersonated Persona) in order to falsely report the activists’ legitimate accounts as impersonations (T0124.001: Report Non-Violative Opposing Content). |
|
||||
| [I00087 Challenging Truth and Trust: A Global Inventory of Organized Social Media Manipulation](../../generated_pages/incidents/I00087.md) | <i>“Another actor operating in China is the American-based company Devumi. Most of the Twitter accounts managed by Devumi resemble real people, and some are even associated with a kind of large-scale social identity theft. At least 55,000 of the accounts use the names, profile pictures, hometowns and other personal details of real Twitter users, including minors, according to The New York Times (Confessore et al., 2018)).”</i><br><br>In this example accounts impersonated real locals while spreading operation narratives (T0143.003: Impersonated Persona, T0097.101: Local Persona). The impersonation included stealing the legitimate accounts’ profile pictures (T0145.001: Copy Account Imagery). |
|
||||
| [I00094 A glimpse inside a Chinese influence campaign: How bogus news websites blur the line between true and false](../../generated_pages/incidents/I00094.md) | Researchers identified websites managed by a Chinese marketing firm which presented themselves as news organisations.<br><br> <i>“On its official website, the Chinese marketing firm boasted that they were in contact with news organizations across the globe, including one in South Korea called the “Chungcheng Times.” According to the joint team, this outlet is a fictional news organization created by the offending company. The Chinese company sought to disguise the site’s true identity and purpose by altering the name attached to it by one character—making it very closely resemble the name of a legitimate outlet operating out of Chungchengbuk-do.<br><br> “The marketing firm also established a news organization under the Korean name “Gyeonggido Daily,” which closely resembles legitimate news outlets operating out of Gyeonggi province such as “Gyeonggi Daily,” “Daily Gyeonggi Newspaper,” and “Gyeonggi N Daily.” One of the fake news sites was named “Incheon Focus,” a title that could be easily mistaken for the legitimate local news outlet, “Focus Incheon.” Furthermore, the Chinese marketing company operated two fake news sites with names identical to two separate local news organizations, one of which ceased operations in December 2022.<br><br> “In total, fifteen out of eighteen Chinese fake news sites incorporated the correct names of real regions in their fake company names. “If the operators had created fake news sites similar to major news organizations based in Seoul, however, the intended deception would have easily been uncovered,” explained Song Tae-eun, an assistant professor in the Department of National Security & Unification Studies at the Korea National Diplomatic Academy, to The Readable. “There is also the possibility that they are using the regional areas as an attempt to form ties with the local community; that being the government, the private sector, and religious communities.””</i><br><br> The firm styled their news site to resemble existing local news outlets in their target region (T0097.201: Local Institution Persona, T0097.202: News Outlet Persona, T0143.003: Impersonated Persona). |
|
||||
|
@ -7,7 +7,7 @@
|
||||
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00080 Hundreds Of Propaganda Accounts Targeting Iran And Qatar Have Been Removed From Facebook](../../generated_pages/incidents/I00080.md) | <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler noun|linking verb|noun/verb/adjective|,” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [I00080 Hundreds Of Propaganda Accounts Targeting Iran And Qatar Have Been Removed From Facebook](../../generated_pages/incidents/I00080.md) | <i>“One example of a fake reporter account targeting Americans is “Jenny Powell,” a self-described Washington-based journalist, volunteer, and environmental activist. At first glance, Powell’s Twitter timeline looks like it belongs to a young and eager reporter amplifying her interests. But her profile photo is a stock image, and many of her links go to the propaganda sites.<br><br> “Powell, who joined the platform just last month, shares links to stories from major US news media outlets, retweets local news about Washington, DC, and regularly promotes content from The Foreign Code and The Economy Club. Other fake journalist accounts behaved similarly to Powell and had generic descriptions. One of the accounts, for a fake Bruce Lopez in Louisiana, has a bio that describes him as a “Correspondent Traveler (noun) (linking verb) (noun/verb/adjective),” which appears to reveal the formula used to write Twitter bios for the accounts.”</i><br><br> The Jenny Powel account used in this influence operation presents as both a journalist and an activist (T0097.102: Journalist Persona, T0097.103: Activist Persona, T0143.002: Fabricated Persona). This example shows how threat actors can easily follow a template to present a fabricated persona to their target audience (T0144.002: Persona Template). |
|
||||
| [I00086 #WeAreNotSafe – Exposing How a Post-October 7th Disinformation Network Operates on Israeli Social Media](../../generated_pages/incidents/I00086.md) | In this report accounts were identified as part of “a sophisticated and extensive coordinated network orchestrating a disinformation campaign targeting Israeli digital spaces since October 7th, 2023”.<br><br> <i>“A core component of the detection methodology was applying qualitative linguistic analysis. This involved checking the fingerprint of language, syntax, and style used in the comments and profile of the suspected account. Each account bio consistently incorporated a combination of specific elements: emojis, nationality, location, educational institution or occupation, age, and a personal quote, sports team or band. The recurrence of this specific formula across multiple accounts hinted at a standardized template for bio construction.”</i><br><br> This example shows how actors can follow a templated formula to present a persona on social media platforms (T0143.002: Fabricated Persona, T0144.002: Persona Template). |
|
||||
|
||||
|
||||
|
@ -8,7 +8,7 @@
|
||||
| Incident | Descriptions given for this incident |
|
||||
| -------- | -------------------- |
|
||||
| [I00070 Eli Lilly Clarifies It’s Not Offering Free Insulin After Tweet From Fake Verified Account—As Chaos Unfolds On Twitter](../../generated_pages/incidents/I00070.md) | <i>“Twitter Blue launched [November 2022], giving any users who pay $8 a month the ability to be verified on the site, a feature previously only available to public figures, government officials and journalists as a way to show they are who they claim to be.<br><br> “[A day after the launch], an account with the handle @EliLillyandCo labeled itself with the name “Eli Lilly and Company,” and by using the same logo as the company in its profile picture and with the verification checkmark, was indistinguishable from the real company (the picture has since been removed and the account has labeled itself as a parody profile).<br><br> The parody account tweeted “we are excited to announce insulin is free now.””</i><br><br> In this example an account impersonated the pharmaceutical company Eli Lilly (T0097.205: Business Persona, T0143.003: Impersonated Persona) by copying its name, profile picture (T0145.001: Copy Account Imagery), and paying for verification. |
|
||||
| [I00076 Network of Social Media Accounts Impersonates U.S. Political Candidates, Leverages U.S. and Israeli Media in Support of Iranian Interests](../../generated_pages/incidents/I00076.md) | <i>“Only three of the Russian operatives identified by local hires of the campaign responded to requests for comment. All acknowledged visiting Madagascar last year, but only one admitted working as a pollster on behalf of the president.<br><br> “The others said they were simply tourists. Pyotr Korolyov, described as a sociologist on one spreadsheet, spent much of the summer of 2018 and fall hunched over a computer, deep in polling data at La Résidence Ankerana, a hotel the Russians used as their headquarters, until he was hospitalized with the measles, according to one person who worked with him.<br><br> “In an email exchange, Mr. Korolyov confirmed that he had come down with the measles, but rejected playing a role in a Russian operation. He did defend the idea of one, though.<br><br> ““Russia should influence elections around the world, the same way the United States influences elections,” he wrote. “Sooner or later Russia will return to global politics as a global player,” he added. “And the American establishment will just have to accept that.””</i><br><br> This behaviour matches T0129.006: Deny Involvement because the actors contacted by journalists denied that they had participated in election interference (in spite of the evidence to the contrary).<i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [I00076 Network of Social Media Accounts Impersonates U.S. Political Candidates, Leverages U.S. and Israeli Media in Support of Iranian Interests](../../generated_pages/incidents/I00076.md) | <i>“Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network.<br><br> “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month”<br><br> [...]<br><br> “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.”</I><br><br> In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
|
||||
| [I00086 #WeAreNotSafe – Exposing How a Post-October 7th Disinformation Network Operates on Israeli Social Media](../../generated_pages/incidents/I00086.md) | <i>“In the wake of the Hamas attack on October 7th, the Israel Defense Forces (IDF) Information Security Department revealed a campaign of Instagram accounts impersonating young, attractive Israeli women who were actively engaging Israeli soldiers, attempting to extract information through direct messages.<br><br> [...]<br><br> “Some profiles underwent a reverse-image search of their photos to ascertain their authenticity. Many of the images searched were found to be appropriated from genuine social media profiles or sites such as Pinterest. When this was the case, the account was marked as confirmed to be inauthentic. One innovative method involves using photos that are initially frames from videos, which allows for evading reverse searches in most cases . This is seen in Figure 4, where an image uploaded by an inauthentic account was a screenshot taken from a TikTok video.”</i><br><br> In this example accounts associated with an influence operation used account imagery showing <i>“young, attractive Israeli women”</i> (T0145.006: Attractive Person Account Imagery), with some of these assets taken from existing accounts not associated with the operation (T0145.001: Copy Account Imagery). |
|
||||
| [I00087 Challenging Truth and Trust: A Global Inventory of Organized Social Media Manipulation](../../generated_pages/incidents/I00087.md) | <i>“In 2017, Tanya O'Carroll, a technology and human rights adviser for Amnesty International, published an investigation of the political impact of bots and trolls in Mexico (O’Carroll, 2017). An article by the BBC describes a video showing the operation of a "troll farm" in Mexico, where people were tweeting in support of Enrique Peña Nieto of the PRI in 2012 (Martinez, 2018).<br><br>“According to a report published by El País, the main target of parties’ online strategies are young people, including 14 million new voters who are expected to play a decisive role in the outcome of the July 2018 election (Peinado et al., 2018). Thus, one of the strategies employed by these bots was the use of profile photos of attractive people from other countries (Soloff, 2017).”</i><br><br>In this example accounts copied the profile pictures of attractive people from other countries (T0145.001: Copy Account Imagery, T0145.006: Attractive Person Account Imagery). |
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user