17 KiB
Technique T0143.003: Impersonated Persona
-
Summary: Threat actors may impersonate existing individuals or institutions to conceal their network identity, add legitimacy to content, or harm the impersonated target’s reputation. This Technique covers situations where an actor presents themselves as another existing individual or institution.
This Technique was previously called Prepare Assets Impersonating Legitimate Entities and used the ID T0099.
Associated Techniques and Sub-techniques
T0097: Presented Persona: Analysts can use the sub-techniques of T0097: Presented Persona to categorise the type of impersonation. For example, a document developed by a threat actor which falsely presented as a letter from a government department could be documented using T0085.004: Develop Document, T0143.003: Impersonated Persona, and T0097.206: Government Institution Persona.
T0145.001: Copy Account Imagery: Actors may take existing accounts’ profile pictures as part of their impersonation efforts. -
Belongs to tactic stage: TA16
Incident | Descriptions given for this incident |
---|---|
I00064 Tinder nightmares: the promise and peril of political bots | “In the days leading up to the UK’s [2019] general election, youths looking for love online encountered a whole new kind of Tinder nightmare. A group of young activists built a Tinder chatbot to co-opt profiles and persuade swing voters to support Labour. The bot accounts sent 30,000-40,000 messages to targeted 18-25 year olds in battleground constituencies like Dudley North, which Labour ended up winning by only 22 votes. [...] “The activists maintain that the project was meant to foster democratic engagement. But screenshots of the bots’ activity expose a harsher reality. Images of conversations between real users and these bots, posted on i-D, Mashable, as well as on Fowler and Goodman’s public Twitter accounts, show that the bots did not identify themselves as automated accounts, instead posing as the user whose profile they had taken over. While conducting research for this story, it turned out that a number of [the reporters’ friends] living in Oxford had interacted with the bot in the lead up to the election and had no idea that it was not a real person.” In this example people offered up their real accounts for the automation of political messaging; the actors convinced the users to give up access to their accounts to use in the operation (T0141.001: Acquire Compromised Account). The actors maintained the accounts’ existing persona, and presented themselves as potential romantic suitors for legitimate platform users (T0097:109 Romantic Suitor Persona, T0143.003: Impersonated Persona). |
I00068 Attempted Audio Deepfake Call Targets LastPass Employee | “While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.” In this example attackers impersonated the CEO of LastPass (T0097.100: Individual Persona, T0143.003: Impersonated Persona), targeting one of its employees over WhatsApp (T0043.001: Use Encrypted Chat Apps) using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
I00069 Uncharmed: Untangling Iran's APT42 Operations | “[Iranian state-sponsored cyber espionage actor] APT42 cloud operations attack lifecycle can be described in details as follows: - “Social engineering schemes involving decoys and trust building, which includes masquerading as legitimate NGOs and conducting ongoing correspondence with the target, sometimes lasting several weeks. - The threat actor masqueraded as well-known international organizations in the legal and NGO fields and sent emails from domains typosquatting the original NGO domains, for example aspenlnstitute[.]org. - The Aspen Institute became aware of this spoofed domain and collaborated with industry partners, including blocking it in SafeBrowsing, thus protecting users of Google Chrome and additional browsers. - To increase their credibility, APT42 impersonated high-ranking personnel working at the aforementioned organizations when creating the email personas. - APT42 enhanced their campaign credibility by using decoy material inviting targets to legitimate and relevant events and conferences. In one instance, the decoy material was hosted on an attacker-controlled SharePoint folder, accessible only after the victim entered their credentials. Mandiant did not identify malicious elements in the files, suggesting they were used solely to gain the victim’s trust.” In this example APT42, an Iranian state-sponsored cyber espionage actor, created a domain impersonating the existing NGO The Aspen Institute (T0143.003: Impersonated Persona, T0097.207: NGO Persona). They increased the perceived legitimacy of the impersonation by also impersonating high-ranking employees of the NGO (T0097.100: Individual Persona, T0143.003: Impersonated Persona). |
I00070 Eli Lilly Clarifies It’s Not Offering Free Insulin After Tweet From Fake Verified Account—As Chaos Unfolds On Twitter | “Twitter Blue launched [November 2022], giving any users who pay $8 a month the ability to be verified on the site, a feature previously only available to public figures, government officials and journalists as a way to show they are who they claim to be. “[A day after the launch], an account with the handle @EliLillyandCo labeled itself with the name “Eli Lilly and Company,” and by using the same logo as the company in its profile picture and with the verification checkmark, was indistinguishable from the real company (the picture has since been removed and the account has labeled itself as a parody profile). The parody account tweeted “we are excited to announce insulin is free now.”” In this example an account impersonated the pharmaceutical company Eli Lilly (T0097.205: Business Persona, T0143.003: Impersonated Persona) by copying its name, profile picture (T0145.001: Copy Account Imagery), and paying for verification. |
I00071 Russia-aligned hacktivists stir up anti-Ukrainian sentiments in Poland | “The August 17 [2022] Telegram post [which contained a falsified letter from the Ukrainian Minister of Foreign Affairs asking Poland to rename Belwederska Street in Warsaw — the location of the Russian embassy building — as Stepan Bandera Street, in honor of the far-right nationalist who led the Ukrainian Insurgent Army during WWII] also contained screenshots of Facebook posts that appeared on two Facebook accounts belonging to Polish nationals Piotr Górka, an expert in the history of the Polish Air Force, and Dariusz Walusiak, a Polish historian and documentary maker. The Górka post suggested that he fully supported the Polish government’s decision to change Belwederska Street to Stepan Bandera Street. “In a statement to the DFRLab, Górka said his account was accessed without his consent. “This is not my post loaded to my Facebook page,” he explained. “My site was hacked, some days ago.” At the time of publishing, Piotr Górka’s post and his Facebook account were no longer accessible. “The post on Górka’s Facebook page was shared by Dariusz Walusiak’s Facebook account; the account also reposted it on the Facebook walls of more than twenty other Facebook users, including Adam Kalita, currently working at Krakow branch of the Institute of National Remembrance; Jan Kasprzyk, head of the Office for War Veterans and Victims of Oppression; and Alicja Kondraciuk, a Polish public figure living in Krakow. “Walusiak’s Facebook account is also no longer accessible. Given his work on Polish history and identity, it seems highly unlikely he would support the Bandera measure; the DFRLab has also reached out to him for comment. “The fact that Joker DPR’s Telegram post included screenshots of their Facebook posts raises the strong possibility that both Facebook accounts were compromised, and that hackers planted false statements on their pages that would seem out of character for them in order to gain further attention to the forged documents.” In this example, threat actors used compromised accounts (T0141.001: Acquire Compromised Account) of Polish historians who have enough relevant knowledge to plausibly weigh in on the forged letter’s narrative (T0143.003: Impersonated Persona, T0097.101: Local Persona, T0097.108: Expert Persona). This matches T0097.108: Expert Persona because the impersonation exploited Górka and Walusiak’s existing personas as experts in Polish history. |
I00075 How Russia Meddles Abroad for Profit: Cash, Trolls and a Cult Leader | “In the campaign’s final weeks, Pastor Mailhol said, the team of Russians made a request: Drop out of the race and support Mr. Rajoelina. He refused. “The Russians made the same proposal to the history professor running for president, saying, “If you accept this deal you will have money” according to Ms. Rasamimanana, the professor’s campaign manager. When the professor refused, she said, the Russians created a fake Facebook page that mimicked his official page and posted an announcement on it that he was supporting Mr. Rajoelina.” In this example actors created online accounts styled to look like official pages to trick targets into thinking that the presidential candidate announced that they had dropped out of the election (T0097.110: Party Official Persona, T0143.003: Impersonated Persona) |
I00076 Network of Social Media Accounts Impersonates U.S. Political Candidates, Leverages U.S. and Israeli Media in Support of Iranian Interests | “Some Twitter accounts in the network [of inauthentic accounts attributed to Iran] impersonated Republican political candidates that ran for House of Representatives seats in the 2018 U.S. congressional midterms. These accounts appropriated the candidates’ photographs and, in some cases, plagiarized tweets from the real individuals’ accounts. Aside from impersonating real U.S. political candidates, the behavior and activity of these accounts resembled that of the others in the network. “For example, the account @livengood_marla impersonated Marla Livengood, a 2018 candidate for California’s 9th Congressional District, using a photograph of Livengood and a campaign banner for its profile and background pictures. The account began tweeting on Sept. 24, 2018, with its first tweet plagiarizing one from Livengood’s official account earlier that month” [...] “In another example, the account @ButlerJineea impersonated Jineea Butler, a 2018 candidate for New York’s 13th Congressional District, using a photograph of Butler for its profile picture and incorporating her campaign slogans into its background picture, as well as claiming in its Twitter bio to be a “US House candidate, NY-13” and linking to Butler’s website, jineeabutlerforcongress[.]com.” In this example actors impersonated existing political candidates (T0097.110: Member of Political Party Persona, T0143.003: Impersonated Persona), strengthening the impersonation by copying legitimate accounts’ imagery (T0145.001: Copy Account Imagery), and copying its previous posts (T0084.002: Plagiarise Content). |
I00082 Meta’s November 2021 Adversarial Threat Report | “[Meta] removed a network of accounts in Vietnam for violating our Inauthentic Behavior policy against mass reporting. They coordinated the targeting of activists and other people who publicly criticized the Vietnamese government and used false reports of various violations in an attempt to have these users removed from our platform. The people behind this activity relied primarily on authentic and duplicate accounts to submit hundreds — in some cases, thousands — of complaints against their targets through our abuse reporting flows. “Many operators also maintained fake accounts — some of which were detected and disabled by our automated systems — to pose as their targets so they could then report the legitimate accounts as fake. They would frequently change the gender and name of their fake accounts to resemble the target individual. Among the most common claims in this misleading reporting activity were complaints of impersonation, and to a much lesser extent inauthenticity. The network also advertised abusive services in their bios and constantly evolved their tactics in an attempt to evade detection.“ In this example actors repurposed their accounts to impersonate targeted activists (T0097.103: Activist Persona, T0143.003: Impersonated Persona) in order to falsely report the activists’ legitimate accounts as impersonations (T0124.001: Report Non-Violative Opposing Content). |
I00087 Challenging Truth and Trust: A Global Inventory of Organized Social Media Manipulation | “Another actor operating in China is the American-based company Devumi. Most of the Twitter accounts managed by Devumi resemble real people, and some are even associated with a kind of large-scale social identity theft. At least 55,000 of the accounts use the names, profile pictures, hometowns and other personal details of real Twitter users, including minors, according to The New York Times (Confessore et al., 2018)).” In this example accounts impersonated real locals while spreading operation narratives (T0143.003: Impersonated Persona, T0097.101: Local Persona). The impersonation included stealing the legitimate accounts’ profile pictures (T0145.001: Copy Account Imagery). |
I00094 A glimpse inside a Chinese influence campaign: How bogus news websites blur the line between true and false | Researchers identified websites managed by a Chinese marketing firm which presented themselves as news organisations. “On its official website, the Chinese marketing firm boasted that they were in contact with news organizations across the globe, including one in South Korea called the “Chungcheng Times.” According to the joint team, this outlet is a fictional news organization created by the offending company. The Chinese company sought to disguise the site’s true identity and purpose by altering the name attached to it by one character—making it very closely resemble the name of a legitimate outlet operating out of Chungchengbuk-do. “The marketing firm also established a news organization under the Korean name “Gyeonggido Daily,” which closely resembles legitimate news outlets operating out of Gyeonggi province such as “Gyeonggi Daily,” “Daily Gyeonggi Newspaper,” and “Gyeonggi N Daily.” One of the fake news sites was named “Incheon Focus,” a title that could be easily mistaken for the legitimate local news outlet, “Focus Incheon.” Furthermore, the Chinese marketing company operated two fake news sites with names identical to two separate local news organizations, one of which ceased operations in December 2022. “In total, fifteen out of eighteen Chinese fake news sites incorporated the correct names of real regions in their fake company names. “If the operators had created fake news sites similar to major news organizations based in Seoul, however, the intended deception would have easily been uncovered,” explained Song Tae-eun, an assistant professor in the Department of National Security & Unification Studies at the Korea National Diplomatic Academy, to The Readable. “There is also the possibility that they are using the regional areas as an attempt to form ties with the local community; that being the government, the private sector, and religious communities.”” The firm styled their news site to resemble existing local news outlets in their target region (T0097.201: Local Institution Persona, T0097.202: News Outlet Persona, T0143.003: Impersonated Persona). |
Counters | Response types |
---|
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW