mirror of
https://github.com/DISARMFoundation/DISARMframeworks.git
synced 2024-12-19 12:54:20 -05:00
8.0 KiB
8.0 KiB
Incident I00068: Attempted Audio Deepfake Call Targets LastPass Employee
-
Summary: “In a new blog post from LastPass, the password management firm used by countless personal and corporate clients to help protect their login information, the company explains that someone used AI voice-cloning tech to spoof the voice of its CEO in an attempt to trick one of its employees.”
-
incident type:
-
Year started:
-
Countries: ,
-
Found via:
-
Date added:
Reference | Pub Date | Authors | Org | Archive |
---|---|---|---|---|
https://blog.lastpass.com/posts/2024/04/attempted-audio-deepfake-call-targets-lastpass-employee | 2024/04/10 | Mike Kosak | LastPass | https://web.archive.org/web/20240619143325/https://blog.lastpass.com/posts/2024/04/attempted-audio-deepfake-call-targets-lastpass-employee |
Technique | Description given for this incident |
---|---|
T0088.001 Develop AI-Generated Audio (Deepfakes) | IT00000220 “While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.” In this example attackers impersonated the CEO of LastPass (T0097.100: Individual Persona, T0143.003: Impersonated Persona), targeting one of its employees over WhatsApp (T0151.004: Chat Platform,T0155.007: Encrypted Communication Channel) using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
T0097.100 Individual Persona | IT00000221 “While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.” In this example attackers impersonated the CEO of LastPass (T0097.100: Individual Persona, T0143.003: Impersonated Persona), targeting one of its employees over WhatsApp (T0151.004: Chat Platform,T0155.007: Encrypted Communication Channel) using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
T0143.003 Impersonated Persona | IT00000222 “While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.” In this example attackers created an account on WhatsApp which impersonated the CEO of lastpass (T0097.100: Individual Persona, T0143.003: Impersonated Persona, T0146: Account Asset, T0151.004: Chat Platform, T0155.007: Encrypted Communication Channel). They used this asset to target an employee using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
T0151.004 Chat Platform | IT00000219 “While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.” In this example attackers created an account on WhatsApp which impersonated the CEO of lastpass (T0097.100: Individual Persona, T0143.003: Impersonated Persona, T0146: Account Asset, T0151.004: Chat Platform, T0155.007: Encrypted Communication Channel). They used this asset to target an employee using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
T0155.007 Encrypted Communication Channel | IT00000547 “While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.” In this example attackers created an account on WhatsApp which impersonated the CEO of lastpass (T0097.100: Individual Persona, T0143.003: Impersonated Persona, T0146: Account Asset, T0151.004: Chat Platform, T0155.007: Encrypted Communication Channel). They used this asset to target an employee using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW