mirror of
https://github.com/DISARMFoundation/DISARMframeworks.git
synced 2024-12-12 01:14:20 -05:00
30 lines
6.4 KiB
Markdown
30 lines
6.4 KiB
Markdown
# Incident I00068: Attempted Audio Deepfake Call Targets LastPass Employee
|
||
|
||
* **Summary:** <I>“In a new blog post from LastPass, the password management firm used by countless personal and corporate clients to help protect their login information, the company explains that someone used AI voice-cloning tech to spoof the voice of its CEO in an attempt to trick one of its employees.”</I>
|
||
|
||
* **incident type**:
|
||
|
||
* **Year started:**
|
||
|
||
* **Countries:** ,
|
||
|
||
* **Found via:**
|
||
|
||
* **Date added:**
|
||
|
||
|
||
| Reference | Pub Date | Authors | Org | Archive |
|
||
| --------- | -------- | ------- | --- | ------- |
|
||
| [https://blog.lastpass.com/posts/2024/04/attempted-audio-deepfake-call-targets-lastpass-employee](https://blog.lastpass.com/posts/2024/04/attempted-audio-deepfake-call-targets-lastpass-employee) | 2024/04/10 | Mike Kosak | LastPass | [https://web.archive.org/web/20240619143325/https://blog.lastpass.com/posts/2024/04/attempted-audio-deepfake-call-targets-lastpass-employee](https://web.archive.org/web/20240619143325/https://blog.lastpass.com/posts/2024/04/attempted-audio-deepfake-call-targets-lastpass-employee) |
|
||
|
||
|
||
|
||
| Technique | Description given for this incident |
|
||
| --------- | ------------------------- |
|
||
| [T0043.001 Use Encrypted Chat Apps](../../generated_pages/techniques/T0043.001.md) | IT00000219 <i>“While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.”</i><br><br> In this example attackers impersonated the CEO of LastPass (T0097.100: Individual Persona, T0143.003: Impersonated Persona), targeting one of its employees over WhatsApp (T0043.001: Use Encrypted Chat Apps) using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
|
||
| [T0088.001 Develop AI-Generated Audio (Deepfakes)](../../generated_pages/techniques/T0088.001.md) | IT00000220 <i>“While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.”</i><br><br> In this example attackers impersonated the CEO of LastPass (T0097.100: Individual Persona, T0143.003: Impersonated Persona), targeting one of its employees over WhatsApp (T0043.001: Use Encrypted Chat Apps) using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
|
||
| [T0097.100 Individual Persona](../../generated_pages/techniques/T0097.100.md) | IT00000221 <i>“While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.”</i><br><br> In this example attackers impersonated the CEO of LastPass (T0097.100: Individual Persona, T0143.003: Impersonated Persona), targeting one of its employees over WhatsApp (T0043.001: Use Encrypted Chat Apps) using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
|
||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000222 <i>“While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.”</i><br><br> In this example attackers impersonated the CEO of LastPass (T0097.100: Individual Persona, T0143.003: Impersonated Persona), targeting one of its employees over WhatsApp (T0043.001: Use Encrypted Chat Apps) using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
|
||
|
||
|
||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW |