mirror of
https://github.com/DISARMFoundation/DISARMframeworks.git
synced 2025-10-04 09:28:33 -04:00
Corrected columns for urls sheet, added back asset into technique names, and tidied up mapping of existing incidents to amended techniques
This commit is contained in:
parent
964938bd15
commit
84f0700c2e
200 changed files with 625 additions and 701 deletions
|
@ -23,8 +23,9 @@
|
|||
| --------- | ------------------------- |
|
||||
| [T0088.001 Develop AI-Generated Audio (Deepfakes)](../../generated_pages/techniques/T0088.001.md) | IT00000220 <i>“While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.”</i><br><br> In this example attackers impersonated the CEO of LastPass (T0097.100: Individual Persona, T0143.003: Impersonated Persona), targeting one of its employees over WhatsApp (T0151.004: Chat Platform,T0155.007: Encrypted Communication Channel) using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
|
||||
| [T0097.100 Individual Persona](../../generated_pages/techniques/T0097.100.md) | IT00000221 <i>“While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.”</i><br><br> In this example attackers impersonated the CEO of LastPass (T0097.100: Individual Persona, T0143.003: Impersonated Persona), targeting one of its employees over WhatsApp (T0151.004: Chat Platform,T0155.007: Encrypted Communication Channel) using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000222 <i>“While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.”</i><br><br>In this example attackers created an account on WhatsApp which impersonated the CEO of lastpass (T0097.100: Individual Persona, T0143.003: Impersonated Persona, T0146: Account, T0151.004: Chat Platform, T0155.007: Encrypted Communication Channel). They used this asset to target an employee using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
|
||||
| [T0151.004 Chat Platform](../../generated_pages/techniques/T0151.004.md) | IT00000219 <i>“While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.”</i><br><br>In this example attackers created an account on WhatsApp which impersonated the CEO of lastpass (T0097.100: Individual Persona, T0143.003: Impersonated Persona, T0146: Account, T0151.004: Chat Platform, T0155.007: Encrypted Communication Channel). They used this asset to target an employee using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
|
||||
| [T0143.003 Impersonated Persona](../../generated_pages/techniques/T0143.003.md) | IT00000222 <i>“While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.”</i><br><br>In this example attackers created an account on WhatsApp which impersonated the CEO of lastpass (T0097.100: Individual Persona, T0143.003: Impersonated Persona, T0146: Account Asset, T0151.004: Chat Platform, T0155.007: Encrypted Communication Channel). They used this asset to target an employee using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
|
||||
| [T0151.004 Chat Platform](../../generated_pages/techniques/T0151.004.md) | IT00000219 <i>“While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.”</i><br><br>In this example attackers created an account on WhatsApp which impersonated the CEO of lastpass (T0097.100: Individual Persona, T0143.003: Impersonated Persona, T0146: Account Asset, T0151.004: Chat Platform, T0155.007: Encrypted Communication Channel). They used this asset to target an employee using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
|
||||
| [T0155.007 Encrypted Communication Channel](../../generated_pages/techniques/T0155.007.md) | IT00000547 <i>“While reports of [...] deepfake calls targeting private companies are luckily still rare, LastPass itself experienced a deepfake attempt earlier today that we are sharing with the larger community to help raise awareness that this tactic is spreading and all companies should be on the alert. In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp. As the attempted communication was outside of normal business communication channels and due to the employee’s suspicion regarding the presence of many of the hallmarks of a social engineering attempt (such as forced urgency), our employee rightly ignored the messages and reported the incident to our internal security team so that we could take steps to both mitigate the threat and raise awareness of the tactic both internally and externally.”</i><br><br>In this example attackers created an account on WhatsApp which impersonated the CEO of lastpass (T0097.100: Individual Persona, T0143.003: Impersonated Persona, T0146: Account Asset, T0151.004: Chat Platform, T0155.007: Encrypted Communication Channel). They used this asset to target an employee using deepfaked audio (T0088.001: Develop AI-Generated Audio (Deepfakes)). |
|
||||
|
||||
|
||||
DO NOT EDIT ABOVE THIS LINE - PLEASE ADD NOTES BELOW
|
Loading…
Add table
Add a link
Reference in a new issue